Sample records for empirical analysis application

  1. The role of empirical Bayes methodology as a leading principle in modern medical statistics.

    PubMed

    van Houwelingen, Hans C

    2014-11-01

    This paper reviews and discusses the role of Empirical Bayes methodology in medical statistics in the last 50 years. It gives some background on the origin of the empirical Bayes approach and its link with the famous Stein estimator. The paper describes the application in four important areas in medical statistics: disease mapping, health care monitoring, meta-analysis, and multiple testing. It ends with a warning that the application of the outcome of an empirical Bayes analysis to the individual "subjects" is a delicate matter that should be handled with prudence and care. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Regional Morphology Analysis Package (RMAP): Empirical Orthogonal Function Analysis, Background and Examples

    DTIC Science & Technology

    2007-10-01

    1984. Complex principal component analysis : Theory and examples. Journal of Climate and Applied Meteorology 23: 1660-1673. Hotelling, H. 1933...Sediments 99. ASCE: 2,566-2,581. Von Storch, H., and A. Navarra. 1995. Analysis of climate variability. Applications of statistical techniques. Berlin...ERDC TN-SWWRP-07-9 October 2007 Regional Morphology Empirical Analysis Package (RMAP): Orthogonal Function Analysis , Background and Examples by

  3. Evaluation of LTPP Climatic Data for Use in Mechanistic-Empirical Pavement Design Guide (MEPDG) Calibration and Other Pavement Analysis

    DOT National Transportation Integrated Search

    2015-02-01

    This TechBrief describes evaluating the use of the Modern-Era Retrospective Analysis for Research and Applications (MERRA) product as an alternative climatic data source for the Mechanistic-Empirical Pavement Design Guide (MEPDG) and other transporta...

  4. Application of empirical mode decomposition in removing fidgeting interference in doppler radar life signs monitoring devices.

    PubMed

    Mostafanezhad, Isar; Boric-Lubecke, Olga; Lubecke, Victor; Mandic, Danilo P

    2009-01-01

    Empirical Mode Decomposition has been shown effective in the analysis of non-stationary and non-linear signals. As an application in wireless life signs monitoring in this paper we use this method in conditioning the signals obtained from the Doppler device. Random physical movements, fidgeting, of the human subject during a measurement can fall on the same frequency of the heart or respiration rate and interfere with the measurement. It will be shown how Empirical Mode Decomposition can break the radar signal down into its components and help separate and remove the fidgeting interference.

  5. Behavioral economics and empirical public policy.

    PubMed

    Hursh, Steven R; Roma, Peter G

    2013-01-01

    The application of economics principles to the analysis of behavior has yielded novel insights on value and choice across contexts ranging from laboratory animal research to clinical populations to national trends of global impact. Recent innovations in demand curve methods provide a credible means of quantitatively comparing qualitatively different reinforcers as well as quantifying the choice relations between concurrently available reinforcers. The potential of the behavioral economic approach to inform public policy is illustrated with examples from basic research, pre-clinical behavioral pharmacology, and clinical drug abuse research as well as emerging applications to public transportation and social behavior. Behavioral Economics can serve as a broadly applicable conceptual, methodological, and analytical framework for the development and evaluation of empirical public policy. © Society for the Experimental Analysis of Behavior.

  6. Lexical Link Analysis Application: Improving Web Service to Acquisition Visibility Portal

    DTIC Science & Technology

    2013-09-30

    during the Empire Challenge 2008 and 2009 (EC08/09) field experiments and for numerous other field experiments of new technologies during Trident Warrior...Empirical Methods in Natural Language Processing and Very Large Corpora (EMNLP/ VLC -2000) (pp. 63–70). Retrieved from http://nlp.stanford.edu/manning

  7. Exponential model for option prices: Application to the Brazilian market

    NASA Astrophysics Data System (ADS)

    Ramos, Antônio M. T.; Carvalho, J. A.; Vasconcelos, G. L.

    2016-03-01

    In this paper we report an empirical analysis of the Ibovespa index of the São Paulo Stock Exchange and its respective option contracts. We compare the empirical data on the Ibovespa options with two option pricing models, namely the standard Black-Scholes model and an empirical model that assumes that the returns are exponentially distributed. It is found that at times near the option expiration date the exponential model performs better than the Black-Scholes model, in the sense that it fits the empirical data better than does the latter model.

  8. Online Business Simulations: A Sustainable or Disruptive Innovation in Management Education?

    ERIC Educational Resources Information Center

    Earl, Jason Scott

    2012-01-01

    The focal goal of this research was to extend the empirical effort on business simulations as a form of experiential learning by providing the first empirical analysis of business acumen and knowledge application skills. Disruptions in technology are providing more opportunities to improve the simulation gaming learning experience and a number of…

  9. Regression Analysis by Example. 5th Edition

    ERIC Educational Resources Information Center

    Chatterjee, Samprit; Hadi, Ali S.

    2012-01-01

    Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. "Regression Analysis by Example, Fifth Edition" has been expanded and thoroughly…

  10. Community College Recruitment: An Analysis of Applicant Reactions.

    ERIC Educational Resources Information Center

    Winter, Paul A.; Kjorlien, Chad L.

    The purpose of this study was to: (1) conduct an empirical examination of applicant reactions to faculty jobs described in recruitment advertisements for business faculty vacancies at community colleges; and (2) assess factors that potentially impact applicant decisions to apply for and pursue position vacancies. The results of this study have…

  11. A comparison of a technical and a participatory application of social impact assessment.

    Treesearch

    Dennis R Becker; Charles C Harris; Erik A Nielsen; William J. McLaughlin

    2004-01-01

    Results of independent applications of a technical and a participatory approach to SIA are compared for an assessment of impacts of the proposed removal of hydroelectric dams to recover threatened and endangered salmon in the Pacific Northwest of the United States. The analysis focuses on empirical differences and similarities between the technical social analysis...

  12. Bayesian model reduction and empirical Bayes for group (DCM) studies

    PubMed Central

    Friston, Karl J.; Litvak, Vladimir; Oswal, Ashwini; Razi, Adeel; Stephan, Klaas E.; van Wijk, Bernadette C.M.; Ziegler, Gabriel; Zeidman, Peter

    2016-01-01

    This technical note describes some Bayesian procedures for the analysis of group studies that use nonlinear models at the first (within-subject) level – e.g., dynamic causal models – and linear models at subsequent (between-subject) levels. Its focus is on using Bayesian model reduction to finesse the inversion of multiple models of a single dataset or a single (hierarchical or empirical Bayes) model of multiple datasets. These applications of Bayesian model reduction allow one to consider parametric random effects and make inferences about group effects very efficiently (in a few seconds). We provide the relatively straightforward theoretical background to these procedures and illustrate their application using a worked example. This example uses a simulated mismatch negativity study of schizophrenia. We illustrate the robustness of Bayesian model reduction to violations of the (commonly used) Laplace assumption in dynamic causal modelling and show how its recursive application can facilitate both classical and Bayesian inference about group differences. Finally, we consider the application of these empirical Bayesian procedures to classification and prediction. PMID:26569570

  13. Final Report A Multi-Language Environment For Programmable Code Optimization and Empirical Tuning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yi, Qing; Whaley, Richard Clint; Qasem, Apan

    This report summarizes our effort and results of building an integrated optimization environment to effectively combine the programmable control and the empirical tuning of source-to-source compiler optimizations within the framework of multiple existing languages, specifically C, C++, and Fortran. The environment contains two main components: the ROSE analysis engine, which is based on the ROSE C/C++/Fortran2003 source-to-source compiler developed by Co-PI Dr.Quinlan et. al at DOE/LLNL, and the POET transformation engine, which is based on an interpreted program transformation language developed by Dr. Yi at University of Texas at San Antonio (UTSA). The ROSE analysis engine performs advanced compiler analysis,more » identifies profitable code transformations, and then produces output in POET, a language designed to provide programmable control of compiler optimizations to application developers and to support the parameterization of architecture-sensitive optimizations so that their configurations can be empirically tuned later. This POET output can then be ported to different machines together with the user application, where a POET-based search engine empirically reconfigures the parameterized optimizations until satisfactory performance is found. Computational specialists can write POET scripts to directly control the optimization of their code. Application developers can interact with ROSE to obtain optimization feedback as well as provide domain-specific knowledge and high-level optimization strategies. The optimization environment is expected to support different levels of automation and programmer intervention, from fully-automated tuning to semi-automated development and to manual programmable control.« less

  14. How Can We Stop Our Children from Hurting Themselves? Stages of Change, Motivational Interviewing, and Exposure Therapy Applications for Non-Suicidal Self-Injury in Children and Adolescents

    ERIC Educational Resources Information Center

    Kamen, David G.

    2009-01-01

    Non-suicidal self-injury (NSSI) in children and adolescents is a major public health problem. Fortunately, we can apply functional analysis, in conjunction with empirically validated NSSI assessment measurements, to precisely evaluate the biopsychosocial risk factors and reinforcements that contextualize NSSI. Empirically validated behavioral…

  15. Survey and analysis of research on supersonic drag-due-to-lift minimization with recommendations for wing design

    NASA Technical Reports Server (NTRS)

    Carlson, Harry W.; Mann, Michael J.

    1992-01-01

    A survey of research on drag-due-to-lift minimization at supersonic speeds, including a study of the effectiveness of current design and analysis methods was conducted. The results show that a linearized theory analysis with estimated attainable thrust and vortex force effects can predict with reasonable accuracy the lifting efficiency of flat wings. Significantly better wing performance can be achieved through the use of twist and camber. Although linearized theory methods tend to overestimate the amount of twist and camber required for a given application and provide an overly optimistic performance prediction, these deficiencies can be overcome by implementation of recently developed empirical corrections. Numerous examples of the correlation of experiment and theory are presented to demonstrate the applicability and limitations of linearized theory methods with and without empirical corrections. The use of an Euler code for the estimation of aerodynamic characteristics of a twisted and cambered wing and its application to design by iteration are discussed.

  16. Critical Realism and Empirical Bioethics: A Methodological Exposition.

    PubMed

    McKeown, Alex

    2017-09-01

    This paper shows how critical realism can be used to integrate empirical data and philosophical analysis within 'empirical bioethics'. The term empirical bioethics, whilst appearing oxymoronic, simply refers to an interdisciplinary approach to the resolution of practical ethical issues within the biological and life sciences, integrating social scientific, empirical data with philosophical analysis. It seeks to achieve a balanced form of ethical deliberation that is both logically rigorous and sensitive to context, to generate normative conclusions that are practically applicable to the problem, challenge, or dilemma. Since it incorporates both philosophical and social scientific components, empirical bioethics is a field that is consistent with the use of critical realism as a research methodology. The integration of philosophical and social scientific approaches to ethics has been beset with difficulties, not least because of the irreducibly normative, rather than descriptive, nature of ethical analysis and the contested relation between fact and value. However, given that facts about states of affairs inform potential courses of action and their consequences, there is a need to overcome these difficulties and successfully integrate data with theory. Previous approaches have been formulated to overcome obstacles in combining philosophical and social scientific perspectives in bioethical analysis; however each has shortcomings. As a mature interdisciplinary approach critical realism is well suited to empirical bioethics, although it has hitherto not been widely used. Here I show how it can be applied to this kind of research and explain how it represents an improvement on previous approaches.

  17. Systematic review of empiricism and theory in domestic minor sex trafficking research.

    PubMed

    Twis, Mary K; Shelton, Beth Anne

    2018-01-01

    Empiricism and the application of human behavior theory to inquiry are regarded as markers of high-quality research. Unfortunately, scholars have noted that there are many gaps in theory and empiricism within the human trafficking literature, calling into question the legitimacy of policies and practices that are derived from the available data. To date, there has not been an analysis of the extent to which empirical methods and human behavior theory have been applied to domestic minor sex trafficking (DMST) research as a subcategory of human trafficking inquiry. To fill this gap in the literature, this systematic review was designed to assess the degree to which DMST publications are a) empirical, and b) apply human behavior theory to inquiry. This analysis also focuses on answering research questions related to patterns within DMST study data sources, and patterns of human behavior theory application. The results of this review indicate that a minority of sampled DMST publications are empirical, a minority of those articles that were empirical apply a specific human behavior theory within the research design and reporting of results, a minority of articles utilize data collected directly from DMST victims, and that there are no discernible patterns in the application of human behavior theory to DMST research. This research note suggests that DMST research is limited by the same challenges as the larger body of human trafficking scholarship. Based upon these overarching findings, specific recommendations are offered to DMST researchers who are committed to enhancing the quality of DMST scholarship.

  18. Bayesian model reduction and empirical Bayes for group (DCM) studies.

    PubMed

    Friston, Karl J; Litvak, Vladimir; Oswal, Ashwini; Razi, Adeel; Stephan, Klaas E; van Wijk, Bernadette C M; Ziegler, Gabriel; Zeidman, Peter

    2016-03-01

    This technical note describes some Bayesian procedures for the analysis of group studies that use nonlinear models at the first (within-subject) level - e.g., dynamic causal models - and linear models at subsequent (between-subject) levels. Its focus is on using Bayesian model reduction to finesse the inversion of multiple models of a single dataset or a single (hierarchical or empirical Bayes) model of multiple datasets. These applications of Bayesian model reduction allow one to consider parametric random effects and make inferences about group effects very efficiently (in a few seconds). We provide the relatively straightforward theoretical background to these procedures and illustrate their application using a worked example. This example uses a simulated mismatch negativity study of schizophrenia. We illustrate the robustness of Bayesian model reduction to violations of the (commonly used) Laplace assumption in dynamic causal modelling and show how its recursive application can facilitate both classical and Bayesian inference about group differences. Finally, we consider the application of these empirical Bayesian procedures to classification and prediction. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  19. An empirical analysis of an innovative application for an underutilized resource: small-diameter roundwood in recreational buildings

    Treesearch

    Randall Cantrell

    2004-01-01

    Builders were surveyed to explore perceptions regarding small-diameter roundwood (SDR). The study empirically tests a model of builders’ attitudes and opinions about using SDR as a building material in recreational buildings. Findings suggest that, of the 130 builders surveyed, most are likely to use SDR in recreational buildings when it meets the following criteria: 1...

  20. Quantifying the benefits to the national economy from secondary applications of NASA technology, executive summary

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The feasibility of systematically quantifying the economic benefits of secondary applications of NASA related R and D was investigated. Based upon the tools of economic theory and econometric analysis, a set of empirical methods was developed and selected applications were made to demonstrate their workability. Analyses of the technological developments related to integrated circuits, cryogenic insulation, gas turbines, and computer programs for structural analysis indicated substantial secondary benefits accruing from NASA's R and D in these areas.

  1. Quantifying the benefits to the national economy from secondary applications of NASA technology

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The feasibility of systematically quantifying the economic benefits of secondary applications of NASA related R and D is investigated. Based upon the tools of economic theory and econometric analysis, it develops a set of empirical methods and makes selected applications to demonstrate their workability. Analyses of the technological developments related to integrated circuits, cryogenic insulation, gas turbines, and computer programs for structural analysis indicated substantial secondary benefits accruing from NASA's R and D in these areas.

  2. Econometrics of exhaustible resource supply: a theory and an application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Epple, D.

    1983-01-01

    This report takes a major step toward developing a fruitful approach to empirical analysis of resource supply. It is the first empirical application of resource theory that has successfully integrated the effects of depletion of nonrenewable resources with the effects of uncertainty about future costs and prices on supply behavior. Thus, the model is a major improvement over traditional engineering-optimization models that assume complete certainty, and over traditional econometrics models that are only implicitly related to the theory of resource supply. The model is used to test hypotheses about interdependence of oil and natural gas discoveries, depletion, ultimate recovery, andmore » the role of price expectations. This paper demonstrates the feasibility of using exhaustible resource theory in the development of empirically testable models. 19 refs., 1 fig., 5 tabs.« less

  3. Expanding the use of empiricism in nursing: can we bridge the gap between knowledge and clinical practice?

    PubMed

    Giuliano, Karen K

    2003-04-01

    The philosophy of Aristotle and its impact on the process of empirical scientific inquiry has been substantial. The influence of the clarity and orderliness of his thinking, when applied to the acquisition of knowledge in nursing, can not be overstated. Traditional empirical approaches have and will continue to have an important influence on the development of nursing knowledge through nursing research. However, as nursing is primarily a practice discipline, the transition from empirical and syllogistic reasoning is problematic. Other types of inquiry are essential in the application of nursing knowledge obtained by empirical scientific approaches and to understand how that knowledge can best be used in the care of patients. This paper reviews the strengths and limitations of syllogistic reasoning by applying it to a recently published study on temperature measurement in nursing. It then discusses possible ways that the empirical knowledge gained from that study and confirmed in its reasoning by logical analysis could be used in the daily care of critically ill patients. It concludes by highlighting the utility of broader approaches to knowledge development, including interpretative approaches and contemporary empiricism, as a way to bridge the gap between factual empirical knowledge and the practical application of that knowledge in everyday clinical nursing practice.

  4. A study of fault prediction and reliability assessment in the SEL environment

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Patnaik, Debabrata

    1986-01-01

    An empirical study on estimation and prediction of faults, prediction of fault detection and correction effort, and reliability assessment in the Software Engineering Laboratory environment (SEL) is presented. Fault estimation using empirical relationships and fault prediction using curve fitting method are investigated. Relationships between debugging efforts (fault detection and correction effort) in different test phases are provided, in order to make an early estimate of future debugging effort. This study concludes with the fault analysis, application of a reliability model, and analysis of a normalized metric for reliability assessment and reliability monitoring during development of software.

  5. An empirical comparison of a dynamic software testability metric to static cyclomatic complexity

    NASA Technical Reports Server (NTRS)

    Voas, Jeffrey M.; Miller, Keith W.; Payne, Jeffrey E.

    1993-01-01

    This paper compares the dynamic testability prediction technique termed 'sensitivity analysis' to the static testability technique termed cyclomatic complexity. The application that we chose in this empirical study is a CASE generated version of a B-737 autoland system. For the B-737 system we analyzed, we isolated those functions that we predict are more prone to hide errors during system/reliability testing. We also analyzed the code with several other well-known static metrics. This paper compares and contrasts the results of sensitivity analysis to the results of the static metrics.

  6. Innovation Attributes, Policy Intervention, and the Diffusion of Computer Applications Among Local Governments

    ERIC Educational Resources Information Center

    Perry, James L.; Kraemer, Kenneth L.

    1978-01-01

    Argues that innovation attributes, together with policies associated with the diffusion on an innovation, account for significant differences in diffusion patterns. An empirical analysis of this thesis focuses on the diffusion of computer applications software in local government. Available from Elsevier Scientific Publishing Co., Box 211,…

  7. Mixture Distribution Latent State-Trait Analysis: Basic Ideas and Applications

    ERIC Educational Resources Information Center

    Courvoisier, Delphine S.; Eid, Michael; Nussbeck, Fridtjof W.

    2007-01-01

    Extensions of latent state-trait models for continuous observed variables to mixture latent state-trait models with and without covariates of change are presented that can separate individuals differing in their occasion-specific variability. An empirical application to the repeated measurement of mood states (N = 501) revealed that a model with 2…

  8. Empirical intrinsic geometry for nonlinear modeling and time series filtering.

    PubMed

    Talmon, Ronen; Coifman, Ronald R

    2013-07-30

    In this paper, we present a method for time series analysis based on empirical intrinsic geometry (EIG). EIG enables one to reveal the low-dimensional parametric manifold as well as to infer the underlying dynamics of high-dimensional time series. By incorporating concepts of information geometry, this method extends existing geometric analysis tools to support stochastic settings and parametrizes the geometry of empirical distributions. However, the statistical models are not required as priors; hence, EIG may be applied to a wide range of real signals without existing definitive models. We show that the inferred model is noise-resilient and invariant under different observation and instrumental modalities. In addition, we show that it can be extended efficiently to newly acquired measurements in a sequential manner. These two advantages enable us to revisit the Bayesian approach and incorporate empirical dynamics and intrinsic geometry into a nonlinear filtering framework. We show applications to nonlinear and non-Gaussian tracking problems as well as to acoustic signal localization.

  9. Reflectance spectroscopy: quantitative analysis techniques for remote sensing applications.

    USGS Publications Warehouse

    Clark, R.N.; Roush, T.L.

    1984-01-01

    Several methods for the analysis of remotely sensed reflectance data are compared, including empirical methods and scattering theories, both of which are important for solving remote sensing problems. The concept of the photon mean path length and the implications for use in modeling reflectance spectra are presented.-from Authors

  10. One Hundred Years of Research: Prudent Aspirations

    ERIC Educational Resources Information Center

    Glass, Gene V.

    2016-01-01

    The statistical method "meta-analysis" is perhaps unique as a contribution to empirical inquiry of many types because it arose entirely within the practice of education research. In spite of its origins, meta-analysis has found its widest application and most important contributions in the field of medicine. Contrasting the success of…

  11. Technology's Effect on Achievement in Higher Education: A Stage I Meta-Analysis of Classroom Applications

    ERIC Educational Resources Information Center

    Schmid, Richard F.; Bernard, Robert M.; Borokhovski, Eugene; Tamim, Rana; Abrami, Philip C.; Wade, C. Anne; Surkes, Michael A.; Lowerison, Gretchen

    2009-01-01

    This paper reports the findings of a Stage I meta-analysis exploring the achievement effects of computer-based technology use in higher education classrooms (non-distance education). An extensive literature search revealed more than 6,000 potentially relevant primary empirical studies. Analysis of a representative sample of 231 studies (k = 310)…

  12. Empirical prediction of peak pressure levels in anthropogenic impulsive noise. Part I: Airgun arrays signals.

    PubMed

    Galindo-Romero, Marta; Lippert, Tristan; Gavrilov, Alexander

    2015-12-01

    This paper presents an empirical linear equation to predict peak pressure level of anthropogenic impulsive signals based on its correlation with the sound exposure level. The regression coefficients are shown to be weakly dependent on the environmental characteristics but governed by the source type and parameters. The equation can be applied to values of the sound exposure level predicted with a numerical model, which provides a significant improvement in the prediction of the peak pressure level. Part I presents the analysis for airgun arrays signals, and Part II considers the application of the empirical equation to offshore impact piling noise.

  13. Falsification Testing of Instrumental Variables Methods for Comparative Effectiveness Research.

    PubMed

    Pizer, Steven D

    2016-04-01

    To demonstrate how falsification tests can be used to evaluate instrumental variables methods applicable to a wide variety of comparative effectiveness research questions. Brief conceptual review of instrumental variables and falsification testing principles and techniques accompanied by an empirical application. Sample STATA code related to the empirical application is provided in the Appendix. Comparative long-term risks of sulfonylureas and thiazolidinediones for management of type 2 diabetes. Outcomes include mortality and hospitalization for an ambulatory care-sensitive condition. Prescribing pattern variations are used as instrumental variables. Falsification testing is an easily computed and powerful way to evaluate the validity of the key assumption underlying instrumental variables analysis. If falsification tests are used, instrumental variables techniques can help answer a multitude of important clinical questions. © Health Research and Educational Trust.

  14. Evidence-based ethics – What it should be and what it shouldn't

    PubMed Central

    Strech, Daniel

    2008-01-01

    Background The concept of evidence-based medicine has strongly influenced the appraisal and application of empirical information in health care decision-making. One principal characteristic of this concept is the distinction between "evidence" in the sense of high-quality empirical information on the one hand and rather low-quality empirical information on the other hand. In the last 5 to 10 years an increasing number of articles published in international journals have made use of the term "evidence-based ethics", making a systematic analysis and explication of the term and its applicability in ethics important. Discussion In this article four descriptive and two normative characteristics of the general concept "evidence-based" are presented and explained systematically. These characteristics are to then serve as a framework for assessing the methodological and practical challenges of evidence-based ethics as a developing methodology. The superiority of evidence in contrast to other empirical information has several normative implications such as the legitimization of decisions in medicine and ethics. This implicit normativity poses ethical concerns if there is no formal consent on which sort of empirical information deserves the label "evidence" and which does not. In empirical ethics, which relies primarily on interview research and other methods from the social sciences, we still lack gold standards for assessing the quality of study designs and appraising their findings. Conclusion The use of the term "evidence-based ethics" should be discouraged, unless there is enough consensus on how to differentiate between high- and low-quality information produced by empirical ethics. In the meantime, whenever empirical information plays a role, the process of ethical decision-making should make use of systematic reviews of empirical studies that involve a critical appraisal and comparative discussion of data. PMID:18937838

  15. Analysis of experts' perception of the effectiveness of teaching methods

    NASA Astrophysics Data System (ADS)

    Kindra, Gurprit S.

    1984-03-01

    The present study attempts to shed light on the perceptions of business educators regarding the effectiveness of six methodologies in achieving Gagné's five learning outcomes. Results of this study empirically confirm the oft-stated contention that no one method is globally effective for the attainment of all objectives. Specifically, business games, traditional lecture, and case study methods are perceived to be most effective for the learning of application, knowledge acquisition, and analysis and application, respectively.

  16. Heuristic Evaluation of E-Learning Courses: A Comparative Analysis of Two E-Learning Heuristic Sets

    ERIC Educational Resources Information Center

    Zaharias, Panagiotis; Koutsabasis, Panayiotis

    2012-01-01

    Purpose: The purpose of this paper is to discuss heuristic evaluation as a method for evaluating e-learning courses and applications and more specifically to investigate the applicability and empirical use of two customized e-learning heuristic protocols. Design/methodology/approach: Two representative e-learning heuristic protocols were chosen…

  17. Applications of Nonlinear Principal Components Analysis to Behavioral Data.

    ERIC Educational Resources Information Center

    Hicks, Marilyn Maginley

    1981-01-01

    An empirical investigation of the statistical procedure entitled nonlinear principal components analysis was conducted on a known equation and on measurement data in order to demonstrate the procedure and examine its potential usefulness. This method was suggested by R. Gnanadesikan and based on an early paper of Karl Pearson. (Author/AL)

  18. Formalization and Analysis of Reasoning by Assumption

    ERIC Educational Resources Information Center

    Bosse, Tibor; Jonker, Catholijn M.; Treur, Jan

    2006-01-01

    This article introduces a novel approach for the analysis of the dynamics of reasoning processes and explores its applicability for the reasoning pattern called reasoning by assumption. More specifically, for a case study in the domain of a Master Mind game, it is shown how empirical human reasoning traces can be formalized and automatically…

  19. Advances in the application of amino acid nitrogen isotopic analysis in ecological and biogeochemical studies

    USDA-ARS?s Scientific Manuscript database

    Compound-specific isotopic analysis of amino acids (CSIA-AA) has emerged in the last decade as a powerful approach for tracing the origins and fate of nitrogen in ecological and biogeochemical studies. This approach is based on the empirical knowledge that source AAs (i.e., phenylalanine), fractiona...

  20. Analysis methods for Kevlar shield response to rotor fragments

    NASA Technical Reports Server (NTRS)

    Gerstle, J. H.

    1977-01-01

    Several empirical and analytical approaches to rotor burst shield sizing are compared and principal differences in metal and fabric dynamic behavior are discussed. The application of transient structural response computer programs to predict Kevlar containment limits is described. For preliminary shield sizing, present analytical methods are useful if insufficient test data for empirical modeling are available. To provide other information useful for engineering design, analytical methods require further developments in material characterization, failure criteria, loads definition, and post-impact fragment trajectory prediction.

  1. Five ways of being "theoretical": applications to provider-patient communication research.

    PubMed

    Hall, Judith A; Schmid Mast, Marianne

    2009-03-01

    Analyzes the term "theoretical" as it applies to the area of provider-patient communication research, in order to understand better at a conceptual level what the term may mean for authors and critics. Based on literature on provider-patient communication. Offers, and discusses, five definitions of the term "theoretical" as it applies to empirical research and its exposition: (1) grounding, (2) referencing, (3) design and analysis, (4) interpretation, and (5) impact. Each of these definitions embodies a different standard for evaluating the theoretical aspects of research. Although it is often said that research on provider-patient communication is not "theoretical" enough, the term is ambiguous and often applied vaguely. A multidimensional analysis reveals that there are several distinct ways in which empirical research can be strong or weak theoretically. Researchers, educators, editors, and reviewers could use the "Five Ways" framework to appraise the theory-relevant strengths and weaknesses of empirical research and its exposition.

  2. Application of the Semi-Empirical Force-Limiting Approach for the CoNNeCT SCAN Testbed

    NASA Technical Reports Server (NTRS)

    Staab, Lucas D.; McNelis, Mark E.; Akers, James C.; Suarez, Vicente J.; Jones, Trevor M.

    2012-01-01

    The semi-empirical force-limiting vibration method was developed and implemented for payload testing to limit the structural impedance mismatch (high force) that occurs during shaker vibration testing. The method has since been extended for use in analytical models. The Space Communications and Navigation Testbed (SCAN Testbed), known at NASA as, the Communications, Navigation, and Networking re-Configurable Testbed (CoNNeCT), project utilized force-limiting testing and analysis following the semi-empirical approach. This paper presents the steps in performing a force-limiting analysis and then compares the results to test data recovered during the CoNNeCT force-limiting random vibration qualification test that took place at NASA Glenn Research Center (GRC) in the Structural Dynamics Laboratory (SDL) December 19, 2010 to January 7, 2011. A compilation of lessons learned and considerations for future force-limiting tests is also included.

  3. Modern Empirical Statistical Spectral Analysis.

    DTIC Science & Technology

    1980-05-01

    716-723. Akaike, H. (1977). On entropy maximization principle, Applications of Statistics, P.R. Krishnaiah , ed., North-Holland, Amsterdam, 27-41...by P. Krishnaiah , North Holland: Amsterdam, 283-295. Parzen, E. (1979). Forecasting and whitening filter estimation, TIMS Studies in the Management

  4. Clustering and Dimensionality Reduction to Discover Interesting Patterns in Binary Data

    NASA Astrophysics Data System (ADS)

    Palumbo, Francesco; D'Enza, Alfonso Iodice

    The attention towards binary data coding increased consistently in the last decade due to several reasons. The analysis of binary data characterizes several fields of application, such as market basket analysis, DNA microarray data, image mining, text mining and web-clickstream mining. The paper illustrates two different approaches exploiting a profitable combination of clustering and dimensionality reduction for the identification of non-trivial association structures in binary data. An application in the Association Rules framework supports the theory with the empirical evidence.

  5. A Geographic-Information-Systems-Based Approach to Analysis of Characteristics Predicting Student Persistence and Graduation

    ERIC Educational Resources Information Center

    Ousley, Chris

    2010-01-01

    This study sought to provide empirical evidence regarding the use of spatial analysis in enrollment management to predict persistence and graduation. The research utilized data from the 2000 U.S. Census and applicant records from The University of Arizona to study the spatial distributions of enrollments. Based on the initial results, stepwise…

  6. Classical Item Analysis Using Latent Variable Modeling: A Note on a Direct Evaluation Procedure

    ERIC Educational Resources Information Center

    Raykov, Tenko; Marcoulides, George A.

    2011-01-01

    A directly applicable latent variable modeling procedure for classical item analysis is outlined. The method allows one to point and interval estimate item difficulty, item correlations, and item-total correlations for composites consisting of categorical items. The approach is readily employed in empirical research and as a by-product permits…

  7. Sorption and reemission of formaldehyde by gypsum wallboard. Report for June 1990-August 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, J.C.S.

    1993-01-01

    The paper gives results of an analysis of the sorption and desorption of formaldehyde by unpainted wallboard, using a mass transfer model based on the Langmuir sorption isotherm. The sorption and desorption rate constants are determined by short-term experimental data. Long-term sorption and desorption curves are developed by the mass transfer model without any adjustable parameters. Compared with other empirically developed models, the mass transfer model has more extensive applicability and provides an elucidation of the sorption and desorption mechanism that empirical models cannot. The mass transfer model is also more feasible and accurate than empirical models for applications suchmore » as scale-up and exposure assessment. For a typical indoor environment, the model predicts that gypsum wallboard is a much stronger sink for formaldehyde than for other indoor air pollutants such as tetrachloroethylene and ethylbenzene. The strong sink effects are reflected by the high equilibrium capacity and slow decay of the desorption curve.« less

  8. A multiple indicator solution approach to endogeneity in discrete-choice models for environmental valuation.

    PubMed

    Mariel, Petr; Hoyos, David; Artabe, Alaitz; Guevara, C Angelo

    2018-08-15

    Endogeneity is an often neglected issue in empirical applications of discrete choice modelling despite its severe consequences in terms of inconsistent parameter estimation and biased welfare measures. This article analyses the performance of the multiple indicator solution method to deal with endogeneity arising from omitted explanatory variables in discrete choice models for environmental valuation. We also propose and illustrate a factor analysis procedure for the selection of the indicators in practice. Additionally, the performance of this method is compared with the recently proposed hybrid choice modelling framework. In an empirical application we find that the multiple indicator solution method and the hybrid model approach provide similar results in terms of welfare estimates, although the multiple indicator solution method is more parsimonious and notably easier to implement. The empirical results open a path to explore the performance of this method when endogeneity is thought to have a different cause or under a different set of indicators. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Quantitative genetic versions of Hamilton's rule with empirical applications

    PubMed Central

    McGlothlin, Joel W.; Wolf, Jason B.; Brodie, Edmund D.; Moore, Allen J.

    2014-01-01

    Hamilton's theory of inclusive fitness revolutionized our understanding of the evolution of social interactions. Surprisingly, an incorporation of Hamilton's perspective into the quantitative genetic theory of phenotypic evolution has been slow, despite the popularity of quantitative genetics in evolutionary studies. Here, we discuss several versions of Hamilton's rule for social evolution from a quantitative genetic perspective, emphasizing its utility in empirical applications. Although evolutionary quantitative genetics offers methods to measure each of the critical parameters of Hamilton's rule, empirical work has lagged behind theory. In particular, we lack studies of selection on altruistic traits in the wild. Fitness costs and benefits of altruism can be estimated using a simple extension of phenotypic selection analysis that incorporates the traits of social interactants. We also discuss the importance of considering the genetic influence of the social environment, or indirect genetic effects (IGEs), in the context of Hamilton's rule. Research in social evolution has generated an extensive body of empirical work focusing—with good reason—almost solely on relatedness. We argue that quantifying the roles of social and non-social components of selection and IGEs, in addition to relatedness, is now timely and should provide unique additional insights into social evolution. PMID:24686930

  10. Evaluation of Regression Models of Balance Calibration Data Using an Empirical Criterion

    NASA Technical Reports Server (NTRS)

    Ulbrich, Norbert; Volden, Thomas R.

    2012-01-01

    An empirical criterion for assessing the significance of individual terms of regression models of wind tunnel strain gage balance outputs is evaluated. The criterion is based on the percent contribution of a regression model term. It considers a term to be significant if its percent contribution exceeds the empirical threshold of 0.05%. The criterion has the advantage that it can easily be computed using the regression coefficients of the gage outputs and the load capacities of the balance. First, a definition of the empirical criterion is provided. Then, it is compared with an alternate statistical criterion that is widely used in regression analysis. Finally, calibration data sets from a variety of balances are used to illustrate the connection between the empirical and the statistical criterion. A review of these results indicated that the empirical criterion seems to be suitable for a crude assessment of the significance of a regression model term as the boundary between a significant and an insignificant term cannot be defined very well. Therefore, regression model term reduction should only be performed by using the more universally applicable statistical criterion.

  11. An Analysis of the Application and Enrollment Processes for In-State and Out-of-State Students at a Large Public University.

    ERIC Educational Resources Information Center

    Curs, Bradley; Singell, Larry D., Jr.

    2002-01-01

    Two separate empirical analyses use time-series data for the University of Oregon to estimate and compare the responsiveness of applicants and enrollees to variations in the net price. Results show that prior studies may understate student price responsiveness. Finds that elasticity estimates differ for in-state and out-of-state students. Suggests…

  12. Conceptualisations of infinity by primary pre-service teachers

    NASA Astrophysics Data System (ADS)

    Date-Huxtable, Elizabeth; Cavanagh, Michael; Coady, Carmel; Easey, Michael

    2018-05-01

    As part of the Opening Real Science: Authentic Mathematics and Science Education for Australia project, an online mathematics learning module embedding conceptual thinking about infinity in science-based contexts, was designed and trialled with a cohort of 22 pre-service teachers during 1 week of intensive study. This research addressed the question: "How do pre-service teachers conceptualise infinity mathematically?" Participants argued the existence of infinity in a summative reflective task, using mathematical and empirical arguments that were coded according to five themes: definition, examples, application, philosophy and teaching; and 17 codes. Participants' reflections were differentiated as to whether infinity was referred to as an abstract (A) or a real (R) concept or whether both (B) codes were used. Principal component analysis of the reflections, using frequency of codings, revealed that A and R codes occurred at different frequencies in three groups of reflections. Distinct methods of argument were associated with each group of reflections: mathematical numerical examples and empirical measurement comparisons characterised arguments for infinity as an abstract concept, geometric and empirical dynamic examples and belief statements characterised arguments for infinity as a real concept and empirical measurement and mathematical examples and belief statements characterised arguments for infinity as both an abstract and a real concept. An implication of the results is that connections between mathematical and empirical applications of infinity may assist pre-service teachers to contrast finite with infinite models of the world.

  13. Statistical detection of EEG synchrony using empirical bayesian inference.

    PubMed

    Singh, Archana K; Asoh, Hideki; Takeda, Yuji; Phillips, Steven

    2015-01-01

    There is growing interest in understanding how the brain utilizes synchronized oscillatory activity to integrate information across functionally connected regions. Computing phase-locking values (PLV) between EEG signals is a popular method for quantifying such synchronizations and elucidating their role in cognitive tasks. However, high-dimensionality in PLV data incurs a serious multiple testing problem. Standard multiple testing methods in neuroimaging research (e.g., false discovery rate, FDR) suffer severe loss of power, because they fail to exploit complex dependence structure between hypotheses that vary in spectral, temporal and spatial dimension. Previously, we showed that a hierarchical FDR and optimal discovery procedures could be effectively applied for PLV analysis to provide better power than FDR. In this article, we revisit the multiple comparison problem from a new Empirical Bayes perspective and propose the application of the local FDR method (locFDR; Efron, 2001) for PLV synchrony analysis to compute FDR as a posterior probability that an observed statistic belongs to a null hypothesis. We demonstrate the application of Efron's Empirical Bayes approach for PLV synchrony analysis for the first time. We use simulations to validate the specificity and sensitivity of locFDR and a real EEG dataset from a visual search study for experimental validation. We also compare locFDR with hierarchical FDR and optimal discovery procedures in both simulation and experimental analyses. Our simulation results showed that the locFDR can effectively control false positives without compromising on the power of PLV synchrony inference. Our results from the application locFDR on experiment data detected more significant discoveries than our previously proposed methods whereas the standard FDR method failed to detect any significant discoveries.

  14. Beyond Autism Treatment: The Application of Applied Behavior Analysis in the Treatment of Emotional and Psychological Disorders

    ERIC Educational Resources Information Center

    Ross, Robert K.

    2007-01-01

    The field of applied behavior analysis (ABA) has increasingly come to be associated with the treatment of autism in young children. This phenomenon is largely the result of empirical research demonstrating effective treatment outcomes in this population. The same cannot be said with regard to the treatment of conditions often referred to as…

  15. Nonparametric bootstrap analysis with applications to demographic effects in demand functions.

    PubMed

    Gozalo, P L

    1997-12-01

    "A new bootstrap proposal, labeled smooth conditional moment (SCM) bootstrap, is introduced for independent but not necessarily identically distributed data, where the classical bootstrap procedure fails.... A good example of the benefits of using nonparametric and bootstrap methods is the area of empirical demand analysis. In particular, we will be concerned with their application to the study of two important topics: what are the most relevant effects of household demographic variables on demand behavior, and to what extent present parametric specifications capture these effects." excerpt

  16. An empirical comparative study on biological age estimation algorithms with an application of Work Ability Index (WAI).

    PubMed

    Cho, Il Haeng; Park, Kyung S; Lim, Chang Joo

    2010-02-01

    In this study, we described the characteristics of five different biological age (BA) estimation algorithms, including (i) multiple linear regression, (ii) principal component analysis, and somewhat unique methods developed by (iii) Hochschild, (iv) Klemera and Doubal, and (v) a variant of Klemera and Doubal's method. The objective of this study is to find the most appropriate method of BA estimation by examining the association between Work Ability Index (WAI) and the differences of each algorithm's estimates from chronological age (CA). The WAI was found to be a measure that reflects an individual's current health status rather than the deterioration caused by a serious dependency with the age. Experiments were conducted on 200 Korean male participants using a BA estimation system developed principally under the concept of non-invasive, simple to operate and human function-based. Using the empirical data, BA estimation as well as various analyses including correlation analysis and discriminant function analysis was performed. As a result, it had been confirmed by the empirical data that Klemera and Doubal's method with uncorrelated variables from principal component analysis produces relatively reliable and acceptable BA estimates. 2009 Elsevier Ireland Ltd. All rights reserved.

  17. An Analysis of Base Pressure at Supersonic Velocities and Comparison with Experiment

    NASA Technical Reports Server (NTRS)

    Chapman, Dean R

    1951-01-01

    In the first part of the investigation an analysis is made of base pressure in an inviscid fluid, both for two-dimensional and axially symmetric flow. It is shown that for two-dimensional flow, and also for the flow over a body of revolution with a cylindrical sting attached to the base, there are an infinite number of possible solutions satisfying all necessary boundary conditions at any given free-stream Mach number. For the particular case of a body having no sting attached only one solution is possible in an inviscid flow, but it corresponds to zero base drag. Accordingly, it is concluded that a strictly inviscid-fluid theory cannot be satisfactory for practical applications. An approximate semi-empirical analysis for base pressure in a viscous fluid is developed in a second part of the investigation. The semi-empirical analysis is based partly on inviscid-flow calculations.

  18. Applicability of empirical data currently used in predicting solid propellant exhaust plumes

    NASA Technical Reports Server (NTRS)

    Tevepaugh, J. A.; Smith, S. D.; Penny, M. M.; Greenwood, T.; Roberts, B. B.

    1977-01-01

    Theoretical and experimental approaches to exhaust plume analysis are compared. A two-phase model is extended to include treatment of reacting gas chemistry, and thermodynamical modeling of the gaseous phase of the flow field is considered. The applicability of empirical data currently available to define particle drag coefficients, heat transfer coefficients, mean particle size, and particle size distributions is investigated. Experimental and analytical comparisons are presented for subscale solid rocket motors operating at three altitudes with attention to pitot total pressure and stagnation point heating rate measurements. The mathematical treatment input requirements are explained. The two-phase flow field solution adequately predicts gasdynamic properties in the inviscid portion of two-phase exhaust plumes. It is found that prediction of exhaust plume gas pressures requires an adequate model of flow field dynamics.

  19. Application-Specific Graph Sampling for Frequent Subgraph Mining and Community Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Purohit, Sumit; Choudhury, Sutanay; Holder, Lawrence B.

    Graph mining is an important data analysis methodology, but struggles as the input graph size increases. The scalability and usability challenges posed by such large graphs make it imperative to sample the input graph and reduce its size. The critical challenge in sampling is to identify the appropriate algorithm to insure the resulting analysis does not suffer heavily from the data reduction. Predicting the expected performance degradation for a given graph and sampling algorithm is also useful. In this paper, we present different sampling approaches for graph mining applications such as Frequent Subgrpah Mining (FSM), and Community Detection (CD). Wemore » explore graph metrics such as PageRank, Triangles, and Diversity to sample a graph and conclude that for heterogeneous graphs Triangles and Diversity perform better than degree based metrics. We also present two new sampling variations for targeted graph mining applications. We present empirical results to show that knowledge of the target application, along with input graph properties can be used to select the best sampling algorithm. We also conclude that performance degradation is an abrupt, rather than gradual phenomena, as the sample size decreases. We present the empirical results to show that the performance degradation follows a logistic function.« less

  20. Behavioral Economics and Empirical Public Policy

    ERIC Educational Resources Information Center

    Hursh, Steven R.; Roma, Peter G.

    2013-01-01

    The application of economics principles to the analysis of behavior has yielded novel insights on value and choice across contexts ranging from laboratory animal research to clinical populations to national trends of global impact. Recent innovations in demand curve methods provide a credible means of quantitatively comparing qualitatively…

  1. Prior robust empirical Bayes inference for large-scale data by conditioning on rank with application to microarray data

    PubMed Central

    Liao, J. G.; Mcmurry, Timothy; Berg, Arthur

    2014-01-01

    Empirical Bayes methods have been extensively used for microarray data analysis by modeling the large number of unknown parameters as random effects. Empirical Bayes allows borrowing information across genes and can automatically adjust for multiple testing and selection bias. However, the standard empirical Bayes model can perform poorly if the assumed working prior deviates from the true prior. This paper proposes a new rank-conditioned inference in which the shrinkage and confidence intervals are based on the distribution of the error conditioned on rank of the data. Our approach is in contrast to a Bayesian posterior, which conditions on the data themselves. The new method is almost as efficient as standard Bayesian methods when the working prior is close to the true prior, and it is much more robust when the working prior is not close. In addition, it allows a more accurate (but also more complex) non-parametric estimate of the prior to be easily incorporated, resulting in improved inference. The new method’s prior robustness is demonstrated via simulation experiments. Application to a breast cancer gene expression microarray dataset is presented. Our R package rank.Shrinkage provides a ready-to-use implementation of the proposed methodology. PMID:23934072

  2. Pre- and Post-equinox ROSINA production rates calculated using a realistic empirical coma model derived from AMPS-DSMC simulations of comet 67P/Churyumov-Gerasimenko

    NASA Astrophysics Data System (ADS)

    Hansen, Kenneth; Altwegg, Kathrin; Berthelier, Jean-Jacques; Bieler, Andre; Calmonte, Ursina; Combi, Michael; De Keyser, Johan; Fiethe, Björn; Fougere, Nicolas; Fuselier, Stephen; Gombosi, Tamas; Hässig, Myrtha; Huang, Zhenguang; Le Roy, Lena; Rubin, Martin; Tenishev, Valeriy; Toth, Gabor; Tzou, Chia-Yu

    2016-04-01

    We have previously used results from the AMPS DSMC (Adaptive Mesh Particle Simulator Direct Simulation Monte Carlo) model to create an empirical model of the near comet coma (<400 km) of comet 67P for the pre-equinox orbit of comet 67P/Churyumov-Gerasimenko. In this work we extend the empirical model to the post-equinox, post-perihelion time period. In addition, we extend the coma model to significantly further from the comet (~100,000-1,000,000 km). The empirical model characterizes the neutral coma in a comet centered, sun fixed reference frame as a function of heliocentric distance, radial distance from the comet, local time and declination. Furthermore, we have generalized the model beyond application to 67P by replacing the heliocentric distance parameterizations and mapping them to production rates. Using this method, the model become significantly more general and can be applied to any comet. The model is a significant improvement over simpler empirical models, such as the Haser model. For 67P, the DSMC results are, of course, a more accurate representation of the coma at any given time, but the advantage of a mean state, empirical model is the ease and speed of use. One application of the empirical model is to de-trend the spacecraft motion from the ROSINA COPS and DFMS data (Rosetta Orbiter Spectrometer for Ion and Neutral Analysis, Comet Pressure Sensor, Double Focusing Mass Spectrometer). The ROSINA instrument measures the neutral coma density at a single point and the measured value is influenced by the location of the spacecraft relative to the comet and the comet-sun line. Using the empirical coma model we can correct for the position of the spacecraft and compute a total production rate based on the single point measurement. In this presentation we will present the coma production rate as a function of heliocentric distance both pre- and post-equinox and perihelion.

  3. Circulation Clusters--An Empirical Approach to Decentralization of Academic Libraries.

    ERIC Educational Resources Information Center

    McGrath, William E.

    1986-01-01

    Discusses the issue of centralization or decentralization of academic library collections, and describes a statistical analysis of book circulation at the University of Southwestern Louisiana that yielded subject area clusters as a compromise solution to the problem. Applications of the cluster model for all types of library catalogs are…

  4. An Application of Mathematical Programming to Assess Managerial Efficiency in the Houston Independent School District

    DTIC Science & Technology

    1981-11-01

    documenting attempts to define production functions in education. Levin 2 and Hanushek characterize current methodologies as being deficient both concept- ually...Evaluation and Policy Analysis, 2, 1, (1980), 27-36. 16. Hanushek , E. A., "Conceptual and Empirical Issues in Lstimation of Educational Production

  5. The Dynamics of Online User Behavior and IS Policy Implications

    ERIC Educational Resources Information Center

    Kim, Keehyung

    2016-01-01

    This dissertation, which comprises three independent essays, explores the dynamics of online user behavior and provides IS policy implications across three different applications. The first essay employs an econometric empirical analysis to examine the role of IT interventions on online users' gambling behavior, based on field data collected over…

  6. Theory, Method and Practice of Neuroscientific Findings in Science Education

    ERIC Educational Resources Information Center

    Liu, Chia-Ju; Chiang, Wen-Wei

    2014-01-01

    This report provides an overview of neuroscience research that is applicable for science educators. It first offers a brief analysis of empirical studies in educational neuroscience literature, followed by six science concept learning constructs based on the whole brain theory: gaining an understanding of brain function; pattern recognition and…

  7. The Review of Internet Marketing Use in Latvia's Companies

    ERIC Educational Resources Information Center

    Sloka, Biruta; Kantane, Inara; Walczak, Renata

    2017-01-01

    Development of new technologies and increasing competition require new solutions in business applications in internet marketing and advertising. The paper deals with issues related to advertising activities in internet marketing. There were presented both theoretical findings and empirical analysis of the survey conducted among Latvia's companies.…

  8. Empirical expression for DC magnetization curve of immobilized magnetic nanoparticles for use in biomedical applications

    NASA Astrophysics Data System (ADS)

    Elrefai, Ahmed L.; Sasayama, Teruyoshi; Yoshida, Takashi; Enpuku, Keiji

    2018-05-01

    We studied the magnetization (M-H) curve of immobilized magnetic nanoparticles (MNPs) used for biomedical applications. First, we performed numerical simulation on the DC M-H curve over a wide range of MNPs parameters. Based on the simulation results, we obtained an empirical expression for DC M-H curve. The empirical expression was compared with the measured M-H curves of various MNP samples, and quantitative agreements were obtained between them. We can also estimate the basic parameters of MNP from the comparison. Therefore, the empirical expression is useful for analyzing the M-H curve of immobilized MNPs for specific biomedical applications.

  9. Evaluation of theoretical and empirical water vapor sorption isotherm models for soils

    NASA Astrophysics Data System (ADS)

    Arthur, Emmanuel; Tuller, Markus; Moldrup, Per; de Jonge, Lis W.

    2016-01-01

    The mathematical characterization of water vapor sorption isotherms of soils is crucial for modeling processes such as volatilization of pesticides and diffusive and convective water vapor transport. Although numerous physically based and empirical models were previously proposed to describe sorption isotherms of building materials, food, and other industrial products, knowledge about the applicability of these functions for soils is noticeably lacking. We present an evaluation of nine models for characterizing adsorption/desorption isotherms for a water activity range from 0.03 to 0.93 based on measured data of 207 soils with widely varying textures, organic carbon contents, and clay mineralogy. In addition, the potential applicability of the models for prediction of sorption isotherms from known clay content was investigated. While in general, all investigated models described measured adsorption and desorption isotherms reasonably well, distinct differences were observed between physical and empirical models and due to the different degrees of freedom of the model equations. There were also considerable differences in model performance for adsorption and desorption data. While regression analysis relating model parameters and clay content and subsequent model application for prediction of measured isotherms showed promise for the majority of investigated soils, for soils with distinct kaolinitic and smectitic clay mineralogy predicted isotherms did not closely match the measurements.

  10. Pi2 detection using Empirical Mode Decomposition (EMD)

    NASA Astrophysics Data System (ADS)

    Mieth, Johannes Z. D.; Frühauff, Dennis; Glassmeier, Karl-Heinz

    2017-04-01

    Empirical Mode Decomposition has been used as an alternative method to wavelet transformation to identify onset times of Pi2 pulsations in data sets of the Scandinavian Magnetometer Array (SMA). Pi2 pulsations are magnetohydrodynamic waves occurring during magnetospheric substorms. Almost always Pi2 are observed at substorm onset in mid to low latitudes on Earth's nightside. They are fed by magnetic energy release caused by dipolarization processes. Their periods lie between 40 to 150 seconds. Usually, Pi2 are detected using wavelet transformation. Here, Empirical Mode Decomposition (EMD) is presented as an alternative approach to the traditional procedure. EMD is a young signal decomposition method designed for nonlinear and non-stationary time series. It provides an adaptive, data driven, and complete decomposition of time series into slow and fast oscillations. An optimized version using Monte-Carlo-type noise assistance is used here. By displaying the results in a time-frequency space a characteristic frequency modulation is observed. This frequency modulation can be correlated with the onset of Pi2 pulsations. A basic algorithm to find the onset is presented. Finally, the results are compared to classical wavelet-based analysis. The use of different SMA stations furthermore allows the spatial analysis of Pi2 onset times. EMD mostly finds application in the fields of engineering and medicine. This work demonstrates the applicability of this method to geomagnetic time series.

  11. On Allometry Relations

    NASA Astrophysics Data System (ADS)

    West, Damien; West, Bruce J.

    2012-07-01

    There are a substantial number of empirical relations that began with the identification of a pattern in data; were shown to have a terse power-law description; were interpreted using existing theory; reached the level of "law" and given a name; only to be subsequently fade away when it proved impossible to connect the "law" with a larger body of theory and/or data. Various forms of allometry relations (ARs) have followed this path. The ARs in biology are nearly two hundred years old and those in ecology, geophysics, physiology and other areas of investigation are not that much younger. In general if X is a measure of the size of a complex host network and Y is a property of a complex subnetwork embedded within the host network a theoretical AR exists between the two when Y = aXb. We emphasize that the reductionistic models of AR interpret X and Y as dynamic variables, albeit the ARs themselves are explicitly time independent even though in some cases the parameter values change over time. On the other hand, the phenomenological models of AR are based on the statistical analysis of data and interpret X and Y as averages to yield the empirical AR: = ab. Modern explanations of AR begin with the application of fractal geometry and fractal statistics to scaling phenomena. The detailed application of fractal geometry to the explanation of theoretical ARs in living networks is slightly more than a decade old and although well received it has not been universally accepted. An alternate perspective is given by the empirical AR that is derived using linear regression analysis of fluctuating data sets. We emphasize that the theoretical and empirical ARs are not the same and review theories "explaining" AR from both the reductionist and statistical fractal perspectives. The probability calculus is used to systematically incorporate both views into a single modeling strategy. We conclude that the empirical AR is entailed by the scaling behavior of the probability density, which is derived using the probability calculus.

  12. Extreme learning machine for ranking: generalization analysis and applications.

    PubMed

    Chen, Hong; Peng, Jiangtao; Zhou, Yicong; Li, Luoqing; Pan, Zhibin

    2014-05-01

    The extreme learning machine (ELM) has attracted increasing attention recently with its successful applications in classification and regression. In this paper, we investigate the generalization performance of ELM-based ranking. A new regularized ranking algorithm is proposed based on the combinations of activation functions in ELM. The generalization analysis is established for the ELM-based ranking (ELMRank) in terms of the covering numbers of hypothesis space. Empirical results on the benchmark datasets show the competitive performance of the ELMRank over the state-of-the-art ranking methods. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nekkab, M., E-mail: mohammed-nekkab@yahoo.com; LESIMS laboratory, Physics Department, Faculty of Sciences, University of Setif 1, 19000 Setif; Kahoul, A.

    The analytical methods based on X-ray fluorescence are advantageous for practical applications in a variety of fields including atomic physics, X-ray fluorescence surface chemical analysis and medical research and so the accurate fluorescence yields (ω{sub K}) are required for these applications. In this contribution we report a new parameters for calculation of K-shell fluorescence yields (ω{sub K}) of elements in the range of 11≤Z≤30. The experimental data are interpolated by using the famous analytical function (ω{sub k}/(1−ω{sub k})){sup 1/q} (were q=3, 3.5 and 4) vs Z to deduce the empirical K-shell fluorescence yields. A comparison is made between the resultsmore » of the procedures followed here and those theoretical and other semi-empirical fluorescence yield values. Reasonable agreement was typically obtained between our result and other works.« less

  14. Calculation of K-shell fluorescence yields for low-Z elements

    NASA Astrophysics Data System (ADS)

    Nekkab, M.; Kahoul, A.; Deghfel, B.; Aylikci, N. Küp; Aylikçi, V.

    2015-03-01

    The analytical methods based on X-ray fluorescence are advantageous for practical applications in a variety of fields including atomic physics, X-ray fluorescence surface chemical analysis and medical research and so the accurate fluorescence yields (ωK) are required for these applications. In this contribution we report a new parameters for calculation of K-shell fluorescence yields (ωK) of elements in the range of 11≤Z≤30. The experimental data are interpolated by using the famous analytical function (ωk/(1 -ωk)) 1 /q (were q=3, 3.5 and 4) vs Z to deduce the empirical K-shell fluorescence yields. A comparison is made between the results of the procedures followed here and those theoretical and other semi-empirical fluorescence yield values. Reasonable agreement was typically obtained between our result and other works.

  15. Experimental investigation on secondary combustion characteristics of airbreathing rockets

    NASA Astrophysics Data System (ADS)

    Mano, Takeshi; Eguchi, Akihiro; Shinohara, Suetsugu; Etou, Takao; Kaneko, Yutaka; Yamamoto, Youichi; Nakagawa, Ichirou

    Empirical correlations of the secondary combustion efficiency of the airbreathing rocket were derived. From the results of a series of experiments employing a connected pipe facility, the combustion efficiency was related to dominant parameters. The feasibility of the performance prediction by one-dimensional analysis was also discussed. The analysis was found to be applicable to the flow processes in the secondary combustor, which include two-stream mixing and combustion.

  16. Concept analysis: lack of anonymity.

    PubMed

    Swan, Marilyn A; Hobbs, Barbara B

    2017-05-01

    To re-examine and expand understanding of the concept 'lack of anonymity' as a component of rural nursing theory. Early healthcare literature reports lack of anonymity as part of social and working environments, particularly rural nursing. Rural nursing theory included the first published concept analysis on lack of anonymity but lacked empirical referents. Workforce, societal and rural healthcare changes support an updated analysis. To further understand lack of anonymity, its present day use and applicability to diverse environments, research from multiple disciplines was reviewed. Concept analysis. A literature search using eight terms in eleven databases was conducted of literature published between 2008-2013. Walker and Avant's concept analysis methodology guided the analysis. The previous concept analysis is supported in part by current literature. The defining attributes, 'identifiable', 'establishing boundaries for public and private self and interconnectedness' in a community were updated. Updated antecedents include: (i) environmental context; (ii) opportunities to become visible; (iii) developing relationships and (iv) unconscious or limited awareness of public or personal privacy. Consequences are: (i) familiarity; (ii) visibility; (iii) awareness of privacy and (iv) manage or balance of lack of anonymity. Cases were constructed and empirical referents identified. The concept of lack of anonymity was updated; portions of the original definition remain unchanged. Empirical referents reveal the defining attributes in daily life and may guide future research on the effect of lack of anonymity on nursing practice. This analysis advances the conceptual understanding of rural nursing theory. © 2016 John Wiley & Sons Ltd.

  17. The application of the electrodynamic separator in minerals beneficiation

    NASA Astrophysics Data System (ADS)

    Skowron, M.; Syrek, P.; Surowiak, A.

    2017-05-01

    The aim of presented paper is elaboration of methodology of upgrading natural minerals in example of chalcocite and bornite sample. The results were obtained by means of laboratory drum separator. This device operates in accordance to properties of materials, which in this case was electrical conductivity. The study contains the analysis of the forces occurring inside of electrodynamic separator chamber, that act on the particles of various electrical properties. Both, the potential and electric field strength distributions were calculated, with set of separators setpoints. Theoretical analysis influenced on separator parameters, and hence impacted the empirical results too. Next, the authors conducted empirical research on chalcocite and bornite beneficiation by means of electrodynamic separation. The results of this process were shown graphically in form of upgrading curves of chalcocite considering elementary copper and lead.

  18. Outbreak patterns of the novel avian influenza (H7N9)

    NASA Astrophysics Data System (ADS)

    Pan, Ya-Nan; Lou, Jing-Jing; Han, Xiao-Pu

    2014-05-01

    The attack of novel avian influenza (H7N9) in East China caused a serious health crisis and public panic. In this paper, we empirically analyze the onset patterns of human cases of the novel avian influenza and observe several spatial and temporal properties that are similar to other infectious diseases. More specifically, using the empirical analysis and modeling studies, we find that the spatio-temporal network that connects the cities with human cases along the order of outbreak timing emerges two-regime-power-law edge-length distribution, indicating the picture that several islands with higher and heterogeneous risk straggle in East China. The proposed method is applicable to the analysis of the spreading situation in the early stage of disease outbreak using quite limited dataset.

  19. Empirical Bayes estimation of proportions with application to cowbird parasitism rates

    USGS Publications Warehouse

    Link, W.A.; Hahn, D.C.

    1996-01-01

    Bayesian models provide a structure for studying collections of parameters such as are considered in the investigation of communities, ecosystems, and landscapes. This structure allows for improved estimation of individual parameters, by considering them in the context of a group of related parameters. Individual estimates are differentially adjusted toward an overall mean, with the magnitude of their adjustment based on their precision. Consequently, Bayesian estimation allows for a more credible identification of extreme values in a collection of estimates. Bayesian models regard individual parameters as values sampled from a specified probability distribution, called a prior. The requirement that the prior be known is often regarded as an unattractive feature of Bayesian analysis and may be the reason why Bayesian analyses are not frequently applied in ecological studies. Empirical Bayes methods provide an alternative approach that incorporates the structural advantages of Bayesian models while requiring a less stringent specification of prior knowledge. Rather than requiring that the prior distribution be known, empirical Bayes methods require only that it be in a certain family of distributions, indexed by hyperparameters that can be estimated from the available data. This structure is of interest per se, in addition to its value in allowing for improved estimation of individual parameters; for example, hypotheses regarding the existence of distinct subgroups in a collection of parameters can be considered under the empirical Bayes framework by allowing the hyperparameters to vary among subgroups. Though empirical Bayes methods have been applied in a variety of contexts, they have received little attention in the ecological literature. We describe the empirical Bayes approach in application to estimation of proportions, using data obtained in a community-wide study of cowbird parasitism rates for illustration. Since observed proportions based on small sample sizes are heavily adjusted toward the mean, extreme values among empirical Bayes estimates identify those species for which there is the greatest evidence of extreme parasitism rates. Applying a subgroup analysis to our data on cowbird parasitism rates, we conclude that parasitism rates for Neotropical Migrants as a group are no greater than those of Resident/Short-distance Migrant species in this forest community. Our data and analyses demonstrate that the parasitism rates for certain Neotropical Migrant species are remarkably low (Wood Thrush and Rose-breasted Grosbeak) while those for others are remarkably high (Ovenbird and Red-eyed Vireo).

  20. A Procedure for Structural Weight Estimation of Single Stage to Orbit Launch Vehicles (Interim User's Manual)

    NASA Technical Reports Server (NTRS)

    Martinovic, Zoran N.; Cerro, Jeffrey A.

    2002-01-01

    This is an interim user's manual for current procedures used in the Vehicle Analysis Branch at NASA Langley Research Center, Hampton, Virginia, for launch vehicle structural subsystem weight estimation based on finite element modeling and structural analysis. The process is intended to complement traditional methods of conceptual and early preliminary structural design such as the application of empirical weight estimation or application of classical engineering design equations and criteria on one dimensional "line" models. Functions of two commercially available software codes are coupled together. Vehicle modeling and analysis are done using SDRC/I-DEAS, and structural sizing is performed with the Collier Research Corp. HyperSizer program.

  1. An Analysis of Empathy as Leadership Attributes and Action in Educational Administrators and Teacher Leaders

    ERIC Educational Resources Information Center

    Bruckner, Jill K.

    2017-01-01

    The study of empathy, as both a concept and a construct, spans disciplines and decades. As such, its relevance to relationships, empirical definition, significance to leadership, motivational factors, and position in emotional intelligence comprise a wide range of perceptions, applications, and examination across fields ranging from psychology to…

  2. Using Generic Inductive Approach in Qualitative Educational Research: A Case Study Analysis

    ERIC Educational Resources Information Center

    Liu, Lisha

    2016-01-01

    Qualitative research strategy has been widely adopted by educational researchers in order to improve the quality of their empirical studies. This paper aims to introduce a generic inductive approach, pragmatic and flexible in qualitative theoretical support, by describing its application in a study of non-English major undergraduates' English…

  3. Re/Theorising Gender: Female Masculinity and Male Femininity in the Classroom?

    ERIC Educational Resources Information Center

    Francis, Becky

    2010-01-01

    Recent gender theorising has been enlivened by post-structuralist accounts of gender as "disembodied"; the reading of gender performances as distinct from sexed bodies. However, there has been little application of such theoretical positions to empirical analysis in gender and education. This article employs two such positions--that of…

  4. Estimation of value at risk and conditional value at risk using normal mixture distributions model

    NASA Astrophysics Data System (ADS)

    Kamaruzzaman, Zetty Ain; Isa, Zaidi

    2013-04-01

    Normal mixture distributions model has been successfully applied in financial time series analysis. In this paper, we estimate the return distribution, value at risk (VaR) and conditional value at risk (CVaR) for monthly and weekly rates of returns for FTSE Bursa Malaysia Kuala Lumpur Composite Index (FBMKLCI) from July 1990 until July 2010 using the two component univariate normal mixture distributions model. First, we present the application of normal mixture distributions model in empirical finance where we fit our real data. Second, we present the application of normal mixture distributions model in risk analysis where we apply the normal mixture distributions model to evaluate the value at risk (VaR) and conditional value at risk (CVaR) with model validation for both risk measures. The empirical results provide evidence that using the two components normal mixture distributions model can fit the data well and can perform better in estimating value at risk (VaR) and conditional value at risk (CVaR) where it can capture the stylized facts of non-normality and leptokurtosis in returns distribution.

  5. Empirical research in medical ethics: how conceptual accounts on normative-empirical collaboration may improve research practice.

    PubMed

    Salloch, Sabine; Schildmann, Jan; Vollmann, Jochen

    2012-04-13

    The methodology of medical ethics during the last few decades has shifted from a predominant use of normative-philosophical analyses to an increasing involvement of empirical methods. The articles which have been published in the course of this so-called 'empirical turn' can be divided into conceptual accounts of empirical-normative collaboration and studies which use socio-empirical methods to investigate ethically relevant issues in concrete social contexts. A considered reference to normative research questions can be expected from good quality empirical research in medical ethics. However, a significant proportion of empirical studies currently published in medical ethics lacks such linkage between the empirical research and the normative analysis. In the first part of this paper, we will outline two typical shortcomings of empirical studies in medical ethics with regard to a link between normative questions and empirical data: (1) The complete lack of normative analysis, and (2) cryptonormativity and a missing account with regard to the relationship between 'is' and 'ought' statements. Subsequently, two selected concepts of empirical-normative collaboration will be presented and how these concepts may contribute to improve the linkage between normative and empirical aspects of empirical research in medical ethics will be demonstrated. Based on our analysis, as well as our own practical experience with empirical research in medical ethics, we conclude with a sketch of concrete suggestions for the conduct of empirical research in medical ethics. High quality empirical research in medical ethics is in need of a considered reference to normative analysis. In this paper, we demonstrate how conceptual approaches of empirical-normative collaboration can enhance empirical research in medical ethics with regard to the link between empirical research and normative analysis.

  6. Empirical research in medical ethics: How conceptual accounts on normative-empirical collaboration may improve research practice

    PubMed Central

    2012-01-01

    Background The methodology of medical ethics during the last few decades has shifted from a predominant use of normative-philosophical analyses to an increasing involvement of empirical methods. The articles which have been published in the course of this so-called 'empirical turn' can be divided into conceptual accounts of empirical-normative collaboration and studies which use socio-empirical methods to investigate ethically relevant issues in concrete social contexts. Discussion A considered reference to normative research questions can be expected from good quality empirical research in medical ethics. However, a significant proportion of empirical studies currently published in medical ethics lacks such linkage between the empirical research and the normative analysis. In the first part of this paper, we will outline two typical shortcomings of empirical studies in medical ethics with regard to a link between normative questions and empirical data: (1) The complete lack of normative analysis, and (2) cryptonormativity and a missing account with regard to the relationship between 'is' and 'ought' statements. Subsequently, two selected concepts of empirical-normative collaboration will be presented and how these concepts may contribute to improve the linkage between normative and empirical aspects of empirical research in medical ethics will be demonstrated. Based on our analysis, as well as our own practical experience with empirical research in medical ethics, we conclude with a sketch of concrete suggestions for the conduct of empirical research in medical ethics. Summary High quality empirical research in medical ethics is in need of a considered reference to normative analysis. In this paper, we demonstrate how conceptual approaches of empirical-normative collaboration can enhance empirical research in medical ethics with regard to the link between empirical research and normative analysis. PMID:22500496

  7. Empirical Corrections to Nutation Amplitudes and Precession Computed from a Global VLBI Solution

    NASA Astrophysics Data System (ADS)

    Schuh, H.; Ferrandiz, J. M.; Belda-Palazón, S.; Heinkelmann, R.; Karbon, M.; Nilsson, T.

    2017-12-01

    The IAU2000A nutation and IAU2006 precession models were adopted to provide accurate estimations and predictions of the Celestial Intermediate Pole (CIP). However, they are not fully accurate and VLBI (Very Long Baseline Interferometry) observations show that the CIP deviates from the position resulting from the application of the IAU2006/2000A model. Currently, those deviations or offsets of the CIP (Celestial Pole Offsets - CPO), can only be obtained by the VLBI technique. The accuracy of the order of 0.1 milliseconds of arc (mas) allows to compare the observed nutation with theoretical prediction model for a rigid Earth and constrain geophysical parameters describing the Earth's interior. In this study, we empirically evaluate the consistency, systematics and deviations of the IAU 2006/2000A precession-nutation model using several CPO time series derived from the global analysis of VLBI sessions. The final objective is the reassessment of the precession offset and rate, and the amplitudes of the principal terms of nutation, trying to empirically improve the conventional values derived from the precession/nutation theories. The statistical analysis of the residuals after re-fitting the main nutation terms demonstrates that our empirical corrections attain an error reduction by almost 15 micro arc seconds.

  8. Implementation and reporting of causal mediation analysis in 2015: a systematic review in epidemiological studies.

    PubMed

    Liu, Shao-Hsien; Ulbricht, Christine M; Chrysanthopoulou, Stavroula A; Lapane, Kate L

    2016-07-20

    Causal mediation analysis is often used to understand the impact of variables along the causal pathway of an occurrence relation. How well studies apply and report the elements of causal mediation analysis remains unknown. We systematically reviewed epidemiological studies published in 2015 that employed causal mediation analysis to estimate direct and indirect effects of observed associations between an exposure on an outcome. We identified potential epidemiological studies through conducting a citation search within Web of Science and a keyword search within PubMed. Two reviewers independently screened studies for eligibility. For eligible studies, one reviewer performed data extraction, and a senior epidemiologist confirmed the extracted information. Empirical application and methodological details of the technique were extracted and summarized. Thirteen studies were eligible for data extraction. While the majority of studies reported and identified the effects of measures, most studies lacked sufficient details on the extent to which identifiability assumptions were satisfied. Although most studies addressed issues of unmeasured confounders either from empirical approaches or sensitivity analyses, the majority did not examine the potential bias arising from the measurement error of the mediator. Some studies allowed for exposure-mediator interaction and only a few presented results from models both with and without interactions. Power calculations were scarce. Reporting of causal mediation analysis is varied and suboptimal. Given that the application of causal mediation analysis will likely continue to increase, developing standards of reporting of causal mediation analysis in epidemiological research would be prudent.

  9. Analytical Investigation and Improvement of Performance of a Proton Exchange Membrane (Pem) Fuel Cell in Mobile Applications

    NASA Astrophysics Data System (ADS)

    Khazaee, I.

    2015-05-01

    In this study, the performance of a proton exchange membrane fuel cell in mobile applications is investigated analytically. At present the main use and advantages of fuel cells impact particularly strongly on mobile applications such as vehicles, mobile computers and mobile telephones. Some external parameters such as the cell temperature (Tcell ) , operating pressure of gases (P) and air stoichiometry (λair ) affect the performance and voltage losses in the PEM fuel cell. Because of the existence of many theoretical, empirical and semi-empirical models of the PEM fuel cell, it is necessary to compare the accuracy of these models. But theoretical models that are obtained from thermodynamic and electrochemical approach, are very exact but complex, so it would be easier to use the empirical and smi-empirical models in order to forecast the fuel cell system performance in many applications such as mobile applications. The main purpose of this study is to obtain the semi-empirical relation of a PEM fuel cell with the least voltage losses. Also, the results are compared with the existing experimental results in the literature and a good agreement is seen.

  10. Application of empirical and mechanistic-empirical pavement design procedures to Mn/ROAD concrete pavement test sections

    DOT National Transportation Integrated Search

    1997-05-01

    Current pavement design procedures are based principally on empirical approaches. The current trend toward developing more mechanistic-empirical type pavement design methods led Minnesota to develop the Minnesota Road Research Project (Mn/ROAD), a lo...

  11. Criminal psychological profiling of serial arson crimes.

    PubMed

    Kocsis, Richard N; Cooksey, Ray W

    2002-12-01

    The practice of criminal psychological profiling is frequently cited as being applicable to serial arson crimes. Despite this claim, there does not appear to be any empirical research that examines serial arson offence behaviors in the context of profiling. This study seeks to develop an empirical model of serial arsonist behaviors that can be systematically associated with probable offender characteristics. Analysis has produced a model of offence behaviors that identify four discrete behavior patterns, all of which share a constellation of common nondiscriminatory behaviors. The inherent behavioral themes of each of these patterns are explored with discussion of their broader implications for our understanding of serial arson and directions for future research.

  12. Concentration analysis of breast tissue phantoms with terahertz spectroscopy

    PubMed Central

    Truong, Bao C. Q.; Fitzgerald, Anthony J.; Fan, Shuting; Wallace, Vincent P.

    2018-01-01

    Terahertz imaging has been previously shown to be capable of distinguishing normal breast tissue from its cancerous form, indicating its applicability to breast conserving surgery. The heterogeneous composition of breast tissue is among the main challenges to progressing this potential research towards a practical application. In this paper, two concentration analysis methods are proposed for analyzing phantoms mimicking breast tissue. The dielectric properties and the double Debye parameters were used to determine the phantom composition. The first method is wholly based on the conventional effective medium theory while the second one combines this theoretical model with empirical polynomial models. Through assessing the accuracy of these methods, their potential for application to quantifying breast tissue pathology was confirmed. PMID:29541525

  13. Constructing the principles: Method and metaphysics in the progress of theoretical physics

    NASA Astrophysics Data System (ADS)

    Glass, Lawrence C.

    This thesis presents a new framework for the philosophy of physics focused on methodological differences found in the practice of modern theoretical physics. The starting point for this investigation is the longstanding debate over scientific realism. Some philosophers have argued that it is the aim of science to produce an accurate description of the world including explanations for observable phenomena. These scientific realists hold that our best confirmed theories are approximately true and that the entities they propose actually populate the world, whether or not they have been observed. Others have argued that science achieves only frameworks for the prediction and manipulation of observable phenomena. These anti-realists argue that truth is a misleading concept when applied to empirical knowledge. Instead, focus should be on the empirical adequacy of scientific theories. This thesis argues that the fundamental distinction at issue, a division between true scientific theories and ones which are empirically adequate, is best explored in terms of methodological differences. In analogy with the realism debate, there are at least two methodological strategies. Rather than focusing on scientific theories as wholes, this thesis takes as units of analysis physical principles which are systematic empirical generalizations. The first possible strategy, the conservative, takes the assumption that the empirical adequacy of a theory in one domain serves as good evidence for such adequacy in other domains. This then motivates the application of the principle to new domains. The second strategy, the innovative, assumes that empirical adequacy in one domain does not justify the expectation of adequacy in other domains. New principles are offered as explanations in the new domain. The final part of the thesis is the application of this framework to two examples. On the first, Lorentz's use of the aether is reconstructed in terms of the conservative strategy with respect to the principles of Galilean relativity. A comparison between the conservative strategy as an application of the conservative strategy and TeVeS as one of the innovative constitutes the second example.

  14. Judgement heuristics and bias in evidence interpretation: The effects of computer generated exhibits.

    PubMed

    Norris, Gareth

    2015-01-01

    The increasing use of multi-media applications, trial presentation software and computer generated exhibits (CGE) has raised questions as to the potential impact of the use of presentation technology on juror decision making. A significant amount of the commentary on the manner in which CGE exerts legal influence is largely anecdotal; empirical examinations too are often devoid of established theoretical rationalisations. This paper will examine a range of established judgement heuristics (for example, the attribution error, representativeness, simulation), in order to establish their appropriate application for comprehending legal decisions. Analysis of both past cases and empirical studies will highlight the potential for heuristics and biases to be restricted or confounded by the use of CGE. The paper will conclude with some wider discussion on admissibility, access to justice, and emerging issues in the use of multi-media in court. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Determination of a Limited Scope Network's Lightning Detection Efficiency

    NASA Technical Reports Server (NTRS)

    Rompala, John T.; Blakeslee, R.

    2008-01-01

    This paper outlines a modeling technique to map lightning detection efficiency variations over a region surveyed by a sparse array of ground based detectors. A reliable flash peak current distribution (PCD) for the region serves as the technique's base. This distribution is recast as an event probability distribution function. The technique then uses the PCD together with information regarding: site signal detection thresholds, type of solution algorithm used, and range attenuation; to formulate the probability that a flash at a specified location will yield a solution. Applying this technique to the full region produces detection efficiency contour maps specific to the parameters employed. These contours facilitate a comparative analysis of each parameter's effect on the network's detection efficiency. In an alternate application, this modeling technique gives an estimate of the number, strength, and distribution of events going undetected. This approach leads to a variety of event density contour maps. This application is also illustrated. The technique's base PCD can be empirical or analytical. A process for formulating an empirical PCD specific to the region and network being studied is presented. A new method for producing an analytical representation of the empirical PCD is also introduced.

  16. Cross validation issues in multiobjective clustering

    PubMed Central

    Brusco, Michael J.; Steinley, Douglas

    2018-01-01

    The implementation of multiobjective programming methods in combinatorial data analysis is an emergent area of study with a variety of pragmatic applications in the behavioural sciences. Most notably, multiobjective programming provides a tool for analysts to model trade offs among competing criteria in clustering, seriation, and unidimensional scaling tasks. Although multiobjective programming has considerable promise, the technique can produce numerically appealing results that lack empirical validity. With this issue in mind, the purpose of this paper is to briefly review viable areas of application for multiobjective programming and, more importantly, to outline the importance of cross-validation when using this method in cluster analysis. PMID:19055857

  17. Damage Tolerance of Composite Laminates from an Empirical Perspective

    NASA Technical Reports Server (NTRS)

    Nettles, Alan T.

    2009-01-01

    Damage tolerance consists of analysis and experimentation working together. Impact damage is usually of most concern for laminated composites. Once impacted, the residual compression strength is usually of most interest. Other properties may be of more interest than compression (application dependent). A damage tolerance program is application specific (not everyone is building aircraft). The "Building Block Approach" is suggested for damage tolerance. Advantage can be taken of the excellent fatigue resistance of damaged laminates to save time and costs.

  18. Time-frequency analysis : mathematical analysis of the empirical mode decomposition.

    DOT National Transportation Integrated Search

    2009-01-01

    Invented over 10 years ago, empirical mode : decomposition (EMD) provides a nonlinear : time-frequency analysis with the ability to successfully : analyze nonstationary signals. Mathematical : Analysis of the Empirical Mode Decomposition : is a...

  19. Fertility Determinants: A Theory, Evidence, and an Application to Policy Evaluation.

    ERIC Educational Resources Information Center

    Schultz, T. Paul

    This report surveys the first generation of theoretical and empirical research on the determinants of parent "demand" for children. A large fraction of this literature was first published as Rand reports and papers. The pragmatic question discussed here is the strengths and shortcomings of the state of the art in economic analysis of…

  20. L1 Use in L2 Vocabulary Learning: Facilitator or Barrier

    ERIC Educational Resources Information Center

    Liu, Jing

    2008-01-01

    Based on empirical research and qualitative analysis, this paper aims to explore the effects of L1 use on L2 vocabulary teaching. The results show that, during L2 vocabulary teaching process, the proper application of L1 can effectively facilitate the memorization of new words, and the bilingual method (both English explanation and Chinese…

  1. Community choices and housing demands: a spatial analysis of the southern Appalachian highlands

    Treesearch

    Seong-Hoon Cho; David H. Newman; David N. Wear

    2005-01-01

    This paper examines housing demand using an integrated approach that combines residential decisions about choices of community in the Southern Appalachian region with the application of a Geographic Information System (GIS). The empirical model infers a distinctive heterogeneity in the characteristics of community choices. The results also indicate that socio-economic...

  2. Integrating biological knowledge into variable selection: an empirical Bayes approach with an application in cancer biology

    PubMed Central

    2012-01-01

    Background An important question in the analysis of biochemical data is that of identifying subsets of molecular variables that may jointly influence a biological response. Statistical variable selection methods have been widely used for this purpose. In many settings, it may be important to incorporate ancillary biological information concerning the variables of interest. Pathway and network maps are one example of a source of such information. However, although ancillary information is increasingly available, it is not always clear how it should be used nor how it should be weighted in relation to primary data. Results We put forward an approach in which biological knowledge is incorporated using informative prior distributions over variable subsets, with prior information selected and weighted in an automated, objective manner using an empirical Bayes formulation. We employ continuous, linear models with interaction terms and exploit biochemically-motivated sparsity constraints to permit exact inference. We show an example of priors for pathway- and network-based information and illustrate our proposed method on both synthetic response data and by an application to cancer drug response data. Comparisons are also made to alternative Bayesian and frequentist penalised-likelihood methods for incorporating network-based information. Conclusions The empirical Bayes method proposed here can aid prior elicitation for Bayesian variable selection studies and help to guard against mis-specification of priors. Empirical Bayes, together with the proposed pathway-based priors, results in an approach with a competitive variable selection performance. In addition, the overall procedure is fast, deterministic, and has very few user-set parameters, yet is capable of capturing interplay between molecular players. The approach presented is general and readily applicable in any setting with multiple sources of biological prior knowledge. PMID:22578440

  3. [Methodological deficits in neuroethics: do we need theoretical neuroethics?].

    PubMed

    Northoff, G

    2013-10-01

    Current neuroethics can be characterized best as empirical neuroethics: it is strongly empirically oriented in that it not only includes empirical findings from neuroscience but also searches for applications within neuroscience. This, however, neglects the social and political contexts which could be subject to a future social neuroethics. In addition, methodological issues need to be considered as in theoretical neuroethics. The focus in this article is on two such methodological issues: (1) the analysis of the different levels and their inferences among each other which is exemplified by the inference of consciousness from the otherwise purely neuronal data in patients with vegetative state and (2) the problem of linking descriptive and normative concepts in a non-reductive and non-inferential way for which I suggest the mutual contextualization between both concepts. This results in a methodological strategy that can be described as contextual fact-norm iterativity.

  4. A protocol for the creation of useful geometric shape metrics illustrated with a newly derived geometric measure of leaf circularity.

    PubMed

    Krieger, Jonathan D

    2014-08-01

    I present a protocol for creating geometric leaf shape metrics to facilitate widespread application of geometric morphometric methods to leaf shape measurement. • To quantify circularity, I created a novel shape metric in the form of the vector between a circle and a line, termed geometric circularity. Using leaves from 17 fern taxa, I performed a coordinate-point eigenshape analysis to empirically identify patterns of shape covariation. I then compared the geometric circularity metric to the empirically derived shape space and the standard metric, circularity shape factor. • The geometric circularity metric was consistent with empirical patterns of shape covariation and appeared more biologically meaningful than the standard approach, the circularity shape factor. The protocol described here has the potential to make geometric morphometrics more accessible to plant biologists by generalizing the approach to developing synthetic shape metrics based on classic, qualitative shape descriptors.

  5. Semi-Empirical Prediction of Aircraft Low-Speed Aerodynamic Characteristics

    NASA Technical Reports Server (NTRS)

    Olson, Erik D.

    2015-01-01

    This paper lays out a comprehensive methodology for computing a low-speed, high-lift polar, without requiring additional details about the aircraft design beyond what is typically available at the conceptual design stage. Introducing low-order, physics-based aerodynamic analyses allows the methodology to be more applicable to unconventional aircraft concepts than traditional, fully-empirical methods. The methodology uses empirical relationships for flap lift effectiveness, chord extension, drag-coefficient increment and maximum lift coefficient of various types of flap systems as a function of flap deflection, and combines these increments with the characteristics of the unflapped airfoils. Once the aerodynamic characteristics of the flapped sections are known, a vortex-lattice analysis calculates the three-dimensional lift, drag and moment coefficients of the whole aircraft configuration. This paper details the results of two validation cases: a supercritical airfoil model with several types of flaps; and a 12-foot, full-span aircraft model with slats and double-slotted flaps.

  6. Enhancing predictive accuracy and reproducibility in clinical evaluation research: Commentary on the special section of the Journal of Evaluation in Clinical Practice.

    PubMed

    Bryant, Fred B

    2016-12-01

    This paper introduces a special section of the current issue of the Journal of Evaluation in Clinical Practice that includes a set of 6 empirical articles showcasing a versatile, new machine-learning statistical method, known as optimal data (or discriminant) analysis (ODA), specifically designed to produce statistical models that maximize predictive accuracy. As this set of papers clearly illustrates, ODA offers numerous important advantages over traditional statistical methods-advantages that enhance the validity and reproducibility of statistical conclusions in empirical research. This issue of the journal also includes a review of a recently published book that provides a comprehensive introduction to the logic, theory, and application of ODA in empirical research. It is argued that researchers have much to gain by using ODA to analyze their data. © 2016 John Wiley & Sons, Ltd.

  7. Recent development in mass spectrometry and its hyphenated techniques for the analysis of medicinal plants.

    PubMed

    Zhu, Ming-Zhi; Chen, Gui-Lin; Wu, Jian-Lin; Li, Na; Liu, Zhong-Hua; Guo, Ming-Quan

    2018-04-23

    Medicinal plants are gaining increasing attention worldwide due to their empirical therapeutic efficacy and being a huge natural compound pool for new drug discovery and development. The efficacy, safety and quality of medicinal plants are the main concerns, which are highly dependent on the comprehensive analysis of chemical components in the medicinal plants. With the advances in mass spectrometry (MS) techniques, comprehensive analysis and fast identification of complex phytochemical components have become feasible, and may meet the needs, for the analysis of medicinal plants. Our aim is to provide an overview on the latest developments in MS and its hyphenated technique and their applications for the comprehensive analysis of medicinal plants. Application of various MS and its hyphenated techniques for the analysis of medicinal plants, including but not limited to one-dimensional chromatography, multiple-dimensional chromatography coupled to MS, ambient ionisation MS, and mass spectral database, have been reviewed and compared in this work. Recent advancs in MS and its hyphenated techniques have made MS one of the most powerful tools for the analysis of complex extracts from medicinal plants due to its excellent separation and identification ability, high sensitivity and resolution, and wide detection dynamic range. To achieve high-throughput or multi-dimensional analysis of medicinal plants, the state-of-the-art MS and its hyphenated techniques have played, and will continue to play a great role in being the major platform for their further research in order to obtain insight into both their empirical therapeutic efficacy and quality control. Copyright © 2018 John Wiley & Sons, Ltd.

  8. Seismic facies analysis based on self-organizing map and empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Du, Hao-kun; Cao, Jun-xing; Xue, Ya-juan; Wang, Xing-jian

    2015-01-01

    Seismic facies analysis plays an important role in seismic interpretation and reservoir model building by offering an effective way to identify the changes in geofacies inter wells. The selections of input seismic attributes and their time window have an obvious effect on the validity of classification and require iterative experimentation and prior knowledge. In general, it is sensitive to noise when waveform serves as the input data to cluster analysis, especially with a narrow window. To conquer this limitation, the Empirical Mode Decomposition (EMD) method is introduced into waveform classification based on SOM. We first de-noise the seismic data using EMD and then cluster the data using 1D grid SOM. The main advantages of this method are resolution enhancement and noise reduction. 3D seismic data from the western Sichuan basin, China, are collected for validation. The application results show that seismic facies analysis can be improved and better help the interpretation. The powerful tolerance for noise makes the proposed method to be a better seismic facies analysis tool than classical 1D grid SOM method, especially for waveform cluster with a narrow window.

  9. Rate My Sleep: Examining the Information, Function, and Basis in Empirical Evidence Within Sleep Applications for Mobile Devices.

    PubMed

    Lee-Tobin, Peta A; Ogeil, Rowan P; Savic, Michael; Lubman, Dan I

    2017-11-15

    Sleep applications (apps) have proliferated in online spaces, but few studies have examined the validity of the information contained within the apps. This study aimed to examine the information and functions found within sleep apps, determine if the information is based on empirical evidence, and whether or not user ratings were affected by these factors. Sleep apps found in the Google Play store (n = 76) were coded using content analysis to examine the types of information, functions, and evidence base of each app. Only 32.9% of sleep apps contained empirical evidence supporting their claims, 15.8% contained clinical input, and 13.2% contained links to sleep literature. Apps also contained information on how sleep is affected by alcohol or drugs (23.7%), food (13.2%), daily activities (13.2), and stress (13.2%). A mean difference in average user rating was found between apps that contained at least one source of information compared those that did not. App user ratings were not associated with an app having multiple functions, or from an app drawing on multiple sources of evidence (except for sleep literature only). Last, there was a higher average user rating among apps that contained a sleep tip function. Sleep apps are increasingly popular, demonstrated by the large number of downloads in the Google Play store. Users favored apps that contained sleep tips; however, these tips and other information in the apps were generally not based on empirical evidence. Future research in the area of sleep apps should consider constructing sleep apps derived from empirical evidence and examining their effectiveness. © 2017 American Academy of Sleep Medicine

  10. Correspondence Analysis-Theory and Application in Management Accounting Research

    NASA Astrophysics Data System (ADS)

    Duller, Christine

    2010-09-01

    Correspondence analysis is an explanatory data analytic technique and is used to identify systematic relations between categorical variables. It is related to principal component analysis and the results provide information on the structure of categorical variables similar to the results given by a principal component analysis in case of metric variables. Classical correspondence analysis is designed two-dimensional, whereas multiple correspondence analysis is an extension to more than two variables. After an introductory overview of the idea and the implementation in standard software packages (PASW, SAS, R) an example in recent research is presented, which deals with strategic management accounting in family and non-family enterprises in Austria, where 70% to 80% of all enterprises can be classified as family firms. Although there is a growing body of literature focusing on various management issues in family firms, so far the state of the art of strategic management accounting in family firms is an empirically under-researched subject. In relevant literature only the (empirically untested) hypothesis can be found, that family firms tend to have less formalized management accounting systems than non-family enterprises. Creating a correspondence analysis will help to identify the underlying structure, which is responsible for differences in strategic management accounting.

  11. Education Policy as an Act of White Supremacy: Whiteness, Critical Race Theory and Education Reform

    ERIC Educational Resources Information Center

    Gillborn, David

    2005-01-01

    The paper presents an empirical analysis of education policy in England that is informed by recent developments in US critical theory. In particular, I draw on 'whiteness studies' and the application of critical race theory (CRT). These perspectives offer a new and radical way of conceptualizing the role of racism in education. Although the US…

  12. The Application of Various Nonlinear Models to Describe Academic Growth Trajectories: An Empirical Analysis Using Four-Wave Longitudinal Achievement Data from a Large Urban School District

    ERIC Educational Resources Information Center

    Shin, Tacksoo

    2012-01-01

    This study introduced various nonlinear growth models, including the quadratic conventional polynomial model, the fractional polynomial model, the Sigmoid model, the growth model with negative exponential functions, the multidimensional scaling technique, and the unstructured growth curve model. It investigated which growth models effectively…

  13. A Growth Model for the Academic Program Life Cycle (APLC): A Theoretical and Empirical Analysis. IR Applications, Volume 33

    ERIC Educational Resources Information Center

    Acquah, Edward H. K.

    2012-01-01

    The academic program life cycle (APLC) concept states each program's life flows through several stages: introduction, growth, maturity, and decline. A mixed-influence diffusion growth model is fitted to annual enrollment data on academic programs to analyze the factors determining progress of academic programs through their life cycles. The…

  14. Security Vulnerability Profiles of Mission Critical Software: Empirical Analysis of Security Related Bug Reports

    NASA Technical Reports Server (NTRS)

    Goseva-Popstojanova, Katerina; Tyo, Jacob

    2017-01-01

    While some prior research work exists on characteristics of software faults (i.e., bugs) and failures, very little work has been published on analysis of software applications vulnerabilities. This paper aims to contribute towards filling that gap by presenting an empirical investigation of application vulnerabilities. The results are based on data extracted from issue tracking systems of two NASA missions. These data were organized in three datasets: Ground mission IVV issues, Flight mission IVV issues, and Flight mission Developers issues. In each dataset, we identified security related software bugs and classified them in specific vulnerability classes. Then, we created the security vulnerability profiles, i.e., determined where and when the security vulnerabilities were introduced and what were the dominating vulnerabilities classes. Our main findings include: (1) In IVV issues datasets the majority of vulnerabilities were code related and were introduced in the Implementation phase. (2) For all datasets, around 90 of the vulnerabilities were located in two to four subsystems. (3) Out of 21 primary classes, five dominated: Exception Management, Memory Access, Other, Risky Values, and Unused Entities. Together, they contributed from 80 to 90 of vulnerabilities in each dataset.

  15. Adaptive-projection intrinsically transformed multivariate empirical mode decomposition in cooperative brain-computer interface applications.

    PubMed

    Hemakom, Apit; Goverdovsky, Valentin; Looney, David; Mandic, Danilo P

    2016-04-13

    An extension to multivariate empirical mode decomposition (MEMD), termed adaptive-projection intrinsically transformed MEMD (APIT-MEMD), is proposed to cater for power imbalances and inter-channel correlations in real-world multichannel data. It is shown that the APIT-MEMD exhibits similar or better performance than MEMD for a large number of projection vectors, whereas it outperforms MEMD for the critical case of a small number of projection vectors within the sifting algorithm. We also employ the noise-assisted APIT-MEMD within our proposed intrinsic multiscale analysis framework and illustrate the advantages of such an approach in notoriously noise-dominated cooperative brain-computer interface (BCI) based on the steady-state visual evoked potentials and the P300 responses. Finally, we show that for a joint cognitive BCI task, the proposed intrinsic multiscale analysis framework improves system performance in terms of the information transfer rate. © 2016 The Author(s).

  16. Novel Multidimensional Cross-Correlation Data Comparison Techniques for Spectroscopic Discernment in a Volumetrically Sensitive, Moderating Type Neutron Spectrometer

    NASA Astrophysics Data System (ADS)

    Hoshor, Cory; Young, Stephan; Rogers, Brent; Currie, James; Oakes, Thomas; Scott, Paul; Miller, William; Caruso, Anthony

    2014-03-01

    A novel application of the Pearson Cross-Correlation to neutron spectral discernment in a moderating type neutron spectrometer is introduced. This cross-correlation analysis will be applied to spectral response data collected through both MCNP simulation and empirical measurement by the volumetrically sensitive spectrometer for comparison in 1, 2, and 3 spatial dimensions. The spectroscopic analysis methods discussed will be demonstrated to discern various common spectral and monoenergetic neutron sources.

  17. Directional variance adjustment: bias reduction in covariance matrices based on factor analysis with an application to portfolio optimization.

    PubMed

    Bartz, Daniel; Hatrick, Kerr; Hesse, Christian W; Müller, Klaus-Robert; Lemm, Steven

    2013-01-01

    Robust and reliable covariance estimates play a decisive role in financial and many other applications. An important class of estimators is based on factor models. Here, we show by extensive Monte Carlo simulations that covariance matrices derived from the statistical Factor Analysis model exhibit a systematic error, which is similar to the well-known systematic error of the spectrum of the sample covariance matrix. Moreover, we introduce the Directional Variance Adjustment (DVA) algorithm, which diminishes the systematic error. In a thorough empirical study for the US, European, and Hong Kong stock market we show that our proposed method leads to improved portfolio allocation.

  18. Directional Variance Adjustment: Bias Reduction in Covariance Matrices Based on Factor Analysis with an Application to Portfolio Optimization

    PubMed Central

    Bartz, Daniel; Hatrick, Kerr; Hesse, Christian W.; Müller, Klaus-Robert; Lemm, Steven

    2013-01-01

    Robust and reliable covariance estimates play a decisive role in financial and many other applications. An important class of estimators is based on factor models. Here, we show by extensive Monte Carlo simulations that covariance matrices derived from the statistical Factor Analysis model exhibit a systematic error, which is similar to the well-known systematic error of the spectrum of the sample covariance matrix. Moreover, we introduce the Directional Variance Adjustment (DVA) algorithm, which diminishes the systematic error. In a thorough empirical study for the US, European, and Hong Kong stock market we show that our proposed method leads to improved portfolio allocation. PMID:23844016

  19. MOGO: Model-Oriented Global Optimization of Petascale Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malony, Allen D.; Shende, Sameer S.

    The MOGO project was initiated under in 2008 under the DOE Program Announcement for Software Development Tools for Improved Ease-of-Use on Petascale systems (LAB 08-19). The MOGO team consisted of Oak Ridge National Lab, Argonne National Lab, and the University of Oregon. The overall goal of MOGO was to attack petascale performance analysis by developing a general framework where empirical performance data could be efficiently and accurately compared with performance expectations at various levels of abstraction. This information could then be used to automatically identify and remediate performance problems. MOGO was be based on performance models derived from application knowledge,more » performance experiments, and symbolic analysis. MOGO was able to make reasonable impact on existing DOE applications and systems. New tools and techniques were developed, which, in turn, were used on important DOE applications on DOE LCF systems to show significant performance improvements.« less

  20. Design and experimentation of an empirical multistructure framework for accurate, sharp and reliable hydrological ensembles

    NASA Astrophysics Data System (ADS)

    Seiller, G.; Anctil, F.; Roy, R.

    2017-09-01

    This paper outlines the design and experimentation of an Empirical Multistructure Framework (EMF) for lumped conceptual hydrological modeling. This concept is inspired from modular frameworks, empirical model development, and multimodel applications, and encompasses the overproduce and select paradigm. The EMF concept aims to reduce subjectivity in conceptual hydrological modeling practice and includes model selection in the optimisation steps, reducing initial assumptions on the prior perception of the dominant rainfall-runoff transformation processes. EMF generates thousands of new modeling options from, for now, twelve parent models that share their functional components and parameters. Optimisation resorts to ensemble calibration, ranking and selection of individual child time series based on optimal bias and reliability trade-offs, as well as accuracy and sharpness improvement of the ensemble. Results on 37 snow-dominated Canadian catchments and 20 climatically-diversified American catchments reveal the excellent potential of the EMF in generating new individual model alternatives, with high respective performance values, that may be pooled efficiently into ensembles of seven to sixty constitutive members, with low bias and high accuracy, sharpness, and reliability. A group of 1446 new models is highlighted to offer good potential on other catchments or applications, based on their individual and collective interests. An analysis of the preferred functional components reveals the importance of the production and total flow elements. Overall, results from this research confirm the added value of ensemble and flexible approaches for hydrological applications, especially in uncertain contexts, and open up new modeling possibilities.

  1. A simple empirical model for the clarification-thickening process in wastewater treatment plants.

    PubMed

    Zhang, Y K; Wang, H C; Qi, L; Liu, G H; He, Z J; Fan, H T

    2015-01-01

    In wastewater treatment plants (WWTPs), activated sludge is thickened in secondary settling tanks and recycled into the biological reactor to maintain enough biomass for wastewater treatment. Accurately estimating the activated sludge concentration in the lower portion of the secondary clarifiers is of great importance for evaluating and controlling the sludge recycled ratio, ensuring smooth and efficient operation of the WWTP. By dividing the overall activated sludge-thickening curve into a hindered zone and a compression zone, an empirical model describing activated sludge thickening in the compression zone was obtained by empirical regression. This empirical model was developed through experiments conducted using sludge from five WWTPs, and validated by the measured data from a sixth WWTP, which fit the model well (R² = 0.98, p < 0.001). The model requires application of only one parameter, the sludge volume index (SVI), which is readily incorporated into routine analysis. By combining this model with the conservation of mass equation, an empirical model for compression settling was also developed. Finally, the effects of denitrification and addition of a polymer were also analysed because of their effect on sludge thickening, which can be useful for WWTP operation, e.g., improving wastewater treatment or the proper use of the polymer.

  2. Calculation of lava discharge rates during effusive eruptions: an empirical approach using MODIS Middle InfraRed data

    NASA Astrophysics Data System (ADS)

    Coppola, Diego; Laiolo, Marco; Cigolini, Corrado

    2016-04-01

    The rate at which the lava is erupted is a crucial parameter to be monitored during any volcanic eruption. However, its accurate and systematic measurement, throughout the whole duration of an event, remains a big challenge, also for volcanologists working on highly studied and well monitored volcanoes. The thermal approach (also known as thermal proxy) is actually one of most promising techniques adopted during effusive eruptions, since it allows to estimate Time Averaged lava Discharge Rates (TADR) from remote-sensed infrared data acquired several time per day. However, due to the complexity of the physic behind the effusive phenomenon and the difficulty to have field validations, the application of the thermal proxy is still debated and limited to few volcanoes only. Here we present the analysis of MODIS Middle InfraRed data, collected by during several distinct eruptions, in order to show how an alternative, empirical method (called radiant density approach; Coppola et al., 2013) permit to estimate TADRs over a wide range of emplacement styles and lava compositions. We suggest that the simplicity of this empirical approach allows its rapid application during eruptive crisis, and provides the basis for more complex models based on the cooling and spreading processes of the active lava bodies.

  3. A practical application of practice-based learning: development of an algorithm for empiric antibiotic coverage in ventilator-associated pneumonia.

    PubMed

    Miller, Preston R; Partrick, Matthew S; Hoth, J Jason; Meredith, J Wayne; Chang, Michael C

    2006-04-01

    Development of practice-based learning (PBL) is one of the core competencies required for resident education by the Accreditation Council for Graduate Medical Education, and specialty organizations including the American College of Surgeons have formed task forces to understand and disseminate information on this important concept. However, translating this concept into daily practice may be difficult. Our goal was to describe the successful application of PBL to patient care improvement with development of an algorithm for the empiric therapy of ventilator-associated pneumonia (VAP). The algorithm development occurred in two phases. In phase 1, the microbiology and timing of VAP as diagnosed by bronchoalveolar lavage was reviewed over a 2-year period to allow for recognition of patterns of infection. In phase 2, based on these data, an algorithm for empiric antibiotic coverage that would ensure that the large majority of patients with VAP received adequate initial empiric therapy was developed and put into practice. The period of algorithm use was then examined to determine rate of adequate coverage and outcome. : In Phase 1, from January 1, 2000 to December 31 2001, 110 patients were diagnosed with VAP. Analysis of microbiology revealed a sharp increase in the recovery of nosocomial pathogens on postinjury day 7 (19% < day 7 versus 47% > or = day 7, p = 0.003). Adequate initial antibiotic coverage was seen in 74%. In Phase 2, an algorithm employing ampicillin- sulbactam for coverage of community- acquired pathogens before day 7 and cefipime for nosocomial coverage > or =day 7 was then employed from January 1, 2002 to December 31, 2003. Evaluation of 186 VAP cases during this interval revealed a similar distribution of nosocomial cases (13% < day 7 versus 64% > or = day 7, p < 0.0001). Empiric antibiotic therapy was adequate in 82% of cases or =day 7: overall accuracy improved to 83% (p = 0.05). Mortality from phase 1 to phase 2 trended toward a decrease (21% versus 13%, p = 0.1). Application of the concept of PBL allowed for identification of local patterns of infection and development of an institution specific treatment algorithm that resulted in >80% adequate initial empiric coverage for VAP with a trend toward decreased mortality. PBL allows for alteration in practice based on local patterns and outcomes and has the potential to improve patient care.

  4. xEMD procedures as a data - Assisted filtering method

    NASA Astrophysics Data System (ADS)

    Machrowska, Anna; Jonak, Józef

    2018-01-01

    The article presents the possibility of using Empirical Mode Decomposition (EMD), Ensemble Empirical Mode Decomposition (EEMD), Complete Ensemble Empirical Mode Decomposition with Adaptive Noise (CEEMDAN) and Improved Complete Ensemble Empirical Mode Decomposition (ICEEMD) algorithms for mechanical system condition monitoring applications. There were presented the results of the xEMD procedures used for vibration signals of system in different states of wear.

  5. Application of low-order potential solutions to higher-order vertical traction boundary problems in an elastic half-space

    PubMed Central

    Taylor, Adam G.

    2018-01-01

    New solutions of potential functions for the bilinear vertical traction boundary condition are derived and presented. The discretization and interpolation of higher-order tractions and the superposition of the bilinear solutions provide a method of forming approximate and continuous solutions for the equilibrium state of a homogeneous and isotropic elastic half-space subjected to arbitrary normal surface tractions. Past experimental measurements of contact pressure distributions in granular media are reviewed in conjunction with the application of the proposed solution method to analysis of elastic settlement in shallow foundations. A numerical example is presented for an empirical ‘saddle-shaped’ traction distribution at the contact interface between a rigid square footing and a supporting soil medium. Non-dimensional soil resistance is computed as the reciprocal of normalized surface displacements under this empirical traction boundary condition, and the resulting internal stresses are compared to classical solutions to uniform traction boundary conditions. PMID:29892456

  6. Price-volume multifractal analysis and its application in Chinese stock markets

    NASA Astrophysics Data System (ADS)

    Yuan, Ying; Zhuang, Xin-tian; Liu, Zhi-ying

    2012-06-01

    An empirical research on Chinese stock markets is conducted using statistical tools. First, the multifractality of stock price return series, ri(ri=ln(Pt+1)-ln(Pt)) and trading volume variation series, vi(vi=ln(Vt+1)-ln(Vt)) is confirmed using multifractal detrended fluctuation analysis. Furthermore, a multifractal detrended cross-correlation analysis between stock price return and trading volume variation in Chinese stock markets is also conducted. It is shown that the cross relationship between them is also found to be multifractal. Second, the cross-correlation between stock price Pi and trading volume Vi is empirically studied using cross-correlation function and detrended cross-correlation analysis. It is found that both Shanghai stock market and Shenzhen stock market show pronounced long-range cross-correlations between stock price and trading volume. Third, a composite index R based on price and trading volume is introduced. Compared with stock price return series ri and trading volume variation series vi, R variation series not only remain the characteristics of original series but also demonstrate the relative correlation between stock price and trading volume. Finally, we analyze the multifractal characteristics of R variation series before and after three financial events in China (namely, Price Limits, Reform of Non-tradable Shares and financial crisis in 2008) in the whole period of sample to study the changes of stock market fluctuation and financial risk. It is found that the empirical results verified the validity of R.

  7. The Solidarities and Cultural Practices of Russia's Young People at the Beginning of the Twenty-First Century: The Theoretical Context

    ERIC Educational Resources Information Center

    Omel'chenko, E. L.

    2015-01-01

    The article looks at the experience of studying young people in today's Russia and the way the experience correlates with Western traditions of research. The analysis that is proposed is oriented toward understanding the analytical and empirical potential of the concept of solidarity applicable to the current agenda. [This article was translated…

  8. Empirical Network Model of Human Higher Cognitive Brain Functions

    DTIC Science & Technology

    1990-03-31

    If applicable) AFOSR j’ F49620-87-0047 8c. ADDRESS (City, State, and ZIP Code) 10. SOURCE OF FUNDING NUMBERS USAF/AFSC, AIR FORCE OFFICE OF PROGRAM ...8217 Workbench", an interactive exploratory data analysis and display program . Other technical developments include development of methods and programs ...feedback. Electroencephalogr. clin. Neurophysiol., 74:147-160. 11. Illes, J. (1989) Neurolinguistic features of spontaneous language production

  9. The Impact of ICT on Educational Performance and its Efficiency in Selected EU and OECD Countries: A Non-Parametric Analysis

    ERIC Educational Resources Information Center

    Aristovnik, Aleksander

    2012-01-01

    The purpose of the paper is to review some previous researches examining ICT efficiency and the impact of ICT on educational output/outcome as well as different conceptual and methodological issues related to performance measurement. Moreover, a definition, measurements and the empirical application of a model measuring the efficiency of ICT use…

  10. Non-Intrusive Gaze Tracking Using Artificial Neural Networks

    DTIC Science & Technology

    1994-01-05

    We have developed an artificial neural network based gaze tracking, system which can be customized to individual users. A three layer feed forward...empirical analysis of the performance of a large number of artificial neural network architectures for this task. Suggestions for further explorations...for neurally based gaze trackers are presented, and are related to other similar artificial neural network applications such as autonomous road following.

  11. Exploratory Analysis of the Comprehensive Application of the Islamic Concept of Zuhd in the Contemporary World

    ERIC Educational Resources Information Center

    Olohunfunmi, Ismail Abdul Fatai

    2015-01-01

    The main aim of the present study is to present a clear frame work of how to practically apply the concept "Zuhd" to individual Muslim life. It is an empirical research on the Islamic concept of "Zuhd." The method that is employed in the study is qualitative approach, whereby interviews were staged, recorded and transcribed.…

  12. Exploring earthquake databases for the creation of magnitude-homogeneous catalogues: tools for application on a regional and global scale

    NASA Astrophysics Data System (ADS)

    Weatherill, G. A.; Pagani, M.; Garcia, J.

    2016-09-01

    The creation of a magnitude-homogenized catalogue is often one of the most fundamental steps in seismic hazard analysis. The process of homogenizing multiple catalogues of earthquakes into a single unified catalogue typically requires careful appraisal of available bulletins, identification of common events within multiple bulletins and the development and application of empirical models to convert from each catalogue's native scale into the required target. The database of the International Seismological Center (ISC) provides the most exhaustive compilation of records from local bulletins, in addition to its reviewed global bulletin. New open-source tools are developed that can utilize this, or any other compiled database, to explore the relations between earthquake solutions provided by different recording networks, and to build and apply empirical models in order to harmonize magnitude scales for the purpose of creating magnitude-homogeneous earthquake catalogues. These tools are described and their application illustrated in two different contexts. The first is a simple application in the Sub-Saharan Africa region where the spatial coverage and magnitude scales for different local recording networks are compared, and their relation to global magnitude scales explored. In the second application the tools are used on a global scale for the purpose of creating an extended magnitude-homogeneous global earthquake catalogue. Several existing high-quality earthquake databases, such as the ISC-GEM and the ISC Reviewed Bulletins, are harmonized into moment magnitude to form a catalogue of more than 562 840 events. This extended catalogue, while not an appropriate substitute for a locally calibrated analysis, can help in studying global patterns in seismicity and hazard, and is therefore released with the accompanying software.

  13. An empirical study of statistical properties of variance partition coefficients for multi-level logistic regression models

    USGS Publications Warehouse

    Li, Ji; Gray, B.R.; Bates, D.M.

    2008-01-01

    Partitioning the variance of a response by design levels is challenging for binomial and other discrete outcomes. Goldstein (2003) proposed four definitions for variance partitioning coefficients (VPC) under a two-level logistic regression model. In this study, we explicitly derived formulae for multi-level logistic regression model and subsequently studied the distributional properties of the calculated VPCs. Using simulations and a vegetation dataset, we demonstrated associations between different VPC definitions, the importance of methods for estimating VPCs (by comparing VPC obtained using Laplace and penalized quasilikehood methods), and bivariate dependence between VPCs calculated at different levels. Such an empirical study lends an immediate support to wider applications of VPC in scientific data analysis.

  14. Quantitative simultaneous multi-element microprobe analysis using combined wavelength and energy dispersive systems

    NASA Technical Reports Server (NTRS)

    Walter, L. S.; Doan, A. S., Jr.; Wood, F. M., Jr.; Bredekamp, J. H.

    1972-01-01

    A combined WDS-EDS system obviates the severe X-ray peak overlap problems encountered with Na, Mg, Al and Si common to pure EDS systems. By application of easily measured empirical correction factors for pulse pile-up and peak overlaps which are normally observed in the analysis of silicate minerals, the accuracy of analysis is comparable with that expected for WDS electron microprobe analyses. The continuum backgrounds are subtracted for the spectra by a spline fitting technique based on integrated intensities between the peaks. The preprocessed data are then reduced to chemical analyses by existing data reduction programs.

  15. Prediction of Partition Coefficients of Organic Compounds between SPME/PDMS and Aqueous Solution

    PubMed Central

    Chao, Keh-Ping; Lu, Yu-Ting; Yang, Hsiu-Wen

    2014-01-01

    Polydimethylsiloxane (PDMS) is commonly used as the coated polymer in the solid phase microextraction (SPME) technique. In this study, the partition coefficients of organic compounds between SPME/PDMS and the aqueous solution were compiled from the literature sources. The correlation analysis for partition coefficients was conducted to interpret the effect of their physicochemical properties and descriptors on the partitioning process. The PDMS-water partition coefficients were significantly correlated to the polarizability of organic compounds (r = 0.977, p < 0.05). An empirical model, consisting of the polarizability, the molecular connectivity index, and an indicator variable, was developed to appropriately predict the partition coefficients of 61 organic compounds for the training set. The predictive ability of the empirical model was demonstrated by using it on a test set of 26 chemicals not included in the training set. The empirical model, applying the straightforward calculated molecular descriptors, for estimating the PDMS-water partition coefficient will contribute to the practical applications of the SPME technique. PMID:24534804

  16. Automatic Detection of Previously-Unseen Application States for Deployment Environment Testing and Analysis

    PubMed Central

    Murphy, Christian; Vaughan, Moses; Ilahi, Waseem; Kaiser, Gail

    2010-01-01

    For large, complex software systems, it is typically impossible in terms of time and cost to reliably test the application in all possible execution states and configurations before releasing it into production. One proposed way of addressing this problem has been to continue testing and analysis of the application in the field, after it has been deployed. A practical limitation of many such automated approaches is the potentially high performance overhead incurred by the necessary instrumentation. However, it may be possible to reduce this overhead by selecting test cases and performing analysis only in previously-unseen application states, thus reducing the number of redundant tests and analyses that are run. Solutions for fault detection, model checking, security testing, and fault localization in deployed software may all benefit from a technique that ignores application states that have already been tested or explored. In this paper, we present a solution that ensures that deployment environment tests are only executed in states that the application has not previously encountered. In addition to discussing our implementation, we present the results of an empirical study that demonstrates its effectiveness, and explain how the new approach can be generalized to assist other automated testing and analysis techniques intended for the deployment environment. PMID:21197140

  17. Verification of GCM-generated regional seasonal precipitation for current climate and of statistical downscaling estimates under changing climate conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Busuioc, A.; Storch, H. von; Schnur, R.

    Empirical downscaling procedures relate large-scale atmospheric features with local features such as station rainfall in order to facilitate local scenarios of climate change. The purpose of the present paper is twofold: first, a downscaling technique is used as a diagnostic tool to verify the performance of climate models on the regional scale; second, a technique is proposed for verifying the validity of empirical downscaling procedures in climate change applications. The case considered is regional seasonal precipitation in Romania. The downscaling model is a regression based on canonical correlation analysis between observed station precipitation and European-scale sea level pressure (SLP). Themore » climate models considered here are the T21 and T42 versions of the Hamburg ECHAM3 atmospheric GCM run in time-slice mode. The climate change scenario refers to the expected time of doubled carbon dioxide concentrations around the year 2050. Generally, applications of statistical downscaling to climate change scenarios have been based on the assumption that the empirical link between the large-scale and regional parameters remains valid under a changed climate. In this study, a rationale is proposed for this assumption by showing the consistency of the 2 x CO{sub 2} GCM scenarios in winter, derived directly from the gridpoint data, with the regional scenarios obtained through empirical downscaling. Since the skill of the GCMs in regional terms is already established, it is concluded that the downscaling technique is adequate for describing climatically changing regional and local conditions, at least for precipitation in Romania during winter.« less

  18. Community health nursing advocacy: a concept analysis.

    PubMed

    Ezeonwu, Mabel C

    2015-01-01

    The purpose of this article is to present an in-depth analysis of the concept of community health nursing (CHN) advocacy. Walker and Avant's (2010) 8-step concept analysis methodology was used. A broad inquiry into the literature between 1994 and 2014 resulted in the identification of the uses, defining attributes, empirical referents, antecedents, and consequences, as well as the articulation of an operational definition of CHN advocacy. Model and contrary cases were identified to demonstrate the concept's application and to clarify its meaning. This analysis contributes to the advancement of knowledge of CHN advocacy and provides nurse clinicians, educators, and researchers with some conceptual clarity to help improve community health outcomes.

  19. Time of Concentration equations: the role of morphometric uncertainties in flood risk analysis and management

    NASA Astrophysics Data System (ADS)

    Martins, Luciano; Díez-Herrero, Andrés; Bodoque, Jose M.; Bateira, Carlos

    2016-04-01

    The perception of flood risk by the responsible authorities on the flood management disasters and mitigation strategies should be based on an overall evaluation of the uncertainties associated with the procedures for risk assessment and mapping production. This contribution presents the results of the development of mapping evaluation of the time of concentration (tc). This parameter reflects the time-space at which a watershed responds to rainfall events and is the most frequently utilized time parameter, and is of great importance in many hydrologic analysis. Accurate estimates of the tc are very important, for instance, if tc is under-estimated, the result is an over-estimated peak discharge and vice versa, resulting significant variations on the flooded areas, and could have important consequences in terms of the land use and occupation of territory, as management's own flood risk. The methology used evaluate 20 different empirical, semi-empirical and kinematics equations of tc calculation, due to different cartographic scales (1:200000; 1:100000; 1:25000; LIDAR 5x5m &1x1m) in in two hydrographic basins with distinct dimensions and geomorphological characteristics, located in the Gredos Mountain range (Spain). The results suggest that the changes in the cartographic scale, has not influence as significant as one might expect. The most important variations occur in the characteristics of the fequations, use different morphometricparameters in the calculations. Some just are based on geomorphological criteria and other magnify the hydraulic characteristics of the channels, resulting in very different tc values. However, we highlighting the role of cartographic scale particularly in the application of semi-empirical equations that take into account changes in land use and occupation. In this case, the determination of parameters, such as flow coefficient, curve number and roughness coefficient are very sensitive to cartographic scale. Sensitivity analysis demonstrates that the empirical equations are simpler (e.g Giandotti, Chow, Temez), since it is based only on the geometrical characteristics of the basin and therefore the results tend not to reflect the dynamic range leadings to worse results of tc.The application of these equations based on local parameters should not be applied to other regions that have distinct geomorphological and climatic characteristics, since greatly influences the results.The semi-empirical and kinematics equations (e.g SCS, Kinematic Wave) tc is reflected mainly in the form of the hydrograph, particularly in the Lag-time. Thats seems be an appropriate to the integrated analysis of hydrographic basins. Moreover, these methods are fundamental to understand spatio-temporal dynamics within the basin, even if some parameters are difficult to calculate. The best way to calibrate and evaluate the obtained concentration time values, should be based on known events, calibrated by rating curves records.

  20. The "SMART Travel Health" Mobile Application Assessment.

    PubMed

    Gallos, Parisis; Mantas, John

    2015-01-01

    An empirical study was conducted to evaluate the users' perception on a pilot mobile application ("SMART Travel Health"), their attitude towards use, and their intention to use it. A theoretical model was constructed based on TAM and other related works. The population was 88 travellers who used the pilot application. Data analysis was performed using partial least squares path modeling. Results highlight the very strong significant effect of perceived ease of use to perceived usefulness, the strong significant effect of perceived usefulness to attitude towards use, as well as, the significant effect of perceived ease of use to attitude towards using the application. Also, the strong significant effect of attitude towards use to behavioral intention to use is presenting the positive perception of the population about this mobile application.

  1. Developing measures for information ergonomics in knowledge work.

    PubMed

    Franssila, Heljä; Okkonen, Jussi; Savolainen, Reijo

    2016-03-01

    Information ergonomics is an evolving application domain of ergonomics focusing on the management of workload in the real-world contexts of information-intensive tasks. This study introduces a method for the evaluation of information ergonomics in knowledge work. To this end, five key dimensions of information ergonomics were identified: contextual factors of knowledge work, multitasking, interruptions at work, practices for managing information load, and perceived job control and productivity. In total, 24 measures focusing on the above dimensions were constructed. The measures include, for example, the number of fragmented work tasks per work day. The measures were preliminarily tested in two Finnish organisations, making use of empirical data gathered by interviews, electronic questionnaires and log data applications tracking work processes on personal computers. The measures are applicable to the evaluation of information ergonomics, even though individual measures vary with regard to the amount of work and time needed for data analysis. Practitioner Summary: The study introduces a method for the evaluation of information ergonomics in knowledge work. To this end, 24 measures were constructed and tested empirically. The measures focus on contextual factors of knowledge work, multitasking, interruptions at work, practices for managing information load, and perceived job control and productivity.

  2. Empirical Investigation of Job Applicants' Reactions to Taking a Pre-Employment Honesty Test.

    ERIC Educational Resources Information Center

    Jones, John W.; Joy, Dennis

    Employee theft is widespread and difficult to detect. Many companies have attempted to control the employee theft problem through pre-employment screening. The use of paper-and-pencil honesty tests in this process has become increasingly common. These two studies empirically investigated job applicants' (N=450) reactions to taking a pre-employment…

  3. Structural analysis for preliminary design of High Speed Civil Transport (HSCT)

    NASA Technical Reports Server (NTRS)

    Bhatia, Kumar G.

    1992-01-01

    In the preliminary design environment, there is a need for quick evaluation of configuration and material concepts. The simplified beam representations used in the subsonic, high aspect ratio wing platform are not applicable for low aspect ratio configurations typical of supersonic transports. There is a requirement to develop methods for efficient generation of structural arrangement and finite element representation to support multidisciplinary analysis and optimization. In addition, empirical data bases required to validate prediction methods need to be improved for high speed civil transport (HSCT) type configurations.

  4. The challenge of informed consent and return of results in translational genomics: empirical analysis and recommendations.

    PubMed

    Henderson, Gail E; Wolf, Susan M; Kuczynski, Kristine J; Joffe, Steven; Sharp, Richard R; Parsons, D Williams; Knoppers, Bartha M; Yu, Joon-Ho; Appelbaum, Paul S

    2014-01-01

    As exome and genome sequencing move into clinical application, questions surround how to elicit consent and handle potential return of individual genomic results. This study analyzes nine consent forms used in NIH-funded sequencing studies. Content analysis reveals considerable heterogeneity, including in defining results that may be returned, identifying potential benefits and risks of return, protecting privacy, addressing placement of results in the medical record, and data-sharing. In response to lack of consensus, we offer recommendations. © 2014 American Society of Law, Medicine & Ethics, Inc.

  5. Current Results and Proposed Activities in Microgravity Fluid Dynamics

    NASA Technical Reports Server (NTRS)

    Polezhaev, V. I.

    1996-01-01

    The Institute for Problems in Mechanics' Laboratory work in mathematical and physical modelling of fluid mechanics develops models, methods, and software for analysis of fluid flow, instability analysis, direct numerical modelling and semi-empirical models of turbulence, as well as experimental research and verification of these models and their applications in technological fluid dynamics, microgravity fluid mechanics, geophysics, and a number of engineering problems. This paper presents an overview of the results in microgravity fluid dynamics research during the last two years. Nonlinear problems of weakly compressible and compressible fluid flows are discussed.

  6. Evaluation of Conceptual Frameworks Applicable to the Study of Isolation Precautions Effectiveness

    PubMed Central

    Crawford, Catherine; Shang, Jingjing

    2015-01-01

    Aims A discussion of conceptual frameworks applicable to the study of isolation precautions effectiveness according to Fawcett and DeSanto-Madeya’s (2013) evaluation technique and their relative merits and drawbacks for this purpose Background Isolation precautions are recommended to control infectious diseases with high morbidity and mortality, but effectiveness is not established due to numerous methodological challenges. These challenges, such as identifying empirical indicators and refining operational definitions, could be alleviated though use of an appropriate conceptual framework. Design Discussion paper Data Sources In mid-April 2014, the primary author searched five electronic, scientific literature databases for conceptual frameworks applicable to study isolation precautions, without limiting searches by publication date. Implications for Nursing By reviewing promising conceptual frameworks to support isolation precautions effectiveness research, this paper exemplifies the process to choose an appropriate conceptual framework for empirical research. Hence, researchers may build on these analyses to improve study design of empirical research in multiple disciplines, which may lead to improved research and practice. Conclusion Three frameworks were reviewed: the epidemiologic triad of disease, Donabedian’s healthcare quality framework and the Quality Health Outcomes model. Each has been used in nursing research to evaluate health outcomes and contains concepts relevant to nursing domains. Which framework can be most useful likely depends on whether the study question necessitates testing multiple interventions, concerns pathogen-specific characteristics and yields cross-sectional or longitudinal data. The Quality Health Outcomes model may be slightly preferred as it assumes reciprocal relationships, multi-level analysis and is sensitive to cultural inputs. PMID:26179813

  7. Measuring farm sustainability using data envelope analysis with principal components: the case of Wisconsin cranberry.

    PubMed

    Dong, Fengxia; Mitchell, Paul D; Colquhoun, Jed

    2015-01-01

    Measuring farm sustainability performance is a crucial component for improving agricultural sustainability. While extensive assessments and indicators exist that reflect the different facets of agricultural sustainability, because of the relatively large number of measures and interactions among them, a composite indicator that integrates and aggregates over all variables is particularly useful. This paper describes and empirically evaluates a method for constructing a composite sustainability indicator that individually scores and ranks farm sustainability performance. The method first uses non-negative polychoric principal component analysis to reduce the number of variables, to remove correlation among variables and to transform categorical variables to continuous variables. Next the method applies common-weight data envelope analysis to these principal components to individually score each farm. The method solves weights endogenously and allows identifying important practices in sustainability evaluation. An empirical application to Wisconsin cranberry farms finds heterogeneity in sustainability practice adoption, implying that some farms could adopt relevant practices to improve the overall sustainability performance of the industry. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Equation of state for dense nucleonic matter from metamodeling. I. Foundational aspects

    NASA Astrophysics Data System (ADS)

    Margueron, Jérôme; Hoffmann Casali, Rudiney; Gulminelli, Francesca

    2018-02-01

    Metamodeling for the nucleonic equation of state (EOS), inspired from a Taylor expansion around the saturation density of symmetric nuclear matter, is proposed and parameterized in terms of the empirical parameters. The present knowledge of nuclear empirical parameters is first reviewed in order to estimate their average values and associated uncertainties, and thus defining the parameter space of the metamodeling. They are divided into isoscalar and isovector types, and ordered according to their power in the density expansion. The goodness of the metamodeling is analyzed against the predictions of the original models. In addition, since no correlation among the empirical parameters is assumed a priori, all arbitrary density dependences can be explored, which might not be accessible in existing functionals. Spurious correlations due to the assumed functional form are also removed. This meta-EOS allows direct relations between the uncertainties on the empirical parameters and the density dependence of the nuclear equation of state and its derivatives, and the mapping between the two can be done with standard Bayesian techniques. A sensitivity analysis shows that the more influential empirical parameters are the isovector parameters Lsym and Ksym, and that laboratory constraints at supersaturation densities are essential to reduce the present uncertainties. The present metamodeling for the EOS for nuclear matter is proposed for further applications in neutron stars and supernova matter.

  9. Application and Validation of Workload Assessment Techniques

    DTIC Science & Technology

    1993-03-01

    tech ical report documents the process and outcome of meeting this objective. Procedure: A series of eight separate studies was conducted using three...development process . The task analysis and simulation technique was shown to have the capability to track empirical workload ratings. More research is...operator workload during the systems acquisi- tion process , and (b) a pamphlet for the managers of Army systems that describes the need and some procedures

  10. Exploring behavior of an unusual megaherbivore: A spatially explicit foraging model of the hippopotamus

    USGS Publications Warehouse

    Lewison, R.L.; Carter, J.

    2004-01-01

    Herbivore foraging theories have been developed for and tested on herbivores across a range of sizes. Due to logistical constraints, however, little research has focused on foraging behavior of megaherbivores. Here we present a research approach that explores megaherbivore foraging behavior, and assesses the applicability of foraging theories developed on smaller herbivores to megafauna. With simulation models as reference points for the analysis of empirical data, we investigate foraging strategies of the common hippopotamus (Hippopotamus amphibius). Using a spatially explicit individual based foraging model, we apply traditional herbivore foraging strategies to a model hippopotamus, compare model output, and then relate these results to field data from wild hippopotami. Hippopotami appear to employ foraging strategies that respond to vegetation characteristics, such as vegetation quality, as well as spatial reference information, namely distance to a water source. Model predictions, field observations, and comparisons of the two support that hippopotami generally conform to the central place foraging construct. These analyses point to the applicability of general herbivore foraging concepts to megaherbivores, but also point to important differences between hippopotami and other herbivores. Our synergistic approach of models as reference points for empirical data highlights a useful method of behavioral analysis for hard-to-study megafauna. ?? 2003 Elsevier B.V. All rights reserved.

  11. Development and application of an empirical probability distribution for the prediction error of re-entry body maximum dynamic pressure

    NASA Technical Reports Server (NTRS)

    Lanzi, R. James; Vincent, Brett T.

    1993-01-01

    The relationship between actual and predicted re-entry maximum dynamic pressure is characterized using a probability density function and a cumulative distribution function derived from sounding rocket flight data. This paper explores the properties of this distribution and demonstrates applications of this data with observed sounding rocket re-entry body damage characteristics to assess probabilities of sustaining various levels of heating damage. The results from this paper effectively bridge the gap existing in sounding rocket reentry analysis between the known damage level/flight environment relationships and the predicted flight environment.

  12. In vitro chemo-sensitivity assay guided chemotherapy is associated with prolonged overall survival in cancer patients.

    PubMed

    Udelnow, Andrej; Schönfęlder, Manfred; Würl, Peter; Halloul, Zuhir; Meyer, Frank; Lippert, Hans; Mroczkowski, Paweł

    2013-06-01

    The overall survival (OS) of patients suffering From various tumour entities was correlated with the results of in vitro-chemosensitivity assay (CSA) of the in vivo applied drugs. Tumour specimen (n=611) were dissected in 514 patients and incubated for primary tumour cell culture. The histocytological regression assay was performed 5 days after adding chemotherapeutic substances to the cell cultures. n=329 patients undergoing chemotherapy were included in the in vitro/in vivo associations. OS was assessed and in vitro response groups compared using survival analysis. Furthermore Cox-regression analysis was performed on OS including CSA, age, TNM classification and treatment course. The growth rate of the primary was 73-96% depending on tumour entity. The in-vitro response rate varied with histology and drugs (e.g. 8-18% for methotrexate and 33-83% for epirubicine). OS was significantly prolonged for patients treated with in vitro effective drugs compared to empiric therapy (log-rank-test, p=0.0435). Cox-regression revealed that application of in vitro effective drugs, residual tumour and postoperative radiotherapy determined the death risk independently. When patients were treated with drugs effective in our CSA, OS was significantly prolonged compared to empiric therapy. CSA guided chemotherapy should be compared to empiric treatment by a prospective randomized trial.

  13. Publication Trends in Thanatology: An Analysis of Leading Journals.

    PubMed

    Wittkowski, Joachim; Doka, Kenneth J; Neimeyer, Robert A; Vallerga, Michael

    2015-01-01

    To identify important trends in thanatology as a discipline, the authors analyzed over 1,500 articles that appeared in Death Studies and Omega over a 20-year period, coding the category of articles (e.g., theory, application, empirical research), their content focus (e.g., bereavement, death attitudes, end-of-life), and for empirical studies, their methodology (e.g., quantitative, qualitative). In general, empirical research predominates in both journals, with quantitative methods outnumbering qualitative procedures 2 to 1 across the period studied, despite an uptick in the latter methods in recent years. Purely theoretical articles, in contrast, decline in frequency. Research on grief and bereavement is the most commonly occurring (and increasing) content focus of this work, with a declining but still substantial body of basic research addressing death attitudes. Suicidology is also well represented in the corpus of articles analyzed. In contrast, publications on topics such as death education, medical ethics, and end-of-life issues occur with lower frequency, in the latter instances likely due to the submission of such work to more specialized medical journals. Differences in emphasis of Death Studies and Omega are noted, and the analysis of publication patterns is interpreted with respect to overall trends in the discipline and the culture, yielding a broad depiction of the field and some predictions regarding its possible future.

  14. Empirical Analysis of Optical Attenuator Performance in Quantum Key Distribution Systems Using a Particle Model

    DTIC Science & Technology

    2012-03-01

    EMPIRICAL ANALYSIS OF OPTICAL ATTENUATOR PERFORMANCE IN QUANTUM KEY DISTRIBUTION SYSTEMS USING A...DISTRIBUTION IS UNLIMITED AFIT/GCS/ENG/12-01 EMPIRICAL ANALYSIS OF OPTICAL ATTENUATOR PERFORMANCE IN QUANTUM KEY DISTRIBUTION SYSTEMS USING ...challenging as the complexity of actual implementation specifics are considered. Two components common to most quantum key distribution

  15. Best Practices in Physics-Based Fault Rupture Models for Seismic Hazard Assessment of Nuclear Installations

    NASA Astrophysics Data System (ADS)

    Dalguer, Luis A.; Fukushima, Yoshimitsu; Irikura, Kojiro; Wu, Changjiang

    2017-09-01

    Inspired by the first workshop on Best Practices in Physics-Based Fault Rupture Models for Seismic Hazard Assessment of Nuclear Installations (BestPSHANI) conducted by the International Atomic Energy Agency (IAEA) on 18-20 November, 2015 in Vienna (http://www-pub.iaea.org/iaeameetings/50896/BestPSHANI), this PAGEOPH topical volume collects several extended articles from this workshop as well as several new contributions. A total of 17 papers have been selected on topics ranging from the seismological aspects of earthquake cycle simulations for source-scaling evaluation, seismic source characterization, source inversion and ground motion modeling (based on finite fault rupture using dynamic, kinematic, stochastic and empirical Green's functions approaches) to the engineering application of simulated ground motion for the analysis of seismic response of structures. These contributions include applications to real earthquakes and description of current practice to assess seismic hazard in terms of nuclear safety in low seismicity areas, as well as proposals for physics-based hazard assessment for critical structures near large earthquakes. Collectively, the papers of this volume highlight the usefulness of physics-based models to evaluate and understand the physical causes of observed and empirical data, as well as to predict ground motion beyond the range of recorded data. Relevant importance is given on the validation and verification of the models by comparing synthetic results with observed data and empirical models.

  16. Constitutive Equation with Varying Parameters for Superplastic Flow Behavior

    NASA Astrophysics Data System (ADS)

    Guan, Zhiping; Ren, Mingwen; Jia, Hongjie; Zhao, Po; Ma, Pinkui

    2014-03-01

    In this study, constitutive equations for superplastic materials with an extra large elongation were investigated through mechanical analysis. From the view of phenomenology, firstly, some traditional empirical constitutive relations were standardized by restricting some strain paths and parameter conditions, and the coefficients in these relations were strictly given new mechanical definitions. Subsequently, a new, general constitutive equation with varying parameters was theoretically deduced based on the general mechanical equation of state. The superplastic tension test data of Zn-5%Al alloy at 340 °C under strain rates, velocities, and loads were employed for building a new constitutive equation and examining its validity. Analysis results indicated that the constitutive equation with varying parameters could characterize superplastic flow behavior in practical superplastic forming with high prediction accuracy and without any restriction of strain path or deformation condition, showing good industrial or scientific interest. On the contrary, those empirical equations have low prediction capabilities due to constant parameters and poor applicability because of the limit of special strain path or parameter conditions based on strict phenomenology.

  17. Formalization and analysis of reasoning by assumption.

    PubMed

    Bosse, Tibor; Jonker, Catholijn M; Treur, Jan

    2006-01-02

    This article introduces a novel approach for the analysis of the dynamics of reasoning processes and explores its applicability for the reasoning pattern called reasoning by assumption. More specifically, for a case study in the domain of a Master Mind game, it is shown how empirical human reasoning traces can be formalized and automatically analyzed against dynamic properties they fulfill. To this end, for the pattern of reasoning by assumption a variety of dynamic properties have been specified, some of which are considered characteristic for the reasoning pattern, whereas some other properties can be used to discriminate among different approaches to the reasoning. These properties have been automatically checked for the traces acquired in experiments undertaken. The approach turned out to be beneficial from two perspectives. First, checking characteristic properties contributes to the empirical validation of a theory on reasoning by assumption. Second, checking discriminating properties allows the analyst to identify different classes of human reasoners. 2006 Lawrence Erlbaum Associates, Inc.

  18. The Boundaries of Woman's Spirituality in the Beliefs-Spirituality-Religiousness (B-S-R) Model: A Third Perspective-Beliefs as a Cognitive Basis.

    PubMed

    Skrzypińska, Katarzyna

    2017-10-01

    The real nature of the phenomenon of woman's Spirituality is the main contemporary challenge for empirical research. The literature needs many more examples of the cognitive genesis of worldviews, Spirituality and Religiousness. The first aim of this article is to present the central tenet of the Threefold Nature of Spirituality model which theoretically explains the nature of Spirituality and the theoretical relationship between beliefs (worldviews), Spirituality and Religiousness (B-S-R model). The second aim is the empirical verification of this relationship through the application of an analysis of mediation. The 308 participants were women aged 18-50 years (M = 25.25, SD = 9.42). The results obtained indicate that is a good mediator between an individual's worldview and Religiousness. Presented analysis of mediation allows us to describe the basic functioning mechanism of the spiritual sphere and the relationship between the three elements: worldview, Spirituality and Religiousness.

  19. Study of the cross-market effects of Brexit based on the improved symbolic transfer entropy GARCH model—An empirical analysis of stock–bond correlations

    PubMed Central

    Chen, Xiurong; Zhao, Rubo

    2017-01-01

    In this paper, we study the cross-market effects of Brexit on the stock and bond markets of nine major countries in the world. By incorporating information theory, we introduce the time-varying impact weights based on symbolic transfer entropy to improve the traditional GARCH model. The empirical results show that under the influence of Brexit, flight-to-quality not only commonly occurs between the stocks and bonds of each country but also simultaneously occurs among different countries. We also find that the accuracy of the time-varying symbolic transfer entropy GARCH model proposed in this paper has been improved compared to the traditional GARCH model, which indicates that it has a certain practical application value. PMID:28817712

  20. Cyriax's deep friction massage application parameters: Evidence from a cross-sectional study with physiotherapists.

    PubMed

    Chaves, Paula; Simões, Daniela; Paço, Maria; Pinho, Francisco; Duarte, José Alberto; Ribeiro, Fernando

    2017-12-01

    Deep friction massage is one of several physiotherapy interventions suggested for the management of tendinopathy. To determine the prevalence of deep friction massage use in clinical practice, to characterize the application parameters used by physiotherapists, and to identify empirical model-based patterns of deep friction massage application in degenerative tendinopathy. observational, analytical, cross-sectional and national web-based survey. 478 physiotherapists were selected through snow-ball sampling method. The participants completed an online questionnaire about personal and professional characteristics as well as specific questions regarding the use of deep friction massage. Characterization of deep friction massage parameters used by physiotherapists were presented as counts and proportions. Latent class analysis was used to identify the empirical model-based patterns. Crude and adjusted odds ratios and 95% confidence intervals were computed. The use of deep friction massage was reported by 88.1% of the participants; tendinopathy was the clinical condition where it was most frequently used (84.9%) and, from these, 55.9% reported its use in degenerative tendinopathy. The "duration of application" parameters in chronic phase and "frequency of application" in acute and chronic phases are those that diverge most from those recommended by the author of deep friction massage. We found a high prevalence of deep friction massage use, namely in degenerative tendinopathy. Our results have shown that the application parameters are heterogeneous and diverse. This is reflected by the identification of two application patterns, although none is in complete agreement with Cyriax's description. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. A hierarchical-multiobjective framework for risk management

    NASA Technical Reports Server (NTRS)

    Haimes, Yacov Y.; Li, Duan

    1991-01-01

    A broad hierarchical-multiobjective framework is established and utilized to methodologically address the management of risk. United into the framework are the hierarchical character of decision-making, the multiple decision-makers at separate levels within the hierarchy, the multiobjective character of large-scale systems, the quantitative/empirical aspects, and the qualitative/normative/judgmental aspects. The methodological components essentially consist of hierarchical-multiobjective coordination, risk of extreme events, and impact analysis. Examples of applications of the framework are presented. It is concluded that complex and interrelated forces require an analysis of trade-offs between engineering analysis and societal preferences, as in the hierarchical-multiobjective framework, to successfully address inherent risk.

  2. A User's Guide for the Differential Reduced Ejector/Mixer Analysis "DREA" Program. 1.0

    NASA Technical Reports Server (NTRS)

    DeChant, Lawrence J.; Nadell, Shari-Beth

    1999-01-01

    A system of analytical and numerical two-dimensional mixer/ejector nozzle models that require minimal empirical input has been developed and programmed for use in conceptual and preliminary design. This report contains a user's guide describing the operation of the computer code, DREA (Differential Reduced Ejector/mixer Analysis), that contains these mathematical models. This program is currently being adopted by the Propulsion Systems Analysis Office at the NASA Glenn Research Center. A brief summary of the DREA method is provided, followed by detailed descriptions of the program input and output files. Sample cases demonstrating the application of the program are presented.

  3. Effects in the network topology due to node aggregation: Empirical evidence from the domestic maritime transportation in Greece

    NASA Astrophysics Data System (ADS)

    Tsiotas, Dimitrios; Polyzos, Serafeim

    2018-02-01

    This article studies the topological consistency of spatial networks due to node aggregation, examining the changes captured between different network representations that result from nodes' grouping and they refer to the same socioeconomic system. The main purpose of this study is to evaluate what kind of topological information remains unalterable due to node aggregation and, further, to develop a framework for linking the data of an empirical network with data of its socioeconomic environment, when the latter are available for hierarchically higher levels of aggregation, in an effort to promote the interdisciplinary research in the field of complex network analysis. The research question is empirically tested on topological and socioeconomic data extracted from the Greek Maritime Network (GMN) that is modeled as a non-directed multilayer (bilayer) graph consisting of a port-layer, where nodes represent ports, and a prefecture-layer, where nodes represent coastal and insular prefectural groups of ports. The analysis highlights that the connectivity (degree) of the GMN is the most consistent aspect of this multilayer network, which preserves both the topological and the socioeconomic information through node aggregation. In terms of spatial analysis and regional science, such effects illustrate the effectiveness of the prefectural administrative division for the functionality of the Greek maritime transportation system. Overall, this approach proposes a methodological framework that can enjoy further applications about the grouping effects induced on the network topology, providing physical, technical, socioeconomic, strategic or political insights.

  4. Comparison of safety effect estimates obtained from empirical Bayes before-after study, propensity scores-potential outcomes framework, and regression model with cross-sectional data.

    PubMed

    Wood, Jonathan S; Donnell, Eric T; Porter, Richard J

    2015-02-01

    A variety of different study designs and analysis methods have been used to evaluate the performance of traffic safety countermeasures. The most common study designs and methods include observational before-after studies using the empirical Bayes method and cross-sectional studies using regression models. The propensity scores-potential outcomes framework has recently been proposed as an alternative traffic safety countermeasure evaluation method to address the challenges associated with selection biases that can be part of cross-sectional studies. Crash modification factors derived from the application of all three methods have not yet been compared. This paper compares the results of retrospective, observational evaluations of a traffic safety countermeasure using both before-after and cross-sectional study designs. The paper describes the strengths and limitations of each method, focusing primarily on how each addresses site selection bias, which is a common issue in observational safety studies. The Safety Edge paving technique, which seeks to mitigate crashes related to roadway departure events, is the countermeasure used in the present study to compare the alternative evaluation methods. The results indicated that all three methods yielded results that were consistent with each other and with previous research. The empirical Bayes results had the smallest standard errors. It is concluded that the propensity scores with potential outcomes framework is a viable alternative analysis method to the empirical Bayes before-after study. It should be considered whenever a before-after study is not possible or practical. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Measuring routine nursing service efficiency: a comparison of cost per patient day and data envelopment analysis models.

    PubMed Central

    Nunamaker, T R

    1983-01-01

    This article provides an illustrative application of Data Envelopment Analysis (DEA) methodology to the measurement of routine nursing service efficiency at a group of Wisconsin hospitals. The DEA efficiency ratings and cost savings estimates are then compared to those resulting from application of Medicare's routine cost limitation to the sample data. DEA is also used to determine if any changes in the potential for efficient operations occurred during the 1978-1979 period. Empirical results were representative of the fundamental differences existing between the DEA and cost per patient day approaches. No evidence was found to support the notion that the overall potential for efficient delivery of routine services by the sample institutions was greater in one year than another. PMID:6874357

  6. Analyzing the Behavioral Differences between Students of Different Genders, Prior Knowledge and Learning Performance with an Educational MMORPG: A Longitudinal Case Study in an Elementary School

    ERIC Educational Resources Information Center

    Hou, Huei-Tse

    2013-01-01

    Many researchers have studied the effects of game-based learning (GBL) (eg, Annetta, Minogu, Holmes & Cheng, 2009; Kiili, 2007). However, empirical process analyses of long-term applications of GBL in a school setting are much less common. A process analysis of GBL in a school setting allows us to better understand the role of games in…

  7. Exploring Wider Well-Being in the EU-15 Countries: An Empirical Application of the Stiglitz Report

    ERIC Educational Resources Information Center

    Madonia, G.; Cracolici, M. F.; Cuffaro, M.

    2013-01-01

    We draw on the recommendations of the Stiglitz Report to select a set of economic and social variables that can be used to make cross-country comparisons of wider well-being. Using data for the EU-15 countries for 1999 and 2005, we show how three-way analysis can be used to extract synthetic information from a large data set to determine the main…

  8. Development of IR Contrast Data Analysis Application for Characterizing Delaminations in Graphite-Epoxy Structures

    NASA Technical Reports Server (NTRS)

    Havican, Marie

    2012-01-01

    Objective: Develop infrared (IR) flash thermography application based on use of a calibration standard for inspecting graphite-epoxy laminated/honeycomb structures. Background: Graphite/Epoxy composites (laminated and honeycomb) are widely used on NASA programs. Composite materials are susceptible for impact damage that is not readily detected by visual inspection. IR inspection can provide required sensitivity to detect surface damage in composites during manufacturing and during service. IR contrast analysis can provide characterization of depth, size and gap thickness of impact damage. Benefits/Payoffs: The research provides an empirical method of calibrating the flash thermography response in nondestructive evaluation. A physical calibration standard with artificial flaws such as flat bottom holes with desired diameter and depth values in a desired material is used in calibration. The research devises several probability of detection (POD) analysis approaches to enable cost effective POD study to meet program requirements.

  9. Calibrating Detailed Chemical Analysis of M dwarfs

    NASA Astrophysics Data System (ADS)

    Veyette, Mark; Muirhead, Philip Steven; Mann, Andrew; Brewer, John; Allard, France; Homeier, Derek

    2018-01-01

    The ability to perform detailed chemical analysis of Sun-like F-, G-, and K-type stars is a powerful tool with many applications including studying the chemical evolution of the Galaxy, assessing membership in stellar kinematic groups, and constraining planet formation theories. Unfortunately, complications in modeling cooler stellar atmospheres has hindered similar analysis of M-dwarf stars. Large surveys of FGK abundances play an important role in developing methods to measure the compositions of M dwarfs by providing benchmark FGK stars that have widely-separated M dwarf companions. These systems allow us to empirically calibrate metallicity-sensitive features in M dwarf spectra. However, current methods to measure metallicity in M dwarfs from moderate-resolution spectra are limited to measuring overall metallicity and largely rely on astrophysical abundance correlations in stellar populations. In this talk, I will discuss how large, homogeneous catalogs of precise FGK abundances are crucial to advancing chemical analysis of M dwarfs beyond overall metallicity to direct measurements of individual elemental abundances. I will present a new method to analyze high-resolution, NIR spectra of M dwarfs that employs an empirical calibration of synthetic M dwarf spectra to infer effective temperature, Fe abundance, and Ti abundance. This work is a step toward detailed chemical analysis of M dwarfs at a similar precision achieved for FGK stars.

  10. Hedonic approaches based on spatial econometrics and spatial statistics: application to evaluation of project benefits

    NASA Astrophysics Data System (ADS)

    Tsutsumi, Morito; Seya, Hajime

    2009-12-01

    This study discusses the theoretical foundation of the application of spatial hedonic approaches—the hedonic approach employing spatial econometrics or/and spatial statistics—to benefits evaluation. The study highlights the limitations of the spatial econometrics approach since it uses a spatial weight matrix that is not employed by the spatial statistics approach. Further, the study presents empirical analyses by applying the Spatial Autoregressive Error Model (SAEM), which is based on the spatial econometrics approach, and the Spatial Process Model (SPM), which is based on the spatial statistics approach. SPMs are conducted based on both isotropy and anisotropy and applied to different mesh sizes. The empirical analysis reveals that the estimated benefits are quite different, especially between isotropic and anisotropic SPM and between isotropic SPM and SAEM; the estimated benefits are similar for SAEM and anisotropic SPM. The study demonstrates that the mesh size does not affect the estimated amount of benefits. Finally, the study provides a confidence interval for the estimated benefits and raises an issue with regard to benefit evaluation.

  11. Highly comparative time-series analysis: the empirical structure of time series and their methods.

    PubMed

    Fulcher, Ben D; Little, Max A; Jones, Nick S

    2013-06-06

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.

  12. Highly comparative time-series analysis: the empirical structure of time series and their methods

    PubMed Central

    Fulcher, Ben D.; Little, Max A.; Jones, Nick S.

    2013-01-01

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines. PMID:23554344

  13. The physical and empirical basis for a specific clear-air turbulence risk index

    NASA Technical Reports Server (NTRS)

    Keller, J. L.

    1985-01-01

    An improved operational CAT detection and forecasting technique is developed and detailed. This technique is the specific clear air turbulence risk (SCATR) index. This index shows some promising results. The improvements seen using hand analyzed data, as a result of the more realistic representation of the vertical shear of the horizontal wind, are also realized in the data analysis used in the PROFS/CWP application. The SCATR index should improve as database enhancements such as profiler and VAS satellite data, which increase the resolution in space and time, are brought into even more sophisticated objective analysis schemes.

  14. Discrimination surfaces with application to region-specific brain asymmetry analysis.

    PubMed

    Martos, Gabriel; de Carvalho, Miguel

    2018-05-20

    Discrimination surfaces are here introduced as a diagnostic tool for localizing brain regions where discrimination between diseased and nondiseased participants is higher. To estimate discrimination surfaces, we introduce a Mann-Whitney type of statistic for random fields and present large-sample results characterizing its asymptotic behavior. Simulation results demonstrate that our estimator accurately recovers the true surface and corresponding interval of maximal discrimination. The empirical analysis suggests that in the anterior region of the brain, schizophrenic patients tend to present lower local asymmetry scores in comparison with participants in the control group. Copyright © 2018 John Wiley & Sons, Ltd.

  15. Generating cancelable fingerprint templates.

    PubMed

    Ratha, Nalini K; Chikkerur, Sharat; Connell, Jonathan H; Bolle, Ruud M

    2007-04-01

    Biometrics-based authentication systems offer obvious usability advantages over traditional password and token-based authentication schemes. However, biometrics raises several privacy concerns. A biometric is permanently associated with a user and cannot be changed. Hence, if a biometric identifier is compromised, it is lost forever and possibly for every application where the biometric is used. Moreover, if the same biometric is used in multiple applications, a user can potentially be tracked from one application to the next by cross-matching biometric databases. In this paper, we demonstrate several methods to generate multiple cancelable identifiers from fingerprint images to overcome these problems. In essence, a user can be given as many biometric identifiers as needed by issuing a new transformation "key." The identifiers can be cancelled and replaced when compromised. We empirically compare the performance of several algorithms such as Cartesian, polar, and surface folding transformations of the minutiae positions. It is demonstrated through multiple experiments that we can achieve revocability and prevent cross-matching of biometric databases. It is also shown that the transforms are noninvertible by demonstrating that it is computationally as hard to recover the original biometric identifier from a transformed version as by randomly guessing. Based on these empirical results and a theoretical analysis we conclude that feature-level cancelable biometric construction is practicable in large biometric deployments.

  16. Disease Risk Score (DRS) as a Confounder Summary Method: Systematic Review and Recommendations

    PubMed Central

    Tadrous, Mina; Gagne, Joshua J.; Stürmer, Til; Cadarette, Suzanne M.

    2013-01-01

    Purpose To systematically examine trends and applications of the disease risk score (DRS) as a confounder summary method. Methods We completed a systematic search of MEDLINE and Web of Science® to identify all English language articles that applied DRS methods. We tabulated the number of publications by year and type (empirical application, methodological contribution, or review paper) and summarized methods used in empirical applications overall and by publication year (<2000, ≥2000). Results Of 714 unique articles identified, 97 examined DRS methods and 86 were empirical applications. We observed a bimodal distribution in the number of publications over time, with a peak 1979-1980, and resurgence since 2000. The majority of applications with methodological detail derived DRS using logistic regression (47%), used DRS as a categorical variable in regression (93%), and applied DRS in a non-experimental cohort (47%) or case-control (42%) study. Few studies examined effect modification by outcome risk (23%). Conclusion Use of DRS methods has increased yet remains low. Comparative effectiveness research may benefit from more DRS applications, particularly to examine effect modification by outcome risk. Standardized terminology may facilitate identification, application, and comprehension of DRS methods. More research is needed to support the application of DRS methods, particularly in case-control studies. PMID:23172692

  17. Disease risk score as a confounder summary method: systematic review and recommendations.

    PubMed

    Tadrous, Mina; Gagne, Joshua J; Stürmer, Til; Cadarette, Suzanne M

    2013-02-01

    To systematically examine trends and applications of the disease risk score (DRS) as a confounder summary method. We completed a systematic search of MEDLINE and Web of Science® to identify all English language articles that applied DRS methods. We tabulated the number of publications by year and type (empirical application, methodological contribution, or review paper) and summarized methods used in empirical applications overall and by publication year (<2000, ≥2000). Of 714 unique articles identified, 97 examined DRS methods and 86 were empirical applications. We observed a bimodal distribution in the number of publications over time, with a peak 1979-1980, and resurgence since 2000. The majority of applications with methodological detail derived DRS using logistic regression (47%), used DRS as a categorical variable in regression (93%), and applied DRS in a non-experimental cohort (47%) or case-control (42%) study. Few studies examined effect modification by outcome risk (23%). Use of DRS methods has increased yet remains low. Comparative effectiveness research may benefit from more DRS applications, particularly to examine effect modification by outcome risk. Standardized terminology may facilitate identification, application, and comprehension of DRS methods. More research is needed to support the application of DRS methods, particularly in case-control studies. Copyright © 2012 John Wiley & Sons, Ltd.

  18. Acoustical Applications of the HHT Method

    NASA Technical Reports Server (NTRS)

    Huang, Norden E.

    2003-01-01

    A document discusses applications of a method based on the Huang-Hilbert transform (HHT). The method was described, without the HHT name, in Analyzing Time Series Using EMD and Hilbert Spectra (GSC-13817), NASA Tech Briefs, Vol. 24, No. 10 (October 2000), page 63. To recapitulate: The method is especially suitable for analyzing time-series data that represent nonstationary and nonlinear physical phenomena. The method involves the empirical mode decomposition (EMD), in which a complicated signal is decomposed into a finite number of functions, called intrinsic mode functions (IMFs), that admit well-behaved Hilbert transforms. The HHT consists of the combination of EMD and Hilbert spectral analysis.

  19. Application of higher harmonic blade feathering for helicopter vibration reduction

    NASA Technical Reports Server (NTRS)

    Powers, R. W.

    1978-01-01

    Higher harmonic blade feathering for helicopter vibration reduction is considered. Recent wind tunnel tests confirmed the effectiveness of higher harmonic control in reducing articulated rotor vibratory hub loads. Several predictive analyses developed in support of the NASA program were shown to be capable of calculating single harmonic control inputs required to minimize a single 4P hub response. In addition, a multiple-input, multiple-output harmonic control predictive analysis was developed. All techniques developed thus far obtain a solution by extracting empirical transfer functions from sampled data. Algorithm data sampling and processing requirements are minimal to encourage adaptive control system application of such techniques in a flight environment.

  20. Quantitative estimation of minimum offset for multichannel surface-wave survey with actively exciting source

    USGS Publications Warehouse

    Xu, Y.; Xia, J.; Miller, R.D.

    2006-01-01

    Multichannel analysis of surface waves is a developing method widely used in shallow subsurface investigations. The field procedures and related parameters are very important for successful applications. Among these parameters, the source-receiver offset range is seldom discussed in theory and normally determined by empirical or semi-quantitative methods in current practice. This paper discusses the problem from a theoretical perspective. A formula for quantitatively evaluating a layered homogenous elastic model was developed. The analytical results based on simple models and experimental data demonstrate that the formula is correct for surface wave surveys for near-surface applications. ?? 2005 Elsevier B.V. All rights reserved.

  1. Structural Patterns in Empirical Research Articles: A Cross-Disciplinary Study

    ERIC Educational Resources Information Center

    Lin, Ling; Evans, Stephen

    2012-01-01

    This paper presents an analysis of the major generic structures of empirical research articles (RAs), with a particular focus on disciplinary variation and the relationship between the adjacent sections in the introductory and concluding parts. The findings were derived from a close "manual" analysis of 433 recent empirical RAs from high-impact…

  2. Analysis of acoustic emission signals and monitoring of machining processes

    PubMed

    Govekar; Gradisek; Grabec

    2000-03-01

    Monitoring of a machining process on the basis of sensor signals requires a selection of informative inputs in order to reliably characterize and model the process. In this article, a system for selection of informative characteristics from signals of multiple sensors is presented. For signal analysis, methods of spectral analysis and methods of nonlinear time series analysis are used. With the aim of modeling relationships between signal characteristics and the corresponding process state, an adaptive empirical modeler is applied. The application of the system is demonstrated by characterization of different parameters defining the states of a turning machining process, such as: chip form, tool wear, and onset of chatter vibration. The results show that, in spite of the complexity of the turning process, the state of the process can be well characterized by just a few proper characteristics extracted from a representative sensor signal. The process characterization can be further improved by joining characteristics from multiple sensors and by application of chaotic characteristics.

  3. Distributed Cognition in Cancer Treatment Decision Making: An Application of the DECIDE Decision-Making Styles Typology.

    PubMed

    Krieger, Janice L; Krok-Schoen, Jessica L; Dailey, Phokeng M; Palmer-Wackerly, Angela L; Schoenberg, Nancy; Paskett, Electra D; Dignan, Mark

    2017-07-01

    Distributed cognition occurs when cognitive and affective schemas are shared between two or more people during interpersonal discussion. Although extant research focuses on distributed cognition in decision making between health care providers and patients, studies show that caregivers are also highly influential in the treatment decisions of patients. However, there are little empirical data describing how and when families exert influence. The current article addresses this gap by examining decisional support in the context of cancer randomized clinical trial (RCT) decision making. Data are drawn from in-depth interviews with rural, Appalachian cancer patients ( N = 46). Analysis of transcript data yielded empirical support for four distinct models of health decision making. The implications of these findings for developing interventions to improve the quality of treatment decision making and overall well-being are discussed.

  4. Empirical Research on Spatial Diffusion Process of Knowledge Spillovers

    NASA Astrophysics Data System (ADS)

    Jin, Xuehui

    2018-02-01

    Firstly, this paper gave a brief review of the core issues of previous studies on spatial distribution of knowledge spillovers. That laid the theoretical foundation for further research. Secondly, this paper roughly described the diffusion process of solar patents in Bejing-Tianjin-Hebei and the Pearl River Delta regions by means of correlation analysis based on patent information of the application date and address of patentee. After that, this paper introduced the variables of spatial distance, knowledge absorptive capacity, knowledge gap and pollution control and built the empirical model of patent, and then collecting data to test them. The results showed that knowledge absorptive capacity was the most significant factor than the other three, followed by the knowledge gap. The influence of spatial distance on knowledge spillovers was limited and the most weak influence factor was pollution control.

  5. Application of spectral methods for high-frequency financial data to quantifying states of market participants

    NASA Astrophysics Data System (ADS)

    Sato, Aki-Hiro

    2008-06-01

    Empirical analysis of the foreign exchange market is conducted based on methods to quantify similarities among multi-dimensional time series with spectral distances introduced in [A.-H. Sato, Physica A 382 (2007) 258-270]. As a result it is found that the similarities among currency pairs fluctuate with the rotation of the earth, and that the similarities among best quotation rates are associated with those among quotation frequencies. Furthermore, it is shown that the Jensen-Shannon spectral divergence is proportional to a mean of the Kullback-Leibler spectral distance both empirically and numerically. It is confirmed that these spectral distances are connected with distributions for behavioural parameters of the market participants from numerical simulation. This concludes that spectral distances of representative quantities of financial markets are related into diversification of behavioural parameters of the market participants.

  6. Radar-aeolian roughness project

    NASA Technical Reports Server (NTRS)

    Greeley, Ronald; Dobrovolskis, A.; Gaddis, L.; Iversen, J. D.; Lancaster, N.; Leach, Rodman N.; Rasnussen, K.; Saunders, S.; Vanzyl, J.; Wall, S.

    1991-01-01

    The objective is to establish an empirical relationship between measurements of radar, aeolian, and surface roughness on a variety of natural surfaces and to understand the underlying physical causes. This relationship will form the basis for developing a predictive equation to derive aeolian roughness from radar backscatter. Results are given from investigations carried out in 1989 on the principal elements of the project, with separate sections on field studies, radar data analysis, laboratory simulations, and development of theory for planetary applications.

  7. Heat transfer correlations for multilayer insulation systems

    NASA Astrophysics Data System (ADS)

    Krishnaprakas, C. K.; Badari Narayana, K.; Dutta, Pradip

    2000-01-01

    Multilayer insulation (MLI) blankets are extensively used in spacecrafts as lightweight thermal protection systems. Heat transfer analysis of MLI is sometimes too complex to use in practical design applications. Hence, for practical engineering design purposes, it is necessary to have simpler procedures to evaluate the heat transfer rate through MLI. In this paper, four different empirical models for heat transfer are evaluated by fitting against experimentally observed heat flux through MLI blankets of various configurations, and the results are discussed.

  8. Species delimitation using Bayes factors: simulations and application to the Sceloporus scalaris species group (Squamata: Phrynosomatidae).

    PubMed

    Grummer, Jared A; Bryson, Robert W; Reeder, Tod W

    2014-03-01

    Current molecular methods of species delimitation are limited by the types of species delimitation models and scenarios that can be tested. Bayes factors allow for more flexibility in testing non-nested species delimitation models and hypotheses of individual assignment to alternative lineages. Here, we examined the efficacy of Bayes factors in delimiting species through simulations and empirical data from the Sceloporus scalaris species group. Marginal-likelihood scores of competing species delimitation models, from which Bayes factor values were compared, were estimated with four different methods: harmonic mean estimation (HME), smoothed harmonic mean estimation (sHME), path-sampling/thermodynamic integration (PS), and stepping-stone (SS) analysis. We also performed model selection using a posterior simulation-based analog of the Akaike information criterion through Markov chain Monte Carlo analysis (AICM). Bayes factor species delimitation results from the empirical data were then compared with results from the reversible-jump MCMC (rjMCMC) coalescent-based species delimitation method Bayesian Phylogenetics and Phylogeography (BP&P). Simulation results show that HME and sHME perform poorly compared with PS and SS marginal-likelihood estimators when identifying the true species delimitation model. Furthermore, Bayes factor delimitation (BFD) of species showed improved performance when species limits are tested by reassigning individuals between species, as opposed to either lumping or splitting lineages. In the empirical data, BFD through PS and SS analyses, as well as the rjMCMC method, each provide support for the recognition of all scalaris group taxa as independent evolutionary lineages. Bayes factor species delimitation and BP&P also support the recognition of three previously undescribed lineages. In both simulated and empirical data sets, harmonic and smoothed harmonic mean marginal-likelihood estimators provided much higher marginal-likelihood estimates than PS and SS estimators. The AICM displayed poor repeatability in both simulated and empirical data sets, and produced inconsistent model rankings across replicate runs with the empirical data. Our results suggest that species delimitation through the use of Bayes factors with marginal-likelihood estimates via PS or SS analyses provide a useful and complementary alternative to existing species delimitation methods.

  9. Human growth and body weight dynamics: an integrative systems model.

    PubMed

    Rahmandad, Hazhir

    2014-01-01

    Quantifying human weight and height dynamics due to growth, aging, and energy balance can inform clinical practice and policy analysis. This paper presents the first mechanism-based model spanning full individual life and capturing changes in body weight, composition and height. Integrating previous empirical and modeling findings and validated against several additional empirical studies, the model replicates key trends in human growth including A) Changes in energy requirements from birth to old ages. B) Short and long-term dynamics of body weight and composition. C) Stunted growth with chronic malnutrition and potential for catch up growth. From obesity policy analysis to treating malnutrition and tracking growth trajectories, the model can address diverse policy questions. For example I find that even without further rise in obesity, the gap between healthy and actual Body Mass Indexes (BMIs) has embedded, for different population groups, a surplus of 14%-24% in energy intake which will be a source of significant inertia in obesity trends. In another analysis, energy deficit percentage needed to reduce BMI by one unit is found to be relatively constant across ages. Accompanying documented and freely available simulation model facilitates diverse applications customized to different sub-populations.

  10. Human Growth and Body Weight Dynamics: An Integrative Systems Model

    PubMed Central

    Rahmandad, Hazhir

    2014-01-01

    Quantifying human weight and height dynamics due to growth, aging, and energy balance can inform clinical practice and policy analysis. This paper presents the first mechanism-based model spanning full individual life and capturing changes in body weight, composition and height. Integrating previous empirical and modeling findings and validated against several additional empirical studies, the model replicates key trends in human growth including A) Changes in energy requirements from birth to old ages. B) Short and long-term dynamics of body weight and composition. C) Stunted growth with chronic malnutrition and potential for catch up growth. From obesity policy analysis to treating malnutrition and tracking growth trajectories, the model can address diverse policy questions. For example I find that even without further rise in obesity, the gap between healthy and actual Body Mass Indexes (BMIs) has embedded, for different population groups, a surplus of 14%–24% in energy intake which will be a source of significant inertia in obesity trends. In another analysis, energy deficit percentage needed to reduce BMI by one unit is found to be relatively constant across ages. Accompanying documented and freely available simulation model facilitates diverse applications customized to different sub-populations. PMID:25479101

  11. Decision-making in healthcare: a practical application of partial least square path modelling to coverage of newborn screening programmes

    PubMed Central

    2012-01-01

    Background Decision-making in healthcare is complex. Research on coverage decision-making has focused on comparative studies for several countries, statistical analyses for single decision-makers, the decision outcome and appraisal criteria. Accounting for decision processes extends the complexity, as they are multidimensional and process elements need to be regarded as latent constructs (composites) that are not observed directly. The objective of this study was to present a practical application of partial least square path modelling (PLS-PM) to evaluate how it offers a method for empirical analysis of decision-making in healthcare. Methods Empirical approaches that applied PLS-PM to decision-making in healthcare were identified through a systematic literature search. PLS-PM was used as an estimation technique for a structural equation model that specified hypotheses between the components of decision processes and the reasonableness of decision-making in terms of medical, economic and other ethical criteria. The model was estimated for a sample of 55 coverage decisions on the extension of newborn screening programmes in Europe. Results were evaluated by standard reliability and validity measures for PLS-PM. Results After modification by dropping two indicators that showed poor measures in the measurement models’ quality assessment and were not meaningful for newborn screening, the structural equation model estimation produced plausible results. The presence of three influences was supported: the links between both stakeholder participation or transparency and the reasonableness of decision-making; and the effect of transparency on the degree of scientific rigour of assessment. Reliable and valid measurement models were obtained to describe the composites of ‘transparency’, ‘participation’, ‘scientific rigour’ and ‘reasonableness’. Conclusions The structural equation model was among the first applications of PLS-PM to coverage decision-making. It allowed testing of hypotheses in situations where there are links between several non-observable constructs. PLS-PM was compatible in accounting for the complexity of coverage decisions to obtain a more realistic perspective for empirical analysis. The model specification can be used for hypothesis testing by using larger sample sizes and for data in the full domain of health technologies. PMID:22856325

  12. Decision-making in healthcare: a practical application of partial least square path modelling to coverage of newborn screening programmes.

    PubMed

    Fischer, Katharina E

    2012-08-02

    Decision-making in healthcare is complex. Research on coverage decision-making has focused on comparative studies for several countries, statistical analyses for single decision-makers, the decision outcome and appraisal criteria. Accounting for decision processes extends the complexity, as they are multidimensional and process elements need to be regarded as latent constructs (composites) that are not observed directly. The objective of this study was to present a practical application of partial least square path modelling (PLS-PM) to evaluate how it offers a method for empirical analysis of decision-making in healthcare. Empirical approaches that applied PLS-PM to decision-making in healthcare were identified through a systematic literature search. PLS-PM was used as an estimation technique for a structural equation model that specified hypotheses between the components of decision processes and the reasonableness of decision-making in terms of medical, economic and other ethical criteria. The model was estimated for a sample of 55 coverage decisions on the extension of newborn screening programmes in Europe. Results were evaluated by standard reliability and validity measures for PLS-PM. After modification by dropping two indicators that showed poor measures in the measurement models' quality assessment and were not meaningful for newborn screening, the structural equation model estimation produced plausible results. The presence of three influences was supported: the links between both stakeholder participation or transparency and the reasonableness of decision-making; and the effect of transparency on the degree of scientific rigour of assessment. Reliable and valid measurement models were obtained to describe the composites of 'transparency', 'participation', 'scientific rigour' and 'reasonableness'. The structural equation model was among the first applications of PLS-PM to coverage decision-making. It allowed testing of hypotheses in situations where there are links between several non-observable constructs. PLS-PM was compatible in accounting for the complexity of coverage decisions to obtain a more realistic perspective for empirical analysis. The model specification can be used for hypothesis testing by using larger sample sizes and for data in the full domain of health technologies.

  13. A steady state model of agricultural waste pyrolysis: A mini review.

    PubMed

    Trninić, M; Jovović, A; Stojiljković, D

    2016-09-01

    Agricultural waste is one of the main renewable energy resources available, especially in an agricultural country such as Serbia. Pyrolysis has already been considered as an attractive alternative for disposal of agricultural waste, since the technique can convert this special biomass resource into granular charcoal, non-condensable gases and pyrolysis oils, which could furnish profitable energy and chemical products owing to their high calorific value. In this regard, the development of thermochemical processes requires a good understanding of pyrolysis mechanisms. Experimental and some literature data on the pyrolysis characteristics of corn cob and several other agricultural residues under inert atmosphere were structured and analysed in order to obtain conversion behaviour patterns of agricultural residues during pyrolysis within the temperature range from 300 °C to 1000 °C. Based on experimental and literature data analysis, empirical relationships were derived, including relations between the temperature of the process and yields of charcoal, tar and gas (CO2, CO, H2 and CH4). An analytical semi-empirical model was then used as a tool to analyse the general trends of biomass pyrolysis. Although this semi-empirical model needs further refinement before application to all types of biomass, its prediction capability was in good agreement with results obtained by the literature review. The compact representation could be used in other applications, to conveniently extrapolate and interpolate these results to other temperatures and biomass types. © The Author(s) 2016.

  14. Characterization of HPGe gamma spectrometric detectors systems for Instrumental Neutron Activation Analysis (INAA) at the Colombian Geological Survey

    NASA Astrophysics Data System (ADS)

    Sierra, O.; Parrado, G.; Cañón, Y.; Porras, A.; Alonso, D.; Herrera, D. C.; Peña, M.; Orozco, J.

    2016-07-01

    This paper presents the progress made by the Neutron Activation Analysis (NAA) laboratory at the Colombian Geological Survey (SGC in its Spanish acronym), towards the characterization of its gamma spectrometric systems for Instrumental Neutron Activation Analysis (INAA), with the aim of introducing corrections to the measurements by variations in sample geometry. Characterization includes the empirical determination of the interaction point of gamma radiation inside the Germanium crystal, through the application of a linear model and the use of a fast Monte Carlo N-Particle (MCNP) software to estimate correction factors for differences in counting efficiency that arise from variations in sample density between samples and standards.

  15. The Optics and Alignment of the Divergent Beam Laboratory X-ray Powder Diffractometer and its Calibration Using NIST Standard Reference Materials.

    PubMed

    Cline, James P; Mendenhall, Marcus H; Black, David; Windover, Donald; Henins, Albert

    2015-01-01

    The laboratory X-ray powder diffractometer is one of the primary analytical tools in materials science. It is applicable to nearly any crystalline material, and with advanced data analysis methods, it can provide a wealth of information concerning sample character. Data from these machines, however, are beset by a complex aberration function that can be addressed through calibration with the use of NIST Standard Reference Materials (SRMs). Laboratory diffractometers can be set up in a range of optical geometries; considered herein are those of Bragg-Brentano divergent beam configuration using both incident and diffracted beam monochromators. We review the origin of the various aberrations affecting instruments of this geometry and the methods developed at NIST to align these machines in a first principles context. Data analysis methods are considered as being in two distinct categories: those that use empirical methods to parameterize the nature of the data for subsequent analysis, and those that use model functions to link the observation directly to a specific aspect of the experiment. We consider a multifaceted approach to instrument calibration using both the empirical and model based data analysis methods. The particular benefits of the fundamental parameters approach are reviewed.

  16. MPI Runtime Error Detection with MUST: Advances in Deadlock Detection

    DOE PAGES

    Hilbrich, Tobias; Protze, Joachim; Schulz, Martin; ...

    2013-01-01

    The widely used Message Passing Interface (MPI) is complex and rich. As a result, application developers require automated tools to avoid and to detect MPI programming errors. We present the Marmot Umpire Scalable Tool (MUST) that detects such errors with significantly increased scalability. We present improvements to our graph-based deadlock detection approach for MPI, which cover future MPI extensions. Our enhancements also check complex MPI constructs that no previous graph-based detection approach handled correctly. Finally, we present optimizations for the processing of MPI operations that reduce runtime deadlock detection overheads. Existing approaches often require ( p ) analysis time permore » MPI operation, for p processes. We empirically observe that our improvements lead to sub-linear or better analysis time per operation for a wide range of real world applications.« less

  17. Mediation analysis in nursing research: a methodological review.

    PubMed

    Liu, Jianghong; Ulrich, Connie

    2016-12-01

    Mediation statistical models help clarify the relationship between independent predictor variables and dependent outcomes of interest by assessing the impact of third variables. This type of statistical analysis is applicable for many clinical nursing research questions, yet its use within nursing remains low. Indeed, mediational analyses may help nurse researchers develop more effective and accurate prevention and treatment programs as well as help bridge the gap between scientific knowledge and clinical practice. In addition, this statistical approach allows nurse researchers to ask - and answer - more meaningful and nuanced questions that extend beyond merely determining whether an outcome occurs. Therefore, the goal of this paper is to provide a brief tutorial on the use of mediational analyses in clinical nursing research by briefly introducing the technique and, through selected empirical examples from the nursing literature, demonstrating its applicability in advancing nursing science.

  18. A Physically Motivated and Empirically Calibrated Method to Measure the Effective Temperature, Metallicity, and Ti Abundance of M Dwarfs

    NASA Astrophysics Data System (ADS)

    Veyette, Mark J.; Muirhead, Philip S.; Mann, Andrew W.; Brewer, John M.; Allard, France; Homeier, Derek

    2017-12-01

    The ability to perform detailed chemical analysis of Sun-like F-, G-, and K-type stars is a powerful tool with many applications, including studying the chemical evolution of the Galaxy and constraining planet formation theories. Unfortunately, complications in modeling cooler stellar atmospheres hinders similar analyses of M dwarf stars. Empirically calibrated methods to measure M dwarf metallicity from moderate-resolution spectra are currently limited to measuring overall metallicity and rely on astrophysical abundance correlations in stellar populations. We present a new, empirical calibration of synthetic M dwarf spectra that can be used to infer effective temperature, Fe abundance, and Ti abundance. We obtained high-resolution (R ˜ 25,000), Y-band (˜1 μm) spectra of 29 M dwarfs with NIRSPEC on Keck II. Using the PHOENIX stellar atmosphere modeling code (version 15.5), we generated a grid of synthetic spectra covering a range of temperatures, metallicities, and alpha-enhancements. From our observed and synthetic spectra, we measured the equivalent widths of multiple Fe I and Ti I lines and a temperature-sensitive index based on the FeH band head. We used abundances measured from widely separated solar-type companions to empirically calibrate transformations to the observed indices and equivalent widths that force agreement with the models. Our calibration achieves precisions in T eff, [Fe/H], and [Ti/Fe] of 60 K, 0.1 dex, and 0.05 dex, respectively, and is calibrated for 3200 K < T eff < 4100 K, -0.7 < [Fe/H] < +0.3, and -0.05 < [Ti/Fe] < +0.3. This work is a step toward detailed chemical analysis of M dwarfs at a precision similar to what has been achieved for FGK stars.

  19. Development, verification, and application of a simplified method to estimate total-streambed scour at bridge sites in Illinois

    USGS Publications Warehouse

    Holmes, Robert R.; Dunn, Chad J.

    1996-01-01

    A simplified method to estimate total-streambed scour was developed for application to bridges in the State of Illinois. Scour envelope curves, developed as empirical relations between calculated total scour and bridge-site chracteristics for 213 State highway bridges in Illinois, are used in the method to estimate the 500-year flood scour. These 213 bridges, geographically distributed throughout Illinois, had been previously evaluated for streambed scour with the application of conventional hydraulic and scour-analysis methods recommended by the Federal Highway Administration. The bridge characteristics necessary for application of the simplified bridge scour-analysis method can be obtained from an office review of bridge plans, examination of topographic maps, and reconnaissance-level site inspection. The estimates computed with the simplified method generally resulted in a larger value of 500-year flood total-streambed scour than with the more detailed conventional method. The simplified method was successfully verified with a separate data set of 106 State highway bridges, which are geographically distributed throughout Illinois, and 15 county highway bridges.

  20. The Use of Empirical Studies in the Development of High End Computing Applications

    DTIC Science & Technology

    2009-12-01

    34, Proceeding of 5th ACM-IEEE International Symposium on Empirical Software Engineering (ISESE󈧊), Rio de Janeiro , Brazil, September, 2006. 8. Jeffrey C...Symposium on Empirical Software Engineering, (ISESE), Rio de Janeiro , September, 2006. [26] Zelkowitz M. , V. Basili, S. Asgari, L. Hochstein, J...data is consistently collected across studies. 4. Sanitization of sensitive data. The framework provides external researcher with access to the

  1. From empirical Bayes to full Bayes : methods for analyzing traffic safety data.

    DOT National Transportation Integrated Search

    2004-10-24

    Traffic safety engineers are among the early adopters of Bayesian statistical tools for : analyzing crash data. As in many other areas of application, empirical Bayes methods were : their first choice, perhaps because they represent an intuitively ap...

  2. Empirical Bayes Approaches to Multivariate Fuzzy Partitions.

    ERIC Educational Resources Information Center

    Woodbury, Max A.; Manton, Kenneth G.

    1991-01-01

    An empirical Bayes-maximum likelihood estimation procedure is presented for the application of fuzzy partition models in describing high dimensional discrete response data. The model describes individuals in terms of partial membership in multiple latent categories that represent bounded discrete spaces. (SLD)

  3. Application of household production theory to selected natural-resource problems in less-developed countries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mercer, D.E.

    The objectives are threefold: (1) to perform an analytical survey of household production theory as it relates to natural-resource problems in less-developed countries, (2) to develop a household production model of fuelwood decision making, (3) to derive a theoretical framework for travel-cost demand studies of international nature tourism. The model of household fuelwood decision making provides a rich array of implications and predictions for empirical analysis. For example, it is shown that fuelwood and modern fuels may be either substitutes or complements depending on the interaction of the gross-substitution and income-expansion effects. Therefore, empirical analysis should precede adoption of anymore » inter-fuel substitution policies such as subsidizing kerosene. The fuelwood model also provides a framework for analyzing the conditions and factors determining entry and exit by households into the wood-burning subpopulation, a key for designing optimal household energy policies in the Third World. The international nature tourism travel cost model predicts that the demand for nature tourism is an aggregate of the demand for the individual activities undertaken during the trip.« less

  4. Mean structure analysis from an IRT approach: an application in the context of organizational psychology.

    PubMed

    Revuelta Menéndez, Javier; Ximénez Gómez, Carmen

    2012-11-01

    The application of mean and covariance structure analysis with quantitative data is increasing. However, latent means analysis with qualitative data is not as widespread. This article summarizes the procedures to conduct an analysis of latent means of dichotomous data from an item response theory approach. We illustrate the implementation of these procedures in an empirical example referring to the organizational context, where a multi-group analysis was conducted to compare the latent means of three employee groups in two factors measuring personal preferences and the perceived degree of rewards from the organization. Results show that higher personal motivations are associated with higher perceived importance of the organization, and that these perceptions differ across groups, so that higher-level employees have a lower level of personal and perceived motivation. The article shows how to estimate the factor means and the factor correlation from dichotomous data, and how to assess goodness of fit. Lastly, we provide the M-Plus syntax code in order to facilitate the latent means analyses for applied researchers.

  5. Estimation of trends

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The application of statistical methods to recorded ozone measurements. The effects of a long term depletion of ozone at magnitudes predicted by the NAS is harmful to most forms of life. Empirical prewhitening filters the derivation of which is independent of the underlying physical mechanisms were analyzed. Statistical analysis performs a checks and balances effort. Time series filters variations into systematic and random parts, errors are uncorrelated, and significant phase lag dependencies are identified. The use of time series modeling to enhance the capability of detecting trends is discussed.

  6. Multiscale Shannon entropy and its application in the stock market

    NASA Astrophysics Data System (ADS)

    Gu, Rongbao

    2017-10-01

    In this paper, we perform a multiscale entropy analysis on the Dow Jones Industrial Average Index using the Shannon entropy. The stock index shows the characteristic of multi-scale entropy that caused by noise in the market. The entropy is demonstrated to have significant predictive ability for the stock index in both long-term and short-term, and empirical results verify that noise does exist in the market and can affect stock price. It has important implications on market participants such as noise traders.

  7. Evaluation of quasi-square wave inverter as a power source for induction motors

    NASA Technical Reports Server (NTRS)

    Guynes, B. V.; Haggard, R. L.; Lanier, J. R., Jr.

    1977-01-01

    The relative merits of quasi-square wave inverter-motor technology versus a sine wave inverter-motor system were investigated. The empirical results of several tests on various sizes of wye-wound induction motors are presented with mathematical analysis to support the conclusions of the study. It was concluded that, within the limitations presented, the quasi-square wave inverter-motor system is superior to the more complex sine wave system for most induction motor applications in space.

  8. An empirical approach for estimating stress-coupling lengths for marine-terminating glaciers

    USGS Publications Warehouse

    Enderlin, Ellyn; Hamilton, Gordon S.; O'Neel, Shad; Bartholomaus, Timothy C.; Morlighem, Mathieu; Holt, John W.

    2016-01-01

    Here we present a new empirical method to estimate the SCL for marine-terminating glaciers using high-resolution observations. We use the empirically-determined periodicity in resistive stress oscillations as a proxy for the SCL. Application of our empirical method to two well-studied tidewater glaciers (Helheim Glacier, SE Greenland, and Columbia Glacier, Alaska, USA) demonstrates that SCL estimates obtained using this approach are consistent with theory (i.e., can be parameterized as a function of the ice thickness) and with prior, independent SCL estimates. In order to accurately resolve stress variations, we suggest that similar empirical stress-coupling parameterizations be employed in future analyses of glacier dynamics.

  9. Empirical studies on usability of mHealth apps: a systematic literature review.

    PubMed

    Zapata, Belén Cruz; Fernández-Alemán, José Luis; Idri, Ali; Toval, Ambrosio

    2015-02-01

    The release of smartphones and tablets, which offer more advanced communication and computing capabilities, has led to the strong emergence of mHealth on the market. mHealth systems are being used to improve patients' lives and their health, in addition to facilitating communication between doctors and patients. Researchers are now proposing mHealth applications for many health conditions such as dementia, autism, dysarthria, Parkinson's disease, and so on. Usability becomes a key factor in the adoption of these applications, which are often used by people who have problems when using mobile devices and who have a limited experience of technology. The aim of this paper is to investigate the empirical usability evaluation processes described in a total of 22 selected studies related to mHealth applications by means of a Systematic Literature Review. Our results show that the empirical evaluation methods employed as regards usability could be improved by the adoption of automated mechanisms. The evaluation processes should also be revised to combine more than one method. This paper will help researchers and developers to create more usable applications. Our study demonstrates the importance of adapting health applications to users' need.

  10. [An EMD based time-frequency distribution and its application in EEG analysis].

    PubMed

    Li, Xiaobing; Chu, Meng; Qiu, Tianshuang; Bao, Haiping

    2007-10-01

    Hilbert-Huang transform (HHT) is a new time-frequency analytic method to analyze the nonlinear and the non-stationary signals. The key step of this method is the empirical mode decomposition (EMD), with which any complicated signal can be decomposed into a finite and small number of intrinsic mode functions (IMF). In this paper, a new EMD based method for suppressing the cross-term of Wigner-Ville distribution (WVD) is developed and is applied to analyze the epileptic EEG signals. The simulation data and analysis results show that the new method suppresses the cross-term of the WVD effectively with an excellent resolution.

  11. Boundary layers in centrifugal compressors. [application of boundary layer theory to compressor design

    NASA Technical Reports Server (NTRS)

    Dean, R. C., Jr.

    1974-01-01

    The utility of boundary-layer theory in the design of centrifugal compressors is demonstrated. Boundary-layer development in the diffuser entry region is shown to be important to stage efficiency. The result of an earnest attempt to analyze this boundary layer with the best tools available is displayed. Acceptable prediction accuracy was not achieved. The inaccuracy of boundary-layer analysis in this case would result in stage efficiency prediction as much as four points low. Fluid dynamic reasons for analysis failure are discussed with support from flow data. Empirical correlations used today to circumnavigate the weakness of the theory are illustrated.

  12. A Poisson process approximation for generalized K-5 confidence regions

    NASA Technical Reports Server (NTRS)

    Arsham, H.; Miller, D. R.

    1982-01-01

    One-sided confidence regions for continuous cumulative distribution functions are constructed using empirical cumulative distribution functions and the generalized Kolmogorov-Smirnov distance. The band width of such regions becomes narrower in the right or left tail of the distribution. To avoid tedious computation of confidence levels and critical values, an approximation based on the Poisson process is introduced. This aproximation provides a conservative confidence region; moreover, the approximation error decreases monotonically to 0 as sample size increases. Critical values necessary for implementation are given. Applications are made to the areas of risk analysis, investment modeling, reliability assessment, and analysis of fault tolerant systems.

  13. Laser fringe anemometry for aero engine components

    NASA Technical Reports Server (NTRS)

    Strazisar, A. J.

    1986-01-01

    Advances in flow measurement techniques in turbomachinery continue to be paced by the need to obtain detailed data for use in validating numerical predictions of the flowfield and for use in the development of empirical models for those flow features which cannot be readily modelled numerically. The use of laser anemometry in turbomachinery research has grown over the last 14 years in response to these needs. Based on past applications and current developments, this paper reviews the key issues which are involved when considering the application of laser anemometry to the measurement of turbomachinery flowfields. Aspects of laser fringe anemometer optical design which are applicable to turbomachinery research are briefly reviewed. Application problems which are common to both laser fringe anemometry (LFA) and laser transit anemometry (LTA) such as seed particle injection, optical access to the flowfield, and measurement of rotor rotational position are covered. The efficiency of various data acquisition schemes is analyzed and issues related to data integrity and error estimation are addressed. Real-time data analysis techniques aimed at capturing flow physics in real time are discussed. Finally, data reduction and analysis techniques are discussed and illustrated using examples taken from several LFA turbomachinery applications.

  14. Calculation of free turbulent mixing by interaction approach.

    NASA Technical Reports Server (NTRS)

    Morel, T.; Torda, T. P.

    1973-01-01

    The applicability of Bradshaw's interaction hypothesis to two-dimensional free shear flows was investigated. According to it, flows with velocity extrema may be considered to consist of several interacting layers. The hypothesis leads to a new expression for the shear stress which removes the usual restriction that shear stress vanishes at the velocity extremum. The approach is based on kinetic energy and the length scale equations. The compressible flow equations are simplified by restriction to low Mach numbers, and the range of their applicability is discussed. The empirical functions of the turbulence model are found here to be correlated with the spreading rate of the shear layer. The analysis demonstrates that the interaction hypothesis is a workable concept.

  15. Analysis of the local structure around Cr3+ centers in perovskite KMgF3 using both ab initio (DFT) and semi-empirical (SPM) calculations

    NASA Astrophysics Data System (ADS)

    Emül, Y.; Erbahar, D.; Açıkgöz, M.

    2014-11-01

    The local structure around Cr3+ centers in perovskite KMgF3 crystal have been investigated through the applications of both an ab-initio, density functional theory (DFT), and a semi empirical, superposition model (SPM), analyses. A supercell approach is used for DFT calculations. All the tetragonal (Cr3+-VMg and Cr3+-Li+), trigonal (Cr3+-VK), and CrF5O cluster centers have been considered with various structural models based on the previously suggested experimental inferences. The significant structural changes around the Cr3+ centers induced by Mg2+ or K+ vacancies and the Li substitution at those vacancy sites have been determined and discussed by means of charge distribution. This study provides insight on both the roles of Mg2+ and K+ vacancies and Li+ ion in the local structural properties around Cr3+ centers in KMgF3.

  16. Emergence, evolution and scaling of online social networks.

    PubMed

    Wang, Le-Zhi; Huang, Zi-Gang; Rong, Zhi-Hai; Wang, Xiao-Fan; Lai, Ying-Cheng

    2014-01-01

    Online social networks have become increasingly ubiquitous and understanding their structural, dynamical, and scaling properties not only is of fundamental interest but also has a broad range of applications. Such networks can be extremely dynamic, generated almost instantaneously by, for example, breaking-news items. We investigate a common class of online social networks, the user-user retweeting networks, by analyzing the empirical data collected from Sina Weibo (a massive twitter-like microblogging social network in China) with respect to the topic of the 2011 Japan earthquake. We uncover a number of algebraic scaling relations governing the growth and structure of the network and develop a probabilistic model that captures the basic dynamical features of the system. The model is capable of reproducing all the empirical results. Our analysis not only reveals the basic mechanisms underlying the dynamics of the retweeting networks, but also provides general insights into the control of information spreading on such networks.

  17. Drainage investment and wetland loss: an analysis of the national resources inventory data

    USGS Publications Warehouse

    Douglas, Aaron J.; Johnson, Richard L.

    1994-01-01

    The United States Soil Conservation Service (SCS) conducts a survey for the purpose of establishing an agricultural land use database. This survey is called the National Resources Inventory (NRI) database. The complex NRI land classification system, in conjunction with the quantitative information gathered by the survey, has numerous applications. The current paper uses the wetland area data gathered by the NRI in 1982 and 1987 to examine empirically the factors that generate wetland loss in the United States. The cross-section regression models listed here use the quantity of wetlands, the stock of drainage capital, the realty value of farmland and drainage costs to explain most of the cross-state variation in wetland loss rates. Wetlands preservation efforts by federal agencies assume that pecuniary economic factors play a decisive role in wetland drainage. The empirical models tested in the present paper validate this assumption.

  18. Qgui: A high-throughput interface for automated setup and analysis of free energy calculations and empirical valence bond simulations in biological systems.

    PubMed

    Isaksen, Geir Villy; Andberg, Tor Arne Heim; Åqvist, Johan; Brandsdal, Bjørn Olav

    2015-07-01

    Structural information and activity data has increased rapidly for many protein targets during the last decades. In this paper, we present a high-throughput interface (Qgui) for automated free energy and empirical valence bond (EVB) calculations that use molecular dynamics (MD) simulations for conformational sampling. Applications to ligand binding using both the linear interaction energy (LIE) method and the free energy perturbation (FEP) technique are given using the estrogen receptor (ERα) as a model system. Examples of free energy profiles obtained using the EVB method for the rate-limiting step of the enzymatic reaction catalyzed by trypsin are also shown. In addition, we present calculation of high-precision Arrhenius plots to obtain the thermodynamic activation enthalpy and entropy with Qgui from running a large number of EVB simulations. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Communication: Charge-population based dispersion interactions for molecules and materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stöhr, Martin; Department Chemie, Technische Universität München, Lichtenbergstr. 4, D-85748 Garching; Michelitsch, Georg S.

    2016-04-21

    We introduce a system-independent method to derive effective atomic C{sub 6} coefficients and polarizabilities in molecules and materials purely from charge population analysis. This enables the use of dispersion-correction schemes in electronic structure calculations without recourse to electron-density partitioning schemes and expands their applicability to semi-empirical methods and tight-binding Hamiltonians. We show that the accuracy of our method is en par with established electron-density partitioning based approaches in describing intermolecular C{sub 6} coefficients as well as dispersion energies of weakly bound molecular dimers, organic crystals, and supramolecular complexes. We showcase the utility of our approach by incorporation of the recentlymore » developed many-body dispersion method [Tkatchenko et al., Phys. Rev. Lett. 108, 236402 (2012)] into the semi-empirical density functional tight-binding method and propose the latter as a viable technique to study hybrid organic-inorganic interfaces.« less

  20. Analysis of copper and brass coins of the early roman empire.

    PubMed

    Carter, G F

    1966-01-14

    X-ray fluorescence analysis of 14 copper and brass coins of the early Roman Empire shows differences in composition between coins minted in Rome and in France. Concentrations of tin, lead, and antimony are nearly always less than in coins minted before 29 B.C. or after 54 A.D. Older coins were not melted to make copper coins of the early empire.

  1. Assessing differential expression in two-color microarrays: a resampling-based empirical Bayes approach.

    PubMed

    Li, Dongmei; Le Pape, Marc A; Parikh, Nisha I; Chen, Will X; Dye, Timothy D

    2013-01-01

    Microarrays are widely used for examining differential gene expression, identifying single nucleotide polymorphisms, and detecting methylation loci. Multiple testing methods in microarray data analysis aim at controlling both Type I and Type II error rates; however, real microarray data do not always fit their distribution assumptions. Smyth's ubiquitous parametric method, for example, inadequately accommodates violations of normality assumptions, resulting in inflated Type I error rates. The Significance Analysis of Microarrays, another widely used microarray data analysis method, is based on a permutation test and is robust to non-normally distributed data; however, the Significance Analysis of Microarrays method fold change criteria are problematic, and can critically alter the conclusion of a study, as a result of compositional changes of the control data set in the analysis. We propose a novel approach, combining resampling with empirical Bayes methods: the Resampling-based empirical Bayes Methods. This approach not only reduces false discovery rates for non-normally distributed microarray data, but it is also impervious to fold change threshold since no control data set selection is needed. Through simulation studies, sensitivities, specificities, total rejections, and false discovery rates are compared across the Smyth's parametric method, the Significance Analysis of Microarrays, and the Resampling-based empirical Bayes Methods. Differences in false discovery rates controls between each approach are illustrated through a preterm delivery methylation study. The results show that the Resampling-based empirical Bayes Methods offer significantly higher specificity and lower false discovery rates compared to Smyth's parametric method when data are not normally distributed. The Resampling-based empirical Bayes Methods also offers higher statistical power than the Significance Analysis of Microarrays method when the proportion of significantly differentially expressed genes is large for both normally and non-normally distributed data. Finally, the Resampling-based empirical Bayes Methods are generalizable to next generation sequencing RNA-seq data analysis.

  2. Prediction and verification of creep behavior in metallic materials and components, for the space shuttle thermal protection system. Volume 1, phase 1: Cyclic materials creep predictions

    NASA Technical Reports Server (NTRS)

    Davis, J. W.; Cramer, B. A.

    1974-01-01

    Cyclic creep response was investigated and design methods applicable to thermal protection system structures were developed. The steady-state (constant temperature and load) and cyclic creep response characteristics of four alloys were studied. Steady-state creep data were gathered through a literature survey to establish reference data bases. These data bases were used to develop empirical equations describing creep as a function of time, temperature, and stress and as a basis of comparison for test data. Steady-state creep tests and tensile cyclic tests were conducted. The following factors were investigated: material thickness and rolling direction; material cyclic creep response under varying loads and temperatures; constant stress and temperature cycles representing flight conditions; changing stresses present in a creeping beam as a result of stress redistribution; and complex stress and temperature profiles representative of space shuttle orbiter trajectories. A computer program was written, applying creep hardening theories and empirical equations for creep, to aid in analysis of test data. Results are considered applicable to a variety of structures which are cyclicly exposed to creep producing thermal environments.

  3. Evaluation of empirical rule of linearly correlated peptide selection (ERLPS) for proteotypic peptide-based quantitative proteomics.

    PubMed

    Liu, Kehui; Zhang, Jiyang; Fu, Bin; Xie, Hongwei; Wang, Yingchun; Qian, Xiaohong

    2014-07-01

    Precise protein quantification is essential in comparative proteomics. Currently, quantification bias is inevitable when using proteotypic peptide-based quantitative proteomics strategy for the differences in peptides measurability. To improve quantification accuracy, we proposed an "empirical rule for linearly correlated peptide selection (ERLPS)" in quantitative proteomics in our previous work. However, a systematic evaluation on general application of ERLPS in quantitative proteomics under diverse experimental conditions needs to be conducted. In this study, the practice workflow of ERLPS was explicitly illustrated; different experimental variables, such as, different MS systems, sample complexities, sample preparations, elution gradients, matrix effects, loading amounts, and other factors were comprehensively investigated to evaluate the applicability, reproducibility, and transferability of ERPLS. The results demonstrated that ERLPS was highly reproducible and transferable within appropriate loading amounts and linearly correlated response peptides should be selected for each specific experiment. ERLPS was used to proteome samples from yeast to mouse and human, and in quantitative methods from label-free to O18/O16-labeled and SILAC analysis, and enabled accurate measurements for all proteotypic peptide-based quantitative proteomics over a large dynamic range. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Optimal bioprocess design through a gene regulatory network - growth kinetic hybrid model: Towards Replacing Monod kinetics.

    PubMed

    Tsipa, Argyro; Koutinas, Michalis; Usaku, Chonlatep; Mantalaris, Athanasios

    2018-05-02

    Currently, design and optimisation of biotechnological bioprocesses is performed either through exhaustive experimentation and/or with the use of empirical, unstructured growth kinetics models. Whereas, elaborate systems biology approaches have been recently explored, mixed-substrate utilisation is predominantly ignored despite its significance in enhancing bioprocess performance. Herein, bioprocess optimisation for an industrially-relevant bioremediation process involving a mixture of highly toxic substrates, m-xylene and toluene, was achieved through application of a novel experimental-modelling gene regulatory network - growth kinetic (GRN-GK) hybrid framework. The GRN model described the TOL and ortho-cleavage pathways in Pseudomonas putida mt-2 and captured the transcriptional kinetics expression patterns of the promoters. The GRN model informed the formulation of the growth kinetics model replacing the empirical and unstructured Monod kinetics. The GRN-GK framework's predictive capability and potential as a systematic optimal bioprocess design tool, was demonstrated by effectively predicting bioprocess performance, which was in agreement with experimental values, when compared to four commonly used models that deviated significantly from the experimental values. Significantly, a fed-batch biodegradation process was designed and optimised through the model-based control of TOL Pr promoter expression resulting in 61% and 60% enhanced pollutant removal and biomass formation, respectively, compared to the batch process. This provides strong evidence of model-based bioprocess optimisation at the gene level, rendering the GRN-GK framework as a novel and applicable approach to optimal bioprocess design. Finally, model analysis using global sensitivity analysis (GSA) suggests an alternative, systematic approach for model-driven strain modification for synthetic biology and metabolic engineering applications. Copyright © 2018. Published by Elsevier Inc.

  5. Concurrent enterprise: a conceptual framework for enterprise supply-chain network activities

    NASA Astrophysics Data System (ADS)

    Addo-Tenkorang, Richard; Helo, Petri T.; Kantola, Jussi

    2017-04-01

    Supply-chain management (SCM) in manufacturing industries has evolved significantly over the years. Recently, a lot more relevant research has picked up on the development of integrated solutions. Thus, seeking a collaborative optimisation of geographical, just-in-time (JIT), quality (customer demand/satisfaction) and return-on-investment (profits), aspects of organisational management and planning through 'best practice' business-process management - concepts and application; employing system tools such as certain applications/aspects of enterprise resource planning (ERP) - SCM systems information technology (IT) enablers to enhance enterprise integrated product development/concurrent engineering principles. This article assumed three main organisation theory applications in positioning its assumptions. Thus, proposing a feasible industry-specific framework not currently included within the SCOR model's level four (4) implementation level, as well as other existing SCM integration reference models such as in the MIT process handbook's - Process Interchange Format (PIF), the TOVE project, etc. which could also be replicated in other SCs. However, the wider focus of this paper's contribution will be concentrated on a complimentary proposed framework to the SCC's SCOR reference model. Quantitative empirical closed-ended questionnaires in addition to the main data collected from a qualitative empirical real-life industrial-based pilot case study were used: To propose a conceptual concurrent enterprise framework for SCM network activities. This research adopts a design structure matrix simulation approach analysis to propose an optimal enterprise SCM-networked value-adding, customised master data-management platform/portal for efficient SCM network information exchange and an effective supply-chain (SC) network systems-design teams' structure. Furthermore, social network theory analysis will be employed in a triangulation approach with statistical correlation analysis to assess the scale/level of frequency, importance, level of collaborative-ness, mutual trust as well as roles and responsibility among the enterprise SCM network for systems product development (PD) design teams' technical communication network as well as extensive literature reviews.

  6. Correlation of refrigerant mass flow rate through adiabatic capillary tubes using mixture refrigerant carbondioxide and ethane for low temperature applications

    NASA Astrophysics Data System (ADS)

    Nasruddin, Syaka, Darwin R. B.; Alhamid, M. Idrus

    2012-06-01

    Various binary mixtures of carbon dioxide and hydrocarbons, especially propane or ethane, as alternative natural refrigerants to Chlorofluorocarbons (CFCs) or Hydro fluorocarbons (HFCs) are presented in this paper. Their environmental performance is friendly, with an ozone depletion potential (ODP) of zero and Global-warming potential (GWP) smaller than 20. The capillary tube performance for the alternative refrigerant HFC HCand mixed refrigerants have been widely studied. However, studies that discuss the performance of the capillary tube to a mixture of natural refrigerants, in particular a mixture of azeotrope carbon dioxide and ethane is still undeveloped. A method of empirical correlation to determine the mass flow rate and pipe length has an important role in the design of the capillary tube for industrial refrigeration. Based on the variables that effect the rate of mass flow of refrigerant in the capillary tube, the Buckingham Pi theorem formulated eight non-dimensional parameters to be developed into an empirical equations correlation. Furthermore, non-linear regression analysis used to determine the co-efficiency and exponent of this empirical correlation based on experimental verification of the results database.

  7. Contribution of economic evaluation to decision making in early phases of product development: a methodological and empirical review.

    PubMed

    Hartz, Susanne; John, Jürgen

    2008-01-01

    Economic evaluation as an integral part of health technology assessment is today mostly applied to established technologies. Evaluating healthcare innovations in their early states of development has recently attracted attention. Although it offers several benefits, it also holds methodological challenges. The aim of our study was to investigate the possible contributions of economic evaluation to industry's decision making early in product development and to confront the results with the actual use of early data in economic assessments. We conducted a literature research to detect methodological contributions as well as economic evaluations that used data from early phases of product development. Economic analysis can be beneficially used in early phases of product development for various purposes including early market assessment, R&D portfolio management, and first estimations of pricing and reimbursement scenarios. Analytical tools available for these purposes have been identified. Numerous empirical works were detected, but most do not disclose any concrete decision context and could not be directly matched with the suggested applications. Industry can benefit from starting economic evaluation early in product development in several ways. Empirical evidence suggests that there is still potential left unused.

  8. An Investigation of Document Partitions.

    ERIC Educational Resources Information Center

    Shaw, W. M., Jr.

    1986-01-01

    Empirical significance of document partitions is investigated as a function of index term-weight and similarity thresholds. Results show the same empirically preferred partitions can be detected by two independent strategies: an analysis of cluster-based retrieval analysis and an analysis of regularities in the underlying structure of the document…

  9. Mediation analysis in nursing research: a methodological review

    PubMed Central

    Liu, Jianghong; Ulrich, Connie

    2017-01-01

    Mediation statistical models help clarify the relationship between independent predictor variables and dependent outcomes of interest by assessing the impact of third variables. This type of statistical analysis is applicable for many clinical nursing research questions, yet its use within nursing remains low. Indeed, mediational analyses may help nurse researchers develop more effective and accurate prevention and treatment programs as well as help bridge the gap between scientific knowledge and clinical practice. In addition, this statistical approach allows nurse researchers to ask – and answer – more meaningful and nuanced questions that extend beyond merely determining whether an outcome occurs. Therefore, the goal of this paper is to provide a brief tutorial on the use of mediational analyses in clinical nursing research by briefly introducing the technique and, through selected empirical examples from the nursing literature, demonstrating its applicability in advancing nursing science. PMID:26176804

  10. Analysis of the UCSF Symptom Management Theory: Implications for Pediatric Oncology Nursing

    PubMed Central

    Linder, Lauri

    2015-01-01

    Symptom management research is a priority for both children and adults with cancer. The UCSF Symptom Management Theory (SMT) is a middle range theory depicting symptom management as a multidimensional process. A theory analysis using the process described by Walker and Avant evaluated the SMT with attention to application in research involving children with cancer. Application of the SMT in studies involving children has been limited to descriptive studies testing only portions of the theory. Findings of these studies have provided empiric support for the relationships proposed within the SMT. Considerations for future research involving children include attention to measurement of symptoms and clarity regarding the location of the parents and family within the model. With additional testing and refinement, the SMT has the potential to guide nursing research and practice to improve symptoms for children with cancer. PMID:20639345

  11. Prevention concept in industry: improvement in occupational safety and health protection--an empirical study.

    PubMed

    Ramsauer, F

    2001-12-01

    This prevention concept offers a contribution to the expansion of the set of instruments for occupational safety and health protection within workplace prevention. The concept involves the multilateral analysis of work conditions. The utilized instruments include a strategy group, a survey, a health issue round table, and an analysis of work demands, and lead to synergy effects at the results level. Employees are drawn into the analysis of work conditions and workplace design solutions for the improvement of the work situation. The prevention concept was tested in a large company and its application established in practice. It was accepted by all participants, and the comparison with the previous situation (defined only through the analysis of work demands) demonstrated a significant improvement in health protection.

  12. Characterization of HPGe gamma spectrometric detectors systems for Instrumental Neutron Activation Analysis (INAA) at the Colombian Geological Survey

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sierra, O., E-mail: osierra@sgc.gov.co; Parrado, G., E-mail: gparrado@sgc.gov.co; Cañón, Y.

    This paper presents the progress made by the Neutron Activation Analysis (NAA) laboratory at the Colombian Geological Survey (SGC in its Spanish acronym), towards the characterization of its gamma spectrometric systems for Instrumental Neutron Activation Analysis (INAA), with the aim of introducing corrections to the measurements by variations in sample geometry. Characterization includes the empirical determination of the interaction point of gamma radiation inside the Germanium crystal, through the application of a linear model and the use of a fast Monte Carlo N-Particle (MCNP) software to estimate correction factors for differences in counting efficiency that arise from variations in samplemore » density between samples and standards.« less

  13. 'Feel the Feeling': Psychological practitioners' experience of acceptance and commitment therapy well-being training in the workplace.

    PubMed

    Wardley, Matt Nj; Flaxman, Paul E; Willig, Carla; Gillanders, David

    2016-08-01

    This empirical study investigates psychological practitioners' experience of worksite training in acceptance and commitment therapy using an interpretative phenomenological analysis methodology. Semi-structured interviews were conducted with eight participants, and three themes emerged from the interpretative phenomenological analysis data analysis: influence of previous experiences, self and others and impact and application The significance of the experiential nature of the acceptance and commitment therapy training is explored as well as the dual aspects of developing participants' self-care while also considering their own clinical practice. Consistencies and inconsistencies across acceptance and commitment therapy processes are considered as well as clinical implications, study limitations and future research suggestions. © The Author(s) 2014.

  14. STEAM: a software tool based on empirical analysis for micro electro mechanical systems

    NASA Astrophysics Data System (ADS)

    Devasia, Archana; Pasupuleti, Ajay; Sahin, Ferat

    2006-03-01

    In this research a generalized software framework that enables accurate computer aided design of MEMS devices is developed. The proposed simulation engine utilizes a novel material property estimation technique that generates effective material properties at the microscopic level. The material property models were developed based on empirical analysis and the behavior extraction of standard test structures. A literature review is provided on the physical phenomena that govern the mechanical behavior of thin films materials. This survey indicates that the present day models operate under a wide range of assumptions that may not be applicable to the micro-world. Thus, this methodology is foreseen to be an essential tool for MEMS designers as it would develop empirical models that relate the loading parameters, material properties, and the geometry of the microstructures with its performance characteristics. This process involves learning the relationship between the above parameters using non-parametric learning algorithms such as radial basis function networks and genetic algorithms. The proposed simulation engine has a graphical user interface (GUI) which is very adaptable, flexible, and transparent. The GUI is able to encompass all parameters associated with the determination of the desired material property so as to create models that provide an accurate estimation of the desired property. This technique was verified by fabricating and simulating bilayer cantilevers consisting of aluminum and glass (TEOS oxide) in our previous work. The results obtained were found to be very encouraging.

  15. The hydrodynamic basis of the vacuum cleaner effect in continuous-flow PCNL instruments: an empiric approach and mathematical model.

    PubMed

    Mager, R; Balzereit, C; Gust, K; Hüsch, T; Herrmann, T; Nagele, U; Haferkamp, A; Schilling, D

    2016-05-01

    Passive removal of stone fragments in the irrigation stream is one of the characteristics in continuous-flow PCNL instruments. So far the physical principle of this so-called vacuum cleaner effect has not been fully understood yet. The aim of the study was to empirically prove the existence of the vacuum cleaner effect and to develop a physical hypothesis and generate a mathematical model for this phenomenon. In an empiric approach, common low-pressure PCNL instruments and conventional PCNL sheaths were tested using an in vitro model. Flow characteristics were visualized by coloring of irrigation fluid. Influence of irrigation pressure, sheath diameter, sheath design, nephroscope design and position of the nephroscope was assessed. Experiments were digitally recorded for further slow-motion analysis to deduce a physical model. In each tested nephroscope design, we could observe the vacuum cleaner effect. Increase in irrigation pressure and reduction in cross section of sheath sustained the effect. Slow-motion analysis of colored flow revealed a synergism of two effects causing suction and transportation of the stone. For the first time, our model showed a flow reversal in the sheath as an integral part of the origin of the stone transportation during vacuum cleaner effect. The application of Bernoulli's equation provided the explanation of these effects and confirmed our experimental results. We widen the understanding of PCNL with a conclusive physical model, which explains fluid mechanics of the vacuum cleaner effect.

  16. Empirical Assessment of the Mean Block Volume of Rock Masses Intersected by Four Joint Sets

    NASA Astrophysics Data System (ADS)

    Morelli, Gian Luca

    2016-05-01

    The estimation of a representative value for the rock block volume ( V b) is of huge interest in rock engineering in regards to rock mass characterization purposes. However, while mathematical relationships to precisely estimate this parameter from the spacing of joints can be found in literature for rock masses intersected by three dominant joint sets, corresponding relationships do not actually exist when more than three sets occur. In these cases, a consistent assessment of V b can only be achieved by directly measuring the dimensions of several representative natural rock blocks in the field or by means of more sophisticated 3D numerical modeling approaches. However, Palmström's empirical relationship based on the volumetric joint count J v and on a block shape factor β is commonly used in the practice, although strictly valid only for rock masses intersected by three joint sets. Starting from these considerations, the present paper is primarily intended to investigate the reliability of a set of empirical relationships linking the block volume with the indexes most commonly used to characterize the degree of jointing in a rock mass (i.e. the J v and the mean value of the joint set spacings) specifically applicable to rock masses intersected by four sets of persistent discontinuities. Based on the analysis of artificial 3D block assemblies generated using the software AutoCAD, the most accurate best-fit regression has been found between the mean block volume (V_{{{{b}}_{{m}} }}) of tested rock mass samples and the geometric mean value of the spacings of the joint sets delimiting blocks; thus, indicating this mean value as a promising parameter for the preliminary characterization of the block size. Tests on field outcrops have demonstrated that the proposed empirical methodology has the potential of predicting the mean block volume of multiple-set jointed rock masses with an acceptable accuracy for common uses in most practical rock engineering applications.

  17. Partial differential equation-based approach for empirical mode decomposition: application on image analysis.

    PubMed

    Niang, Oumar; Thioune, Abdoulaye; El Gueirea, Mouhamed Cheikh; Deléchelle, Eric; Lemoine, Jacques

    2012-09-01

    The major problem with the empirical mode decomposition (EMD) algorithm is its lack of a theoretical framework. So, it is difficult to characterize and evaluate this approach. In this paper, we propose, in the 2-D case, the use of an alternative implementation to the algorithmic definition of the so-called "sifting process" used in the original Huang's EMD method. This approach, especially based on partial differential equations (PDEs), was presented by Niang in previous works, in 2005 and 2007, and relies on a nonlinear diffusion-based filtering process to solve the mean envelope estimation problem. In the 1-D case, the efficiency of the PDE-based method, compared to the original EMD algorithmic version, was also illustrated in a recent paper. Recently, several 2-D extensions of the EMD method have been proposed. Despite some effort, 2-D versions for EMD appear poorly performing and are very time consuming. So in this paper, an extension to the 2-D space of the PDE-based approach is extensively described. This approach has been applied in cases of both signal and image decomposition. The obtained results confirm the usefulness of the new PDE-based sifting process for the decomposition of various kinds of data. Some results have been provided in the case of image decomposition. The effectiveness of the approach encourages its use in a number of signal and image applications such as denoising, detrending, or texture analysis.

  18. A Tutorial in Bayesian Potential Outcomes Mediation Analysis.

    PubMed

    Miočević, Milica; Gonzalez, Oscar; Valente, Matthew J; MacKinnon, David P

    2018-01-01

    Statistical mediation analysis is used to investigate intermediate variables in the relation between independent and dependent variables. Causal interpretation of mediation analyses is challenging because randomization of subjects to levels of the independent variable does not rule out the possibility of unmeasured confounders of the mediator to outcome relation. Furthermore, commonly used frequentist methods for mediation analysis compute the probability of the data given the null hypothesis, which is not the probability of a hypothesis given the data as in Bayesian analysis. Under certain assumptions, applying the potential outcomes framework to mediation analysis allows for the computation of causal effects, and statistical mediation in the Bayesian framework gives indirect effects probabilistic interpretations. This tutorial combines causal inference and Bayesian methods for mediation analysis so the indirect and direct effects have both causal and probabilistic interpretations. Steps in Bayesian causal mediation analysis are shown in the application to an empirical example.

  19. Probabilistic analysis of tsunami hazards

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2006-01-01

    Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).

  20. Using geometric algebra to represent curvature in shell theory with applications to Starling resistors

    PubMed Central

    Agarwal, A.; Lasenby, J.

    2017-01-01

    We present a novel application of rotors in geometric algebra to represent the change of curvature tensor that is used in shell theory as part of the constitutive law. We introduce a new decomposition of the change of curvature tensor, which has explicit terms for changes of curvature due to initial curvature combined with strain, and changes in rotation over the surface. We use this decomposition to perform a scaling analysis of the relative importance of bending and stretching in flexible tubes undergoing self-excited oscillations. These oscillations have relevance to the lung, in which it is believed that they are responsible for wheezing. The new analysis is necessitated by the fact that the working fluid is air, compared to water in most previous work. We use stereographic imaging to empirically measure the relative importance of bending and stretching energy in observed self-excited oscillations. This enables us to validate our scaling analysis. We show that bending energy is dominated by stretching energy, and the scaling analysis makes clear that this will remain true for tubes in the airways of the lung. PMID:29291106

  1. Using geometric algebra to represent curvature in shell theory with applications to Starling resistors.

    PubMed

    Gregory, A L; Agarwal, A; Lasenby, J

    2017-11-01

    We present a novel application of rotors in geometric algebra to represent the change of curvature tensor that is used in shell theory as part of the constitutive law. We introduce a new decomposition of the change of curvature tensor, which has explicit terms for changes of curvature due to initial curvature combined with strain, and changes in rotation over the surface. We use this decomposition to perform a scaling analysis of the relative importance of bending and stretching in flexible tubes undergoing self-excited oscillations. These oscillations have relevance to the lung, in which it is believed that they are responsible for wheezing. The new analysis is necessitated by the fact that the working fluid is air, compared to water in most previous work. We use stereographic imaging to empirically measure the relative importance of bending and stretching energy in observed self-excited oscillations. This enables us to validate our scaling analysis. We show that bending energy is dominated by stretching energy, and the scaling analysis makes clear that this will remain true for tubes in the airways of the lung.

  2. Implementation of a Computerized Decision Support System to Improve the Appropriateness of Antibiotic Therapy Using Local Microbiologic Data

    PubMed Central

    Rodriguez-Maresca, Manuel; Sorlozano, Antonio; Grau, Magnolia; Rodriguez-Castaño, Rocio; Ruiz-Valverde, Andres; Gutierrez-Fernandez, Jose

    2014-01-01

    A prospective quasi-experimental study was undertaken in 218 patients with suspicion of nosocomial infection hospitalized in a polyvalent ICU where a new electronic device (GERB) has been designed for antibiotic prescriptions. Two GERB-based applications were developed to provide local resistance maps (LRMs) and preliminary microbiological reports with therapeutic recommendation (PMRTRs). Both applications used the data in the Laboratory Information System of the Microbiology Department to report on the optimal empiric therapeutic option, based on the most likely susceptibility profile of the microorganisms potentially responsible for infection in patients and taking into account the local epidemiology of the hospital department/unit. LRMs were used for antibiotic prescription in 20.2% of the patients and PMRTRs in 78.2%, and active antibiotics against the finally identified bacteria were prescribed in 80.0% of the former group and 82.4% of the latter. When neither LMRs nor PMRTRs were considered for empiric treatment prescription, only around 40% of the antibiotics prescribed were active. Hence, the percentage appropriateness of the empiric antibiotic treatments was significantly higher when LRM or PMRTR guidelines were followed rather than other criteria. LRMs and PMRTRs applications are dynamic, highly accessible, and readily interpreted instruments that contribute to the appropriateness of empiric antibiotic treatments. PMID:25197643

  3. Sensitivity Analysis of Empirical Results on Civil War Onset

    ERIC Educational Resources Information Center

    Hegre, Havard; Sambanis, Nicholas

    2006-01-01

    In the literature on civil war onset, several empirical results are not robust or replicable across studies. Studies use different definitions of civil war and analyze different time periods, so readers cannot easily determine if differences in empirical results are due to those factors or if most empirical results are just not robust. The authors…

  4. Rasch Analysis of a New Hierarchical Scoring System for Evaluating Hand Function on the Motor Assessment Scale for Stroke

    PubMed Central

    Sabari, Joyce S.; Woodbury, Michelle; Velozo, Craig A.

    2014-01-01

    Objectives. (1) To develop two independent measurement scales for use as items assessing hand movements and hand activities within the Motor Assessment Scale (MAS), an existing instrument used for clinical assessment of motor performance in stroke survivors; (2) To examine the psychometric properties of these new measurement scales. Design. Scale development, followed by a multicenter observational study. Setting. Inpatient and outpatient occupational therapy programs in eight hospital and rehabilitation facilities in the United States and Canada. Participants. Patients (N = 332) receiving stroke rehabilitation following left (52%) or right (48%) cerebrovascular accident; mean age 64.2 years (sd 15); median 1 month since stroke onset. Intervention. Not applicable. Main Outcome Measures. Data were tested for unidimensionality and reliability, and behavioral criteria were ordered according to difficulty level with Rasch analysis. Results. The new scales assessing hand movements and hand activities met Rasch expectations of unidimensionality and reliability. Conclusion. Following a multistep process of test development, analysis, and refinement, we have redesigned the two scales that comprise the hand function items on the MAS. The hand movement scale contains an empirically validated 10-behavior hierarchy and the hand activities item contains an empirically validated 8-behavior hierarchy. PMID:25177513

  5. The effect of loving-kindness meditation on positive emotions: a meta-analytic review.

    PubMed

    Zeng, Xianglong; Chiu, Cleo P K; Wang, Rong; Oei, Tian P S; Leung, Freedom Y K

    2015-01-01

    While it has been suggested that loving-kindness meditation (LKM) is an effective practice for promoting positive emotions, the empirical evidence in the literature remains unclear. Here, we provide a systematic review of 24 empirical studies (N = 1759) on LKM with self-reported positive emotions. The effect of LKM on positive emotions was estimated with meta-analysis, and the influence of variations across LKM interventions was further explored with subgroup analysis and meta-regression. The meta-analysis showed that (1) medium effect sizes for LKM interventions on daily positive emotions in both wait-list controlled RCTs and non-RCT studies; and (2) small to large effect sizes for the on-going practice of LKM on immediate positive emotions across different comparisons. Further analysis showed that (1) interventions focused on loving-kindness had medium effect size, but interventions focused on compassion showed small effect sizes; (2) the length of interventions and the time spent on meditation did not influence the effect sizes, but the studies without didactic components in interventions had small effect sizes. A few individual studies reported that the nature of positive emotions and individual differences also influenced the results. In sum, LKM practice and interventions are effective in enhancing positive emotions, but more studies are needed to identify the active components of the interventions, to compare different psychological operations, and to explore the applicability in clinical populations.

  6. The effect of loving-kindness meditation on positive emotions: a meta-analytic review

    PubMed Central

    Zeng, Xianglong; Chiu, Cleo P. K.; Wang, Rong; Oei, Tian P. S.; Leung, Freedom Y. K.

    2015-01-01

    While it has been suggested that loving-kindness meditation (LKM) is an effective practice for promoting positive emotions, the empirical evidence in the literature remains unclear. Here, we provide a systematic review of 24 empirical studies (N = 1759) on LKM with self-reported positive emotions. The effect of LKM on positive emotions was estimated with meta-analysis, and the influence of variations across LKM interventions was further explored with subgroup analysis and meta-regression. The meta-analysis showed that (1) medium effect sizes for LKM interventions on daily positive emotions in both wait-list controlled RCTs and non-RCT studies; and (2) small to large effect sizes for the on-going practice of LKM on immediate positive emotions across different comparisons. Further analysis showed that (1) interventions focused on loving-kindness had medium effect size, but interventions focused on compassion showed small effect sizes; (2) the length of interventions and the time spent on meditation did not influence the effect sizes, but the studies without didactic components in interventions had small effect sizes. A few individual studies reported that the nature of positive emotions and individual differences also influenced the results. In sum, LKM practice and interventions are effective in enhancing positive emotions, but more studies are needed to identify the active components of the interventions, to compare different psychological operations, and to explore the applicability in clinical populations. PMID:26579061

  7. The Future of School Board Governance: Relevancy and Revelation

    ERIC Educational Resources Information Center

    Alsbury, Thomas L.

    2008-01-01

    This book combines theoretical debate and empirical evidence of the effectiveness and relevancy of local school boards today. Original theorists of competing school board governance theories, current researchers, and researcher/practitioners provided the latest empirical data about the role of school boards as well as applications for…

  8. Empirical Specification of Utility Functions.

    ERIC Educational Resources Information Center

    Mellenbergh, Gideon J.

    Decision theory can be applied to four types of decision situations in education and psychology: (1) selection; (2) placement; (3) classification; and (4) mastery. For the application of the theory, a utility function must be specified. Usually the utility function is chosen on a priori grounds. In this paper methods for the empirical assessment…

  9. Three Empirical Strategies for Teaching Statistics

    ERIC Educational Resources Information Center

    Marson, Stephen M.

    2007-01-01

    This paper employs a three-step process to analyze three empirically supported strategies for teaching statistics to BSW students. The strategies included: repetition, immediate feedback, and use of original data. First, each strategy is addressed through the literature. Second, the application of employing each of the strategies over the period…

  10. 75 FR 13529 - Combined Notice of Filings #1

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-22

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings 1 March 15..., 2010. Docket Numbers: ER09-1099-003; ER07-412-004. Applicants: Empire Generating Co, LLC; ECP Energy I, LLC. Description: Quarterly Report of ECP Energy I, LLC, and Empire Generating Co, LLC. Filed Date: 02...

  11. How GPs value guidelines applied to patients with multimorbidity: a qualitative study

    PubMed Central

    Luijks, Hilde; Lucassen, Peter; van Weel, Chris; Loeffen, Maartje; Lagro-Janssen, Antoine; Schermer, Tjard

    2015-01-01

    Objectives To explore and describe the value general practitioner (GPs) attribute to medical guidelines when they are applied to patients with multimorbidity, and to describe which benefits GPs experience from guideline adherence in these patients. Also, we aimed to identify limitations from guideline adherence in patients with multimorbidity, as perceived by GPs, and to describe their empirical solutions to manage these obstacles. Design Focus group study with purposive sampling of participants. Focus groups were guided by an experienced moderator who used an interview guide. Interviews were transcribed verbatim. Data analysis was performed by two researchers using the constant comparison analysis technique and field notes were used in the analysis. Data collection proceeded until saturation was reached. Setting Primary care, eastern part of The Netherlands. Participants Dutch GPs, heterogeneous in age, sex and academic involvement. Results 25 GPs participated in five focus groups. GPs valued the guidance that guidelines provide, but experienced shortcomings when they were applied to patients with multimorbidity. Taking these patients’ personal circumstances into account was regarded as important, but it was impeded by a consistent focus on guideline adherence. Preventative measures were considered less appropriate in (elderly) patients with multimorbidity. Moreover, the applicability of guidelines in patients with multimorbidity was questioned. GPs’ extensive practical experience with managing multimorbidity resulted in several empirical solutions, for example, using their ‘common sense’ to respond to the perceived shortcomings. Conclusions GPs applying guidelines for patients with multimorbidity integrate patient-specific factors in their medical decisions, aiming for patient-centred solutions. Such integration of clinical experience and best evidence is required to practise evidence-based medicine. More flexibility in pay-for-performance systems is needed to facilitate this integration. Several improvements in guideline reporting are necessary to enhance the applicability of guidelines in patients with multimorbidity. PMID:26503382

  12. Filtration of human EEG recordings from physiological artifacts with empirical mode method

    NASA Astrophysics Data System (ADS)

    Grubov, Vadim V.; Runnova, Anastasiya E.; Khramova, Marina V.

    2017-03-01

    In the paper we propose the new method for dealing with noise and physiological artifacts in experimental human EEG recordings. The method is based on analysis of EEG signals with empirical mode decomposition (Hilbert-Huang transform). We consider noises and physiological artifacts on EEG as specific oscillatory patterns that cause problems during EEG analysis and can be detected with additional signals recorded simultaneously with EEG (ECG, EMG, EOG, etc.) We introduce the algorithm of the method with following steps: empirical mode decomposition of EEG signal, choosing of empirical modes with artifacts, removing empirical modes with artifacts, reconstruction of the initial EEG signal. We test the method on filtration of experimental human EEG signals from eye-moving artifacts and show high efficiency of the method.

  13. Exploring Empirical Rank-Frequency Distributions Longitudinally through a Simple Stochastic Process

    PubMed Central

    Finley, Benjamin J.; Kilkki, Kalevi

    2014-01-01

    The frequent appearance of empirical rank-frequency laws, such as Zipf’s law, in a wide range of domains reinforces the importance of understanding and modeling these laws and rank-frequency distributions in general. In this spirit, we utilize a simple stochastic cascade process to simulate several empirical rank-frequency distributions longitudinally. We focus especially on limiting the process’s complexity to increase accessibility for non-experts in mathematics. The process provides a good fit for many empirical distributions because the stochastic multiplicative nature of the process leads to an often observed concave rank-frequency distribution (on a log-log scale) and the finiteness of the cascade replicates real-world finite size effects. Furthermore, we show that repeated trials of the process can roughly simulate the longitudinal variation of empirical ranks. However, we find that the empirical variation is often less that the average simulated process variation, likely due to longitudinal dependencies in the empirical datasets. Finally, we discuss the process limitations and practical applications. PMID:24755621

  14. Exploring empirical rank-frequency distributions longitudinally through a simple stochastic process.

    PubMed

    Finley, Benjamin J; Kilkki, Kalevi

    2014-01-01

    The frequent appearance of empirical rank-frequency laws, such as Zipf's law, in a wide range of domains reinforces the importance of understanding and modeling these laws and rank-frequency distributions in general. In this spirit, we utilize a simple stochastic cascade process to simulate several empirical rank-frequency distributions longitudinally. We focus especially on limiting the process's complexity to increase accessibility for non-experts in mathematics. The process provides a good fit for many empirical distributions because the stochastic multiplicative nature of the process leads to an often observed concave rank-frequency distribution (on a log-log scale) and the finiteness of the cascade replicates real-world finite size effects. Furthermore, we show that repeated trials of the process can roughly simulate the longitudinal variation of empirical ranks. However, we find that the empirical variation is often less that the average simulated process variation, likely due to longitudinal dependencies in the empirical datasets. Finally, we discuss the process limitations and practical applications.

  15. Identification of Child Pedestrian Training Objectives: The Role of Task Analysis and Empirical Research.

    ERIC Educational Resources Information Center

    van der Molen, Hugo H.

    1984-01-01

    Describes a study designed to demonstrate that child pedestrian training objectives may be identified systematically through various task analysis methods, making use of different types of empirical information. Early approaches to analysis of pedestrian tasks are reviewed, and an outline of the Traffic Research Centre's pedestrian task analysis…

  16. Parricide: An Empirical Analysis of 24 Years of U.S. Data

    ERIC Educational Resources Information Center

    Heide, Kathleen M.; Petee, Thomas A.

    2007-01-01

    Empirical analysis of homicides in which children have killed parents has been limited. The most comprehensive statistical analysis involving parents as victims was undertaken by Heide and used Supplementary Homicide Report (SHR) data for the 10-year period 1977 to 1986. This article provides an updated examination of characteristics of victims,…

  17. Development of Alabama traffic factors for use in mechanistic-empirical pavement design.

    DOT National Transportation Integrated Search

    2015-02-01

    The pavement engineering community is moving toward design practices that use mechanistic-empirical (M-E) approaches to the design and analysis of pavement structures. This effort is : embodied in the Mechanistic-Empirical Pavement Design Guide (MEPD...

  18. Τhe observational and empirical thermospheric CO2 and NO power do not exhibit power-law behavior; an indication of their reliability

    NASA Astrophysics Data System (ADS)

    Varotsos, C. A.; Efstathiou, M. N.

    2018-03-01

    In this paper we investigate the evolution of the energy emitted by CO2 and NO from the Earth's thermosphere on a global scale using both observational and empirically derived data. In the beginning, we analyze the daily power observations of CO2 and NO received from the Sounding of the Atmosphere using Broadband Emission Radiometry (SABER) equipment on the NASA Thermosphere-Ionosphere-Mesosphere Energetics and Dynamics (TIMED) satellite for the entire period 2002-2016. We then perform the same analysis on the empirical daily power emitted by CO2 and NO that were derived recently from the infrared energy budget of the thermosphere during 1947-2016. The tool used for the analysis of the observational and empirical datasets is the detrended fluctuation analysis, in order to investigate whether the power emitted by CO2 and by NO from the thermosphere exhibits power-law behavior. The results obtained from both observational and empirical data do not support the establishment of the power-law behavior. This conclusion reveals that the empirically derived data are characterized by the same intrinsic properties as those of the observational ones, thus enhancing the validity of their reliability.

  19. Traditional Arabic & Islamic medicine: validation and empirical assessment of a conceptual model in Qatar.

    PubMed

    AlRawi, Sara N; Khidir, Amal; Elnashar, Maha S; Abdelrahim, Huda A; Killawi, Amal K; Hammoud, Maya M; Fetters, Michael D

    2017-03-14

    Evidence indicates traditional medicine is no longer only used for the healthcare of the poor, its prevalence is also increasing in countries where allopathic medicine is predominant in the healthcare system. While these healing practices have been utilized for thousands of years in the Arabian Gulf, only recently has a theoretical model been developed illustrating the linkages and components of such practices articulated as Traditional Arabic & Islamic Medicine (TAIM). Despite previous theoretical work presenting development of the TAIM model, empirical support has been lacking. The objective of this research is to provide empirical support for the TAIM model and illustrate real world applicability. Using an ethnographic approach, we recruited 84 individuals (43 women and 41 men) who were speakers of one of four common languages in Qatar; Arabic, English, Hindi, and Urdu, Through in-depth interviews, we sought confirming and disconfirming evidence of the model components, namely, health practices, beliefs and philosophy to treat, diagnose, and prevent illnesses and/or maintain well-being, as well as patterns of communication about their TAIM practices with their allopathic providers. Based on our analysis, we find empirical support for all elements of the TAIM model. Participants in this research, visitors to major healthcare centers, mentioned using all elements of the TAIM model: herbal medicines, spiritual therapies, dietary practices, mind-body methods, and manual techniques, applied singularly or in combination. Participants had varying levels of comfort sharing information about TAIM practices with allopathic practitioners. These findings confirm an empirical basis for the elements of the TAIM model. Three elements, namely, spiritual healing, herbal medicine, and dietary practices, were most commonly found. Future research should examine the prevalence of TAIM element use, how it differs among various populations, and its impact on health.

  20. On the meaning of the weighted alternative free-response operating characteristic figure of merit.

    PubMed

    Chakraborty, Dev P; Zhai, Xuetong

    2016-05-01

    The free-response receiver operating characteristic (FROC) method is being increasingly used to evaluate observer performance in search tasks. Data analysis requires definition of a figure of merit (FOM) quantifying performance. While a number of FOMs have been proposed, the recommended one, namely, the weighted alternative FROC (wAFROC) FOM, is not well understood. The aim of this work is to clarify the meaning of this FOM by relating it to the empirical area under a proposed wAFROC curve. The weighted wAFROC FOM is defined in terms of a quasi-Wilcoxon statistic that involves weights, coding the clinical importance, assigned to each lesion. A new wAFROC curve is proposed, the y-axis of which incorporates the weights, giving more credit for marking clinically important lesions, while the x-axis is identical to that of the AFROC curve. An expression is derived relating the area under the empirical wAFROC curve to the wAFROC FOM. Examples are presented with small numbers of cases showing how AFROC and wAFROC curves are affected by correct and incorrect decisions and how the corresponding FOMs credit or penalize these decisions. The wAFROC, AFROC, and inferred ROC FOMs were applied to three clinical data sets involving multiple reader FROC interpretations in different modalities. It is shown analytically that the area under the empirical wAFROC curve equals the wAFROC FOM. This theorem is the FROC analog of a well-known theorem developed in 1975 for ROC analysis, which gave meaning to a Wilcoxon statistic based ROC FOM. A similar equivalence applies between the area under the empirical AFROC curve and the AFROC FOM. The examples show explicitly that the wAFROC FOM gives equal importance to all diseased cases, regardless of the number of lesions, a desirable statistical property not shared by the AFROC FOM. Applications to the clinical data sets show that the wAFROC FOM yields results comparable to that using the AFROC FOM. The equivalence theorem gives meaning to the weighted AFROC FOM, namely, it is identical to the empirical area under weighted AFROC curve.

  1. From correspondence to complementarity: The emergence of Bohr's Copenhagen interpretation of quantum mechanics

    NASA Astrophysics Data System (ADS)

    Tanona, Scott Daniel

    I develop a new analysis of Niels Bohr's Copenhagen interpretation of quantum mechanics by examining the development of his views from his earlier use of the correspondence principle in the so-called 'old quantum theory' to his articulation of the idea of complementarity in the context of the novel mathematical formalism of quantum mechanics. I argue that Bohr was motivated not by controversial and perhaps dispensable epistemological ideas---positivism or neo-Kantianism, for example---but by his own unique perspective on the difficulties of creating a new working physics of the internal structure of the atom. Bohr's use of the correspondence principle in the old quantum theory was associated with an empirical methodology that used this principle as an epistemological bridge to connect empirical phenomena with quantum models. The application of the correspondence principle required that one determine the validity of the idealizations and approximations necessary for the judicious use of classical physics within quantum theory. Bohr's interpretation of the new quantum mechanics then focused on the largely unexamined ways in which the developing abstract mathematical formalism is given empirical content by precisely this process of approximation. Significant consistency between his later interpretive framework and his forms of argument with the correspondence principle indicate that complementarity is best understood as a relationship among the various approximations and idealizations that must be made when one connects otherwise meaningless quantum mechanical symbols to empirical situations or 'experimental arrangements' described using concepts from classical physics. We discover that this relationship is unavoidable not through any sort of a priori analysis of the priority of classical concepts, but because quantum mechanics incorporates the correspondence approach in the way in which it represents quantum properties with matrices of transition probabilities, the empirical meaning of which depend on the situation but in general are tied to the correspondence connection to the spectra. For Bohr, it is then the commutation relations, which arise from the formalism, which inform us of the complementary nature of this approximate representation of quantum properties via the classical equations through which we connect them to experiments.

  2. Wavelet-bounded empirical mode decomposition for measured time series analysis

    NASA Astrophysics Data System (ADS)

    Moore, Keegan J.; Kurt, Mehmet; Eriten, Melih; McFarland, D. Michael; Bergman, Lawrence A.; Vakakis, Alexander F.

    2018-01-01

    Empirical mode decomposition (EMD) is a powerful technique for separating the transient responses of nonlinear and nonstationary systems into finite sets of nearly orthogonal components, called intrinsic mode functions (IMFs), which represent the dynamics on different characteristic time scales. However, a deficiency of EMD is the mixing of two or more components in a single IMF, which can drastically affect the physical meaning of the empirical decomposition results. In this paper, we present a new approached based on EMD, designated as wavelet-bounded empirical mode decomposition (WBEMD), which is a closed-loop, optimization-based solution to the problem of mode mixing. The optimization routine relies on maximizing the isolation of an IMF around a characteristic frequency. This isolation is measured by fitting a bounding function around the IMF in the frequency domain and computing the area under this function. It follows that a large (small) area corresponds to a poorly (well) separated IMF. An optimization routine is developed based on this result with the objective of minimizing the bounding-function area and with the masking signal parameters serving as free parameters, such that a well-separated IMF is extracted. As examples of application of WBEMD we apply the proposed method, first to a stationary, two-component signal, and then to the numerically simulated response of a cantilever beam with an essentially nonlinear end attachment. We find that WBEMD vastly improves upon EMD and that the extracted sets of IMFs provide insight into the underlying physics of the response of each system.

  3. Theoretical vs. empirical discriminability: the application of ROC methods to eyewitness identification.

    PubMed

    Wixted, John T; Mickes, Laura

    2018-01-01

    Receiver operating characteristic (ROC) analysis was introduced to the field of eyewitness identification 5 years ago. Since that time, it has been both influential and controversial, and the debate has raised an issue about measuring discriminability that is rarely considered. The issue concerns the distinction between empirical discriminability (measured by area under the ROC curve) vs. underlying/theoretical discriminability (measured by d' or variants of it). Under most circumstances, the two measures will agree about a difference between two conditions in terms of discriminability. However, it is possible for them to disagree, and that fact can lead to confusion about which condition actually yields higher discriminability. For example, if the two conditions have implications for real-world practice (e.g., a comparison of competing lineup formats), should a policymaker rely on the area-under-the-curve measure or the theory-based measure? Here, we illustrate the fact that a given empirical ROC yields as many underlying discriminability measures as there are theories that one is willing to take seriously. No matter which theory is correct, for practical purposes, the singular area-under-the-curve measure best identifies the diagnostically superior procedure. For that reason, area under the ROC curve informs policy in a way that underlying theoretical discriminability never can. At the same time, theoretical measures of discriminability are equally important, but for a different reason. Without an adequate theoretical understanding of the relevant task, the field will be in no position to enhance empirical discriminability.

  4. Artifact interactions retard technological improvement: An empirical study

    PubMed Central

    Magee, Christopher L.

    2017-01-01

    Empirical research has shown performance improvement of many different technological domains occurs exponentially but with widely varying improvement rates. What causes some technologies to improve faster than others do? Previous quantitative modeling research has identified artifact interactions, where a design change in one component influences others, as an important determinant of improvement rates. The models predict that improvement rate for a domain is proportional to the inverse of the domain’s interaction parameter. However, no empirical research has previously studied and tested the dependence of improvement rates on artifact interactions. A challenge to testing the dependence is that any method for measuring interactions has to be applicable to a wide variety of technologies. Here we propose a novel patent-based method that is both technology domain-agnostic and less costly than alternative methods. We use textual content from patent sets in 27 domains to find the influence of interactions on improvement rates. Qualitative analysis identified six specific keywords that signal artifact interactions. Patent sets from each domain were then examined to determine the total count of these 6 keywords in each domain, giving an estimate of artifact interactions in each domain. It is found that improvement rates are positively correlated with the inverse of the total count of keywords with Pearson correlation coefficient of +0.56 with a p-value of 0.002. The results agree with model predictions, and provide, for the first time, empirical evidence that artifact interactions have a retarding effect on improvement rates of technological domains. PMID:28777798

  5. An Empirical Examination of Reverse Auction Appropriateness in B2B Source Selection

    DTIC Science & Technology

    2006-01-01

    Measures of Marketing Constructs," Journal of Marketing Research , Vol. 16, No.2, pp. 64 - 73. Cohn, L. (2000), "B2B: The Hottest Net Bet Yet...34 Psycholgical Bulletin, Vol. 103, No.3, pp. 411-423. Armstrong, J.S. and T.S. Overton (1977), "Estimating Nonresponse Bias in Mail Surveys," Journal of Marketing ... Research , Vol.14, No.3, pp. 396-402. Bagozzi, R. (1983), "Issues in the Application of Covariance Structure Analysis: A Further Comment," Journal of

  6. Phase holdups in three-phase fluidized beds in the presence of disc promoter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murty, M.S.N.; Ramesh, K.V.; Venkateswarlu, P.

    2011-02-15

    Three-phase fluidized beds are found to have wide applications in process industries. The present investigation essentially comprises of the studies on gas holdup, liquid holdup and bed porosity in three-phase fluidized beds with coaxially placed disc promoter. Holdup data were obtained from bed expansion and pressure drop measurements. Analysis of the data was done to elucidate the effects of dynamic and geometric parameters on gas holdup, liquid holdup and bed porosity. Data were correlated and useful equations were obtained from empirical modeling. (author)

  7. A Tentative Study on the Evaluation of Community Health Service Quality*

    NASA Astrophysics Data System (ADS)

    Ma, Zhi-qiang; Zhu, Yong-yue

    Community health service is the key point of health reform in China. Based on pertinent studies, this paper constructed an indicator system for the community health service quality evaluation from such five perspectives as visible image, reliability, responsiveness, assurance and sympathy, according to service quality evaluation scale designed by Parasuraman, Zeithaml and Berry. A multilevel fuzzy synthetical evaluation model was constructed to evaluate community health service by fuzzy mathematics theory. The applicability and maneuverability of the evaluation indicator system and evaluation model were verified by empirical analysis.

  8. Functionality limit of classical simulated annealing

    NASA Astrophysics Data System (ADS)

    Hasegawa, M.

    2015-09-01

    By analyzing the system dynamics in the landscape paradigm, optimization function of classical simulated annealing is reviewed on the random traveling salesman problems. The properly functioning region of the algorithm is experimentally determined in the size-time plane and the influence of its boundary on the scalability test is examined in the standard framework of this method. From both results, an empirical choice of temperature length is plausibly explained as a minimum requirement that the algorithm maintains its scalability within its functionality limit. The study exemplifies the applicability of computational physics analysis to the optimization algorithm research.

  9. AN EMPIRICAL FORMULA FOR THE DISTRIBUTION FUNCTION OF A THIN EXPONENTIAL DISC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharma, Sanjib; Bland-Hawthorn, Joss

    2013-08-20

    An empirical formula for a Shu distribution function that reproduces a thin disc with exponential surface density to good accuracy is presented. The formula has two free parameters that specify the functional form of the velocity dispersion. Conventionally, this requires the use of an iterative algorithm to produce the correct solution, which is computationally taxing for applications like Markov Chain Monte Carlo model fitting. The formula has been shown to work for flat, rising, and falling rotation curves. Application of this methodology to one of the Dehnen distribution functions is also shown. Finally, an extension of this formula to reproducemore » velocity dispersion profiles that are an exponential function of radius is also presented. Our empirical formula should greatly aid the efficient comparison of disc models with large stellar surveys or N-body simulations.« less

  10. Synchrony in Dyadic Psychotherapy Sessions

    NASA Astrophysics Data System (ADS)

    Ramseyer, Fabian; Tschacher, Wolfgang

    Synchrony is a multi-faceted concept used in diverse domains such as physics, biology, and the social sciences. This chapter reviews some of the evidence of nonverbal synchrony in human communication, with a main focus on the role of synchrony in the psychotherapeutic setting. Nonverbal synchrony describes coordinated behavior of patient and therapist. Its association with empathy, rapport and the therapeutic relationship has been pointed out repeatedly, yet close evaluation of empirical studies suggests that the evidence remains inconclusive. Particularly in naturalistic studies, research with quantitative measures of synchrony is still lacking. We introduce a new empirical approach for the study of synchrony in psychotherapies under field conditions: Motion Energy Analysis (MEA). This is a video-based algorithm that quantifies the amount of movement in freely definable regions of interest. Our statistical analysis detects synchrony on a global level, irrespective of the specific body parts moving. Synchrony thus defined can be considered as a general measure of movement coordination between interacting individuals. Data from a sequence of N = 21 therapy sessions taken from one psychotherapy dyad shows a high positive relationship between synchrony and the therapeutic bond. Nonverbal synchrony can thus be considered a promising concept for research on the therapeutic alliance. Further areas of application are discussed.

  11. Application of the empirical orthogonal function to study the rainfall pattern in Daerah Istimewa Yogyakarta province

    NASA Astrophysics Data System (ADS)

    Adi-Kusumo, Fajar; Gunardi, Utami, Herni; Nurjani, Emilya; Sopaheluwakan, Ardhasena; Aluicius, Irwan Endrayanto; Christiawan, Titus

    2016-02-01

    We consider the Empirical Orthogonal Function (EOF) to study the rainfall pattern in Daerah Istimewa Yogyakarta (DIY) Province, Indonesia. The EOF is one of the important methods to study the dominant pattern of the data using dimension reduction technique. EOF makes possible to reduce the huge dimension of observed data into a smaller one without losing its significant information in order to figures the whole data. The methods is also known as Principal Components Analysis (PCA) which is conducted to find the pattern of the data. DIY Province is one of the province in Indonesia which has special characteristics related to the rainfall pattern. This province has an active volcano, karst, highlands, and also some lower area including beach. This province is bounded by the Indonesian ocean which is one of the important factor to provide the rainfall. We use at least ten years rainfall monthly data of all stations in this area and study the rainfall characteristics based on the four regencies of the province. EOF analysis is conducted to analyze data in order to decide the station groups which have similar characters.

  12. An Empirical Bayes Approach to Mantel-Haenszel DIF Analysis.

    ERIC Educational Resources Information Center

    Zwick, Rebecca; Thayer, Dorothy T.; Lewis, Charles

    1999-01-01

    Developed an empirical Bayes enhancement to Mantel-Haenszel (MH) analysis of differential item functioning (DIF) in which it is assumed that the MH statistics are normally distributed and that the prior distribution of underlying DIF parameters is also normal. (Author/SLD)

  13. General Strain Theory and School Bullying: An Empirical Test in South Korea

    ERIC Educational Resources Information Center

    Moon, Byongook; Morash, Merry; McCluskey, John D.

    2012-01-01

    Despite recognition of bullying as a serious school and social problem with negative effects on students' well-being and safety, and the overlap between aggressive bullying acts and delinquent behavior, few empirical studies test the applicability of criminological theories to explaining bullying. This limitation in research is especially evident…

  14. Development of Mathematical Literacy: Results of an Empirical Study

    ERIC Educational Resources Information Center

    Kaiser, Gabriele; Willander, Torben

    2005-01-01

    In the paper the results of an empirical study, which has evaluated the development of mathematical literacy in an innovative teaching programme, are presented. The theoretical approach of mathematical literacy relies strongly on applications and modelling and the study follows the approach of R. Bybee, who develops a theoretical concept of…

  15. Theoretical Implementations of Various Mobile Applications Used in English Language Learning

    ERIC Educational Resources Information Center

    Small, Melissa

    2014-01-01

    This review of the theoretical framework for Mastery Learning Theory and Sense of Community theories is provided in conjunction with a review of the literature for mobile technology in relation to language learning. Although empirical research is minimal for mobile phone technology as an aid for language learning, the empirical research that…

  16. An Empirical Study of the Application of Psychological Principles to the Teaching of Orienteering.

    ERIC Educational Resources Information Center

    Martland, J. R.

    1983-01-01

    An empirical study was carried out to explore effects of three sets of schedules developed by Edgar Stones as guidelines conducive to student learning. Guidelines for concept teaching, psychomotor skill development, and teaching problem solving formed the instructional framework for teaching 11-year-old children the principles of navigational…

  17. Computational techniques in tribology and material science at the atomic level

    NASA Technical Reports Server (NTRS)

    Ferrante, J.; Bozzolo, G. H.

    1992-01-01

    Computations in tribology and material science at the atomic level present considerable difficulties. Computational techniques ranging from first-principles to semi-empirical and their limitations are discussed. Example calculations of metallic surface energies using semi-empirical techniques are presented. Finally, application of the methods to calculation of adhesion and friction are presented.

  18. An Empirical Test of Mnemonic Devices to Improve Learning in Elementary Accounting

    ERIC Educational Resources Information Center

    Laing, Gregory Kenneth

    2010-01-01

    The author empirically examined the use of mnemonic devices to enhance learning in first-year accounting at university. The experiment was conducted on three groups using learning strategy application as between participant's factors. The means of the scores from pre- and posttests were analyzed using the student "t" test. No significant…

  19. Diversifying Theory and Science: Expanding the Boundaries of Empirically Supported Interventions in School Psychology.

    ERIC Educational Resources Information Center

    Kratochwill, Thomas R.; Stoiber, Karen Callan

    2000-01-01

    Developmental psychopathology and principles advance in Hughes' target article can be useful to promote development, evaluation, and application of empirically supported interventions (ESIs), but embracing a pathological framework is extremely limited given the diversity in theoretical approaches relevant to school-based ESIs. Argues that in order…

  20. The Limits of Outcomes Analysis; A Comment on "Sex Discrimination in Higher Employment: An Empirical Analysis of the Case Law."

    ERIC Educational Resources Information Center

    Lee, Barbara A.

    1990-01-01

    Questions assumptions by Schoenfeld and Zirkel in a study reviewing gender discrimination cases against institutions of higher education. Critiques the methodology used in that study, cautions about the overall utility of "outcomes analysis," and reports more promising routes of empirical legal research. (15 references) (MLF)

  1. Exploring Advertising in Higher Education: An Empirical Analysis in North America, Europe, and Japan

    ERIC Educational Resources Information Center

    Papadimitriou, Antigoni; Blanco Ramírez, Gerardo

    2015-01-01

    This empirical study explores higher education advertising campaigns displayed in five world cities: Boston, New York, Oslo, Tokyo, and Toronto. The study follows a mixed-methods research design relying on content analysis and multimodal semiotic analysis and employs a conceptual framework based on the knowledge triangle of education, research,…

  2. Preparation of the implementation plan of AASHTO Mechanistic-Empirical Pavement Design Guide (M-EPDG) in Connecticut : Phase II : expanded sensitivity analysis and validation with pavement management data.

    DOT National Transportation Integrated Search

    2017-02-08

    The study re-evaluates distress prediction models using the Mechanistic-Empirical Pavement Design Guide (MEPDG) and expands the sensitivity analysis to a wide range of pavement structures and soils. In addition, an extensive validation analysis of th...

  3. Application of cause-and-effect analysis to potentiometric titration.

    PubMed

    Kufelnicki, A; Lis, S; Meinrath, G

    2005-08-01

    A first attempt has been made to interpret physicochemical data from potentiometric titration analysis in accordance with the complete measurement-uncertainty budget approach (bottom-up) of ISO and Eurachem. A cause-and-effect diagram is established and discussed. Titration data for arsenazo III are used as a basis for this discussion. The commercial software Superquad is used and applied within a computer-intensive resampling framework. The cause-and-effect diagram is applied to evaluation of seven protonation constants of arsenazo III in the pH range 2-10.7. The data interpretation is based on empirical probability distributions and their analysis by second-order correct confidence estimates. The evaluated data are applied in the calculation of a speciation diagram including uncertainty estimates using the probabilistic speciation software Ljungskile.

  4. Assessing the Claims of Participatory Measurement, Reporting and Verification (PMRV) in Achieving REDD+ Outcomes: A Systematic Review

    PubMed Central

    Hawthorne, Sandra; Boissière, Manuel; Felker, Mary Elizabeth; Atmadja, Stibniati

    2016-01-01

    Participation of local communities in the Measurement, Reporting and Verification (MRV) of forest changes has been promoted as a strategy that lowers the cost of MRV and increases their engagement with REDD+. This systematic review of literature assessed the claims of participatory MRV (PMRV) in achieving REDD+ outcomes. We identified 29 PMRV publications that consisted of 20 peer-reviewed and 9 non peer-reviewed publications, with 14 publications being empirically based studies. The evidence supporting PMRV claims was categorized into empirical finding, citation or assumption. Our analysis of the empirical studies showed that PMRV projects were conducted in 17 countries in three tropical continents and across various forest and land tenure types. Most of these projects tested the feasibility of participatory measurement or monitoring, which limited the participation of local communities to data gathering. PMRV claims of providing accurate local biomass measurements and lowering MRV cost were well-supported with empirical evidence. Claims that PMRV supports REDD+ social outcomes that affect local communities directly, such as increased environmental awareness and equity in benefit sharing, were supported with less empirical evidence than REDD+ technical outcomes. This may be due to the difficulties in measuring social outcomes and the slow progress in the development and implementation of REDD+ components outside of experimental research contexts. Although lessons from other monitoring contexts have been used to support PMRV claims, they are only applicable when the enabling conditions can be replicated in REDD+ contexts. There is a need for more empirical evidence to support PMRV claims on achieving REDD+ social outcomes, which may be addressed with more opportunities and rigorous methods for assessing REDD+ social outcomes. Integrating future PMRV studies into local REDD+ implementations may help create those opportunities, while increasing the participation of local communities as local REDD+ stakeholders. Further development and testing of participatory reporting framework are required to integrate PMRV data with the national database. Publication of empirical PMRV studies is encouraged to guide when, where and how PMRV should be implemented. PMID:27812110

  5. Probabilistic empirical prediction of seasonal climate: evaluation and potential applications

    NASA Astrophysics Data System (ADS)

    Dieppois, B.; Eden, J.; van Oldenborgh, G. J.

    2017-12-01

    Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a new evaluation of an established empirical system used to predict seasonal climate across the globe. Forecasts for surface air temperature, precipitation and sea level pressure are produced by the KNMI Probabilistic Empirical Prediction (K-PREP) system every month and disseminated via the KNMI Climate Explorer (climexp.knmi.nl). K-PREP is based on multiple linear regression and built on physical principles to the fullest extent with predictive information taken from the global CO2-equivalent concentration, large-scale modes of variability in the climate system and regional-scale information. K-PREP seasonal forecasts for the period 1981-2016 will be compared with corresponding dynamically generated forecasts produced by operational forecast systems. While there are many regions of the world where empirical forecast skill is extremely limited, several areas are identified where K-PREP offers comparable skill to dynamical systems. We discuss two key points in the future development and application of the K-PREP system: (a) the potential for K-PREP to provide a more useful basis for reference forecasts than those based on persistence or climatology, and (b) the added value of including K-PREP forecast information in multi-model forecast products, at least for known regions of good skill. We also discuss the potential development of stakeholder-driven applications of the K-PREP system, including empirical forecasts for circumboreal fire activity.

  6. Random-effects meta-analysis: the number of studies matters.

    PubMed

    Guolo, Annamaria; Varin, Cristiano

    2017-06-01

    This paper investigates the impact of the number of studies on meta-analysis and meta-regression within the random-effects model framework. It is frequently neglected that inference in random-effects models requires a substantial number of studies included in meta-analysis to guarantee reliable conclusions. Several authors warn about the risk of inaccurate results of the traditional DerSimonian and Laird approach especially in the common case of meta-analysis involving a limited number of studies. This paper presents a selection of likelihood and non-likelihood methods for inference in meta-analysis proposed to overcome the limitations of the DerSimonian and Laird procedure, with a focus on the effect of the number of studies. The applicability and the performance of the methods are investigated in terms of Type I error rates and empirical power to detect effects, according to scenarios of practical interest. Simulation studies and applications to real meta-analyses highlight that it is not possible to identify an approach uniformly superior to alternatives. The overall recommendation is to avoid the DerSimonian and Laird method when the number of meta-analysis studies is modest and prefer a more comprehensive procedure that compares alternative inferential approaches. R code for meta-analysis according to all of the inferential methods examined in the paper is provided.

  7. Discovering new events beyond the catalogue—application of empirical matched field processing to Salton Sea geothermal field seismicity

    DOE PAGES

    Wang, Jingbo; Templeton, Dennise C.; Harris, David B.

    2015-07-30

    Using empirical matched field processing (MFP), we compare 4 yr of continuous seismic data to a set of 195 master templates from within an active geothermal field and identify over 140 per cent more events than were identified using traditional detection and location techniques alone. In managed underground reservoirs, a substantial fraction of seismic events can be excluded from the official catalogue due to an inability to clearly identify seismic-phase onsets. Empirical MFP can improve the effectiveness of current seismic detection and location methodologies by using conventionally located events with higher signal-to-noise ratios as master events to define wavefield templatesmore » that could then be used to map normally discarded indistinct seismicity. Since MFP does not require picking, it can be carried out automatically and rapidly once suitable templates are defined. In this application, we extend MFP by constructing local-distance empirical master templates using Southern California Earthquake Data Center archived waveform data of events originating within the Salton Sea Geothermal Field. We compare the empirical templates to continuous seismic data collected between 1 January 2008 and 31 December 2011. The empirical MFP method successfully identifies 6249 additional events, while the original catalogue reported 4352 events. The majority of these new events are lower-magnitude events with magnitudes between M0.2–M0.8. Here, the increased spatial-temporal resolution of the microseismicity map within the geothermal field illustrates how empirical MFP, when combined with conventional methods, can significantly improve seismic network detection capabilities, which can aid in long-term sustainability and monitoring of managed underground reservoirs.« less

  8. Empirical population and public health ethics: A review and critical analysis to advance robust empirical-normative inquiry.

    PubMed

    Knight, Rod

    2016-05-01

    The field of population and public health ethics (PPHE) has yet to fully embrace the generation of evidence as an important project. This article reviews the philosophical debates related to the 'empirical turn' in clinical bioethics, and critically analyses how PPHE has and can engage with the philosophical implications of generating empirical data within the task of normative inquiry. A set of five conceptual and theoretical issues pertaining to population health that are unresolved and could potentially benefit from empirical PPHE approaches to normative inquiry are discussed. Each issue differs from traditional empirical bioethical approaches, in that they emphasize (1) concerns related to the population, (2) 'upstream' policy-relevant health interventions - within and outside of the health care system and (3) the prevention of illness and disease. Within each theoretical issue, a conceptual example from population and public health approaches to HIV prevention and health promotion is interrogated. Based on the review and critical analysis, this article concludes that empirical-normative approaches to population and public health ethics would be most usefully pursued as an iterative project (rather than as a linear project), in which the normative informs the empirical questions to be asked and new empirical evidence constantly directs conceptualizations of what constitutes morally robust public health practices. Finally, a conceptualization of an empirical population and public health ethics is advanced in order to open up new interdisciplinary 'spaces', in which empirical and normative approaches to ethical inquiry are transparently (and ethically) integrated. © The Author(s) 2015.

  9. Sharing methodology: a worked example of theoretical integration with qualitative data to clarify practical understanding of learning and generate new theoretical development.

    PubMed

    Yardley, Sarah; Brosnan, Caragh; Richardson, Jane

    2013-01-01

    Theoretical integration is a necessary element of study design if clarification of experiential learning is to be achieved. There are few published examples demonstrating how this can be achieved. This methodological article provides a worked example of research methodology that achieved clarification of authentic early experiences (AEEs) through a bi-directional approach to theory and data. Bi-directional refers to our simultaneous use of theory to guide and interrogate empirical data and the use of empirical data to refine theory. We explain the five steps of our methodological approach: (1) understanding the context; (2) critique on existing applications of socio-cultural models to inform study design; (3) data generation; (4) analysis and interpretation and (5) theoretical development through a novel application of Metis. These steps resulted in understanding of how and why different outcomes arose from students participating in AEE. Our approach offers a mechanism for clarification without which evidence-based effective ways to maximise constructive learning cannot be developed. In our example it also contributed to greater theoretical understanding of the influence of social interactions. By sharing this example of research undertaken to develop both theory and educational practice we hope to assist others seeking to conduct similar research.

  10. Semi-Empirical Validation of the Cross-Band Relative Absorption Technique for the Measurement of Molecular Mixing Ratios

    NASA Technical Reports Server (NTRS)

    Pliutau, Denis; Prasad, Narasimha S

    2013-01-01

    Studies were performed to carry out semi-empirical validation of a new measurement approach we propose for molecular mixing ratios determination. The approach is based on relative measurements in bands of O2 and other molecules and as such may be best described as cross band relative absorption (CoBRA). . The current validation studies rely upon well verified and established theoretical and experimental databases, satellite data assimilations and modeling codes such as HITRAN, line-by-line radiative transfer model (LBLRTM), and the modern-era retrospective analysis for research and applications (MERRA). The approach holds promise for atmospheric mixing ratio measurements of CO2 and a variety of other molecules currently under investigation for several future satellite lidar missions. One of the advantages of the method is a significant reduction of the temperature sensitivity uncertainties which is illustrated with application to the ASCENDS mission for the measurement of CO2 mixing ratios (XCO2). Additional advantages of the method include the possibility to closely match cross-band weighting function combinations which is harder to achieve using conventional differential absorption techniques and the potential for additional corrections for water vapor and other interferences without using the data from numerical weather prediction (NWP) models.

  11. Obstacles to prior art searching by the trilateral patent offices: empirical evidence from International Search Reports.

    PubMed

    Wada, Tetsuo

    Despite many empirical studies having been carried out on examiner patent citations, few have scrutinized the obstacles to prior art searching when adding patent citations during patent prosecution at patent offices. This analysis takes advantage of the longitudinal gap between an International Search Report (ISR) as required by the Patent Cooperation Treaty (PCT) and subsequent national examination procedures. We investigate whether several kinds of distance actually affect the probability that prior art is detected at the time of an ISR; this occurs much earlier than in national phase examinations. Based on triadic PCT applications between 2002 and 2005 for the trilateral patent offices (the European Patent Office, the US Patent and Trademark Office, and the Japan Patent Office) and their family-level citations made by the trilateral offices, we find evidence that geographical distance negatively affects the probability of capture of prior patents in an ISR. In addition, the technological complexity of an application negatively affects the probability of capture, whereas the volume of forward citations of prior art affects it positively. These results demonstrate the presence of obstacles to searching at patent offices, and suggest ways to design work sharing by patent offices, such that the duplication of search costs arises only when patent office search horizons overlap.

  12. A New Load Residual Threshold Definition for the Evaluation of Wind Tunnel Strain-Gage Balance Data

    NASA Technical Reports Server (NTRS)

    Ulbrich, N.; Volden, T.

    2016-01-01

    A new definition of a threshold for the detection of load residual outliers of wind tunnel strain-gage balance data was developed. The new threshold is defined as the product between the inverse of the absolute value of the primary gage sensitivity and an empirical limit of the electrical outputs of a strain{gage. The empirical limit of the outputs is either 2.5 microV/V for balance calibration or check load residuals. A reduced limit of 0.5 microV/V is recommended for the evaluation of differences between repeat load points because, by design, the calculation of these differences removes errors in the residuals that are associated with the regression analysis of the data itself. The definition of the new threshold and different methods for the determination of the primary gage sensitivity are discussed. In addition, calibration data of a six-component force balance and a five-component semi-span balance are used to illustrate the application of the proposed new threshold definition to different types of strain{gage balances. During the discussion of the force balance example it is also explained how the estimated maximum expected output of a balance gage can be used to better understand results of the application of the new threshold definition.

  13. SDF technology in location and navigation procedures: a survey of applications

    NASA Astrophysics Data System (ADS)

    Kelner, Jan M.; Ziółkowski, Cezary

    2017-04-01

    The basis for development the Doppler location method, also called the signal Doppler frequency (SDF) method or technology is the analytical solution of the wave equation for a mobile source. This paper presents an overview of the simulations, numerical analysis and empirical studies of the possibilities and the range of SDF method applications. In the paper, the various applications from numerous publications are collected and described. They mainly focus on the use of SDF method in: emitter positioning, electronic warfare, crisis management, search and rescue, navigation. The developed method is characterized by an innovative, unique property among other location methods, because it allows the simultaneous location of the many radio emitters. Moreover, this is the first method based on the Doppler effect, which allows positioning of transmitters, using a single mobile platform. In the paper, the results of the using SDF method by the other teams are also presented.

  14. Towards a universal model for carbon dioxide uptake by plants

    DOE PAGES

    Wang, Han; Prentice, I. Colin; Keenan, Trevor F.; ...

    2017-09-04

    Gross primary production (GPP) - the uptake of carbon dioxide (CO 2) by leaves, and its conversion to sugars by photosynthesis - is the basis for life on land. Earth System Models (ESMs) incorporating the interactions of land ecosystems and climate are used to predict the future of the terrestrial sink for anthropogenic CO 2. ESMs require accurate representation of GPP. However, current ESMs disagree on how GPP responds to environmental variations, suggesting a need for a more robust theoretical framework for modelling. Here in this work, we focus on a key quantity for GPP, the ratio of leaf internalmore » to external CO 2 (χ). χ is tightly regulated and depends on environmental conditions, but is represented empirically and incompletely in today's models. We show that a simple evolutionary optimality hypothesis predicts specific quantitative dependencies of χ on temperature, vapour pressure deficit and elevation; and that these same dependencies emerge from an independent analysis of empirical χ values, derived from a worldwide dataset of >3,500 leaf stable carbon isotope measurements. A single global equation embodying these relationships then unifies the empirical light-use efficiency model with the standard model of C 3 photosynthesis, and successfully predicts GPP measured at eddy-covariance flux sites. This success is notable given the equation's simplicity and broad applicability across biomes and plant functional types. Finally, it provides a theoretical underpinning for the analysis of plant functional coordination across species and emergent properties of ecosystems, and a potential basis for the reformulation of the controls of GPP in next-generation ESMs.« less

  15. Redefinition and global estimation of basal ecosystem respiration rate

    NASA Astrophysics Data System (ADS)

    Yuan, Wenping; Luo, Yiqi; Li, Xianglan; Liu, Shuguang; Yu, Guirui; Zhou, Tao; Bahn, Michael; Black, Andy; Desai, Ankur R.; Cescatti, Alessandro; Marcolla, Barbara; Jacobs, Cor; Chen, Jiquan; Aurela, Mika; Bernhofer, Christian; Gielen, Bert; Bohrer, Gil; Cook, David R.; Dragoni, Danilo; Dunn, Allison L.; Gianelle, Damiano; Grünwald, Thomas; Ibrom, Andreas; Leclerc, Monique Y.; Lindroth, Anders; Liu, Heping; Marchesini, Luca Belelli; Montagnani, Leonardo; Pita, Gabriel; Rodeghiero, Mirco; Rodrigues, Abel; Starr, Gregory; Stoy, Paul C.

    2011-12-01

    Basal ecosystem respiration rate (BR), the ecosystem respiration rate at a given temperature, is a common and important parameter in empirical models for quantifying ecosystem respiration (ER) globally. Numerous studies have indicated that BR varies in space. However, many empirical ER models still use a global constant BR largely due to the lack of a functional description for BR. In this study, we redefined BR to be ecosystem respiration rate at the mean annual temperature. To test the validity of this concept, we conducted a synthesis analysis using 276 site-years of eddy covariance data, from 79 research sites located at latitudes ranging from ˜3°S to ˜70°N. Results showed that mean annual ER rate closely matches ER rate at mean annual temperature. Incorporation of site-specific BR into global ER model substantially improved simulated ER compared to an invariant BR at all sites. These results confirm that ER at the mean annual temperature can be considered as BR in empirical models. A strong correlation was found between the mean annual ER and mean annual gross primary production (GPP). Consequently, GPP, which is typically more accurately modeled, can be used to estimate BR. A light use efficiency GPP model (i.e., EC-LUE) was applied to estimate global GPP, BR and ER with input data from MERRA (Modern Era Retrospective-Analysis for Research and Applications) and MODIS (Moderate resolution Imaging Spectroradiometer). The global ER was 103 Pg C yr -1, with the highest respiration rate over tropical forests and the lowest value in dry and high-latitude areas.

  16. Towards a universal model for carbon dioxide uptake by plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Han; Prentice, I. Colin; Keenan, Trevor F.

    Gross primary production (GPP) - the uptake of carbon dioxide (CO 2) by leaves, and its conversion to sugars by photosynthesis - is the basis for life on land. Earth System Models (ESMs) incorporating the interactions of land ecosystems and climate are used to predict the future of the terrestrial sink for anthropogenic CO 2. ESMs require accurate representation of GPP. However, current ESMs disagree on how GPP responds to environmental variations, suggesting a need for a more robust theoretical framework for modelling. Here in this work, we focus on a key quantity for GPP, the ratio of leaf internalmore » to external CO 2 (χ). χ is tightly regulated and depends on environmental conditions, but is represented empirically and incompletely in today's models. We show that a simple evolutionary optimality hypothesis predicts specific quantitative dependencies of χ on temperature, vapour pressure deficit and elevation; and that these same dependencies emerge from an independent analysis of empirical χ values, derived from a worldwide dataset of >3,500 leaf stable carbon isotope measurements. A single global equation embodying these relationships then unifies the empirical light-use efficiency model with the standard model of C 3 photosynthesis, and successfully predicts GPP measured at eddy-covariance flux sites. This success is notable given the equation's simplicity and broad applicability across biomes and plant functional types. Finally, it provides a theoretical underpinning for the analysis of plant functional coordination across species and emergent properties of ecosystems, and a potential basis for the reformulation of the controls of GPP in next-generation ESMs.« less

  17. Redefinition and global estimation of basal ecosystem respiration rate

    USGS Publications Warehouse

    Yuan, W.; Luo, Y.; Li, X.; Liu, S.; Yu, G.; Zhou, T.; Bahn, M.; Black, A.; Desai, A.R.; Cescatti, A.; Marcolla, B.; Jacobs, C.; Chen, J.; Aurela, M.; Bernhofer, C.; Gielen, B.; Bohrer, G.; Cook, D.R.; Dragoni, D.; Dunn, A.L.; Gianelle, D.; Grnwald, T.; Ibrom, A.; Leclerc, M.Y.; Lindroth, A.; Liu, H.; Marchesini, L.B.; Montagnani, L.; Pita, G.; Rodeghiero, M.; Rodrigues, A.; Starr, G.; Stoy, Paul C.

    2011-01-01

    Basal ecosystem respiration rate (BR), the ecosystem respiration rate at a given temperature, is a common and important parameter in empirical models for quantifying ecosystem respiration (ER) globally. Numerous studies have indicated that BR varies in space. However, many empirical ER models still use a global constant BR largely due to the lack of a functional description for BR. In this study, we redefined BR to be ecosystem respiration rate at the mean annual temperature. To test the validity of this concept, we conducted a synthesis analysis using 276 site-years of eddy covariance data, from 79 research sites located at latitudes ranging from ∼3°S to ∼70°N. Results showed that mean annual ER rate closely matches ER rate at mean annual temperature. Incorporation of site-specific BR into global ER model substantially improved simulated ER compared to an invariant BR at all sites. These results confirm that ER at the mean annual temperature can be considered as BR in empirical models. A strong correlation was found between the mean annual ER and mean annual gross primary production (GPP). Consequently, GPP, which is typically more accurately modeled, can be used to estimate BR. A light use efficiency GPP model (i.e., EC-LUE) was applied to estimate global GPP, BR and ER with input data from MERRA (Modern Era Retrospective-Analysis for Research and Applications) and MODIS (Moderate resolution Imaging Spectroradiometer). The global ER was 103 Pg C yr −1, with the highest respiration rate over tropical forests and the lowest value in dry and high-latitude areas.

  18. Gibbs Sampler-Based λ-Dynamics and Rao-Blackwell Estimator for Alchemical Free Energy Calculation.

    PubMed

    Ding, Xinqiang; Vilseck, Jonah Z; Hayes, Ryan L; Brooks, Charles L

    2017-06-13

    λ-dynamics is a generalized ensemble method for alchemical free energy calculations. In traditional λ-dynamics, the alchemical switch variable λ is treated as a continuous variable ranging from 0 to 1 and an empirical estimator is utilized to approximate the free energy. In the present article, we describe an alternative formulation of λ-dynamics that utilizes the Gibbs sampler framework, which we call Gibbs sampler-based λ-dynamics (GSLD). GSLD, like traditional λ-dynamics, can be readily extended to calculate free energy differences between multiple ligands in one simulation. We also introduce a new free energy estimator, the Rao-Blackwell estimator (RBE), for use in conjunction with GSLD. Compared with the current empirical estimator, the advantage of RBE is that RBE is an unbiased estimator and its variance is usually smaller than the current empirical estimator. We also show that the multistate Bennett acceptance ratio equation or the unbinned weighted histogram analysis method equation can be derived using the RBE. We illustrate the use and performance of this new free energy computational framework by application to a simple harmonic system as well as relevant calculations of small molecule relative free energies of solvation and binding to a protein receptor. Our findings demonstrate consistent and improved performance compared with conventional alchemical free energy methods.

  19. A Universal Threshold for the Assessment of Load and Output Residuals of Strain-Gage Balance Data

    NASA Technical Reports Server (NTRS)

    Ulbrich, N.; Volden, T.

    2017-01-01

    A new universal residual threshold for the detection of load and gage output residual outliers of wind tunnel strain{gage balance data was developed. The threshold works with both the Iterative and Non{Iterative Methods that are used in the aerospace testing community to analyze and process balance data. It also supports all known load and gage output formats that are traditionally used to describe balance data. The threshold's definition is based on an empirical electrical constant. First, the constant is used to construct a threshold for the assessment of gage output residuals. Then, the related threshold for the assessment of load residuals is obtained by multiplying the empirical electrical constant with the sum of the absolute values of all first partial derivatives of a given load component. The empirical constant equals 2.5 microV/V for the assessment of balance calibration or check load data residuals. A value of 0.5 microV/V is recommended for the evaluation of repeat point residuals because, by design, the calculation of these residuals removes errors that are associated with the regression analysis of the data itself. Data from a calibration of a six-component force balance is used to illustrate the application of the new threshold definitions to real{world balance calibration data.

  20. 3-D CFD Simulation and Validation of Oxygen-Rich Hydrocarbon Combustion in a Gas-Centered Swirl Coaxial Injector using a Flamelet-Based Approach

    NASA Technical Reports Server (NTRS)

    Richardson, Brian; Kenny, Jeremy

    2015-01-01

    Injector design is a critical part of the development of a rocket Thrust Chamber Assembly (TCA). Proper detailed injector design can maximize propulsion efficiency while minimizing the potential for failures in the combustion chamber. Traditional design and analysis methods for hydrocarbon-fuel injector elements are based heavily on empirical data and models developed from heritage hardware tests. Using this limited set of data produces challenges when trying to design a new propulsion system where the operating conditions may greatly differ from heritage applications. Time-accurate, Three-Dimensional (3-D) Computational Fluid Dynamics (CFD) modeling of combusting flows inside of injectors has long been a goal of the fluid analysis group at Marshall Space Flight Center (MSFC) and the larger CFD modeling community. CFD simulation can provide insight into the design and function of an injector that cannot be obtained easily through testing or empirical comparisons to existing hardware. However, the traditional finite-rate chemistry modeling approach utilized to simulate combusting flows for complex fuels, such as Rocket Propellant-2 (RP-2), is prohibitively expensive and time consuming even with a large amount of computational resources. MSFC has been working, in partnership with Streamline Numerics, Inc., to develop a computationally efficient, flamelet-based approach for modeling complex combusting flow applications. In this work, a flamelet modeling approach is used to simulate time-accurate, 3-D, combusting flow inside a single Gas Centered Swirl Coaxial (GCSC) injector using the flow solver, Loci-STREAM. CFD simulations were performed for several different injector geometries. Results of the CFD analysis helped guide the design of the injector from an initial concept to a tested prototype. The results of the CFD analysis are compared to data gathered from several hot-fire, single element injector tests performed in the Air Force Research Lab EC-1 test facility located at Edwards Air Force Base.

  1. Posttraumatic Stress Disorder and Intimate Relationship Problems: A Meta-Analysis

    ERIC Educational Resources Information Center

    Taft, Casey T.; Watkins, Laura E.; Stafford, Jane; Street, Amy E.; Monson, Candice M.

    2011-01-01

    Objective: The authors conducted a meta-analysis of empirical studies investigating associations between indices of posttraumatic stress disorder (PTSD) and intimate relationship problems to empirically synthesize this literature. Method: A literature search using PsycINFO, Medline, Published International Literature on Traumatic Stress (PILOTS),…

  2. The Theory of Value-Based Payment Incentives and Their Application to Health Care.

    PubMed

    Conrad, Douglas A

    2015-12-01

    To present the implications of agency theory in microeconomics, augmented by behavioral economics, for different methods of value-based payment in health care; and to derive a set of future research questions and policy recommendations based on that conceptual analysis. Original literature of agency theory, and secondarily behavioral economics, combined with applied research and empirical evidence on the application of those principles to value-based payment. Conceptual analysis and targeted review of theoretical research and empirical literature relevant to value-based payment in health care. Agency theory and secondarily behavioral economics have powerful implications for design of value-based payment in health care. To achieve improved value-better patient experience, clinical quality, health outcomes, and lower costs of care-high-powered incentives should directly target improved care processes, enhanced patient experience, and create achievable benchmarks for improved outcomes. Differing forms of value-based payment (e.g., shared savings and risk, reference pricing, capitation, and bundled payment), coupled with adjunct incentives for quality and efficiency, can be tailored to different market conditions and organizational settings. Payment contracts that are "incentive compatible"-which directly encourage better care and reduced cost, mitigate gaming, and selectively induce clinically efficient providers to participate-will focus differentially on evidence-based care processes, will right-size and structure incentives to avoid crowd-out of providers' intrinsic motivation, and will align patient incentives with value. Future research should address the details of putting these and related principles into practice; further, by deploying these insights in payment design, policy makers will improve health care value for patients and purchasers. © Health Research and Educational Trust.

  3. Analyzing Empirical Notions of Suffering: Advancing Youth Dialogue and Education

    ERIC Educational Resources Information Center

    Baring, Rito V.

    2010-01-01

    This article explores the possibilities of advancing youth dialogue and education among the Filipino youth using empirical notions of students on suffering. Examining empirical data, this analysis exposes uncharted notions of suffering and shows relevant meanings that underscore the plausible trappings of youth dialogue and its benefits on…

  4. Empirical Data Collection and Analysis Using Camtasia and Transana

    ERIC Educational Resources Information Center

    Thorsteinsson, Gisli; Page, Tom

    2009-01-01

    One of the possible techniques for collecting empirical data is video recordings of a computer screen with specific screen capture software. This method for collecting empirical data shows how students use the BSCWII (Be Smart Cooperate Worldwide--a web based collaboration/groupware environment) to coordinate their work and collaborate in…

  5. Valuing Informal Arguments and Empirical Investigations during Collective Argumentation

    ERIC Educational Resources Information Center

    Yopp, David A.

    2012-01-01

    Considerable literature has documented both the pros and cons of students' use of empirical evidence during proving activities. This article presents an analysis of a classroom episode involving in-service middle school, high school, and college teachers that demonstrates that learners need not be steered away from empirical investigations during…

  6. Activity Theory in Empirical Higher Education Research: Choices, Uses and Values

    ERIC Educational Resources Information Center

    Bligh, Brett; Flood, Michelle

    2017-01-01

    This paper contributes to discussion of theory application in higher education research. We examine 59 empirical research papers from specialist journals that use a particular theory: activity theory. We scrutinise stated reasons for choosing the theory, functions played by the theory, and how the theory is valorised. We find that the theory is…

  7. Empirically Guided Coordination of Multiple Evidence-Based Treatments: An Illustration of Relevance Mapping in Children's Mental Health Services

    ERIC Educational Resources Information Center

    Chorpita, Bruce F.; Bernstein, Adam; Daleiden, Eric L.

    2011-01-01

    Objective: Despite substantial progress in the development and identification of psychosocial evidence-based treatments (EBTs) in mental health, there is minimal empirical guidance for selecting an optimal "set" of EBTs maximally applicable and generalizable to a chosen service sample. Relevance mapping is a proposed methodology that…

  8. Educational Virtual Environments: A Ten-Year Review of Empirical Research (1999-2009)

    ERIC Educational Resources Information Center

    Mikropoulos, Tassos A.; Natsis, Antonis

    2011-01-01

    This study is a ten-year critical review of empirical research on the educational applications of Virtual Reality (VR). Results show that although the majority of the 53 reviewed articles refer to science and mathematics, researchers from social sciences also seem to appreciate the educational value of VR and incorporate their learning goals in…

  9. Monitoring of Qualifications and Employment in Austria: An Empirical Approach Based on the Labour Force Survey

    ERIC Educational Resources Information Center

    Lassnigg, Lorenz; Vogtenhuber, Stefan

    2011-01-01

    The empirical approach referred to in this article describes the relationship between education and training (ET) supply and employment in Austria; the use of the new ISCED (International Standard Classification of Education) fields of study variable makes this approach applicable abroad. The purpose is to explore a system that produces timely…

  10. Modeling Traffic on the Web Graph

    NASA Astrophysics Data System (ADS)

    Meiss, Mark R.; Gonçalves, Bruno; Ramasco, José J.; Flammini, Alessandro; Menczer, Filippo

    Analysis of aggregate and individual Web requests shows that PageRank is a poor predictor of traffic. We use empirical data to characterize properties of Web traffic not reproduced by Markovian models, including both aggregate statistics such as page and link traffic, and individual statistics such as entropy and session size. As no current model reconciles all of these observations, we present an agent-based model that explains them through realistic browsing behaviors: (1) revisiting bookmarked pages; (2) backtracking; and (3) seeking out novel pages of topical interest. The resulting model can reproduce the behaviors we observe in empirical data, especially heterogeneous session lengths, reconciling the narrowly focused browsing patterns of individual users with the extreme variance in aggregate traffic measurements. We can thereby identify a few salient features that are necessary and sufficient to interpret Web traffic data. Beyond the descriptive and explanatory power of our model, these results may lead to improvements in Web applications such as search and crawling.

  11. Empirical spatial econometric modelling of small scale neighbourhood

    NASA Astrophysics Data System (ADS)

    Gerkman, Linda

    2012-07-01

    The aim of the paper is to model small scale neighbourhood in a house price model by implementing the newest methodology in spatial econometrics. A common problem when modelling house prices is that in practice it is seldom possible to obtain all the desired variables. Especially variables capturing the small scale neighbourhood conditions are hard to find. If there are important explanatory variables missing from the model, the omitted variables are spatially autocorrelated and they are correlated with the explanatory variables included in the model, it can be shown that a spatial Durbin model is motivated. In the empirical application on new house price data from Helsinki in Finland, we find the motivation for a spatial Durbin model, we estimate the model and interpret the estimates for the summary measures of impacts. By the analysis we show that the model structure makes it possible to model and find small scale neighbourhood effects, when we know that they exist, but we are lacking proper variables to measure them.

  12. Molecular modeling and dynamics simulations of PNP from Streptococcus agalactiae.

    PubMed

    Caceres, Rafael Andrade; Saraiva Timmers, Luis Fernando; Dias, Raquel; Basso, Luiz Augusto; Santos, Diogenes Santiago; de Azevedo, Walter Filgueira

    2008-05-01

    This work describes for the first time a structural model of purine nucleoside phosphorylase from Streptococcus agalactiae (SaPNP). PNP catalyzes the cleavage of N-ribosidic bonds of the purine ribonucleosides and 2-deoxyribonucleosides in the presence of inorganic orthophosphate as a second substrate. This enzyme is a potential target for the development of antibacterial drugs. We modeled the complexes of SaPNP with 15 different ligands in order to determine the structural basis for the specificity of these ligands against SaPNP. The application of a novel empirical scoring function to estimate the affinity of a ligand for a protein was able to identify the ligands with high affinity for PNPs. The analysis of molecular dynamics trajectory for SaPNP indicates that the functionally important motifs have a very stable structure. This new structural model together with a novel empirical scoring function opens the possibility to explorer larger library of compounds in order to identify the new inhibitors for PNPs in virtual screening projects.

  13. Fluorescence background removal method for biological Raman spectroscopy based on empirical mode decomposition.

    PubMed

    Leon-Bejarano, Maritza; Dorantes-Mendez, Guadalupe; Ramirez-Elias, Miguel; Mendez, Martin O; Alba, Alfonso; Rodriguez-Leyva, Ildefonso; Jimenez, M

    2016-08-01

    Raman spectroscopy of biological tissue presents fluorescence background, an undesirable effect that generates false Raman intensities. This paper proposes the application of the Empirical Mode Decomposition (EMD) method to baseline correction. EMD is a suitable approach since it is an adaptive signal processing method for nonlinear and non-stationary signal analysis that does not require parameters selection such as polynomial methods. EMD performance was assessed through synthetic Raman spectra with different signal to noise ratio (SNR). The correlation coefficient between synthetic Raman spectra and the recovered one after EMD denoising was higher than 0.92. Additionally, twenty Raman spectra from skin were used to evaluate EMD performance and the results were compared with Vancouver Raman algorithm (VRA). The comparison resulted in a mean square error (MSE) of 0.001554. High correlation coefficient using synthetic spectra and low MSE in the comparison between EMD and VRA suggest that EMD could be an effective method to remove fluorescence background in biological Raman spectra.

  14. An Analysis of Machine- and Human-Analytics in Classification.

    PubMed

    Tam, Gary K L; Kothari, Vivek; Chen, Min

    2017-01-01

    In this work, we present a study that traces the technical and cognitive processes in two visual analytics applications to a common theoretic model of soft knowledge that may be added into a visual analytics process for constructing a decision-tree model. Both case studies involved the development of classification models based on the "bag of features" approach. Both compared a visual analytics approach using parallel coordinates with a machine-learning approach using information theory. Both found that the visual analytics approach had some advantages over the machine learning approach, especially when sparse datasets were used as the ground truth. We examine various possible factors that may have contributed to such advantages, and collect empirical evidence for supporting the observation and reasoning of these factors. We propose an information-theoretic model as a common theoretic basis to explain the phenomena exhibited in these two case studies. Together we provide interconnected empirical and theoretical evidence to support the usefulness of visual analytics.

  15. Nonlinear mode decomposition: A noise-robust, adaptive decomposition method

    NASA Astrophysics Data System (ADS)

    Iatsenko, Dmytro; McClintock, Peter V. E.; Stefanovska, Aneta

    2015-09-01

    The signals emanating from complex systems are usually composed of a mixture of different oscillations which, for a reliable analysis, should be separated from each other and from the inevitable background of noise. Here we introduce an adaptive decomposition tool—nonlinear mode decomposition (NMD)—which decomposes a given signal into a set of physically meaningful oscillations for any wave form, simultaneously removing the noise. NMD is based on the powerful combination of time-frequency analysis techniques—which, together with the adaptive choice of their parameters, make it extremely noise robust—and surrogate data tests used to identify interdependent oscillations and to distinguish deterministic from random activity. We illustrate the application of NMD to both simulated and real signals and demonstrate its qualitative and quantitative superiority over other approaches, such as (ensemble) empirical mode decomposition, Karhunen-Loève expansion, and independent component analysis. We point out that NMD is likely to be applicable and useful in many different areas of research, such as geophysics, finance, and the life sciences. The necessary matlab codes for running NMD are freely available for download.

  16. Exploring Methodologies and Indicators for Cross-disciplinary Applications

    NASA Astrophysics Data System (ADS)

    Bernknopf, R.; Pearlman, J.

    2015-12-01

    Assessing the impact and benefit of geospatial information is a multidisciplinary task that involves the social, economic and environmental knowledge to formulate indicators and methods. There are use cases that couple the social sciences including economics, psychology, sociology that incorporate geospatial information. Benefit - cost analysis is an empirical approach that uses money as an indicator for decision making. It is a traditional base for a use case and has been applied to geospatial information and other areas. A new use case that applies indicators is Meta Regression analysis, which is used to evaluate transfers of socioeconomic benefits from different geographic regions into a unifying statistical approach. In this technique, qualitative and quantitative variables are indicators, which provide a weighted average of value for the nonmarket good or resource over a large region. The expected willingness to pay for the nonmarket good can be applied to a specific region. A third use case is the application of Decision Support Systems and Tools that have been used for forecasting agricultural prices and analysis of hazard policies. However, new methods for integrating these disciplines into use cases, an avenue to instruct the development of operational applications of geospatial information, are needed. Experience in one case may not be broadly transferable to other uses and applications if multiple disciplines are involved. To move forward, more use cases are needed and, especially, applications in the private sector. Applications are being examined across a multidisciplinary community for good examples that would be instructive in meeting the challenge. This presentation will look at the results of an investigation into directions in the broader applications of use cases to teach the methodologies and use of indicators that have applications across fields of interest.

  17. The application of language-game theory to the analysis of science learning: Developing an interpretive classroom-level learning framework

    NASA Astrophysics Data System (ADS)

    Ahmadibasir, Mohammad

    In this study an interpretive learning framework that aims to measure learning on the classroom level is introduced. In order to develop and evaluate the value of the framework, a theoretical/empirical study is designed. The researcher attempted to illustrate how the proposed framework provides insights on the problem of classroom-level learning. The framework is developed by construction of connections between the current literature on science learning and Wittgenstein's language-game theory. In this framework learning is defined as change of classroom language-game or discourse. In the proposed framework, learning is measured by analysis of classroom discourse. The empirical explanation power of the framework is evaluated by applying the framework in the analysis of learning in a fifth-grade science classroom. The researcher attempted to analyze how students' colloquial discourse changed to a discourse that bears more resemblance to science discourse. The results of the empirical part of the investigation are presented in three parts: first, the gap between what students did and what they were supposed to do was reported. The gap showed that students during the classroom inquiry wanted to do simple comparisons by direct observation, while they were supposed to do tool-assisted observation and procedural manipulation for a complete comparison. Second, it was illustrated that the first attempt to connect the colloquial to science discourse was done by what was immediately intelligible for students and then the teacher negotiated with students in order to help them to connect the old to the new language-game more purposefully. The researcher suggested that these two events in the science classroom are critical in discourse change. Third, it was illustrated that through the academic year, the way that students did the act of comparison was improved and by the end of the year more accurate causal inferences were observable in classroom communication. At the end of the study, the researcher illustrates that the application of the proposed framework resulted in an improved version of the framework. The improved version of the proposed framework is more connected to the topic of science learning, and is able to measure the change of discourse in higher resolution.

  18. Study on Influencing Factor Analysis and Application of Consumer Mobile Commerce Acceptance

    NASA Astrophysics Data System (ADS)

    Li, Gaoguang; Lv, Tingjie

    Mobile commerce (MC) refers to e-commerce activities carried out using a mobile device such as a phone or PDA. With new technology, MC will be rapidly growing in the near future. At the present time, what factors making consumer accept MC and what MC applications are acceptable by consumers are two of hot issues both for MC providers and f or MC researchers. This study presents a proposed MC acceptance model that integrates perceived playfulness, perceived risk and cost into the TAM to study which factors affect consumer MC acceptance. The proposed model includes five variables, namely perceived risk, cost, perceived usefulness, perceived playfulness, perceived ease of use, perceived playfulness. Then, using analytic hierarchy process (AHP) to calculate weight of criteria involved in proposed model. Finally, the study utilizes fuzzy comprehensive evaluation method to evaluate MC applications accepted possibility, and then a MC application is empirically tested using data collected from a survey of MC consumers.

  19. An Analysis of Social Justice Research in School Psychology

    ERIC Educational Resources Information Center

    Graybill, Emily; Baker, Courtney N.; Cloth, Allison H.; Fisher, Sycarah; Nastasi, Bonnie K.

    2018-01-01

    The purpose of the current content analysis was to build upon previous empirical research both within school psychology and in other subdisciplines of psychology to refine the operationalized definition of social justice within school psychology research. Operationalizing the definition and substantiating it within the empirical literature is a…

  20. University Student Satisfaction: An Empirical Analysis

    ERIC Educational Resources Information Center

    Clemes, Michael D.; Gan, Christopher E. C.; Kao, Tzu-Hui

    2008-01-01

    The purpose of this research is to gain an empirical understanding of students' overall satisfaction with their academic university experiences. A hierarchal model is used as a framework for this analysis. Fifteen hypotheses are formulated and tested, in order to identify the dimensions of service quality as perceived by university students, to…

  1. Determinants of Crime in Virginia: An Empirical Analysis

    ERIC Educational Resources Information Center

    Ali, Abdiweli M.; Peek, Willam

    2009-01-01

    This paper is an empirical analysis of the determinants of crime in Virginia. Over a dozen explanatory variables that current literature suggests as important determinants of crime are collected. The data is from 1970 to 2000. These include economic, fiscal, demographic, political, and social variables. The regression results indicate that crime…

  2. What is heartburn worth? A cost-utility analysis of management strategies.

    PubMed

    Heudebert, G R; Centor, R M; Klapow, J C; Marks, R; Johnson, L; Wilcox, C M

    2000-03-01

    To determine the best treatment strategy for the management of patients presenting with symptoms consistent with uncomplicated heartburn. We performed a cost-utility analysis of 4 alternatives: empirical proton pump inhibitor, empirical histamine2-receptor antagonist, and diagnostic strategies consisting of either esophagogastroduodenoscopy (EGD) or an upper gastrointestinal series before treatment. The time horizon of the model was 1 year. The base case analysis assumed a cohort of otherwise healthy 45-year-old individuals in a primary care practice. Empirical treatment with a proton pump inhibitor was projected to provide the greatest quality-adjusted survival for the cohort. Empirical treatment with a histamine2 receptor antagonist was projected to be the least costly of the alternatives. The marginal cost-effectiveness of using a proton pump inhibitor over a histamine2-receptor antagonist was approximately $10,400 per quality-adjusted life year (QALY) gained in the base case analysis and was less than $50,000 per QALY as long as the utility for heartburn was less than 0.95. Both diagnostic strategies were dominated by proton pump inhibitor alternative. Empirical treatment seems to be the optimal initial management strategy for patients with heartburn, but the choice between a proton pump inhibitor or histamine2-receptor antagonist depends on the impact of heartburn on quality of life.

  3. Parameterization of aquatic ecosystem functioning and its natural variation: Hierarchical Bayesian modelling of plankton food web dynamics

    NASA Astrophysics Data System (ADS)

    Norros, Veera; Laine, Marko; Lignell, Risto; Thingstad, Frede

    2017-10-01

    Methods for extracting empirically and theoretically sound parameter values are urgently needed in aquatic ecosystem modelling to describe key flows and their variation in the system. Here, we compare three Bayesian formulations for mechanistic model parameterization that differ in their assumptions about the variation in parameter values between various datasets: 1) global analysis - no variation, 2) separate analysis - independent variation and 3) hierarchical analysis - variation arising from a shared distribution defined by hyperparameters. We tested these methods, using computer-generated and empirical data, coupled with simplified and reasonably realistic plankton food web models, respectively. While all methods were adequate, the simulated example demonstrated that a well-designed hierarchical analysis can result in the most accurate and precise parameter estimates and predictions, due to its ability to combine information across datasets. However, our results also highlighted sensitivity to hyperparameter prior distributions as an important caveat of hierarchical analysis. In the more complex empirical example, hierarchical analysis was able to combine precise identification of parameter values with reasonably good predictive performance, although the ranking of the methods was less straightforward. We conclude that hierarchical Bayesian analysis is a promising tool for identifying key ecosystem-functioning parameters and their variation from empirical datasets.

  4. 75 FR 21287 - Empire Pipeline Inc.; Notice of Application

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-23

    ..., Inc. (EPI), 6363 Main Street, Williamsville, New York 14221, filed in Docket No. CP10-136-000, an... utilizing EPI's existing cross-border facilities. EPI proposes no new facilities in its application. The...

  5. Impact of Inadequate Empirical Therapy on the Mortality of Patients with Bloodstream Infections: a Propensity Score-Based Analysis

    PubMed Central

    Retamar, Pilar; Portillo, María M.; López-Prieto, María Dolores; Rodríguez-López, Fernando; de Cueto, Marina; García, María V.; Gómez, María J.; del Arco, Alfonso; Muñoz, Angel; Sánchez-Porto, Antonio; Torres-Tortosa, Manuel; Martín-Aspas, Andrés; Arroyo, Ascensión; García-Figueras, Carolina; Acosta, Federico; Corzo, Juan E.; León-Ruiz, Laura; Escobar-Lara, Trinidad

    2012-01-01

    The impact of the adequacy of empirical therapy on outcome for patients with bloodstream infections (BSI) is key for determining whether adequate empirical coverage should be prioritized over other, more conservative approaches. Recent systematic reviews outlined the need for new studies in the field, using improved methodologies. We assessed the impact of inadequate empirical treatment on the mortality of patients with BSI in the present-day context, incorporating recent methodological recommendations. A prospective multicenter cohort including all BSI episodes in adult patients was performed in 15 hospitals in Andalucía, Spain, over a 2-month period in 2006 to 2007. The main outcome variables were 14- and 30-day mortality. Adjusted analyses were performed by multivariate analysis and propensity score-based matching. Eight hundred one episodes were included. Inadequate empirical therapy was administered in 199 (24.8%) episodes; mortality at days 14 and 30 was 18.55% and 22.6%, respectively. After controlling for age, Charlson index, Pitt score, neutropenia, source, etiology, and presentation with severe sepsis or shock, inadequate empirical treatment was associated with increased mortality at days 14 and 30 (odds ratios [ORs], 2.12 and 1.56; 95% confidence intervals [95% CI], 1.34 to 3.34 and 1.01 to 2.40, respectively). The adjusted ORs after a propensity score-based matched analysis were 3.03 and 1.70 (95% CI, 1.60 to 5.74 and 0.98 to 2.98, respectively). In conclusion, inadequate empirical therapy is independently associated with increased mortality in patients with BSI. Programs to improve the quality of empirical therapy in patients with suspicion of BSI and optimization of definitive therapy should be implemented. PMID:22005999

  6. Tunnel Detection Using Seismic Methods

    NASA Astrophysics Data System (ADS)

    Miller, R.; Park, C. B.; Xia, J.; Ivanov, J.; Steeples, D. W.; Ryden, N.; Ballard, R. F.; Llopis, J. L.; Anderson, T. S.; Moran, M. L.; Ketcham, S. A.

    2006-05-01

    Surface seismic methods have shown great promise for use in detecting clandestine tunnels in areas where unauthorized movement beneath secure boundaries have been or are a matter of concern for authorities. Unauthorized infiltration beneath national borders and into or out of secure facilities is possible at many sites by tunneling. Developments in acquisition, processing, and analysis techniques using multi-channel seismic imaging have opened the door to a vast number of near-surface applications including anomaly detection and delineation, specifically tunnels. Body waves have great potential based on modeling and very preliminary empirical studies trying to capitalize on diffracted energy. A primary limitation of all seismic energy is the natural attenuation of high-frequency energy by earth materials and the difficulty in transmitting a high- amplitude source pulse with a broad spectrum above 500 Hz into the earth. Surface waves have shown great potential since the development of multi-channel analysis methods (e.g., MASW). Both shear-wave velocity and backscatter energy from surface waves have been shown through modeling and empirical studies to have great promise in detecting the presence of anomalies, such as tunnels. Success in developing and evaluating various seismic approaches for detecting tunnels relies on investigations at known tunnel locations, in a variety of geologic settings, employing a wide range of seismic methods, and targeting a range of uniquely different tunnel geometries, characteristics, and host lithologies. Body-wave research at the Moffat tunnels in Winter Park, Colorado, provided well-defined diffraction-looking events that correlated with the subsurface location of the tunnel complex. Natural voids related to karst have been studied in Kansas, Oklahoma, Alabama, and Florida using shear-wave velocity imaging techniques based on the MASW approach. Manmade tunnels, culverts, and crawl spaces have been the target of multi-modal analysis in Kansas and California. Clandestine tunnels used for illegal entry into the U.S. from Mexico were studied at two different sites along the southern border of California. All these studies represent the empirical basis for suggesting surface seismic has a significant role to play in tunnel detection and that methods are under development and very nearly at hand that will provide an effective tool in appraising and maintaining parameter security. As broadband sources, gravity-coupled towed spreads, and automated analysis software continues to make advancements, so does the applicability of routine deployment of seismic imaging systems that can be operated by technicians with interpretation aids for nearly real-time target selection. Key to making these systems commercial is the development of enhanced imaging techniques in geologically noisy areas and highly variable surface terrain.

  7. Robust and transferable quantification of NMR spectral quality using IROC analysis

    NASA Astrophysics Data System (ADS)

    Zambrello, Matthew A.; Maciejewski, Mark W.; Schuyler, Adam D.; Weatherby, Gerard; Hoch, Jeffrey C.

    2017-12-01

    Non-Fourier methods are increasingly utilized in NMR spectroscopy because of their ability to handle nonuniformly-sampled data. However, non-Fourier methods present unique challenges due to their nonlinearity, which can produce nonrandom noise and render conventional metrics for spectral quality such as signal-to-noise ratio unreliable. The lack of robust and transferable metrics (i.e. applicable to methods exhibiting different nonlinearities) has hampered comparison of non-Fourier methods and nonuniform sampling schemes, preventing the identification of best practices. We describe a novel method, in situ receiver operating characteristic analysis (IROC), for characterizing spectral quality based on the Receiver Operating Characteristic curve. IROC utilizes synthetic signals added to empirical data as "ground truth", and provides several robust scalar-valued metrics for spectral quality. This approach avoids problems posed by nonlinear spectral estimates, and provides a versatile quantitative means of characterizing many aspects of spectral quality. We demonstrate applications to parameter optimization in Fourier and non-Fourier spectral estimation, critical comparison of different methods for spectrum analysis, and optimization of nonuniform sampling schemes. The approach will accelerate the discovery of optimal approaches to nonuniform sampling experiment design and non-Fourier spectrum analysis for multidimensional NMR.

  8. Mid-infrared absorption spectroscopy using quantum cascade lasers

    NASA Astrophysics Data System (ADS)

    Haibach, Fred; Erlich, Adam; Deutsch, Erik

    2011-06-01

    Block Engineering has developed an absorption spectroscopy system based on widely tunable Quantum Cascade Lasers (QCL). The QCL spectrometer rapidly cycles through a user-selected range in the mid-infrared spectrum, between 6 to 12 μm (1667 to 833 cm-1), to detect and identify substances on surfaces based on their absorption characteristics from a standoff distance of up to 2 feet with an eye-safe laser. It can also analyze vapors and liquids in a single device. For military applications, the QCL spectrometer has demonstrated trace explosive, chemical warfare agent (CWA), and toxic industrial chemical (TIC) detection and analysis. The QCL's higher power density enables measurements from diffuse and highly absorbing materials and substrates. Other advantages over Fourier Transform Infrared (FTIR) spectroscopy include portability, ruggedness, rapid analysis, and the ability to function from a distance through free space or a fiber optic probe. This paper will discuss the basic technology behind the system and the empirical data on various safety and security applications.

  9. edgeR: a Bioconductor package for differential expression analysis of digital gene expression data.

    PubMed

    Robinson, Mark D; McCarthy, Davis J; Smyth, Gordon K

    2010-01-01

    It is expected that emerging digital gene expression (DGE) technologies will overtake microarray technologies in the near future for many functional genomics applications. One of the fundamental data analysis tasks, especially for gene expression studies, involves determining whether there is evidence that counts for a transcript or exon are significantly different across experimental conditions. edgeR is a Bioconductor software package for examining differential expression of replicated count data. An overdispersed Poisson model is used to account for both biological and technical variability. Empirical Bayes methods are used to moderate the degree of overdispersion across transcripts, improving the reliability of inference. The methodology can be used even with the most minimal levels of replication, provided at least one phenotype or experimental condition is replicated. The software may have other applications beyond sequencing data, such as proteome peptide count data. The package is freely available under the LGPL licence from the Bioconductor web site (http://bioconductor.org).

  10. Fractal scaling in bottlenose dolphin (Tursiops truncatus) echolocation: A case study

    NASA Astrophysics Data System (ADS)

    Perisho, Shaun T.; Kelty-Stephen, Damian G.; Hajnal, Alen; Houser, Dorian; Kuczaj, Stan A., II

    2016-02-01

    Fractal scaling patterns, which entail a power-law relationship between magnitude of fluctuations in a variable and the scale at which the variable is measured, have been found in many aspects of human behavior. These findings have led to advances in behavioral models (e.g. providing empirical support for cascade-driven theories of cognition) and have had practical medical applications (e.g. providing new methods for early diagnosis of medical conditions). In the present paper, fractal analysis is used to investigate whether similar fractal scaling patterns exist in inter-click interval and peak-peak amplitude measurements of bottlenose dolphin click trains. Several echolocation recordings taken from two male bottlenose dolphins were analyzed using Detrended Fluctuation Analysis and Higuchi's (1988) method for determination of fractal dimension. Both animals were found to exhibit fractal scaling patterns near what is consistent with persistent long range correlations. These findings suggest that recent advances in human cognition and medicine may have important parallel applications to echolocation as well.

  11. Higher order statistical moment application for solar PV potential analysis

    NASA Astrophysics Data System (ADS)

    Basri, Mohd Juhari Mat; Abdullah, Samizee; Azrulhisham, Engku Ahmad; Harun, Khairulezuan

    2016-10-01

    Solar photovoltaic energy could be as alternative energy to fossil fuel, which is depleting and posing a global warming problem. However, this renewable energy is so variable and intermittent to be relied on. Therefore the knowledge of energy potential is very important for any site to build this solar photovoltaic power generation system. Here, the application of higher order statistical moment model is being analyzed using data collected from 5MW grid-connected photovoltaic system. Due to the dynamic changes of skewness and kurtosis of AC power and solar irradiance distributions of the solar farm, Pearson system where the probability distribution is calculated by matching their theoretical moments with that of the empirical moments of a distribution could be suitable for this purpose. On the advantage of the Pearson system in MATLAB, a software programming has been developed to help in data processing for distribution fitting and potential analysis for future projection of amount of AC power and solar irradiance availability.

  12. Tolerant compressed sensing with partially coherent sensing matrices

    NASA Astrophysics Data System (ADS)

    Birnbaum, Tobias; Eldar, Yonina C.; Needell, Deanna

    2017-08-01

    Most of compressed sensing (CS) theory to date is focused on incoherent sensing, that is, columns from the sensing matrix are highly uncorrelated. However, sensing systems with naturally occurring correlations arise in many applications, such as signal detection, motion detection and radar. Moreover, in these applications it is often not necessary to know the support of the signal exactly, but instead small errors in the support and signal are tolerable. Despite the abundance of work utilizing incoherent sensing matrices, for this type of tolerant recovery we suggest that coherence is actually beneficial . We promote the use of coherent sampling when tolerant support recovery is acceptable, and demonstrate its advantages empirically. In addition, we provide a first step towards theoretical analysis by considering a specific reconstruction method for selected signal classes.

  13. Reflective equilibrium and empirical data: third person moral experiences in empirical medical ethics.

    PubMed

    De Vries, Martine; Van Leeuwen, Evert

    2010-11-01

    In ethics, the use of empirical data has become more and more popular, leading to a distinct form of applied ethics, namely empirical ethics. This 'empirical turn' is especially visible in bioethics. There are various ways of combining empirical research and ethical reflection. In this paper we discuss the use of empirical data in a special form of Reflective Equilibrium (RE), namely the Network Model with Third Person Moral Experiences. In this model, the empirical data consist of the moral experiences of people in a practice. Although inclusion of these moral experiences in this specific model of RE can be well defended, their use in the application of the model still raises important questions. What precisely are moral experiences? How to determine relevance of experiences, in other words: should there be a selection of the moral experiences that are eventually used in the RE? How much weight should the empirical data have in the RE? And the key question: can the use of RE by empirical ethicists really produce answers to practical moral questions? In this paper we start to answer the above questions by giving examples taken from our research project on understanding the norm of informed consent in the field of pediatric oncology. We especially emphasize that incorporation of empirical data in a network model can reduce the risk of self-justification and bias and can increase the credibility of the RE reached. © 2009 Blackwell Publishing Ltd.

  14. A New Method for Nonlinear and Nonstationary Time Series Analysis and Its Application to the Earthquake and Building Response Records

    NASA Technical Reports Server (NTRS)

    Huang, Norden E.

    1999-01-01

    A new method for analyzing nonlinear and nonstationary data has been developed. The key part of the method is the Empirical Mode Decomposition method with which any complicated data set can be decomposed into a finite and often small number of Intrinsic Mode Functions (IMF). An IMF is defined as any function having the same numbers of zero-crossing and extrema, and also having symmetric envelopes defined by the local maxima and minima respectively. The IMF also admits well-behaved Hilbert transform. This decomposition method is adaptive, and, therefore, highly efficient. Since the decomposition is based on the local characteristic time scale of the data, it is applicable to nonlinear and nonstationary processes. With the Hilbert transform, the Intrinsic Mode Functions yield instantaneous frequencies as functions of time that give sharp identifications of imbedded structures. The final presentation of the results is an energy-frequency-time distribution, designated as the Hilbert Spectrum, Example of application of this method to earthquake and building response will be given. The results indicate those low frequency components, totally missed by the Fourier analysis, are clearly identified by the new method. Comparisons with Wavelet and window Fourier analysis show the new method offers much better temporal and frequency resolutions.

  15. An update on the "empirical turn" in bioethics: analysis of empirical research in nine bioethics journals.

    PubMed

    Wangmo, Tenzin; Hauri, Sirin; Gennet, Eloise; Anane-Sarpong, Evelyn; Provoost, Veerle; Elger, Bernice S

    2018-02-07

    A review of literature published a decade ago noted a significant increase in empirical papers across nine bioethics journals. This study provides an update on the presence of empirical papers in the same nine journals. It first evaluates whether the empirical trend is continuing as noted in the previous study, and second, how it is changing, that is, what are the characteristics of the empirical works published in these nine bioethics journals. A review of the same nine journals (Bioethics; Journal of Medical Ethics; Journal of Clinical Ethics; Nursing Ethics; Cambridge Quarterly of Healthcare Ethics; Hastings Center Report; Theoretical Medicine and Bioethics; Christian Bioethics; and Kennedy Institute of Ethics Journal) was conducted for a 12-year period from 2004 to 2015. Data obtained was analysed descriptively and using a non-parametric Chi-square test. Of the total number of original papers (N = 5567) published in the nine bioethics journals, 18.1% (n = 1007) collected and analysed empirical data. Journal of Medical Ethics and Nursing Ethics led the empirical publications, accounting for 89.4% of all empirical papers. The former published significantly more quantitative papers than qualitative, whereas the latter published more qualitative papers. Our analysis reveals no significant difference (χ2 = 2.857; p = 0.091) between the proportion of empirical papers published in 2004-2009 and 2010-2015. However, the increasing empirical trend has continued in these journals with the proportion of empirical papers increasing from 14.9% in 2004 to 17.8% in 2015. This study presents the current state of affairs regarding empirical research published nine bioethics journals. In the quarter century of data that is available about the nine bioethics journals studied in two reviews, the proportion of empirical publications continues to increase, signifying a trend towards empirical research in bioethics. The growing volume is mainly attributable to two journals: Journal of Medical Ethics and Nursing Ethics. This descriptive study further maps the still developing field of empirical research in bioethics. Additional studies are needed to completely map the nature and extent of empirical research in bioethics to inform the ongoing debate about the value of empirical research for bioethics.

  16. Teaching Integrity in Empirical Research: A Protocol for Documenting Data Management and Analysis

    ERIC Educational Resources Information Center

    Ball, Richard; Medeiros, Norm

    2012-01-01

    This article describes a protocol the authors developed for teaching undergraduates to document their statistical analyses for empirical research projects so that their results are completely reproducible and verifiable. The protocol is guided by the principle that the documentation prepared to accompany an empirical research project should be…

  17. GetReal in network meta-analysis: a review of the methodology.

    PubMed

    Efthimiou, Orestis; Debray, Thomas P A; van Valkenhoef, Gert; Trelle, Sven; Panayidou, Klea; Moons, Karel G M; Reitsma, Johannes B; Shang, Aijing; Salanti, Georgia

    2016-09-01

    Pairwise meta-analysis is an established statistical tool for synthesizing evidence from multiple trials, but it is informative only about the relative efficacy of two specific interventions. The usefulness of pairwise meta-analysis is thus limited in real-life medical practice, where many competing interventions may be available for a certain condition and studies informing some of the pairwise comparisons may be lacking. This commonly encountered scenario has led to the development of network meta-analysis (NMA). In the last decade, several applications, methodological developments, and empirical studies in NMA have been published, and the area is thriving as its relevance to public health is increasingly recognized. This article presents a review of the relevant literature on NMA methodology aiming to pinpoint the developments that have appeared in the field. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  18. Ultrahigh-Dimensional Multiclass Linear Discriminant Analysis by Pairwise Sure Independence Screening

    PubMed Central

    Pan, Rui; Wang, Hansheng; Li, Runze

    2016-01-01

    This paper is concerned with the problem of feature screening for multi-class linear discriminant analysis under ultrahigh dimensional setting. We allow the number of classes to be relatively large. As a result, the total number of relevant features is larger than usual. This makes the related classification problem much more challenging than the conventional one, where the number of classes is small (very often two). To solve the problem, we propose a novel pairwise sure independence screening method for linear discriminant analysis with an ultrahigh dimensional predictor. The proposed procedure is directly applicable to the situation with many classes. We further prove that the proposed method is screening consistent. Simulation studies are conducted to assess the finite sample performance of the new procedure. We also demonstrate the proposed methodology via an empirical analysis of a real life example on handwritten Chinese character recognition. PMID:28127109

  19. Automotive Gas Turbine Power System-Performance Analysis Code

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert J.

    1997-01-01

    An open cycle gas turbine numerical modelling code suitable for thermodynamic performance analysis (i.e. thermal efficiency, specific fuel consumption, cycle state points, working fluid flowrates etc.) of automotive and aircraft powerplant applications has been generated at the NASA Lewis Research Center's Power Technology Division. The use this code can be made available to automotive gas turbine preliminary design efforts, either in its present version, or, assuming that resources can be obtained to incorporate empirical models for component weight and packaging volume, in later version that includes the weight-volume estimator feature. The paper contains a brief discussion of the capabilities of the presently operational version of the code, including a listing of input and output parameters and actual sample output listings.

  20. Wavelet analysis for wind fields estimation.

    PubMed

    Leite, Gladeston C; Ushizima, Daniela M; Medeiros, Fátima N S; de Lima, Gilson G

    2010-01-01

    Wind field analysis from synthetic aperture radar images allows the estimation of wind direction and speed based on image descriptors. In this paper, we propose a framework to automate wind direction retrieval based on wavelet decomposition associated with spectral processing. We extend existing undecimated wavelet transform approaches, by including à trous with B(3) spline scaling function, in addition to other wavelet bases as Gabor and Mexican-hat. The purpose is to extract more reliable directional information, when wind speed values range from 5 to 10 ms(-1). Using C-band empirical models, associated with the estimated directional information, we calculate local wind speed values and compare our results with QuikSCAT scatterometer data. The proposed approach has potential application in the evaluation of oil spills and wind farms.

  1. Image quality enhancement for skin cancer optical diagnostics

    NASA Astrophysics Data System (ADS)

    Bliznuks, Dmitrijs; Kuzmina, Ilona; Bolocko, Katrina; Lihachev, Alexey

    2017-12-01

    The research presents image quality analysis and enhancement proposals in biophotonic area. The sources of image problems are reviewed and analyzed. The problems with most impact in biophotonic area are analyzed in terms of specific biophotonic task - skin cancer diagnostics. The results point out that main problem for skin cancer analysis is the skin illumination problems. Since it is often not possible to prevent illumination problems, the paper proposes image post processing algorithm - low frequency filtering. Practical results show diagnostic results improvement after using proposed filter. Along that, filter do not reduces diagnostic results' quality for images without illumination defects. Current filtering algorithm requires empirical tuning of filter parameters. Further work needed to test the algorithm in other biophotonic applications and propose automatic filter parameter selection.

  2. Interdependence and contagion among industry-level US credit markets: An application of wavelet and VMD based copula approaches

    NASA Astrophysics Data System (ADS)

    Shahzad, Syed Jawad Hussain; Nor, Safwan Mohd; Kumar, Ronald Ravinesh; Mensi, Walid

    2017-01-01

    This study examines the interdependence and contagion among US industry-level credit markets. We use daily data of 11 industries from 17 December 2007 to 31 December 2014 for the time-frequency, namely, wavelet squared coherence analysis. The empirical analysis reveals that Basic Materials (Utilities) industry credit market has the highest (lowest) interdependence with other industries. Basic Materials credit market passes cyclical effect to all other industries. The little ;shift-contagion; as defined by Forbes and Rigobon (2002) is examined using elliptical and Archimedean copulas on the short-run decomposed series obtained through Variational Mode Decomposition (VMD). The contagion effects between US industry-level credit markets mainly occurred during the global financial crisis of 2007-08.

  3. A Tool for Empirical Forecasting of Major Flares, Coronal Mass Ejections, and Solar Particle Events from a Proxy of Active-Region Free Magnetic Energy

    NASA Technical Reports Server (NTRS)

    Barghouty, A. F.; Falconer, D. A.; Adams, J. H., Jr.

    2010-01-01

    This presentation describes a new forecasting tool developed for and is currently being tested by NASA s Space Radiation Analysis Group (SRAG) at JSC, which is responsible for the monitoring and forecasting of radiation exposure levels of astronauts. The new software tool is designed for the empirical forecasting of M and X-class flares, coronal mass ejections, as well as solar energetic particle events. Its algorithm is based on an empirical relationship between the various types of events rates and a proxy of the active region s free magnetic energy, determined from a data set of approx.40,000 active-region magnetograms from approx.1,300 active regions observed by SOHO/MDI that have known histories of flare, coronal mass ejection, and solar energetic particle event production. The new tool automatically extracts each strong-field magnetic areas from an MDI full-disk magnetogram, identifies each as an NOAA active region, and measures a proxy of the active region s free magnetic energy from the extracted magnetogram. For each active region, the empirical relationship is then used to convert the free magnetic energy proxy into an expected event rate. The expected event rate in turn can be readily converted into the probability that the active region will produce such an event in a given forward time window. Descriptions of the datasets, algorithm, and software in addition to sample applications and a validation test are presented. Further development and transition of the new tool in anticipation of SDO/HMI is briefly discussed.

  4. High throughput film dosimetry in homogeneous and heterogeneous media for a small animal irradiator

    PubMed Central

    Wack, L.; Ngwa, W.; Tryggestad, E.; Tsiamas, P.; Berbeco, R.; Ng, S.K.; Hesser, J.

    2013-01-01

    Purpose We have established a high-throughput Gafchromic film dosimetry protocol for narrow kilo-voltage beams in homogeneous and heterogeneous media for small-animal radiotherapy applications. The kV beam characterization is based on extensive Gafchromic film dosimetry data acquired in homogeneous and heterogeneous media. An empirical model is used for parameterization of depth and off-axis dependence of measured data. Methods We have modified previously published methods of film dosimetry to suit the specific tasks of the study. Unlike film protocols used in previous studies, our protocol employs simultaneous multichannel scanning and analysis of up to nine Gafchromic films per scan. A scanner and background correction were implemented to improve accuracy of the measurements. Measurements were taken in homogeneous and inhomogeneous phantoms at 220 kVp and a field size of 5 × 5 mm2. The results were compared against Monte Carlo simulations. Results Dose differences caused by variations in background signal were effectively removed by the corrections applied. Measurements in homogeneous phantoms were used to empirically characterize beam data in homogeneous and heterogeneous media. Film measurements in inhomogeneous phantoms and their empirical parameterization differed by about 2%–3%. The model differed from MC by about 1% (water, lung) to 7% (bone). Good agreement was found for measured and modelled off-axis ratios. Conclusions EBT2 films are a valuable tool for characterization of narrow kV beams, though care must be taken to eliminate disturbances caused by varying background signals. The usefulness of the empirical beam model in interpretation and parameterization of film data was demonstrated. PMID:23510532

  5. Regionally Adaptable Ground Motion Prediction Equation (GMPE) from Empirical Models of Fourier and Duration of Ground Motion

    NASA Astrophysics Data System (ADS)

    Bora, Sanjay; Scherbaum, Frank; Kuehn, Nicolas; Stafford, Peter; Edwards, Benjamin

    2016-04-01

    The current practice of deriving empirical ground motion prediction equations (GMPEs) involves using ground motions recorded at multiple sites. However, in applications like site-specific (e.g., critical facility) hazard ground motions obtained from the GMPEs are need to be adjusted/corrected to a particular site/site-condition under investigation. This study presents a complete framework for developing a response spectral GMPE, within which the issue of adjustment of ground motions is addressed in a manner consistent with the linear system framework. The present approach is a two-step process in which the first step consists of deriving two separate empirical models, one for Fourier amplitude spectra (FAS) and the other for a random vibration theory (RVT) optimized duration (Drvto) of ground motion. In the second step the two models are combined within the RVT framework to obtain full response spectral amplitudes. Additionally, the framework also involves a stochastic model based extrapolation of individual Fourier spectra to extend the useable frequency limit of the empirically derived FAS model. The stochastic model parameters were determined by inverting the Fourier spectral data using an approach similar to the one as described in Edwards and Faeh (2013). Comparison of median predicted response spectra from present approach with those from other regional GMPEs indicates that the present approach can also be used as a stand-alone model. The dataset used for the presented analysis is a subset of the recently compiled database RESORCE-2012 across Europe, the Middle East and the Mediterranean region.

  6. Estimating connectivity in marine populations: an empirical evaluation of assignment tests and parentage analysis under different gene flow scenarios.

    PubMed

    Saenz-Agudelo, P; Jones, G P; Thorrold, S R; Planes, S

    2009-04-01

    The application of spatially explicit models of population dynamics to fisheries management and the design marine reserve network systems has been limited due to a lack of empirical estimates of larval dispersal. Here we compared assignment tests and parentage analysis for examining larval retention and connectivity under two different gene flow scenarios using panda clownfish (Amphiprion polymnus) in Papua New Guinea. A metapopulation of panda clownfish in Bootless Bay with little or no genetic differentiation among five spatially discrete locations separated by 2-6 km provided the high gene flow scenario. The low gene flow scenario compared the Bootless Bay metapopulation with a genetically distinct population (F(ST )= 0.1) located at Schumann Island, New Britain, 1500 km to the northeast. We used assignment tests and parentage analysis based on microsatellite DNA data to identify natal origins of 177 juveniles in Bootless Bay and 73 juveniles at Schumann Island. At low rates of gene flow, assignment tests correctly classified juveniles to their source population. On the other hand, parentage analysis led to an overestimate of self-recruitment within the two populations due to the significant deviation from panmixia when both populations were pooled. At high gene flow (within Bootless Bay), assignment tests underestimated self-recruitment and connectivity among subpopulations, and grossly overestimated self-recruitment within the overall metapopulation. However, the assignment tests did identify immigrants from distant (genetically distinct) populations. Parentage analysis clearly provided the most accurate estimates of connectivity in situations of high gene flow.

  7. An Empirical Study on the Application of Cooperative Learning to Comprehensive English Classes in a Chinese Independent College

    ERIC Educational Resources Information Center

    Meng, Ji

    2017-01-01

    This research investigated a comparison between the effect of cooperative learning and lecture teaching on Comprehensive English classes in a Chinese Independent College. An empirical study for two semesters was carried out in the forms of pretest, posttest, questionnaire and interviews. While control class was taught in the conventional way,…

  8. Effects of Work Environment on Transfer of Training: Empirical Evidence from Master of Business Administration Programs in Vietnam

    ERIC Educational Resources Information Center

    Pham, Nga T. P.; Segers, Mien S. R.; Gijselaers, Wim H.

    2013-01-01

    Practical application of newly gained knowledge and skills, also referred to as transfer of training, is an issue of great concern in training issues generally and in Master of Business Administration (MBA) programs particularly. This empirical study examined the influence of the trainees' work environment on their transfer of training, taking…

  9. Some special features of Wigner’s mass formula for nuclei

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nurmukhamedov, A. M., E-mail: fattah52@mail.ru

    2014-12-15

    Experimental data on anomalous values of the empirical function b(A) in Wigner’s mass formula are presented, the application of Student’s t criterion in experimentally proving the restoration of Wigner’s SU(4) symmetry in nuclei is validated, and a physical interpretation of the basic parameter of the empirical function a(A) in Wigner’s mass formula is given.

  10. An Empirical Method for Deriving Grade Equivalence for University Entrance Qualifications: An Application to A Levels and the International Baccalaureate

    ERIC Educational Resources Information Center

    Green, Francis; Vignoles, Anna

    2012-01-01

    We present a method to compare different qualifications for entry to higher education by studying students' subsequent performance. Using this method for students holding either the International Baccalaureate (IB) or A-levels gaining their degrees in 2010, we estimate an "empirical" equivalence scale between IB grade points and UCAS…

  11. Test/semi-empirical analysis of a carbon/epoxy fabric stiffened panel

    NASA Technical Reports Server (NTRS)

    Spier, E. E.; Anderson, J. A.

    1990-01-01

    The purpose of this work-in-progress is to present a semi-empirical analysis method developed to predict the buckling and crippling loads of carbon/epoxy fabric blade stiffened panels in compression. This is a hand analysis method comprised of well known, accepted techniques, logical engineering judgements, and experimental data that results in conservative solutions. In order to verify this method, a stiffened panel was fabricated and tested. Both the best and analysis results are presented.

  12. Concept Analysis of Spirituality: An Evolutionary Approach.

    PubMed

    Weathers, Elizabeth; McCarthy, Geraldine; Coffey, Alice

    2016-04-01

    The aim of this article is to clarify the concept of spirituality for future nursing research. Previous concept analyses of spirituality have mostly reviewed the conceptual literature with little consideration of the empirical literature. The literature reviewed in prior concept analyses extends from 1972 to 2005, with no analysis conducted in the past 9 years. Rodgers' evolutionary framework was used to review both the theoretical and empirical literature pertaining to spirituality. Evolutionary concept analysis is a formal method of philosophical inquiry, in which papers are analyzed to identify attributes, antecedents, and consequences of the concept. Empirical and conceptual literature. Three defining attributes of spirituality were identified: connectedness, transcendence, and meaning in life. A conceptual definition of spirituality was proposed based on the findings. Also, four antecedents and five primary consequences of spirituality were identified. Spirituality is a complex concept. This concept analysis adds some clarification by proposing a definition of spirituality that is underpinned by both conceptual and empirical research. Furthermore, exemplars of spirituality, based on prior qualitative research, are presented to support the findings. Hence, the findings of this analysis could guide future nursing research on spirituality. © 2015 Wiley Periodicals, Inc.

  13. A Conversation Analysis-Informed Test of L2 Aural Pragmatic Comprehension

    ERIC Educational Resources Information Center

    Walters, F. Scott

    2009-01-01

    Speech act theory-based, second language pragmatics testing (SLPT) raises test-validation issues owing to a lack of correspondence with empirical conversational data. On the assumption that conversation analysis (CA) provides a more accurate account of language use, it is suggested that CA serve as a more empirically valid basis for SLPT…

  14. Globalization, Development and International Migration: A Cross-National Analysis of Less-Developed Countries, 1970-2000

    ERIC Educational Resources Information Center

    Sanderson, Matthew R.; Kentor, Jeffrey D.

    2009-01-01

    It is widely argued that globalization and economic development are associated with international migration. However, these relationships have not been tested empirically. We use a cross-national empirical analysis to assess the impact of global and national factors on international migration from less-developed countries. An interdisciplinary…

  15. Competences in Romanian Higher Education--An Empirical Investigation for the Business Sector

    ERIC Educational Resources Information Center

    Deaconu, Adela; Nistor, Cristina Silvia

    2017-01-01

    This research study particularizes the general descriptions of the European Qualifications Framework for Lifelong Learning, as compiled and developed within the Romanian qualification framework, to the business and economics field in general and to the property economic analysis and valuation field in particular. By means of an empirical analysis,…

  16. Threats and Aggression Directed at Soccer Referees: An Empirical Phenomenological Psychological Study

    ERIC Educational Resources Information Center

    Friman, Margareta; Nyberg, Claes; Norlander, Torsten

    2004-01-01

    A descriptive qualitative analysis of in-depth interviews involving seven provincial Soccer Association referees was carried out in order to find out how referees experience threats and aggression directed to soccer referees. The Empirical Phenomenological Psychological method (EPP-method) was used. The analysis resulted in thirty categories which…

  17. A Social Judgment Analysis of Information Source Preference Profiles: An Exploratory Study to Empirically Represent Media Selection Patterns.

    ERIC Educational Resources Information Center

    Stefl-Mabry, Joette

    2003-01-01

    Describes a study that empirically identified individual preferences profiles to understand information-seeking behavior among professional groups for six selected information sources. Highlights include Social Judgment Analysis; the development of the survey used, a copy of which is appended; hypotheses tested; results of multiple regression…

  18. Critical Access Hospitals and Retail Activity: An Empirical Analysis in Oklahoma

    ERIC Educational Resources Information Center

    Brooks, Lara; Whitacre, Brian E.

    2011-01-01

    Purpose: This paper takes an empirical approach to determining the effect that a critical access hospital (CAH) has on local retail activity. Previous research on the relationship between hospitals and economic development has primarily focused on single-case, multiplier-oriented analysis. However, as the efficacy of federal and state-level rural…

  19. Steering the Ship through Uncertain Waters: Empirical Analysis and the Future of Evangelical Higher Education

    ERIC Educational Resources Information Center

    Rine, P. Jesse; Guthrie, David S.

    2016-01-01

    Leaders of evangelical Christian colleges must navigate a challenging environment shaped by public concern about college costs and educational quality, federal inclinations toward increased regulation, and lingering fallout from the Great Recession. Proceeding from the premise that empirical analysis empowers institutional actors to lead well in…

  20. Deriving Multidimensional Poverty Indicators: Methodological Issues and an Empirical Analysis for Italy

    ERIC Educational Resources Information Center

    Coromaldi, Manuela; Zoli, Mariangela

    2012-01-01

    Theoretical and empirical studies have recently adopted a multidimensional concept of poverty. There is considerable debate about the most appropriate degree of multidimensionality to retain in the analysis. In this work we add to the received literature in two ways. First, we derive indicators of multiple deprivation by applying a particular…

  1. Extended Analysis of Empirical Citations with Skinner's "Verbal Behavior": 1984-2004

    ERIC Educational Resources Information Center

    Dixon, Mark R.; Small, Stacey L.; Rosales, Rocio

    2007-01-01

    The present paper comments on and extends the citation analysis of verbal operant publications based on Skinner's "Verbal Behavior" (1957) by Dymond, O'Hora, Whelan, and O'Donovan (2006). Variations in population parameters were evaluated for only those studies that Dymond et al. categorized as empirical. Preliminary results indicate that the…

  2. On the prediction of auto-rotational characteristics of light airplane fuselages

    NASA Technical Reports Server (NTRS)

    Pamadi, B. N.; Taylor, L. W., Jr.

    1984-01-01

    A semi-empirical theory is presented for the estimation of aerodynamic forces and moments acting on a steadily rotating (spinning) airplane fuselage, with a particular emphasis on the prediction of its auto-rotational behavior. This approach is based on an extension of the available analytical methods for high angle of attack and side-slip and then coupling this procedure with strip theory for application to a rotating airplane fuselage. The analysis is applied to the fuselage of a light general aviation airplane and the results are shown to be in fair agreement with experimental data.

  3. Application of Information-Theoretic Data Mining Techniques in a National Ambulatory Practice Outcomes Research Network

    PubMed Central

    Wright, Adam; Ricciardi, Thomas N.; Zwick, Martin

    2005-01-01

    The Medical Quality Improvement Consortium data warehouse contains de-identified data on more than 3.6 million patients including their problem lists, test results, procedures and medication lists. This study uses reconstructability analysis, an information-theoretic data mining technique, on the MQIC data warehouse to empirically identify risk factors for various complications of diabetes including myocardial infarction and microalbuminuria. The risk factors identified match those risk factors identified in the literature, demonstrating the utility of the MQIC data warehouse for outcomes research, and RA as a technique for mining clinical data warehouses. PMID:16779156

  4. Irregularity, volatility, risk, and financial market time series

    PubMed Central

    Pincus, Steve; Kalman, Rudolf E.

    2004-01-01

    The need to assess subtle, potentially exploitable changes in serial structure is paramount in the analysis of financial data. Herein, we demonstrate the utility of approximate entropy (ApEn), a model-independent measure of sequential irregularity, toward this goal, by several distinct applications. We consider both empirical data and models, including composite indices (Standard and Poor's 500 and Hang Seng), individual stock prices, the random-walk hypothesis, and the Black–Scholes and fractional Brownian motion models. Notably, ApEn appears to be a potentially useful marker of system stability, with rapid increases possibly foreshadowing significant changes in a financial variable. PMID:15358860

  5. Welfare Gains from Financial Liberalization

    PubMed Central

    Townsend, Robert M.; Ueda, Kenichi

    2010-01-01

    Financial liberalization has been a controversial issue, as empirical evidence for growth enhancing effects is mixed. Here, we find sizable welfare gains from liberalization (cost to repression), though the gain in economic growth is ambiguous. We take the view that financial liberalization is a government policy that alters the path of financial deepening, while financial deepening is endogenously chosen by agents given a policy and occurs in transition towards a distant steady state. This history-dependent view necessitates the use of simulation analysis based on a growth model. Our application is a specific episode: Thailand from 1976 to 1996. PMID:20806055

  6. Application of clustering for customer segmentation in private banking

    NASA Astrophysics Data System (ADS)

    Yang, Xuan; Chen, Jin; Hao, Pengpeng; Wang, Yanbo J.

    2015-07-01

    With fierce competition in banking industry, more and more banks have realised that accurate customer segmentation is of fundamental importance, especially for the identification of those high-value customers. In order to solve this problem, we collected real data about private banking customers of a commercial bank in China, conducted empirical analysis by applying K-means clustering technique. When determine the K value, we propose a mechanism that meet both academic requirements and practical needs. Through K-means clustering, we successfully segmented the customers into three categories, and features of each group have been illustrated in details.

  7. [Methodology of determination of the time of death and outlooks for the further development].

    PubMed

    Novikov, P I; Vlasov, A Iu; Shved, E F; Natsentov, E O; Korshunov, N V; Belykh, S A

    2004-01-01

    A methodological analysis of diagnosing the prescription of death coming (PDC) is described in the paper. Key philosophic fundamentals for further novel and more effective methods of PDC determination are elucidated. Main requirement applicable to postmortem diagnosis are defined. Different methods of modeling the postmortem process are demonstrated by the example of cadaver cooling, i.e. in real time, by analogue computer systems and by mathematic modeling. The traditional empiric and the adaptive approaches are comparatively analyzed in modeling the postmortem processes for the PDC diagnosis. A variety of promising trends for further related research is outlined.

  8. Low speed airfoil design and analysis

    NASA Technical Reports Server (NTRS)

    Eppler, R.; Somers, D. M.

    1979-01-01

    A low speed airfoil design and analysis program was developed which contains several unique features. In the design mode, the velocity distribution is not specified for one but many different angles of attack. Several iteration options are included which allow the trailing edge angle to be specified while other parameters are iterated. For airfoil analysis, a panel method is available which uses third-order panels having parabolic vorticity distributions. The flow condition is satisfied at the end points of the panels. Both sharp and blunt trailing edges can be analyzed. The integral boundary layer method with its laminar separation bubble analog, empirical transition criterion, and precise turbulent boundary layer equations compares very favorably with other methods, both integral and finite difference. Comparisons with experiment for several airfoils over a very wide Reynolds number range are discussed. Applications to high lift airfoil design are also demonstrated.

  9. Detecting and characterizing high-frequency oscillations in epilepsy: a case study of big data analysis

    NASA Astrophysics Data System (ADS)

    Huang, Liang; Ni, Xuan; Ditto, William L.; Spano, Mark; Carney, Paul R.; Lai, Ying-Cheng

    2017-01-01

    We develop a framework to uncover and analyse dynamical anomalies from massive, nonlinear and non-stationary time series data. The framework consists of three steps: preprocessing of massive datasets to eliminate erroneous data segments, application of the empirical mode decomposition and Hilbert transform paradigm to obtain the fundamental components embedded in the time series at distinct time scales, and statistical/scaling analysis of the components. As a case study, we apply our framework to detecting and characterizing high-frequency oscillations (HFOs) from a big database of rat electroencephalogram recordings. We find a striking phenomenon: HFOs exhibit on-off intermittency that can be quantified by algebraic scaling laws. Our framework can be generalized to big data-related problems in other fields such as large-scale sensor data and seismic data analysis.

  10. Aircraft directional stability and vertical tail design: A review of semi-empirical methods

    NASA Astrophysics Data System (ADS)

    Ciliberti, Danilo; Della Vecchia, Pierluigi; Nicolosi, Fabrizio; De Marco, Agostino

    2017-11-01

    Aircraft directional stability and control are related to vertical tail design. The safety, performance, and flight qualities of an aircraft also depend on a correct empennage sizing. Specifically, the vertical tail is responsible for the aircraft yaw stability and control. If these characteristics are not well balanced, the entire aircraft design may fail. Stability and control are often evaluated, especially in the preliminary design phase, with semi-empirical methods, which are based on the results of experimental investigations performed in the past decades, and occasionally are merged with data provided by theoretical assumptions. This paper reviews the standard semi-empirical methods usually applied in the estimation of airplane directional stability derivatives in preliminary design, highlighting the advantages and drawbacks of these approaches that were developed from wind tunnel tests performed mainly on fighter airplane configurations of the first decades of the past century, and discussing their applicability on current transport aircraft configurations. Recent investigations made by the authors have shown the limit of these methods, proving the existence of aerodynamic interference effects in sideslip conditions which are not adequately considered in classical formulations. The article continues with a concise review of the numerical methods for aerodynamics and their applicability in aircraft design, highlighting how Reynolds-Averaged Navier-Stokes (RANS) solvers are well-suited to attain reliable results in attached flow conditions, with reasonable computational times. From the results of RANS simulations on a modular model of a representative regional turboprop airplane layout, the authors have developed a modern method to evaluate the vertical tail and fuselage contributions to aircraft directional stability. The investigation on the modular model has permitted an effective analysis of the aerodynamic interference effects by moving, changing, and expanding the available airplane components. Wind tunnel tests over a wide range of airplane configurations have been used to validate the numerical approach. The comparison between the proposed method and the standard semi-empirical methods available in literature proves the reliability of the innovative approach, according to the available experimental data collected in the wind tunnel test campaign.

  11. Artifact removal from EEG data with empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Grubov, Vadim V.; Runnova, Anastasiya E.; Efremova, Tatyana Yu.; Hramov, Alexander E.

    2017-03-01

    In the paper we propose the novel method for dealing with the physiological artifacts caused by intensive activity of facial and neck muscles and other movements in experimental human EEG recordings. The method is based on analysis of EEG signals with empirical mode decomposition (Hilbert-Huang transform). We introduce the mathematical algorithm of the method with following steps: empirical mode decomposition of EEG signal, choosing of empirical modes with artifacts, removing empirical modes with artifacts, reconstruction of the initial EEG signal. We test the method on filtration of experimental human EEG signals from movement artifacts and show high efficiency of the method.

  12. Evaluating the evidence base for relational frame theory: a citation analysis.

    PubMed

    Dymond, Simon; May, Richard J; Munnelly, Anita; Hoon, Alice E

    2010-01-01

    Relational frame theory (RFT) is a contemporary behavior-analytic account of language and cognition. Since it was first outlined in 1985, RFT has generated considerable controversy and debate, and several claims have been made concerning its evidence base. The present study sought to evaluate the evidence base for RFT by undertaking a citation analysis and by categorizing all articles that cited RFT-related search terms. A total of 174 articles were identified between 1991 and 2008, 62 (36%) of which were empirical and 112 (64%) were nonempirical articles. Further analyses revealed that 42 (68%) of the empirical articles were classified as empirical RFT and 20 (32%) as empirical other, whereas 27 (24%) of the nonempirical articles were assigned to the nonempirical reviews category and 85 (76%) to the nonempirical conceptual category. In addition, the present findings show that the majority of empirical research on RFT has been conducted with typically developing adult populations, on the relational frame of sameness, and has tended to be published in either The Psychological Record or the Journal of the Experimental Analysis of Behavior. Overall, RFT has made a substantial contribution to the literature in a relatively short period of time.

  13. The application of finite element analysis in the skull biomechanics and dentistry.

    PubMed

    Prado, Felippe Bevilacqua; Rossi, Ana Cláudia; Freire, Alexandre Rodrigues; Ferreira Caria, Paulo Henrique

    2014-01-01

    Empirical concepts describe the direction of the masticatory stress dissipation in the skull. The scientific evidence of the trajectories and the magnitude of stress dissipation can help in the diagnosis of the masticatory alterations and the planning of oral rehabilitation in the different areas of Dentistry. The Finite Element Analysis (FEA) is a tool that may reproduce complex structures with irregular geometries of natural and artificial tissues of the human body because it uses mathematical functions that enable the understanding of the craniofacial biomechanics. The aim of this study was to review the literature on the advantages and limitations of FEA in the skull biomechanics and Dentistry study. The keywords of the selected original research articles were: Finite element analysis, biomechanics, skull, Dentistry, teeth, and implant. The literature review was performed in the databases, PUBMED, MEDLINE and SCOPUS. The selected books and articles were between the years 1928 and 2010. The FEA is an assessment tool whose application in different areas of the Dentistry has gradually increased over the past 10 years, but its application in the analysis of the skull biomechanics is scarce. The main advantages of the FEA are the realistic mode of approach and the possibility of results being based on analysis of only one model. On the other hand, the main limitation of the FEA studies is the lack of anatomical details in the modeling phase of the craniofacial structures and the lack of information about the material properties.

  14. Usability evaluation of mobile applications using ISO 9241 and ISO 25062 standards.

    PubMed

    Moumane, Karima; Idri, Ali; Abran, Alain

    2016-01-01

    This paper presents an empirical study based on a set of measures to evaluate the usability of mobile applications running on different mobile operating systems, including Android, iOS and Symbian. The aim is to evaluate empirically a framework that we have developed on the use of the Software Quality Standard ISO 9126 in mobile environments, especially the usability characteristic. To do that, 32 users had participated in the experiment and we have used ISO 25062 and ISO 9241 standards for objective measures by working with two widely used mobile applications: Google Apps and Google Maps. The QUIS 7.0 questionnaire have been used to collect measures assessing the users' level of satisfaction when using these two mobile applications. By analyzing the results we highlighted a set of mobile usability issues that are related to the hardware as well as to the software and that need to be taken into account by designers and developers in order to improve the usability of mobile applications.

  15. Holo-analysis.

    PubMed

    Rosen, G D

    2006-06-01

    Meta-analysis is a vague descriptor used to encompass very diverse methods of data collection analysis, ranging from simple averages to more complex statistical methods. Holo-analysis is a fully comprehensive statistical analysis of all available data and all available variables in a specified topic, with results expressed in a holistic factual empirical model. The objectives and applications of holo-analysis include software production for prediction of responses with confidence limits, translation of research conditions to praxis (field) circumstances, exposure of key missing variables, discovery of theoretically unpredictable variables and interactions, and planning future research. Holo-analyses are cited as examples of the effects on broiler feed intake and live weight gain of exogenous phytases, which account for 70% of variation in responses in terms of 20 highly significant chronological, dietary, environmental, genetic, managemental, and nutrient variables. Even better future accountancy of variation will be facilitated if and when authors of papers routinely provide key data for currently neglected variables, such as temperatures, complete feed formulations, and mortalities.

  16. When is hub gene selection better than standard meta-analysis?

    PubMed

    Langfelder, Peter; Mischel, Paul S; Horvath, Steve

    2013-01-01

    Since hub nodes have been found to play important roles in many networks, highly connected hub genes are expected to play an important role in biology as well. However, the empirical evidence remains ambiguous. An open question is whether (or when) hub gene selection leads to more meaningful gene lists than a standard statistical analysis based on significance testing when analyzing genomic data sets (e.g., gene expression or DNA methylation data). Here we address this question for the special case when multiple genomic data sets are available. This is of great practical importance since for many research questions multiple data sets are publicly available. In this case, the data analyst can decide between a standard statistical approach (e.g., based on meta-analysis) and a co-expression network analysis approach that selects intramodular hubs in consensus modules. We assess the performance of these two types of approaches according to two criteria. The first criterion evaluates the biological insights gained and is relevant in basic research. The second criterion evaluates the validation success (reproducibility) in independent data sets and often applies in clinical diagnostic or prognostic applications. We compare meta-analysis with consensus network analysis based on weighted correlation network analysis (WGCNA) in three comprehensive and unbiased empirical studies: (1) Finding genes predictive of lung cancer survival, (2) finding methylation markers related to age, and (3) finding mouse genes related to total cholesterol. The results demonstrate that intramodular hub gene status with respect to consensus modules is more useful than a meta-analysis p-value when identifying biologically meaningful gene lists (reflecting criterion 1). However, standard meta-analysis methods perform as good as (if not better than) a consensus network approach in terms of validation success (criterion 2). The article also reports a comparison of meta-analysis techniques applied to gene expression data and presents novel R functions for carrying out consensus network analysis, network based screening, and meta analysis.

  17. Beyond integrating social sciences: Reflecting on the place of life sciences in empirical bioethics methodologies.

    PubMed

    Mertz, Marcel; Schildmann, Jan

    2018-06-01

    Empirical bioethics is commonly understood as integrating empirical research with normative-ethical research in order to address an ethical issue. Methodological analyses in empirical bioethics mainly focus on the integration of socio-empirical sciences (e.g. sociology or psychology) and normative ethics. But while there are numerous multidisciplinary research projects combining life sciences and normative ethics, there is few explicit methodological reflection on how to integrate both fields, or about the goals and rationales of such interdisciplinary cooperation. In this paper we will review some drivers for the tendency of empirical bioethics methodologies to focus on the collaboration of normative ethics with particularly social sciences. Subsequently, we argue that the ends of empirical bioethics, not the empirical methods, are decisive for the question of which empirical disciplines can contribute to empirical bioethics in a meaningful way. Using already existing types of research integration as a springboard, five possible types of research which encompass life sciences and normative analysis will illustrate how such cooperation can be conceptualized from a methodological perspective within empirical bioethics. We will conclude with a reflection on the limitations and challenges of empirical bioethics research that integrates life sciences.

  18. Reconsideration at Field Scale of the Relationship between Hydraulic Conductivity and Porosity: The Case of a Sandy Aquifer in South Italy

    PubMed Central

    2014-01-01

    To describe flow or transport phenomena in porous media, relations between aquifer hydraulic conductivity and effective porosity can prove useful, avoiding the need to perform expensive and time consuming measurements. The practical applications generally require the determination of this parameter at field scale, while most of the empirical and semiempirical formulas, based on grain size analysis and allowing determination of the hydraulic conductivity from the porosity, are related to the laboratory scale and thus are not representative of the aquifer volumes to which one refers. Therefore, following the grain size distribution methodology, a new experimental relation between hydraulic conductivity and effective porosity, representative of aquifer volumes at field scale, is given for a confined aquifer. The experimental values used to determine this law were obtained for both parameters using only field measurements methods. The experimental results found, also if in the strict sense valid only for the investigated aquifer, can give useful suggestions for other alluvial aquifers with analogous characteristics of grain-size distribution. Limited to the investigated range, a useful comparison with the best known empirical formulas based on grain size analysis was carried out. The experimental data allowed also investigation of the existence of a scaling behaviour for both parameters considered. PMID:25180202

  19. Multimodal Pressure-Flow Analysis: Application of Hilbert Huang Transform in Cerebral Blood Flow Regulation

    NASA Astrophysics Data System (ADS)

    Lo, Men-Tzung; Hu, Kun; Liu, Yanhui; Peng, C.-K.; Novak, Vera

    2008-12-01

    Quantification of nonlinear interactions between two nonstationary signals presents a computational challenge in different research fields, especially for assessments of physiological systems. Traditional approaches that are based on theories of stationary signals cannot resolve nonstationarity-related issues and, thus, cannot reliably assess nonlinear interactions in physiological systems. In this review we discuss a new technique called multimodal pressure flow (MMPF) method that utilizes Hilbert-Huang transformation to quantify interaction between nonstationary cerebral blood flow velocity (BFV) and blood pressure (BP) for the assessment of dynamic cerebral autoregulation (CA). CA is an important mechanism responsible for controlling cerebral blood flow in responses to fluctuations in systemic BP within a few heart-beats. The MMPF analysis decomposes BP and BFV signals into multiple empirical modes adaptively so that the fluctuations caused by a specific physiologic process can be represented in a corresponding empirical mode. Using this technique, we showed that dynamic CA can be characterized by specific phase delays between the decomposed BP and BFV oscillations, and that the phase shifts are significantly reduced in hypertensive, diabetics and stroke subjects with impaired CA. Additionally, the new technique can reliably assess CA using both induced BP/BFV oscillations during clinical tests and spontaneous BP/BFV fluctuations during resting conditions.

  20. Direction dependence analysis: A framework to test the direction of effects in linear models with an implementation in SPSS.

    PubMed

    Wiedermann, Wolfgang; Li, Xintong

    2018-04-16

    In nonexperimental data, at least three possible explanations exist for the association of two variables x and y: (1) x is the cause of y, (2) y is the cause of x, or (3) an unmeasured confounder is present. Statistical tests that identify which of the three explanatory models fits best would be a useful adjunct to the use of theory alone. The present article introduces one such statistical method, direction dependence analysis (DDA), which assesses the relative plausibility of the three explanatory models on the basis of higher-moment information about the variables (i.e., skewness and kurtosis). DDA involves the evaluation of three properties of the data: (1) the observed distributions of the variables, (2) the residual distributions of the competing models, and (3) the independence properties of the predictors and residuals of the competing models. When the observed variables are nonnormally distributed, we show that DDA components can be used to uniquely identify each explanatory model. Statistical inference methods for model selection are presented, and macros to implement DDA in SPSS are provided. An empirical example is given to illustrate the approach. Conceptual and empirical considerations are discussed for best-practice applications in psychological data, and sample size recommendations based on previous simulation studies are provided.

  1. [Intervening in the neural basis of one's personality: a practice-oriented ethical analysis of neuropharmacology and deep-brain stimulation].

    PubMed

    Synofzik, M

    2007-12-01

    Through the rapid progress in neuropharmacology it seems to become possible to effectively improve our cognitive capacities and emotional states by easily applicable means. Moreover, deep-brain stimulation may allow an effective therapeutic option for those neurological and psychiatric diseases which still can not be sufficiently treated by pharmacological measures. So far, however, both the benefit and the harm of these techniques are only insufficiently understood by neuroscience and detailed ethical analyses are still missing. In this article ethical criteria and most recent empirical evidence are systematically brought together for the first time. This analysis shows that it is irrelevant for an ethical evaluation whether a drug or a brain-machine interface is categorized as "enhancement" or "treatment" or whether it changes "human nature". The only decisive criteria are whether the intervention (1.) benefits the patient, (2.) does not harm the patient and (3.) is desired by the patient. However, current empirical data in both fields, neuropharmacology and deep-brain stimulation are still too sparse to adequately evaluate these criteria. Moreover, the focus in both fields has been strongly misled by neglecting the distinction between "benefit" and "efficacy": In past years research and clinical practice have only focused on physiological effects, but not on the actual benefit to the patient.

  2. Application of the Hilbert-Huang Transform to Financial Data

    NASA Technical Reports Server (NTRS)

    Huang, Norden

    2005-01-01

    A paper discusses the application of the Hilbert-Huang transform (HHT) method to time-series financial-market data. The method was described, variously without and with the HHT name, in several prior NASA Tech Briefs articles and supporting documents. To recapitulate: The method is especially suitable for analyzing time-series data that represent nonstationary and nonlinear phenomena including physical phenomena and, in the present case, financial-market processes. The method involves the empirical mode decomposition (EMD), in which a complicated signal is decomposed into a finite number of functions, called "intrinsic mode functions" (IMFs), that admit well-behaved Hilbert transforms. The HHT consists of the combination of EMD and Hilbert spectral analysis. The local energies and the instantaneous frequencies derived from the IMFs through Hilbert transforms can be used to construct an energy-frequency-time distribution, denoted a Hilbert spectrum. The instant paper begins with a discussion of prior approaches to quantification of market volatility, summarizes the HHT method, then describes the application of the method in performing time-frequency analysis of mortgage-market data from the years 1972 through 2000. Filtering by use of the EMD is shown to be useful for quantifying market volatility.

  3. [Effects of intercropping Chinese milk vetch on functional characteristics of soil microbial community in rape rhizosphere].

    PubMed

    Zhou, Quan; Wang, Long Chang; Xing, Yi; Ma, Shu Min; Zhang, Xiao Duan; Chen, Jiao; Shi, Chao

    2018-03-01

    The application of green manure is facing serious problems in purple soil region of southwest China. With the aim to explore the potential application of green manure, we examined the functional characteristics of soil microbial community in a system of Chinese milk vetch intercropped with rape. The innovations are the application of Chinese milk vetch in dry land of the southwest China and the establishment of new planting pattern of rape by providing empirical data. Results showed that the intercropping with Chinese milk vetch decreased the carbon resource use efficiency of microbial community in rape rhizosphere, especially for the utilization of carbohydrates. At the same time, Shannon index, Simpson index, and richness were reduced, but evenness index was increased by intercropping. Those results from cluster analysis and principal component analysis suggest that the soil microbial community composition was significantly different between monocropping and intercropping. The carbohydrates, amino acids and carboxylic acids were the sensitive carbon sources for differentiating the changes of the microbial community induced by monocropping and intercropping. Intercropping Chinese milk vetch could decrease functional activity, change community composition, and reduce diversity of soil microbial community in rape rhizosphere.

  4. FIM measurement properties and Rasch model details.

    PubMed

    Wright, B D; Linacre, J M; Smith, R M; Heinemann, A W; Granger, C V

    1997-12-01

    To summarize, we take issue with the criticisms of Dickson & Köhler for two main reasons: 1. Rasch analysis provides a model from which to approach the analysis of the FIM, an ordinal scale, as an interval scale. The existence of examples of items or individuals which do not fit the model does not disprove the overall efficacy of the model; and 2. the principal components analysis of FIM motor items as presented by Dickson & Köhler tends to undermine rather than support their argument. Their own analyses produce a single major factor explaining between 58.5 and 67.1% of the variance, depending upon the sample, with secondary factors explaining much less variance. Finally, analysis of item response, or latent trait, is a powerful method for understanding the meaning of a measure. However, it presumes that item scores are accurate. Another concern is that Dickson & Köhler do not address the issue of reliability of scoring the FIM items on which they report, a critical point in comparing results. The Uniform Data System for Medical Rehabilitation (UDSMRSM) expends extensive effort in the training of clinicians of subscribing facilities to score items accurately. This is followed up with a credentialing process. Phase 1 involves the testing of individual clinicians who are submitting data to determine if they have achieved mastery over the use of the FIM instrument. Phase 2 involves examining the data for outlying values. When Dickson & Köhler investigate more carefully the application of the Rasch model to their FIM data, they will discover that the results presented in their paper support rather than contradict their application of the Rasch model! This paper is typical of supposed refutations of Rasch model applications. Dickson & Köhler will find that idiosyncrasies in their data and misunderstandings of the Rasch model are the only basis for a claim to have disproven the relevance of the model to FIM data. The Rasch model is a mathematical theorem (like Pythagoras') and so cannot be disproven by empirical data once it has been deduced on theoretical grounds. Sometimes empirical data are not suitable for construction of a measure. When this happens, the routine fit statistics indicate the unsuitable segments of the data. Most FIM data do conform closely enough to the Rasch model to support generalizable linear measures. Science can advance!

  5. Sediment yield estimation in mountain catchments of the Camastra reservoir, southern Italy: a comparison among different empirical methods

    NASA Astrophysics Data System (ADS)

    Lazzari, Maurizio; Danese, Maria; Gioia, Dario; Piccarreta, Marco

    2013-04-01

    Sedimentary budget estimation is an important topic for both scientific and social community, because it is crucial to understand both dynamics of orogenic belts and many practical problems, such as soil conservation and sediment accumulation in reservoir. Estimations of sediment yield or denudation rates in southern-central Italy are generally obtained by simple empirical relationships based on statistical regression between geomorphic parameters of the drainage network and the measured suspended sediment yield at the outlet of several drainage basins or through the use of models based on sediment delivery ratio or on soil loss equations. In this work, we perform a study of catchment dynamics and an estimation of sedimentary yield for several mountain catchments of the central-western sector of the Basilicata region, southern Italy. Sediment yield estimation has been obtained through both an indirect estimation of suspended sediment yield based on the Tu index (mean annual suspension sediment yield, Ciccacci et al., 1980) and the application of the Rusle (Renard et al., 1997) and the USPED (Mitasova et al., 1996) empirical methods. The preliminary results indicate a reliable difference between the RUSLE and USPED methods and the estimation based on the Tu index; a critical data analysis of results has been carried out considering also the present-day spatial distribution of erosion, transport and depositional processes in relation to the maps obtained from the application of those different empirical methods. The studied catchments drain an artificial reservoir (i.e. the Camastra dam), where a detailed evaluation of the amount of historical sediment storage has been collected. Sediment yield estimation obtained by means of the empirical methods have been compared and checked with historical data of sediment accumulation measured in the artificial reservoir of the Camastra dam. The validation of such estimations of sediment yield at the scale of large catchments using sediment storage in reservoirs provides a good opportunity: i) to test the reliability of the empirical methods used to estimate the sediment yield; ii) to investigate the catchment dynamics and its spatial and temporal evolution in terms of erosion, transport and deposition. References Ciccacci S., Fredi F., Lupia Palmieri E., Pugliese F., 1980. Contributo dell'analisi geomorfica quantitativa alla valutazione dell'entita dell'erosione nei bacini fluviali. Bollettino della Società Geologica Italiana 99: 455-516. Mitasova H, Hofierka J, Zlocha M, Iverson LR. 1996. Modeling topographic potential for erosion and deposition using GIS. International Journal of Geographical Information Systems 10: 629-641. Renard K.G., Foster G.R., Weesies G.A., McCool D.K., Yoder D.C., 1997. Predicting soil erosion by water: a guide to conservation planning with the Revised Universal Soil Loss Equation (RUSLE), USDA-ARS, Agricultural Handbook No. 703.

  6. Multicultural Counseling Competencies Research: A 20-Year Content Analysis

    ERIC Educational Resources Information Center

    Worthington, Roger L.; Soth-McNett, Angela M.; Moreno, Matthew V.

    2007-01-01

    The authors conducted a 20-year content analysis of the entire field of empirical research on the multicultural counseling competencies (D. W. Sue et al., 1982). They conducted an exhaustive search for empirical research articles using PsycINFO, as well as complete reviews of the past 20 years of several journals (e.g., Journal of Counseling…

  7. Understanding the Dynamics of MOOC Discussion Forums with Simulation Investigation for Empirical Network Analysis (SIENA)

    ERIC Educational Resources Information Center

    Zhang, Jingjing; Skryabin, Maxim; Song, Xiongwei

    2016-01-01

    This study attempts to make inferences about the mechanisms that drive network change over time. It adopts simulation investigation for empirical network analysis to examine the patterns and evolution of relationships formed in the context of a massive open online course (MOOC) discussion forum. Four network effects--"homophily,"…

  8. Functions of Research in Radical Behaviorism for the Further Development of Behavior Analysis

    ERIC Educational Resources Information Center

    Leigland, Sam

    2010-01-01

    The experimental analysis of behavior began as an inductively oriented, empirically based scientific field. As the field grew, its distinctive system of science--radical behaviorism--grew with it. The continuing growth of the empirical base of the field has been accompanied by the growth of the literature on radical behaviorism and its…

  9. Interest Rates and Coupon Bonds in Quantum Finance

    NASA Astrophysics Data System (ADS)

    Baaquie, Belal E.

    2009-09-01

    1. Synopsis; 2. Interest rates and coupon bonds; 3. Options and option theory; 4. Interest rate and coupon bond options; 5. Quantum field theory of bond forward interest rates; 6. Libor Market Model of interest rates; 7. Empirical analysis of forward interest rates; 8. Libor Market Model of interest rate options; 9. Numeraires for bond forward interest rates; 10. Empirical analysis of interest rate caps; 11. Coupon bond European and Asian options; 12. Empirical analysis of interest rate swaptions; 13. Correlation of coupon bond options; 14. Hedging interest rate options; 15. Interest rate Hamiltonian and option theory; 16. American options for coupon bonds and interest rates; 17. Hamiltonian derivation of coupon bond options; Appendixes; Glossaries; List of symbols; Reference; Index.

  10. Exploring connectivity with large-scale Granger causality on resting-state functional MRI.

    PubMed

    DSouza, Adora M; Abidin, Anas Z; Leistritz, Lutz; Wismüller, Axel

    2017-08-01

    Large-scale Granger causality (lsGC) is a recently developed, resting-state functional MRI (fMRI) connectivity analysis approach that estimates multivariate voxel-resolution connectivity. Unlike most commonly used multivariate approaches, which establish coarse-resolution connectivity by aggregating voxel time-series avoiding an underdetermined problem, lsGC estimates voxel-resolution, fine-grained connectivity by incorporating an embedded dimension reduction. We investigate application of lsGC on realistic fMRI simulations, modeling smoothing of neuronal activity by the hemodynamic response function and repetition time (TR), and empirical resting-state fMRI data. Subsequently, functional subnetworks are extracted from lsGC connectivity measures for both datasets and validated quantitatively. We also provide guidelines to select lsGC free parameters. Results indicate that lsGC reliably recovers underlying network structure with area under receiver operator characteristic curve (AUC) of 0.93 at TR=1.5s for a 10-min session of fMRI simulations. Furthermore, subnetworks of closely interacting modules are recovered from the aforementioned lsGC networks. Results on empirical resting-state fMRI data demonstrate recovery of visual and motor cortex in close agreement with spatial maps obtained from (i) visuo-motor fMRI stimulation task-sequence (Accuracy=0.76) and (ii) independent component analysis (ICA) of resting-state fMRI (Accuracy=0.86). Compared with conventional Granger causality approach (AUC=0.75), lsGC produces better network recovery on fMRI simulations. Furthermore, it cannot recover functional subnetworks from empirical fMRI data, since quantifying voxel-resolution connectivity is not possible as consequence of encountering an underdetermined problem. Functional network recovery from fMRI data suggests that lsGC gives useful insight into connectivity patterns from resting-state fMRI at a multivariate voxel-resolution. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Application of Factor Analysis on the Financial Ratios of Indian Cement Industry and Validation of the Results by Cluster Analysis

    NASA Astrophysics Data System (ADS)

    De, Anupam; Bandyopadhyay, Gautam; Chakraborty, B. N.

    2010-10-01

    Financial ratio analysis is an important and commonly used tool in analyzing financial health of a firm. Quite a large number of financial ratios, which can be categorized in different groups, are used for this analysis. However, to reduce number of ratios to be used for financial analysis and regrouping them into different groups on basis of empirical evidence, Factor Analysis technique is being used successfully by different researches during the last three decades. In this study Factor Analysis has been applied over audited financial data of Indian cement companies for a period of 10 years. The sample companies are listed on the Stock Exchange India (BSE and NSE). Factor Analysis, conducted over 44 variables (financial ratios) grouped in 7 categories, resulted in 11 underlying categories (factors). Each factor is named in an appropriate manner considering the factor loads and constituent variables (ratios). Representative ratios are identified for each such factor. To validate the results of Factor Analysis and to reach final conclusion regarding the representative ratios, Cluster Analysis had been performed.

  12. Oseltamivir Treatment for Children with Influenza-Like Illness in China: A Cost-Effectiveness Analysis.

    PubMed

    Shen, Kunling; Xiong, Tengbin; Tan, Seng Chuen; Wu, Jiuhong

    2016-01-01

    Influenza is a common viral respiratory infection that causes epidemics and pandemics in the human population. Oseltamivir is a neuraminidase inhibitor-a new class of antiviral therapy for influenza. Although its efficacy and safety have been established, there is uncertainty regarding whether influenza-like illness (ILI) in children is best managed by oseltamivir at the onset of illness, and its cost-effectiveness in children has not been studied in China. To evaluate the cost-effectiveness of post rapid influenza diagnostic test (RIDT) treatment with oseltamivir and empiric treatment with oseltamivir comparing with no antiviral therapy against influenza for children with ILI. We developed a decision-analytic model based on previously published evidence to simulate and evaluate 1-year potential clinical and economic outcomes associated with three managing strategies for children presenting with symptoms of influenza. Model inputs were derived from literature and expert opinion of clinical practice and research in China. Outcome measures included costs and quality-adjusted life year (QALY). All the interventions were compared with incremental cost-effectiveness ratios (ICER). In base case analysis, empiric treatment with oseltamivir consistently produced the greatest gains in QALY. When compared with no antiviral therapy, the empiric treatment with oseltamivir strategy is very cost effective with an ICER of RMB 4,438. When compared with the post RIDT treatment with oseltamivir, the empiric treatment with oseltamivir strategy is dominant. Probabilistic sensitivity analysis projected that there is a 100% probability that empiric oseltamivir treatment would be considered as a very cost-effective strategy compared to the no antiviral therapy, according to the WHO recommendations for cost-effectiveness thresholds. The same was concluded with 99% probability for empiric oseltamivir treatment being a very cost-effective strategy compared to the post RIDT treatment with oseltamivir. In the Chinese setting of current health system, our modelling based simulation analysis suggests that empiric treatment with oseltamivir to be a cost-saving and very cost-effective strategy in managing children with ILI.

  13. Application-Driven Educational Game to Assist Young Children in Learning English Vocabulary

    ERIC Educational Resources Information Center

    Chen, Zhi-Hong; Lee, Shu-Yu

    2018-01-01

    This paper describes the development of an educational game, named My-Pet-Shop, to enhance young children's learning of English vocabulary. The educational game is underpinned by an application-driven model, which consists of three components: application scenario, subject learning, and learning regulation. An empirical study is further conducted…

  14. The roles of the analogy with natural selection in B.F. Skinner's philosophy.

    PubMed

    Smith, Terry L

    2018-02-17

    Beginning in the 1950s, B.F. Skinner made increasing reference to an analogy between operant conditioning and natural selection. This analogy is the basis of an argument that, in contrast to Skinner's other critiques of cognitive science, is neither epistemological nor pragmatic. Instead, it is based on the claim that ontogenetic adaptation is due to a special mode of causation he called "selection by consequences." He argued that this mode of causation conflicts with explanations that attribute action to an autonomous agent with reasons for acting. This argument dismisses ordinary explanations of action, and has implications not only for cognitive science but also for morals. Skinner cited the latter implications to counter objections to the application of behavior analysis to the reform of society and its institutions. Skinner's critique, however, rests upon empirical assumptions that have been criticized by other behavior analysts. Although for Skinner the major role of the analogy was to propose an empirical thesis, it also can play a metaphysical role-namely, to demonstrate the possibility of ontogenetic adaptation without reference to agents who have reasons for acting. These two roles, empirical and metaphysical, are the mirror image of the empirical and metaphysical roles of the computer analogy for cognitive science. That analogy also can be (and has been) interpreted as an empirical thesis. Its empirical implications, however, have been difficult to confirm. It also, however, has played a metaphysical role-namely, to demonstrate the possibility that a physical process could perform logical operations on states having propositional content. Neither analogy provides a well-confirmed, general answer to the question of how to explain the process of ontogenetic adaptation. But together they show there are two metaphysically coherent, but conflicting, answers to this question. Depending upon one's epistemology, the analogy with natural selection may provide a useful point of departure for a strategy of research. Such a pragmatic grounding for a research strategy does not, however, provide sufficient reason to abandon for purposes of ethics the concept of persons as autonomous agents. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. Empirical and pragmatic adequacy of grounded theory: Advancing nurse empowerment theory for nurses' practice.

    PubMed

    Udod, Sonia A; Racine, Louise

    2017-12-01

    To draw on the findings of a grounded theory study aimed at exploring how power is exercised in nurse-manager relationships in the hospital setting, this paper examines the empirical and pragmatic adequacy of grounded theory as a methodology to advance the concept of empowerment in the area of nursing leadership and management. The evidence on staff nurse empowerment has highlighted the magnitude of individual and organisational outcomes, but has not fully explicated the micro-level processes underlying how power is exercised, shared or created within the nurse-manager relationship. Although grounded theory is a widely adopted nursing research methodology, it remains less used in nursing leadership because of the dominance of quantitative approaches to research. Grounded theory methodology provides the empirical and pragmatic relevance to inform nursing practice and policy. Grounded theory is a relevant qualitative approach to use in leadership research as it provides a fine and detailed analysis of the process underlying complexity and bureaucracy. Discursive paper. A critical examination of the empirical and pragmatic relevance of grounded theory by (Corbin & Strauss, , ) as a method for analysing and solving problems in nurses' practice is provided. This paper provides evidence to support the empirical and pragmatic adequacy of grounded theory methodology. Although the application of the ontological, epistemological and methodological assumptions of grounded theory is challenging, this methodology is useful to address real-life problems in nursing practice by developing theoretical explanations of nurse empowerment, or lack thereof, in the workplace. Grounded theory represents a relevant methodology to inform nursing leadership research. Grounded theory is anchored in the reality of practice. The strength of grounded theory is to provide results that can be readily applied to clinical practice and policy as they arise from problems that affect practice and that are meaningful to nurses. © 2017 John Wiley & Sons Ltd.

  16. Multivariate Qst–Fst Comparisons: A Neutrality Test for the Evolution of the G Matrix in Structured Populations

    PubMed Central

    Martin, Guillaume; Chapuis, Elodie; Goudet, Jérôme

    2008-01-01

    Neutrality tests in quantitative genetics provide a statistical framework for the detection of selection on polygenic traits in wild populations. However, the existing method based on comparisons of divergence at neutral markers and quantitative traits (Qst–Fst) suffers from several limitations that hinder a clear interpretation of the results with typical empirical designs. In this article, we propose a multivariate extension of this neutrality test based on empirical estimates of the among-populations (D) and within-populations (G) covariance matrices by MANOVA. A simple pattern is expected under neutrality: D = 2Fst/(1 − Fst)G, so that neutrality implies both proportionality of the two matrices and a specific value of the proportionality coefficient. This pattern is tested using Flury's framework for matrix comparison [common principal-component (CPC) analysis], a well-known tool in G matrix evolution studies. We show the importance of using a Bartlett adjustment of the test for the small sample sizes typically found in empirical studies. We propose a dual test: (i) that the proportionality coefficient is not different from its neutral expectation [2Fst/(1 − Fst)] and (ii) that the MANOVA estimates of mean square matrices between and among populations are proportional. These two tests combined provide a more stringent test for neutrality than the classic Qst–Fst comparison and avoid several statistical problems. Extensive simulations of realistic empirical designs suggest that these tests correctly detect the expected pattern under neutrality and have enough power to efficiently detect mild to strong selection (homogeneous, heterogeneous, or mixed) when it is occurring on a set of traits. This method also provides a rigorous and quantitative framework for disentangling the effects of different selection regimes and of drift on the evolution of the G matrix. We discuss practical requirements for the proper application of our test in empirical studies and potential extensions. PMID:18245845

  17. Application of Metacognitive Strategy to Primary Listening Teaching

    NASA Astrophysics Data System (ADS)

    Zheng, Jie

    2017-12-01

    It is of vital importance that our students should be taught to listen effectively and critically. This essay focuses the metacognitive strategy in listening and an empirical study of the application of metacognitive strategy to primary listening teaching is made.

  18. Temperature dependence of Er³⁺ ionoluminescence and photoluminescence in Gd₂O₃:Bi nanopowder.

    PubMed

    Boruc, Zuzanna; Gawlik, Grzegorz; Fetliński, Bartosz; Kaczkan, Marcin; Malinowski, Michał

    2014-06-01

    Ionoluminescence (IL) and photoluminescence (PL) of trivalent erbium ions (Er(3+)) in Gd2O3 nanopowder host activated with Bi(3+) ions has been studied in order to establish the link between changes in luminescent spectra and temperature of the sample material. IL measurements have been performed with H2 (+) 100 keV ion beam bombarding the target material for a few seconds, while PL spectra have been collected for temperatures ranging from 20 °C to 700 °C. The PL data was used as a reference in determining the temperature corresponding to IL spectra. The collected data enabled the definition of empirical formula based on the Boltzmann distribution, which allows the temperature to be determined with a maximum sensitivity of 9.7 × 10(-3) °C(-1). The analysis of the Er(3+) energy level structure in terms of tendency of the system to stay in thermal equilibrium, explained different behaviors of the line intensities. This work led to the conclusion that temperature changes during ion excitation can be easily defined with separately collected PL spectra. The final result, which is empirical formula describing dependence of fluorescence intensity ratio on temperature, raises the idea of an application of method in temperature control, during processes like ion implantation and some nuclear applications.

  19. Evaluation of Large-scale Data to Detect Irregularity in Payment for Medical Services. An Extended Use of Benford's Law.

    PubMed

    Park, Junghyun A; Kim, Minki; Yoon, Seokjoon

    2016-05-17

    Sophisticated anti-fraud systems for the healthcare sector have been built based on several statistical methods. Although existing methods have been developed to detect fraud in the healthcare sector, these algorithms consume considerable time and cost, and lack a theoretical basis to handle large-scale data. Based on mathematical theory, this study proposes a new approach to using Benford's Law in that we closely examined the individual-level data to identify specific fees for in-depth analysis. We extended the mathematical theory to demonstrate the manner in which large-scale data conform to Benford's Law. Then, we empirically tested its applicability using actual large-scale healthcare data from Korea's Health Insurance Review and Assessment (HIRA) National Patient Sample (NPS). For Benford's Law, we considered the mean absolute deviation (MAD) formula to test the large-scale data. We conducted our study on 32 diseases, comprising 25 representative diseases and 7 DRG-regulated diseases. We performed an empirical test on 25 diseases, showing the applicability of Benford's Law to large-scale data in the healthcare industry. For the seven DRG-regulated diseases, we examined the individual-level data to identify specific fees to carry out an in-depth analysis. Among the eight categories of medical costs, we considered the strength of certain irregularities based on the details of each DRG-regulated disease. Using the degree of abnormality, we propose priority action to be taken by government health departments and private insurance institutions to bring unnecessary medical expenses under control. However, when we detect deviations from Benford's Law, relatively high contamination ratios are required at conventional significance levels.

  20. A new concept in seismic landslide hazard analysis for practical application

    NASA Astrophysics Data System (ADS)

    Lee, Chyi-Tyi

    2017-04-01

    A seismic landslide hazard model could be constructed using deterministic approach (Jibson et al., 2000) or statistical approach (Lee, 2014). Both approaches got landslide spatial probability under a certain return-period earthquake. In the statistical approach, our recent study found that there are common patterns among different landslide susceptibility models of the same region. The common susceptibility could reflect relative stability of slopes at a region; higher susceptibility indicates lower stability. Using the common susceptibility together with an earthquake event landslide inventory and a map of topographically corrected Arias intensity, we can build the relationship among probability of failure, Arias intensity and the susceptibility. This relationship can immediately be used to construct a seismic landslide hazard map for the region that the empirical relationship built. If the common susceptibility model is further normalized and the empirical relationship built with normalized susceptibility, then the empirical relationship may be practically applied to different region with similar tectonic environments and climate conditions. This could be feasible, when a region has no existing earthquake-induce landslide data to train the susceptibility model and to build the relationship. It is worth mentioning that a rain-induced landslide susceptibility model has common pattern similar to earthquake-induced landslide susceptibility in the same region, and is usable to build the relationship with an earthquake event landslide inventory and a map of Arias intensity. These will be introduced with examples in the meeting.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Streets, W.E.

    As the need for rapid and more accurate determinations of gamma-emitting radionuclides in environmental and mixed waste samples grows, there is continued interest in the development of theoretical tools to eliminate the need for some laboratory analyses and to enhance the quality of information from necessary analyses. In gamma spectrometry the use of theoretical self-absorption coefficients (SACs) can eliminate the need to determine the SAC empirically by counting a known source through each sample. This empirical approach requires extra counting time and introduces another source of counting error, which must be included in the calculation of results. The empirical determinationmore » of SACs is routinely used when the nuclides of interest are specified; theoretical determination of the SAC can enhance the information for the analysis of true unknowns, where there may be no prior knowledge about radionuclides present in a sample. Determination of an exact SAC does require knowledge about the total composition of a sample. In support of the Department of Energy`s (DOE) Environmental Survey Program, the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory developed theoretical self-absorption models to estimate SACs for the determination of non-specified radionuclides in samples of unknown, widely-varying, compositions. Subsequently, another SAC model, in a different counting geometry and for specified nuclides, was developed for another application. These two models are now used routinely for the determination of gamma-emitting radionuclides in a wide variety of environmental and mixed waste samples.« less

  2. Assessment of municipal solid waste settlement models based on field-scale data analysis.

    PubMed

    Bareither, Christopher A; Kwak, Seungbok

    2015-08-01

    An evaluation of municipal solid waste (MSW) settlement model performance and applicability was conducted based on analysis of two field-scale datasets: (1) Yolo and (2) Deer Track Bioreactor Experiment (DTBE). Twelve MSW settlement models were considered that included a range of compression behavior (i.e., immediate compression, mechanical creep, and biocompression) and range of total (2-22) and optimized (2-7) model parameters. A multi-layer immediate settlement analysis developed for Yolo provides a framework to estimate initial waste thickness and waste thickness at the end-of-immediate compression. Model application to the Yolo test cells (conventional and bioreactor landfills) via least squares optimization yielded high coefficient of determinations for all settlement models (R(2)>0.83). However, empirical models (i.e., power creep, logarithmic, and hyperbolic models) are not recommended for use in MSW settlement modeling due to potential non-representative long-term MSW behavior, limited physical significance of model parameters, and required settlement data for model parameterization. Settlement models that combine mechanical creep and biocompression into a single mathematical function constrain time-dependent settlement to a single process with finite magnitude, which limits model applicability. Overall, all models evaluated that couple multiple compression processes (immediate, creep, and biocompression) provided accurate representations of both Yolo and DTBE datasets. A model presented in Gourc et al. (2010) included the lowest number of total and optimized model parameters and yielded high statistical performance for all model applications (R(2)⩾0.97). Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Treatment of specific phobias with Eye Movement Desensitization and Reprocessing (EMDR): protocol, empirical status, and conceptual issues.

    PubMed

    De Jongh, A; Ten Broeke, E; Renssen, M R

    1999-01-01

    This paper considers the current empirical status of Eye Movement Desensitization and Reprocessing (EMDR) as a treatment method for specific phobias, along with some conceptual and practical issues in relation to its use. Both uncontrolled and controlled studies on the application of EMDR with specific phobias demonstrate that EMDR can produce significant improvements within a limited number of sessions. With regard to the treatment of childhood spider phobia, EMDR has been found to be more effective than a placebo control condition, but less effective than exposure in vivo. The empirical support for EMDR with specific phobias is still meagre, therefore, one should remain cautious. However, given that there is insufficient research to validate any method for complex or trauma related phobias, that EMDR is a time-limited procedure, and that it can be used in cases for which an exposure in vivo approach is difficult to administer, the application of EMDR with specific phobias merits further clinical and research attention.

  4. Empirical and targeted therapy of candidemia with fluconazole versus echinocandins: a propensity score-derived analysis of a population-based, multicentre prospective cohort.

    PubMed

    López-Cortés, L E; Almirante, B; Cuenca-Estrella, M; Garnacho-Montero, J; Padilla, B; Puig-Asensio, M; Ruiz-Camps, I; Rodríguez-Baño, J

    2016-08-01

    We compared the clinical efficacy of fluconazole and echinocandins in the treatment of candidemia in real practice. The CANDIPOP study is a prospective, population-based cohort study on candidemia carried out between May 2010 and April 2011 in 29 Spanish hospitals. Using strict inclusion criteria, we separately compared the impact of empirical and targeted therapy with fluconazole or echinocandins on 30-day mortality. Cox regression, including a propensity score (PS) for receiving echinocandins, stratified analysis on the PS quartiles and PS-based matched analyses, were performed. The empirical and targeted therapy cohorts comprised 316 and 421 cases, respectively; 30-day mortality was 18.7% with fluconazole and 33.9% with echinocandins (p 0.02) in the empirical therapy group and 19.8% with fluconazole and 27.7% with echinocandins (p 0.06) in the targeted therapy group. Multivariate Cox regression analysis including PS showed that empirical therapy with fluconazole was associated with better prognosis (adjusted hazard ratio 0.38; 95% confidence interval 0.17-0.81; p 0.01); no differences were found within each PS quartile or in cases matched according to PS. Targeted therapy with fluconazole did not show a significant association with mortality in the Cox regression analysis (adjusted hazard ratio 0.77; 95% confidence interval 0.41-1.46; p 0.63), in the PS quartiles or in PS-matched cases. The results were similar among patients with severe sepsis and septic shock. Empirical or targeted treatment with fluconazole was not associated with increased 30-day mortality compared to echinocandins among adults with candidemia. Copyright © 2016 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.

  5. Virtual reality as a human factors design analysis tool: Macro-ergonomic application validation and assessment of the Space Station Freedom payload control area

    NASA Technical Reports Server (NTRS)

    Hale, Joseph P.

    1994-01-01

    A virtual reality (VR) Applications Program has been under development at MSFC since 1989. Its objectives are to develop, assess, validate, and utilize VR in hardware development, operations development and support, missions operations training, and science training. A variety of activities are under way within many of these areas. One ongoing macro-ergonomic application of VR relates to the design of the Space Station Freedom Payload Control Area (PCA), the control room from which onboard payload operations are managed. Several preliminary conceptual PCA layouts have been developed and modeled in VR. Various managers and potential end users have virtually 'entered' these rooms and provided valuable feedback. Before VR can be used with confidence in a particular application, it must be validated, or calibrated, for that class of applications. Two associated validation studies for macro-ergonomic applications are under way to help characterize possible distortions of filtering of relevant perceptions in a virtual world. In both studies, existing control rooms and their 'virtual counterparts will be empirically compared using distance and heading estimations to objects and subjective assessments. Approaches and findings of the PCA activities and details of the studies are presented.

  6. Empirical Likelihood in Nonignorable Covariate-Missing Data Problems.

    PubMed

    Xie, Yanmei; Zhang, Biao

    2017-04-20

    Missing covariate data occurs often in regression analysis, which frequently arises in the health and social sciences as well as in survey sampling. We study methods for the analysis of a nonignorable covariate-missing data problem in an assumed conditional mean function when some covariates are completely observed but other covariates are missing for some subjects. We adopt the semiparametric perspective of Bartlett et al. (Improving upon the efficiency of complete case analysis when covariates are MNAR. Biostatistics 2014;15:719-30) on regression analyses with nonignorable missing covariates, in which they have introduced the use of two working models, the working probability model of missingness and the working conditional score model. In this paper, we study an empirical likelihood approach to nonignorable covariate-missing data problems with the objective of effectively utilizing the two working models in the analysis of covariate-missing data. We propose a unified approach to constructing a system of unbiased estimating equations, where there are more equations than unknown parameters of interest. One useful feature of these unbiased estimating equations is that they naturally incorporate the incomplete data into the data analysis, making it possible to seek efficient estimation of the parameter of interest even when the working regression function is not specified to be the optimal regression function. We apply the general methodology of empirical likelihood to optimally combine these unbiased estimating equations. We propose three maximum empirical likelihood estimators of the underlying regression parameters and compare their efficiencies with other existing competitors. We present a simulation study to compare the finite-sample performance of various methods with respect to bias, efficiency, and robustness to model misspecification. The proposed empirical likelihood method is also illustrated by an analysis of a data set from the US National Health and Nutrition Examination Survey (NHANES).

  7. Refined discrete and empirical horizontal gradients in VLBI analysis

    NASA Astrophysics Data System (ADS)

    Landskron, Daniel; Böhm, Johannes

    2018-02-01

    Missing or incorrect consideration of azimuthal asymmetry of troposphere delays is a considerable error source in space geodetic techniques such as Global Navigation Satellite Systems (GNSS) or Very Long Baseline Interferometry (VLBI). So-called horizontal troposphere gradients are generally utilized for modeling such azimuthal variations and are particularly required for observations at low elevation angles. Apart from estimating the gradients within the data analysis, which has become common practice in space geodetic techniques, there is also the possibility to determine the gradients beforehand from different data sources than the actual observations. Using ray-tracing through Numerical Weather Models (NWMs), we determined discrete gradient values referred to as GRAD for VLBI observations, based on the standard gradient model by Chen and Herring (J Geophys Res 102(B9):20489-20502, 1997. https://doi.org/10.1029/97JB01739) and also for new, higher-order gradient models. These gradients are produced on the same data basis as the Vienna Mapping Functions 3 (VMF3) (Landskron and Böhm in J Geod, 2017.https://doi.org/10.1007/s00190-017-1066-2), so they can also be regarded as the VMF3 gradients as they are fully consistent with each other. From VLBI analyses of the Vienna VLBI and Satellite Software (VieVS), it becomes evident that baseline length repeatabilities (BLRs) are improved on average by 5% when using a priori gradients GRAD instead of estimating the gradients. The reason for this improvement is that the gradient estimation yields poor results for VLBI sessions with a small number of observations, while the GRAD a priori gradients are unaffected from this. We also developed a new empirical gradient model applicable for any time and location on Earth, which is included in the Global Pressure and Temperature 3 (GPT3) model. Although being able to describe only the systematic component of azimuthal asymmetry and no short-term variations at all, even these empirical a priori gradients slightly reduce (improve) the BLRs with respect to the estimation of gradients. In general, this paper addresses that a priori horizontal gradients are actually more important for VLBI analysis than previously assumed, as particularly the discrete model GRAD as well as the empirical model GPT3 are indeed able to refine and improve the results.

  8. Philosophy and the front line of science.

    PubMed

    Pernu, Tuomas K

    2008-03-01

    According to one traditional view, empirical science is necessarily preceded by philosophical analysis. Yet the relevance of philosophy is often doubted by those engaged in empirical sciences. I argue that these doubts can be substantiated by two theoretical problems that the traditional conception of philosophy is bound to face. First, there is a strong normative etiology to philosophical problems, theories, and notions that is dfficult to reconcile with descriptive empirical study. Second, conceptual analysis (a role that is typically assigned to philosophy) seems to lose its object of study if it is granted that terms do not have purely conceptual meanings detached from their actual use in empirical sciences. These problems are particularly acute to the current naturalistic philosophy of science. I suggest a more concrete integration of philosophy and the sciences as a possible way of making philosophy of science have more impact.

  9. Multimodal Pressure Flow Analysis: Application of Hilbert Huang Transform in Cerebral Blood Flow Regulation

    PubMed Central

    Lo, Men-Tzung; Hu, Kun; Liu, Yanhui; Peng, C.-K.; Novak, Vera

    2008-01-01

    Quantification of nonlinear interactions between two nonstationary signals presents a computational challenge in different research fields, especially for assessments of physiological systems. Traditional approaches that are based on theories of stationary signals cannot resolve nonstationarity-related issues and, thus, cannot reliably assess nonlinear interactions in physiological systems. In this review we discuss a new technique “Multi-Modal Pressure Flow method (MMPF)” that utilizes Hilbert-Huang transformation to quantify dynamic cerebral autoregulation (CA) by studying interaction between nonstationary cerebral blood flow velocity (BFV) and blood pressure (BP). CA is an important mechanism responsible for controlling cerebral blood flow in responses to fluctuations in systemic BP within a few heart-beats. The influence of CA is traditionally assessed from the relationship between the well-pronounced systemic BP and BFV oscillations induced by clinical tests. Reliable noninvasive assessment of dynamic CA, however, remains a challenge in clinical and diagnostic medicine. In this brief review we: 1) present an overview of transfer function analysis (TFA) that is traditionally used to quantify CA; 2) describe the a MMPF method and its modifications; 3) introduce a newly developed automatic algorithm and engineering aspects of the improved MMPF method; and 4) review clinical applications of MMPF and its sensitivity for detection of CA abnormalities in clinical studies. The MMPF analysis decomposes complex nonstationary BP and BFV signals into multiple empirical modes adaptively so that the fluctuations caused by a specific physiologic process can be represented in a corresponding empirical mode. Using this technique, we recently showed that dynamic CA can be characterized by specific phase delays between the decomposed BP and BFV oscillations, and that the phase shifts are significantly reduced in hypertensive, diabetics and stroke subjects with impaired CA. In addition, the new technique enables reliable assessment of CA using both data collected during clinical test and spontaneous BP/BFV fluctuations during baseline resting conditions. PMID:18725996

  10. How GPs value guidelines applied to patients with multimorbidity: a qualitative study.

    PubMed

    Luijks, Hilde; Lucassen, Peter; van Weel, Chris; Loeffen, Maartje; Lagro-Janssen, Antoine; Schermer, Tjard

    2015-10-26

    To explore and describe the value general practitioner (GPs) attribute to medical guidelines when they are applied to patients with multimorbidity, and to describe which benefits GPs experience from guideline adherence in these patients. Also, we aimed to identify limitations from guideline adherence in patients with multimorbidity, as perceived by GPs, and to describe their empirical solutions to manage these obstacles. Focus group study with purposive sampling of participants. Focus groups were guided by an experienced moderator who used an interview guide. Interviews were transcribed verbatim. Data analysis was performed by two researchers using the constant comparison analysis technique and field notes were used in the analysis. Data collection proceeded until saturation was reached. Primary care, eastern part of The Netherlands. Dutch GPs, heterogeneous in age, sex and academic involvement. 25 GPs participated in five focus groups. GPs valued the guidance that guidelines provide, but experienced shortcomings when they were applied to patients with multimorbidity. Taking these patients' personal circumstances into account was regarded as important, but it was impeded by a consistent focus on guideline adherence. Preventative measures were considered less appropriate in (elderly) patients with multimorbidity. Moreover, the applicability of guidelines in patients with multimorbidity was questioned. GPs' extensive practical experience with managing multimorbidity resulted in several empirical solutions, for example, using their 'common sense' to respond to the perceived shortcomings. GPs applying guidelines for patients with multimorbidity integrate patient-specific factors in their medical decisions, aiming for patient-centred solutions. Such integration of clinical experience and best evidence is required to practise evidence-based medicine. More flexibility in pay-for-performance systems is needed to facilitate this integration. Several improvements in guideline reporting are necessary to enhance the applicability of guidelines in patients with multimorbidity. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  11. A cost-effectiveness analysis of "test" versus "treat" patients hospitalized with suspected influenza in Hong Kong.

    PubMed

    You, Joyce H S; Chan, Eva S K; Leung, Maggie Y K; Ip, Margaret; Lee, Nelson L S

    2012-01-01

    Seasonal and 2009 H1N1 influenza viruses may cause severe diseases and result in excess hospitalization and mortality in the older and younger adults, respectively. Early antiviral treatment may improve clinical outcomes. We examined potential outcomes and costs of test-guided versus empirical treatment in patients hospitalized for suspected influenza in Hong Kong. We designed a decision tree to simulate potential outcomes of four management strategies in adults hospitalized for severe respiratory infection suspected of influenza: "immunofluorescence-assay" (IFA) or "polymerase-chain-reaction" (PCR)-guided oseltamivir treatment, "empirical treatment plus PCR" and "empirical treatment alone". Model inputs were derived from literature. The average prevalence (11%) of influenza in 2010-2011 (58% being 2009 H1N1) among cases of respiratory infections was used in the base-case analysis. Primary outcome simulated was cost per quality-adjusted life-year (QALY) expected (ICER) from the Hong Kong healthcare providers' perspective. In base-case analysis, "empirical treatment alone" was shown to be the most cost-effective strategy and dominated the other three options. Sensitivity analyses showed that "PCR-guided treatment" would dominate "empirical treatment alone" when the daily cost of oseltamivir exceeded USD18, or when influenza prevalence was <2.5% and the predominant circulating viruses were not 2009 H1N1. Using USD50,000 as the threshold of willingness-to-pay, "empirical treatment alone" and "PCR-guided treatment" were cost-effective 97% and 3% of time, respectively, in 10,000 Monte-Carlo simulations. During influenza epidemics, empirical antiviral treatment appears to be a cost-effective strategy in managing patients hospitalized with severe respiratory infection suspected of influenza, from the perspective of healthcare providers in Hong Kong.

  12. Quantum optimization for training support vector machines.

    PubMed

    Anguita, Davide; Ridella, Sandro; Rivieccio, Fabio; Zunino, Rodolfo

    2003-01-01

    Refined concepts, such as Rademacher estimates of model complexity and nonlinear criteria for weighting empirical classification errors, represent recent and promising approaches to characterize the generalization ability of Support Vector Machines (SVMs). The advantages of those techniques lie in both improving the SVM representation ability and yielding tighter generalization bounds. On the other hand, they often make Quadratic-Programming algorithms no longer applicable, and SVM training cannot benefit from efficient, specialized optimization techniques. The paper considers the application of Quantum Computing to solve the problem of effective SVM training, especially in the case of digital implementations. The presented research compares the behavioral aspects of conventional and enhanced SVMs; experiments in both a synthetic and real-world problems support the theoretical analysis. At the same time, the related differences between Quadratic-Programming and Quantum-based optimization techniques are considered.

  13. Do Farm Advisory Services Improve Adoption of Rural Development Policies? An Empirical Analysis in GI Areas

    ERIC Educational Resources Information Center

    De Rosa, Marcello; Bartoli, Luca

    2017-01-01

    Purpose: The aim of the paper is to evaluate how advisory services stimulate the adoption of rural development policies (RDP) aiming at value creation. Design/methodology/approach: By linking the use of agricultural extension services (AES) to policies for value creation, we will put forward an empirical analysis in Italy, with the aim of…

  14. Reflections on the Human Terrain System During the First 4 Years

    DTIC Science & Technology

    2011-09-01

    contracted social science research and analysis capability in both Iraq and Afghanistan to conduct empirical qualitative and quantita- tive...contracted social science research and analysis capability in both Iraq and Afghanistan to conduct empirical qualitative and quantita- tive research to...problematic.29 All research products in the public domain (including ethnographies produced by academic anthropologists) are accessible by intelligence

  15. Accuracy of Revised and Traditional Parallel Analyses for Assessing Dimensionality with Binary Data

    ERIC Educational Resources Information Center

    Green, Samuel B.; Redell, Nickalus; Thompson, Marilyn S.; Levy, Roy

    2016-01-01

    Parallel analysis (PA) is a useful empirical tool for assessing the number of factors in exploratory factor analysis. On conceptual and empirical grounds, we argue for a revision to PA that makes it more consistent with hypothesis testing. Using Monte Carlo methods, we evaluated the relative accuracy of the revised PA (R-PA) and traditional PA…

  16. Hard-Rock Stability Analysis for Span Design in Entry-Type Excavations with Learning Classifiers

    PubMed Central

    García-Gonzalo, Esperanza; Fernández-Muñiz, Zulima; García Nieto, Paulino José; Bernardo Sánchez, Antonio; Menéndez Fernández, Marta

    2016-01-01

    The mining industry relies heavily on empirical analysis for design and prediction. An empirical design method, called the critical span graph, was developed specifically for rock stability analysis in entry-type excavations, based on an extensive case-history database of cut and fill mining in Canada. This empirical span design chart plots the critical span against rock mass rating for the observed case histories and has been accepted by many mining operations for the initial span design of cut and fill stopes. Different types of analysis have been used to classify the observed cases into stable, potentially unstable and unstable groups. The main purpose of this paper is to present a new method for defining rock stability areas of the critical span graph, which applies machine learning classifiers (support vector machine and extreme learning machine). The results show a reasonable correlation with previous guidelines. These machine learning methods are good tools for developing empirical methods, since they make no assumptions about the regression function. With this software, it is easy to add new field observations to a previous database, improving prediction output with the addition of data that consider the local conditions for each mine. PMID:28773653

  17. Hard-Rock Stability Analysis for Span Design in Entry-Type Excavations with Learning Classifiers.

    PubMed

    García-Gonzalo, Esperanza; Fernández-Muñiz, Zulima; García Nieto, Paulino José; Bernardo Sánchez, Antonio; Menéndez Fernández, Marta

    2016-06-29

    The mining industry relies heavily on empirical analysis for design and prediction. An empirical design method, called the critical span graph, was developed specifically for rock stability analysis in entry-type excavations, based on an extensive case-history database of cut and fill mining in Canada. This empirical span design chart plots the critical span against rock mass rating for the observed case histories and has been accepted by many mining operations for the initial span design of cut and fill stopes. Different types of analysis have been used to classify the observed cases into stable, potentially unstable and unstable groups. The main purpose of this paper is to present a new method for defining rock stability areas of the critical span graph, which applies machine learning classifiers (support vector machine and extreme learning machine). The results show a reasonable correlation with previous guidelines. These machine learning methods are good tools for developing empirical methods, since they make no assumptions about the regression function. With this software, it is easy to add new field observations to a previous database, improving prediction output with the addition of data that consider the local conditions for each mine.

  18. What Is Heartburn Worth?

    PubMed Central

    Heudebert, Gustavo R; Centor, Robert M; Klapow, Joshua C; Marks, Robert; Johnson, Lawrence; Wilcox, C Mel

    2000-01-01

    OBJECTIVE T o determine the best treatment strategy for the management of patients presenting with symptoms consistent with uncomplicated heartburn. METHODS We performed a cost-utility analysis of 4 alternatives: empirical proton pump inhibitor, empirical histamine2-receptor antagonist, and diagnostic strategies consisting of either esophagogastroduodenoscopy (EGD) or an upper gastrointestinal series before treatment. The time horizon of the model was 1 year. The base case analysis assumed a cohort of otherwise healthy 45-year-old individuals in a primary care practice. MAIN RESULTS Empirical treatment with a proton pump inhibitor was projected to provide the greatest quality-adjusted survival for the cohort. Empirical treatment with a histamine2receptor antagonist was projected to be the least costly of the alternatives. The marginal cost-effectiveness of using a proton pump inhibitor over a histamine2-receptor antagonist was approximately $10,400 per quality-adjusted life year (QALY) gained in the base case analysis and was less than $50,000 per QALY as long as the utility for heartburn was less than 0.95. Both diagnostic strategies were dominated by proton pump inhibitor alternative. CONCLUSIONS Empirical treatment seems to be the optimal initial management strategy for patients with heartburn, but the choice between a proton pump inhibitor or histamine2-receptor antagonist depends on the impact of heartburn on quality of life. PMID:10718898

  19. Some problems with social cognition models: a pragmatic and conceptual analysis.

    PubMed

    Ogden, Jane

    2003-07-01

    Empirical articles published between 1997 and 2001 from 4 health psychology journals that tested or applied 1 or more social cognition models (theory of reasoned action, theory of planned behavior, health belief model, and protection motivation theory; N = 47) were scrutinized for their pragmatic and conceptual basis. In terms of their pragmatic basis, these 4 models were useful for guiding research. The analysis of their conceptual basis was less positive. First, these models do not enable the generation of hypotheses because their constructs are unspecific; they therefore cannot be tested. Second, they focus on analytic truths rather than synthetic ones, and the conclusions resulting from their application are often true by definition rather than by observation. Finally, they may create and change both cognitions and behavior rather than describe them.

  20. Factors influencing health information system adoption in American hospitals.

    PubMed

    Wang, Bill B; Wan, Thomas T H; Burke, Darrell E; Bazzoli, Gloria J; Lin, Blossom Y J

    2005-01-01

    To study the number of health information systems (HISs), applicable to administrative, clinical, and executive decision support functionalities, adopted by acute care hospitals and to examine how hospital market, organizational, and financial factors influence HIS adoption. A cross-sectional analysis was performed with 1441 hospitals selected from metropolitan statistical areas in the United States. Multiple data sources were merged. Six hypotheses were empirically tested by multiple regression analysis. HIS adoption was influenced by the hospital market, organizational, and financial factors. Larger, system-affiliated, and for-profit hospitals with more preferred provider organization contracts are more likely to adopt managerial information systems than their counterparts. Operating revenue is positively associated with HIS adoption. The study concludes that hospital organizational and financial factors influence on hospitals' strategic adoption of clinical, administrative, and managerial information systems.

  1. Wavelet Analysis for Wind Fields Estimation

    PubMed Central

    Leite, Gladeston C.; Ushizima, Daniela M.; Medeiros, Fátima N. S.; de Lima, Gilson G.

    2010-01-01

    Wind field analysis from synthetic aperture radar images allows the estimation of wind direction and speed based on image descriptors. In this paper, we propose a framework to automate wind direction retrieval based on wavelet decomposition associated with spectral processing. We extend existing undecimated wavelet transform approaches, by including à trous with B3 spline scaling function, in addition to other wavelet bases as Gabor and Mexican-hat. The purpose is to extract more reliable directional information, when wind speed values range from 5 to 10 ms−1. Using C-band empirical models, associated with the estimated directional information, we calculate local wind speed values and compare our results with QuikSCAT scatterometer data. The proposed approach has potential application in the evaluation of oil spills and wind farms. PMID:22219699

  2. Using partial site aggregation to reduce bias in random utility travel cost models

    NASA Astrophysics Data System (ADS)

    Lupi, Frank; Feather, Peter M.

    1998-12-01

    We propose a "partial aggregation" strategy for defining the recreation sites that enter choice sets in random utility models. Under the proposal, the most popular sites and sites that will be the subject of policy analysis enter choice sets as individual sites while remaining sites are aggregated into groups of similar sites. The scheme balances the desire to include all potential substitute sites in the choice sets with practical data and modeling constraints. Unlike fully aggregate models, our analysis and empirical applications suggest that the partial aggregation approach reasonably approximates the results of a disaggregate model. The partial aggregation approach offers all of the data and computational advantages of models with aggregate sites but does not suffer from the same degree of bias as fully aggregate models.

  3. Labyrinth Seal Analysis. Volume 3. Analytical and Experimental Development of a Design Model for Labyrinth Seals

    DTIC Science & Technology

    1986-01-01

    the information that has been determined experimentally. The Labyrinth Seal Analysis program was, therefore, directed to the develop - ment of an...labyrinth seal performance, the program included the development of an improved empirical design model to pro- j. .,’ vide the calculation of the flow... program . * Phase I was directed to the analytical development of both an *analysis* model and an improvwd empirical *design" model. Supporting rig tests

  4. Empirical model for calculating vapor-liquid equilibrium and associated phase enthalpy for the CO$sub 2$--O$sub 2$--Kr--Xe system for application to the KALC process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glass, R. W.; Gilliam, T. M.; Fowler, V. L.

    An empirical model is presented for vapor-liquid equilibria and enthalpy for the CO$sub 2$-O$sub 2$ system. In the model, krypton and xenon in very low concentrations are combined with the CO$sub 2$-O$sub 2$ system, thereby representing the total system of primary interest in the High-Temperature Gas- Cooled Reactor program for removing krypton from off-gas generated during the reprocessing of spent fuel. Selected properties of the individual and combined components being considered are presented in the form of tables and empirical equations. (auth)

  5. Candidate genetic pathways for attention-deficit/hyperactivity disorder (ADHD) show association to hyperactive/impulsive symptoms in children with ADHD.

    PubMed

    Bralten, Janita; Franke, Barbara; Waldman, Irwin; Rommelse, Nanda; Hartman, Catharina; Asherson, Philip; Banaschewski, Tobias; Ebstein, Richard P; Gill, Michael; Miranda, Ana; Oades, Robert D; Roeyers, Herbert; Rothenberger, Aribert; Sergeant, Joseph A; Oosterlaan, Jaap; Sonuga-Barke, Edmund; Steinhausen, Hans-Christoph; Faraone, Stephen V; Buitelaar, Jan K; Arias-Vásquez, Alejandro

    2013-11-01

    Because multiple genes with small effect sizes are assumed to play a role in attention-deficit/hyperactivity disorder (ADHD) etiology, considering multiple variants within the same analysis likely increases the total explained phenotypic variance, thereby boosting the power of genetic studies. This study investigated whether pathway-based analysis could bring scientists closer to unraveling the biology of ADHD. The pathway was described as a predefined gene selection based on a well-established database or literature data. Common genetic variants in pathways involved in dopamine/norepinephrine and serotonin neurotransmission and genes involved in neuritic outgrowth were investigated in cases from the International Multicentre ADHD Genetics (IMAGE) study. Multivariable analysis was performed to combine the effects of single genetic variants within the pathway genes. Phenotypes were DSM-IV symptom counts for inattention and hyperactivity/impulsivity (n = 871) and symptom severity measured with the Conners Parent (n = 930) and Teacher (n = 916) Rating Scales. Summing genetic effects of common genetic variants within the pathways showed a significant association with hyperactive/impulsive symptoms ((p)empirical = .007) but not with inattentive symptoms ((p)empirical = .73). Analysis of parent-rated Conners hyperactive/impulsive symptom scores validated this result ((p)empirical = .0018). Teacher-rated Conners scores were not associated. Post hoc analyses showed a significant contribution of all pathways to the hyperactive/impulsive symptom domain (dopamine/norepinephrine, (p)empirical = .0004; serotonin, (p)empirical = .0149; neuritic outgrowth, (p)empirical = .0452). The present analysis shows an association between common variants in 3 genetic pathways and the hyperactive/impulsive component of ADHD. This study demonstrates that pathway-based association analyses, using quantitative measurements of ADHD symptom domains, can increase the power of genetic analyses to identify biological risk factors involved in this disorder. Copyright © 2013 American Academy of Child and Adolescent Psychiatry. Published by Elsevier Inc. All rights reserved.

  6. Flood Change Assessment and Attribution in Austrian alpine Basins

    NASA Astrophysics Data System (ADS)

    Claps, Pierluigi; Allamano, Paola; Como, Anastasia; Viglione, Alberto

    2016-04-01

    The present paper aims to investigate the sensitivity of flood peaks to global warming in the Austrian alpine basins. A group of 97 Austrian watersheds, with areas ranging from 14 to 6000 km2 and with average elevation ranging from 1000 to 2900 m a.s.l. have been considered. Annual maximum floods are available for the basins from 1890 to 2007 with two densities of observation. In a first period, until 1950, an average of 42 records of flood peaks are available. From 1951 to 2007 the density of observation increases to an average amount of contemporary peaks of 85. This information is very important with reference to the statistical tools used for the empirical assessment of change over time, that is linear quantile regressions. Application of this tool to the data set unveils trends in extreme events, confirmed by statistical testing, for the 0.75 and 0.95 empirical quantiles. All applications are made with specific (discharges/area) values . Similarly of what done in a previous approach, multiple quantile regressions have also been applied, confirming the presence of trends even when the possible interference of the specific discharge and morphoclimatic parameters (i.e. mean elevation and catchment area). Application of a geomorphoclimatic model by Allamano et al (2009) can allow to mimic to which extent the empirically available increase in air temperature and annual rainfall can justify the attribution of change derived by the empirical statistical tools. An comparison with data from Swiss alpine basins treated in a previous paper is finally undertaken.

  7. Modeling the risk of water pollution by pesticides from imbalanced data.

    PubMed

    Trajanov, Aneta; Kuzmanovski, Vladimir; Real, Benoit; Perreau, Jonathan Marks; Džeroski, Sašo; Debeljak, Marko

    2018-04-30

    The pollution of ground and surface waters with pesticides is a serious ecological issue that requires adequate treatment. Most of the existing water pollution models are mechanistic mathematical models. While they have made a significant contribution to understanding the transfer processes, they face the problem of validation because of their complexity, the user subjectivity in their parameterization, and the lack of empirical data for validation. In addition, the data describing water pollution with pesticides are, in most cases, very imbalanced. This is due to strict regulations for pesticide applications, which lead to only a few pollution events. In this study, we propose the use of data mining to build models for assessing the risk of water pollution by pesticides in field-drained outflow water. Unlike the mechanistic models, the models generated by data mining are based on easily obtainable empirical data, while the parameterization of the models is not influenced by the subjectivity of ecological modelers. We used empirical data from field trials at the La Jaillière experimental site in France and applied the random forests algorithm to build predictive models that predict "risky" and "not-risky" pesticide application events. To address the problems of the imbalanced classes in the data, cost-sensitive learning and different measures of predictive performance were used. Despite the high imbalance between risky and not-risky application events, we managed to build predictive models that make reliable predictions. The proposed modeling approach can be easily applied to other ecological modeling problems where we encounter empirical data with highly imbalanced classes.

  8. Systematic approach to developing empirical interatomic potentials for III-N semiconductors

    NASA Astrophysics Data System (ADS)

    Ito, Tomonori; Akiyama, Toru; Nakamura, Kohji

    2016-05-01

    A systematic approach to the derivation of empirical interatomic potentials is developed for III-N semiconductors with the aid of ab initio calculations. The parameter values of empirical potential based on bond order potential are determined by reproducing the cohesive energy differences among 3-fold coordinated hexagonal, 4-fold coordinated zinc blende, wurtzite, and 6-fold coordinated rocksalt structures in BN, AlN, GaN, and InN. The bond order p is successfully introduced as a function of the coordination number Z in the form of p = a exp(-bZn ) if Z ≤ 4 and p = (4/Z)α if Z ≥ 4 in empirical interatomic potential. Moreover, the energy difference between wurtzite and zinc blende structures can be successfully evaluated by considering interaction beyond the second-nearest neighbors as a function of ionicity. This approach is feasible for developing empirical interatomic potentials applicable to a system consisting of poorly coordinated atoms at surfaces and interfaces including nanostructures.

  9. Thermodynamics of Oligonucleotide Duplex Melting

    ERIC Educational Resources Information Center

    Schreiber-Gosche, Sherrie; Edwards, Robert A.

    2009-01-01

    Melting temperatures of oligonucleotides are useful for a number of molecular biology applications, such as the polymerase chain reaction (PCR). Although melting temperatures are often calculated with simplistic empirical equations, application of thermodynamics provides more accurate melting temperatures and an opportunity for students to apply…

  10. Benefit-cost analysis of addiction treatment: methodological guidelines and empirical application using the DATCAP and ASI.

    PubMed

    French, Michael T; Salomé, Helena J; Sindelar, Jody L; McLellan, A Thomas

    2002-04-01

    To provide detailed methodological guidelines for using the Drug Abuse Treatment Cost Analysis Program (DATCAP) and Addiction Severity Index (ASI) in a benefit-cost analysis of addiction treatment. A representative benefit-cost analysis of three outpatient programs was conducted to demonstrate the feasibility and value of the methodological guidelines. Procedures are outlined for using resource use and cost data collected with the DATCAP. Techniques are described for converting outcome measures from the ASI to economic (dollar) benefits of treatment. Finally, principles are advanced for conducting a benefit-cost analysis and a sensitivity analysis of the estimates. The DATCAP was administered at three outpatient drug-free programs in Philadelphia, PA, for 2 consecutive fiscal years (1996 and 1997). The ASI was administered to a sample of 178 treatment clients at treatment entry and at 7-months postadmission. The DATCAP and ASI appear to have significant potential for contributing to an economic evaluation of addiction treatment. The benefit-cost analysis and subsequent sensitivity analysis all showed that total economic benefit was greater than total economic cost at the three outpatient programs, but this representative application is meant to stimulate future economic research rather than justifying treatment per se. This study used previously validated, research-proven instruments and methods to perform a practical benefit-cost analysis of real-world treatment programs. The study demonstrates one way to combine economic and clinical data and offers a methodological foundation for future economic evaluations of addiction treatment.

  11. Accuracy Analysis of a Box-wing Theoretical SRP Model

    NASA Astrophysics Data System (ADS)

    Wang, Xiaoya; Hu, Xiaogong; Zhao, Qunhe; Guo, Rui

    2016-07-01

    For Beidou satellite navigation system (BDS) a high accuracy SRP model is necessary for high precise applications especially with Global BDS establishment in future. The BDS accuracy for broadcast ephemeris need be improved. So, a box-wing theoretical SRP model with fine structure and adding conical shadow factor of earth and moon were established. We verified this SRP model by the GPS Block IIF satellites. The calculation was done with the data of PRN 1, 24, 25, 27 satellites. The results show that the physical SRP model for POD and forecast for GPS IIF satellite has higher accuracy with respect to Bern empirical model. The 3D-RMS of orbit is about 20 centimeters. The POD accuracy for both models is similar but the prediction accuracy with the physical SRP model is more than doubled. We tested 1-day 3-day and 7-day orbit prediction. The longer is the prediction arc length, the more significant is the improvement. The orbit prediction accuracy with the physical SRP model for 1-day, 3-day and 7-day arc length are 0.4m, 2.0m, 10.0m respectively. But they are 0.9m, 5.5m and 30m with Bern empirical model respectively. We apply this means to the BDS and give out a SRP model for Beidou satellites. Then we test and verify the model with Beidou data of one month only for test. Initial results show the model is good but needs more data for verification and improvement. The orbit residual RMS is similar to that with our empirical force model which only estimate the force for along track, across track direction and y-bias. But the orbit overlap and SLR observation evaluation show some improvement. The remaining empirical force is reduced significantly for present Beidou constellation.

  12. Adaptive Filtration of Physiological Artifacts in EEG Signals in Humans Using Empirical Mode Decomposition

    NASA Astrophysics Data System (ADS)

    Grubov, V. V.; Runnova, A. E.; Hramov, A. E.

    2018-05-01

    A new method for adaptive filtration of experimental EEG signals in humans and for removal of different physiological artifacts has been proposed. The algorithm of the method includes empirical mode decomposition of EEG, determination of the number of empirical modes that are considered, analysis of the empirical modes and search for modes that contains artifacts, removal of these modes, and reconstruction of the EEG signal. The method was tested on experimental human EEG signals and demonstrated high efficiency in the removal of different types of physiological EEG artifacts.

  13. SPIN or LURCH : a Comparative Assessment of Model Checking and Stochastic Search for Temporal Properties in Procedural Code

    NASA Technical Reports Server (NTRS)

    Powell, John D.; Owens, David; Menzies, Tim

    2004-01-01

    The difficulty of how to test large systems, such as the one on board a NASA robotic remote explorer (RRE) vehicle, is fundamentally a search issue: the global state space representing all possible has yet to be solved, even after many decades of work. Randomized algorithms have been known to outperform their deterministic counterparts for search problems representing a wide range of applications. In the case study presented here, the LURCH randomized algorithm proved to be adequate to the task of testing a NASA RRE vehicle. LURCH found all the errors found by an earlier analysis of a more complete method (SPIN). Our empirical results are that LURCH can scale to much larger models than standard model checkers like SMV and SPIN. Further, the LURCH analysis was simpler than the SPIN analysis. The simplicity and scalability of LURCH are two compelling reasons for experimenting further with this tool.

  14. Joint multifractal analysis based on the partition function approach: analytical analysis, numerical simulation and empirical application

    NASA Astrophysics Data System (ADS)

    Xie, Wen-Jie; Jiang, Zhi-Qiang; Gu, Gao-Feng; Xiong, Xiong; Zhou, Wei-Xing

    2015-10-01

    Many complex systems generate multifractal time series which are long-range cross-correlated. Numerous methods have been proposed to characterize the multifractal nature of these long-range cross correlations. However, several important issues about these methods are not well understood and most methods consider only one moment order. We study the joint multifractal analysis based on partition function with two moment orders, which was initially invented to investigate fluid fields, and derive analytically several important properties. We apply the method numerically to binomial measures with multifractal cross correlations and bivariate fractional Brownian motions without multifractal cross correlations. For binomial multifractal measures, the explicit expressions of mass function, singularity strength and multifractal spectrum of the cross correlations are derived, which agree excellently with the numerical results. We also apply the method to stock market indexes and unveil intriguing multifractality in the cross correlations of index volatilities.

  15. Qualitative fusion technique based on information poor system and its application to factor analysis for vibration of rolling bearings

    NASA Astrophysics Data System (ADS)

    Xia, Xintao; Wang, Zhongyu

    2008-10-01

    For some methods of stability analysis of a system using statistics, it is difficult to resolve the problems of unknown probability distribution and small sample. Therefore, a novel method is proposed in this paper to resolve these problems. This method is independent of probability distribution, and is useful for small sample systems. After rearrangement of the original data series, the order difference and two polynomial membership functions are introduced to estimate the true value, the lower bound and the supper bound of the system using fuzzy-set theory. Then empirical distribution function is investigated to ensure confidence level above 95%, and the degree of similarity is presented to evaluate stability of the system. Cases of computer simulation investigate stable systems with various probability distribution, unstable systems with linear systematic errors and periodic systematic errors and some mixed systems. The method of analysis for systematic stability is approved.

  16. Empirical analysis and modeling of manual turnpike tollbooths in China

    NASA Astrophysics Data System (ADS)

    Zhang, Hao

    2017-03-01

    To deal with low-level of service satisfaction at tollbooths of many turnpikes in China, we conduct an empirical study and use a queueing model to investigate performance measures. In this paper, we collect archived data from six tollbooths of a turnpike in China. Empirical analysis on vehicle's time-dependent arrival process and collector's time-dependent service time is conducted. It shows that the vehicle arrival process follows a non-homogeneous Poisson process while the collector service time follows a log-normal distribution. Further, we model the process of collecting tolls at tollbooths with MAP / PH / 1 / FCFS queue for mathematical tractability and present some numerical examples.

  17. An Empirical Analysis of the Default Rate of Informal Lending—Evidence from Yiwu, China

    NASA Astrophysics Data System (ADS)

    Lu, Wei; Yu, Xiaobo; Du, Juan; Ji, Feng

    This study empirically analyzes the underlying factors contributing to the default rate of informal lending. This paper adopts snowball sampling interview to collect data and uses the logistic regression model to explore the specific factors. The results of these analyses validate the explanation of how the informal lending differs from the commercial loan. Factors that contribute to the default rate have particular attributes, while sharing some similarities with commercial bank or FICO credit scoring Index. Finally, our concluding remarks draw some inferences from empirical analysis and speculate as to what this may imply for the role of formal and informal financial sectors.

  18. Data analysis using a combination of independent component analysis and empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Lin, Shih-Lin; Tung, Pi-Cheng; Huang, Norden E.

    2009-06-01

    A combination of independent component analysis and empirical mode decomposition (ICA-EMD) is proposed in this paper to analyze low signal-to-noise ratio data. The advantages of ICA-EMD combination are these: ICA needs few sensory clues to separate the original source from unwanted noise and EMD can effectively separate the data into its constituting parts. The case studies reported here involve original sources contaminated by white Gaussian noise. The simulation results show that the ICA-EMD combination is an effective data analysis tool.

  19. Investigation of short cavity CRDS noise terms by optical correlation

    NASA Astrophysics Data System (ADS)

    Griffin, Steven T.; Fathi, Jason

    2013-05-01

    Cavity Ring Down Spectroscopy (CRDS) has been identified as having significant potential for Department of Defense security and sensing applications. Significant factors in the development of new sensor architectures are portability, robustness and economy. A significant factor in new CRDS sensor architectures is cavity length. Prior publication has examined the role of cavity length in sensing modality both from the standpoint of the system's design and the identification of potential difficulties presented by novel approaches. Two of interest here are new noise terms that have been designated turbulence-like and speckle-like in prior publication. In the prior publication the theoretical and some empirical data was presented. This presentation addresses the automation of the experimental apparatus, new data analysis, and implications regarding the significance of the two noise terms. This is accomplished through an Analog-to- Digital Conversion (ADC) from the output of a custom designed optical correlator. Details of the unique application of the developed instrument and implications for short cavity (portable) CRDS applications are presented.

  20. Fine structure of spectral properties for random correlation matrices: An application to financial markets

    NASA Astrophysics Data System (ADS)

    Livan, Giacomo; Alfarano, Simone; Scalas, Enrico

    2011-07-01

    We study some properties of eigenvalue spectra of financial correlation matrices. In particular, we investigate the nature of the large eigenvalue bulks which are observed empirically, and which have often been regarded as a consequence of the supposedly large amount of noise contained in financial data. We challenge this common knowledge by acting on the empirical correlation matrices of two data sets with a filtering procedure which highlights some of the cluster structure they contain, and we analyze the consequences of such filtering on eigenvalue spectra. We show that empirically observed eigenvalue bulks emerge as superpositions of smaller structures, which in turn emerge as a consequence of cross correlations between stocks. We interpret and corroborate these findings in terms of factor models, and we compare empirical spectra to those predicted by random matrix theory for such models.

  1. The Past, Present and Future of Geodemographic Research in the United States and United Kingdom

    PubMed Central

    Singleton, Alexander D.; Spielman, Seth E.

    2014-01-01

    This article presents an extensive comparative review of the emergence and application of geodemographics in both the United States and United Kingdom, situating them as an extension of earlier empirically driven models of urban socio-spatial structure. The empirical and theoretical basis for this generalization technique is also considered. Findings demonstrate critical differences in both the application and development of geodemographics between the United States and United Kingdom resulting from their diverging histories, variable data economies, and availability of academic or free classifications. Finally, current methodological research is reviewed, linking this discussion prospectively to the changing spatial data economy in both the United States and United Kingdom. PMID:25484455

  2. Application of Stein and related parametric empirical Bayes estimators to the nuclear plant reliability data system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hill, J.R.; Heger, A.S.; Koen, B.V.

    1984-04-01

    This report is the result of a preliminary feasibility study of the applicability of Stein and related parametric empirical Bayes (PEB) estimators to the Nuclear Plant Reliability Data System (NPRDS). A new estimator is derived for the means of several independent Poisson distributions with different sampling times. This estimator is applied to data from NPRDS in an attempt to improve failure rate estimation. Theoretical and Monte Carlo results indicate that the new PEB estimator can perform significantly better than the standard maximum likelihood estimator if the estimation of the individual means can be combined through the loss function or throughmore » a parametric class of prior distributions.« less

  3. Use of a Game-Like Application on a Mobile Device to Improve Accuracy in Conjugating Spanish Verbs

    ERIC Educational Resources Information Center

    Castañeda, Daniel A.; Cho, Moon-Heum

    2016-01-01

    Interest in using mobile applications to enhance students' learning in Spanish classrooms runs high; however, little empirical research about their effects has been conducted. Using intentionally designed classroom activities to promote meaningful learning with a mobile application, we investigated the extent to which students of Spanish as a…

  4. Aerial spray technology: possibilities and limitations for control of pear thrips

    Treesearch

    Karl Mierzejewski

    1991-01-01

    The feasibility of using aerial application as a means of managing a pear thrips infestation in maple forest stands is examined, based on existing knowledge of forest aerial application acquired from theoretical and empirical studies. Specific strategies by which aerial application should be performed and potential problem areas are discussed. Two new tools, aircraft...

  5. Advancing the detection of steady-state visual evoked potentials in brain-computer interfaces.

    PubMed

    Abu-Alqumsan, Mohammad; Peer, Angelika

    2016-06-01

    Spatial filtering has proved to be a powerful pre-processing step in detection of steady-state visual evoked potentials and boosted typical detection rates both in offline analysis and online SSVEP-based brain-computer interface applications. State-of-the-art detection methods and the spatial filters used thereby share many common foundations as they all build upon the second order statistics of the acquired Electroencephalographic (EEG) data, that is, its spatial autocovariance and cross-covariance with what is assumed to be a pure SSVEP response. The present study aims at highlighting the similarities and differences between these methods. We consider the canonical correlation analysis (CCA) method as a basis for the theoretical and empirical (with real EEG data) analysis of the state-of-the-art detection methods and the spatial filters used thereby. We build upon the findings of this analysis and prior research and propose a new detection method (CVARS) that combines the power of the canonical variates and that of the autoregressive spectral analysis in estimating the signal and noise power levels. We found that the multivariate synchronization index method and the maximum contrast combination method are variations of the CCA method. All three methods were found to provide relatively unreliable detections in low signal-to-noise ratio (SNR) regimes. CVARS and the minimum energy combination methods were found to provide better estimates for different SNR levels. Our theoretical and empirical results demonstrate that the proposed CVARS method outperforms other state-of-the-art detection methods when used in an unsupervised fashion. Furthermore, when used in a supervised fashion, a linear classifier learned from a short training session is able to estimate the hidden user intention, including the idle state (when the user is not attending to any stimulus), rapidly, accurately and reliably.

  6. Data envelopment analysis in service quality evaluation: an empirical study

    NASA Astrophysics Data System (ADS)

    Najafi, Seyedvahid; Saati, Saber; Tavana, Madjid

    2015-09-01

    Service quality is often conceptualized as the comparison between service expectations and the actual performance perceptions. It enhances customer satisfaction, decreases customer defection, and promotes customer loyalty. Substantial literature has examined the concept of service quality, its dimensions, and measurement methods. We introduce the perceived service quality index (PSQI) as a single measure for evaluating the multiple-item service quality construct based on the SERVQUAL model. A slack-based measure (SBM) of efficiency with constant inputs is used to calculate the PSQI. In addition, a non-linear programming model based on the SBM is proposed to delineate an improvement guideline and improve service quality. An empirical study is conducted to assess the applicability of the method proposed in this study. A large number of studies have used DEA as a benchmarking tool to measure service quality. These models do not propose a coherent performance evaluation construct and consequently fail to deliver improvement guidelines for improving service quality. The DEA models proposed in this study are designed to evaluate and improve service quality within a comprehensive framework and without any dependency on external data.

  7. Reducing numerical costs for core wide nuclear reactor CFD simulations by the Coarse-Grid-CFD

    NASA Astrophysics Data System (ADS)

    Viellieber, Mathias; Class, Andreas G.

    2013-11-01

    Traditionally complete nuclear reactor core simulations are performed with subchannel analysis codes, that rely on experimental and empirical input. The Coarse-Grid-CFD (CGCFD) intends to replace the experimental or empirical input with CFD data. The reactor core consists of repetitive flow patterns, allowing the general approach of creating a parametrized model for one segment and composing many of those to obtain the entire reactor simulation. The method is based on a detailed and well-resolved CFD simulation of one representative segment. From this simulation we extract so-called parametrized volumetric forces which close, an otherwise strongly under resolved, coarsely-meshed model of a complete reactor setup. While the formulation so far accounts for forces created internally in the fluid others e.g. obstruction and flow deviation through spacers and wire wraps, still need to be accounted for if the geometric details are not represented in the coarse mesh. These are modelled with an Anisotropic Porosity Formulation (APF). This work focuses on the application of the CGCFD to a complete reactor core setup and the accomplishment of the parametrization of the volumetric forces.

  8. Development and Implementation of an Empirical Ionosphere Variability Model

    NASA Technical Reports Server (NTRS)

    Minow, Joesph I.; Almond, Deborah (Technical Monitor)

    2002-01-01

    Spacecraft designers and operations support personnel involved in space environment analysis for low Earth orbit missions require ionospheric specification and forecast models that provide not only average ionospheric plasma parameters for a given set of geophysical conditions but the statistical variations about the mean as well. This presentation describes the development of a prototype empirical model intended for use with the International Reference Ionosphere (IRI) to provide ionospheric Ne and Te variability. We first describe the database of on-orbit observations from a variety of spacecraft and ground based radars over a wide range of latitudes and altitudes used to obtain estimates of the environment variability. Next, comparison of the observations with the IRI model provide estimates of the deviations from the average model as well as the range of possible values that may correspond to a given IRI output. Options for implementation of the statistical variations in software that can be run with the IRI model are described. Finally, we provide example applications including thrust estimates for tethered satellites and specification of sunrise Ne, Te conditions required to support spacecraft charging issues for satellites with high voltage solar arrays.

  9. A computational efficient modelling of laminar separation bubbles

    NASA Technical Reports Server (NTRS)

    Dini, Paolo; Maughmer, Mark D.

    1990-01-01

    In predicting the aerodynamic characteristics of airfoils operating at low Reynolds numbers, it is often important to account for the effects of laminar (transitional) separation bubbles. Previous approaches to the modelling of this viscous phenomenon range from fast but sometimes unreliable empirical correlations for the length of the bubble and the associated increase in momentum thickness, to more accurate but significantly slower displacement-thickness iteration methods employing inverse boundary-layer formulations in the separated regions. Since the penalty in computational time associated with the more general methods is unacceptable for airfoil design applications, use of an accurate yet computationally efficient model is highly desirable. To this end, a semi-empirical bubble model was developed and incorporated into the Eppler and Somers airfoil design and analysis program. The generality and the efficiency was achieved by successfully approximating the local viscous/inviscid interaction, the transition location, and the turbulent reattachment process within the framework of an integral boundary-layer method. Comparisons of the predicted aerodynamic characteristics with experimental measurements for several airfoils show excellent and consistent agreement for Reynolds numbers from 2,000,000 down to 100,000.

  10. Science in a New Mode: Good Old (Theoretical) Science Versus Brave New (Commodified) Knowledge Production?

    NASA Astrophysics Data System (ADS)

    Knuuttila, Tarja

    2013-10-01

    The present transformation of the university system is conceptualized in terms of such terminologies as "Mode-2 knowledge production" and the "entrepreneurial university." What is remarkable about these analyses is how closely they link the generally accepted requirement of more socially relevant knowledge to the commercialization of university research. This paper critically examines the Mode-1/Mode-2 distinction through a combination of philosophical and empirical analysis. It argues that, from the perspective of actual scientific practice, this Mode-1/Mode-2 distinction and the related transition thesis do not stand closer scrutiny. Theoretical "Mode-1" science shares "Mode-2" features in being also problem-oriented, interventive and transdisciplinary. On the other hand, the empirical case on language technology demonstrates that even in "Mode-2"-like research, undertaken in the "context of application," scientists make a distinction between more difficult scientific problems and those that are considered more applied or commercial. Moreover, the case shows that the need to make such distinctions may even become more acute due to the compromises imposed by the commercialization of research.

  11. Combined Exact-Repeat and Geodetic Mission Altimetry for High-Resolution Empirical Tide Mapping

    NASA Astrophysics Data System (ADS)

    Zaron, E. D.

    2014-12-01

    The configuration of present and historical exact-repeat mission (ERM) altimeter ground tracks determines the maximum resolution of empirical tidal maps obtained with ERM data. Although the mode-1 baroclinic tide is resolvable at mid-latitudes in the open ocean, the ability to detect baroclinic and barotropic tides near islands and complex coastlines is limited, in part, by ERM track density. In order to obtain higher resolution maps, the possibility of combining ERM and geodetic mission (GM) altimetry is considered, using a combination of spatial thin-plate splines and temporal harmonic analysis. Given the present spatial and temporal distribution of GM missions, it is found that GM data can contribute to resolving tidal features smaller than 75 km, provided the signal amplitude is greater than about 1 cm. Uncertainties in the mean sea surface and environmental corrections are significant components of the GM error budget, and methods to optimize data selection and along-track filtering are still being optimized. Application to two regions, Monterey Bay and Luzon Strait, finds evidence for complex tidal fields in agreement with independent observations and modeling studies.

  12. An Empirical Non-TNT Approach to Launch Vehicle Explosion Modeling

    NASA Technical Reports Server (NTRS)

    Blackwood, James M.; Skinner, Troy; Richardson, Erin H.; Bangham, Michal E.

    2015-01-01

    In an effort to increase crew survivability from catastrophic explosions of Launch Vehicles (LV), a study was conducted to determine the best method for predicting LV explosion environments in the near field. After reviewing such methods as TNT equivalence, Vapor Cloud Explosion (VCE) theory, and Computational Fluid Dynamics (CFD), it was determined that the best approach for this study was to assemble all available empirical data from full scale launch vehicle explosion tests and accidents. Approximately 25 accidents or full-scale tests were found that had some amount of measured blast wave, thermal, or fragment explosion environment characteristics. Blast wave overpressure was found to be much lower in the near field than predicted by most TNT equivalence methods. Additionally, fragments tended to be larger, fewer, and slower than expected if the driving force was from a high explosive type event. In light of these discoveries, a simple model for cryogenic rocket explosions is presented. Predictions from this model encompass all known applicable full scale launch vehicle explosion data. Finally, a brief description of on-going analysis and testing to further refine the launch vehicle explosion environment is discussed.

  13. Investigation of KDP crystal surface based on an improved bidimensional empirical mode decomposition method

    NASA Astrophysics Data System (ADS)

    Lu, Lei; Yan, Jihong; Chen, Wanqun; An, Shi

    2018-03-01

    This paper proposed a novel spatial frequency analysis method for the investigation of potassium dihydrogen phosphate (KDP) crystal surface based on an improved bidimensional empirical mode decomposition (BEMD) method. Aiming to eliminate end effects of the BEMD method and improve the intrinsic mode functions (IMFs) for the efficient identification of texture features, a denoising process was embedded in the sifting iteration of BEMD method. With removing redundant information in decomposed sub-components of KDP crystal surface, middle spatial frequencies of the cutting and feeding processes were identified. Comparative study with the power spectral density method, two-dimensional wavelet transform (2D-WT), as well as the traditional BEMD method, demonstrated that the method developed in this paper can efficiently extract texture features and reveal gradient development of KDP crystal surface. Furthermore, the proposed method was a self-adaptive data driven technique without prior knowledge, which overcame shortcomings of the 2D-WT model such as the parameters selection. Additionally, the proposed method was a promising tool for the application of online monitoring and optimal control of precision machining process.

  14. A biased review of biases in Twitter studies on political collective action

    NASA Astrophysics Data System (ADS)

    Cihon, Peter; Yasseri, Taha

    2016-08-01

    In recent years researchers have gravitated to Twitter and other social media platforms as fertile ground for empirical analysis of social phenomena. Social media provides researchers access to trace data of interactions and discourse that once went unrecorded in the offline world. Researchers have sought to use these data to explain social phenomena both particular to social media and applicable to the broader social world. This paper offers a minireview of Twitter-based research on political crowd behaviour. This literature offers insight into particular social phenomena on Twitter, but often fails to use standardized methods that permit interpretation beyond individual studies. Moreover, the literature fails to ground methodologies and results in social or political theory, divorcing empirical research from the theory needed to interpret it. Rather, investigations focus primarily on methodological innovations for social media analyses, but these too often fail to sufficiently demonstrate the validity of such methodologies. This minireview considers a small number of selected papers; we analyse their (often lack of) theoretical approaches, review their methodological innovations, and offer suggestions as to the relevance of their results for political scientists and sociologists.

  15. Distinguishing perceived competence and self-efficacy: an example from exercise.

    PubMed

    Rodgers, Wendy M; Markland, David; Selzler, Anne-Marie; Murray, Terra C; Wilson, Philip M

    2014-12-01

    This article examined the conceptual and statistical distinction between perceived competence and self-efficacy. Although they are frequently used interchangeably, it is possible that distinguishing them might assist researchers in better understanding their roles in developing enduring adaptive behavior patterns. Perceived competence is conceived in the theoretical framework of self-determination theory and self-efficacy is conceived in the theoretical framework of social-cognitive theory. The purpose of this study was to empirically distinguish perceived competence from self-efficacy for exercise. Two studies evaluated the independence of perceived competence and self-efficacy in the context of exercise. Using 2 extant instruments with validity and reliability evidence in exercise contexts, the distinctiveness of the 2 constructs was assessed in 2 separate samples (n = 357 middle-aged sedentary adults; n = 247 undergraduate students). Confirmatory factor analysis supported the conceptual and empirical distinction of the 2 constructs. This study supports the conceptual and statistical distinction of perceived competence from perceived self-efficacy. Applications of these results provide a rationale for more precise future theorizing regarding their respective roles in supporting initiation and maintenance of health behaviors.

  16. Bivariate empirical mode decomposition for ECG-based biometric identification with emotional data.

    PubMed

    Ferdinando, Hany; Seppanen, Tapio; Alasaarela, Esko

    2017-07-01

    Emotions modulate ECG signals such that they might affect ECG-based biometric identification in real life application. It motivated in finding good feature extraction methods where the emotional state of the subjects has minimum impacts. This paper evaluates feature extraction based on bivariate empirical mode decomposition (BEMD) for biometric identification when emotion is considered. Using the ECG signal from the Mahnob-HCI database for affect recognition, the features were statistical distributions of dominant frequency after applying BEMD analysis to ECG signals. The achieved accuracy was 99.5% with high consistency using kNN classifier in 10-fold cross validation to identify 26 subjects when the emotional states of the subjects were ignored. When the emotional states of the subject were considered, the proposed method also delivered high accuracy, around 99.4%. We concluded that the proposed method offers emotion-independent features for ECG-based biometric identification. The proposed method needs more evaluation related to testing with other classifier and variation in ECG signals, e.g. normal ECG vs. ECG with arrhythmias, ECG from various ages, and ECG from other affective databases.

  17. Combining quantitative trait loci analysis with physiological models to predict genotype-specific transpiration rates.

    PubMed

    Reuning, Gretchen A; Bauerle, William L; Mullen, Jack L; McKay, John K

    2015-04-01

    Transpiration is controlled by evaporative demand and stomatal conductance (gs ), and there can be substantial genetic variation in gs . A key parameter in empirical models of transpiration is minimum stomatal conductance (g0 ), a trait that can be measured and has a large effect on gs and transpiration. In Arabidopsis thaliana, g0 exhibits both environmental and genetic variation, and quantitative trait loci (QTL) have been mapped. We used this information to create a genetically parameterized empirical model to predict transpiration of genotypes. For the parental lines, this worked well. However, in a recombinant inbred population, the predictions proved less accurate. When based only upon their genotype at a single g0 QTL, genotypes were less distinct than our model predicted. Follow-up experiments indicated that both genotype by environment interaction and a polygenic inheritance complicate the application of genetic effects into physiological models. The use of ecophysiological or 'crop' models for predicting transpiration of novel genetic lines will benefit from incorporating further knowledge of the genetic control and degree of independence of core traits/parameters underlying gs variation. © 2014 John Wiley & Sons Ltd.

  18. Review essay: empires, ancient and modern.

    PubMed

    Hall, John A

    2011-09-01

    This essay drews attention to two books on empires by historians which deserve the attention of sociologists. Bang's model of the workings of the Roman economy powerfully demonstrates the tributary nature of per-industrial tributary empires. Darwin's analysis concentrates on modern overseas empires, wholly different in character as they involved the transportation of consumption items for the many rather than luxury goods for the few. Darwin is especially good at describing the conditions of existence of late nineteenth century empires, noting that their demise was caused most of all by the failure of balance of power politics in Europe. Concluding thoughts are offered about the USA. © London School of Economics and Political Science 2011.

  19. Dealing with noise and physiological artifacts in human EEG recordings: empirical mode methods

    NASA Astrophysics Data System (ADS)

    Runnova, Anastasiya E.; Grubov, Vadim V.; Khramova, Marina V.; Hramov, Alexander E.

    2017-04-01

    In the paper we propose the new method for removing noise and physiological artifacts in human EEG recordings based on empirical mode decomposition (Hilbert-Huang transform). As physiological artifacts we consider specific oscillatory patterns that cause problems during EEG analysis and can be detected with additional signals recorded simultaneously with EEG (ECG, EMG, EOG, etc.) We introduce the algorithm of the proposed method with steps including empirical mode decomposition of EEG signal, choosing of empirical modes with artifacts, removing these empirical modes and reconstructing of initial EEG signal. We show the efficiency of the method on the example of filtration of human EEG signal from eye-moving artifacts.

  20. Strengthening population health interventions: developing the CollaboraKTion Framework for Community-Based Knowledge Translation.

    PubMed

    Jenkins, Emily K; Kothari, Anita; Bungay, Vicky; Johnson, Joy L; Oliffe, John L

    2016-08-30

    Much of the research and theorising in the knowledge translation (KT) field has focused on clinical settings, providing little guidance to those working in community settings. In this study, we build on previous research in community-based KT by detailing the theory driven and empirically-informed CollaboraKTion framework. A case study design and ethnographic methods were utilised to gain an in-depth understanding of the processes for conducting a community-based KT study as a means to distilling the CollaboraKTion framework. Drawing on extensive field notes describing fieldwork observations and interactions as well as evidence from the participatory research and KT literature, we detail the processes and steps undertaken in this community-based KT study as well as their rationale and the challenges encountered. In an effort to build upon existing knowledge, Kitson and colleagues' co-KT framework, which provides guidance for conducting KT aimed at addressing population-level health, was applied as a coding structure to inform the current analysis. This approach was selected because it (1) supported the application of an existing community-based KT framework to empirical data and (2) provided an opportunity to contribute to the theory and practice gaps in the community-based KT literature through an inductively derived empirical example. Analysis revealed that community-based KT is an iterative process that can be viewed as comprising five overarching processes: (1) contacting and connecting; (2) deepening understandings; (3) adapting and applying the knowledge base; (4) supporting and evaluating continued action; and (5) transitioning and embedding as well as several key elements within each of these processes (e.g. building on existing knowledge, establishing partnerships). These empirically informed theory advancements in KT and participatory research traditions are summarised in the CollaboraKTion framework. We suggest that community-based KT researchers place less emphasis on enhancing uptake of specific interventions and focus on collaboratively identifying and creating changes to the contextual factors that influence health outcomes. The CollaboraKTion framework can be used to guide the development, implementation and evaluation of contextually relevant, evidence-informed initiatives aimed at improving population health, amid providing a foundation to leverage future research and practice in this emergent KT area.

  1. Overexcitabilities: Empirical Studies and Application

    ERIC Educational Resources Information Center

    Chang, Hsin-Jen; Kuo, Ching-Chih

    2013-01-01

    Ever since Dr. Dabrowski raised his theory of positive disintegration, several studies focusing on overexcitabilities (OEs) have been performed. This study reviewed previous findings on overexcitabilities and their application, focusing in particular on studies in Taiwan. Since 2001, a series of studies related to overexcitabilities has been…

  2. Consensus in the Wasserstein Metric Space of Probability Measures

    DTIC Science & Technology

    2015-07-01

    this direction, potential applications/uses for the Wasser - stein barycentre (itself) have been considered previously in a number of fields...one is interested in more general empirical input measures. Applications in machine learning and Bayesian statistics have also made use of the Wasser

  3. Opportunities for research in aerothermodynamics

    NASA Technical Reports Server (NTRS)

    Graham, R. W.

    1983-01-01

    "Aerothermodynamics' involves the disciplines of chemistry, thermodynamics, fluid mechanics and heat transfer which have collaborative importance in propulsion systems. There are growing opportunities for the further application of these disciplines to improve the methodology for the design of advanced gas turbines; particularly, the combustor and turbine. Design procedures follow empirical or cut and try guidelines. The tremendous advances in computational analysis and in instrumentation techniques hold promise for research answers to complex physical processes that are currently not well understood. The transfer of basic research understanding to engineering design should result in shorter, less expensive development commitments for engines. The status and anticipated opportunities in research topics relevant to combustors and turbines is reviewed.

  4. Operations analysis (study 2.1): Shuttle upper stage software requirements

    NASA Technical Reports Server (NTRS)

    Wolfe, R. R.

    1974-01-01

    An investigation of software costs related to space shuttle upper stage operations with emphasis on the additional costs attributable to space servicing was conducted. The questions and problem areas include the following: (1) the key parameters involved with software costs; (2) historical data for extrapolation of future costs; (3) elements of the basic software development effort that are applicable to servicing functions; (4) effect of multiple servicing on complexity of the operation; and (5) are recurring software costs significant. The results address these questions and provide a foundation for estimating software costs based on the costs of similar programs and a series of empirical factors.

  5. A theoretical model for smoking prevention studies in preteen children.

    PubMed

    McGahee, T W; Kemp, V; Tingen, M

    2000-01-01

    The age of the onset of smoking is on a continual decline, with the prime age of tobacco use initiation being 12-14 years. A weakness of the limited research conducted on smoking prevention programs designed for preteen children (ages 10-12) is a well-defined theoretical basis. A theoretical perspective is needed in order to make a meaningful transition from empirical analysis to application of knowledge. Bandura's Social Cognitive Theory (1977, 1986), the Theory of Reasoned Action (Ajzen & Fishbein, 1980), and other literature linking various concepts to smoking behaviors in preteens were used to develop a model that may be useful for smoking prevention studies in preteen children.

  6. Feature Screening for Ultrahigh Dimensional Categorical Data with Applications.

    PubMed

    Huang, Danyang; Li, Runze; Wang, Hansheng

    2014-01-01

    Ultrahigh dimensional data with both categorical responses and categorical covariates are frequently encountered in the analysis of big data, for which feature screening has become an indispensable statistical tool. We propose a Pearson chi-square based feature screening procedure for categorical response with ultrahigh dimensional categorical covariates. The proposed procedure can be directly applied for detection of important interaction effects. We further show that the proposed procedure possesses screening consistency property in the terminology of Fan and Lv (2008). We investigate the finite sample performance of the proposed procedure by Monte Carlo simulation studies, and illustrate the proposed method by two empirical datasets.

  7. Dehydration of detomidine hydrochloride monohydrate.

    PubMed

    Veldre, K; Actiņš, A; Jaunbergs, J

    2011-10-09

    The thermodynamic stability of detomidine hydrochloride monohydrate has been evaluated on the basis of phase transition kinetics in solid state. A method free of empirical models was used for the treatment of kinetic data, and compared to several known solid state kinetic data processing methods. Phase transitions were monitored by powder X-ray diffraction (PXRD) and thermal analysis. Full PXRD profiles were used for determining the phase content instead of single reflex intensity measurements, in order to minimize the influence of particle texture. We compared the applicability of isothermal and nonisothermal methods to our investigation of detomidine hydrochlorine monohydrate dehydration. Copyright © 2011 Elsevier B.V. All rights reserved.

  8. Genetic algorithm based adaptive neural network ensemble and its application in predicting carbon flux

    USGS Publications Warehouse

    Xue, Y.; Liu, S.; Hu, Y.; Yang, J.; Chen, Q.

    2007-01-01

    To improve the accuracy in prediction, Genetic Algorithm based Adaptive Neural Network Ensemble (GA-ANNE) is presented. Intersections are allowed between different training sets based on the fuzzy clustering analysis, which ensures the diversity as well as the accuracy of individual Neural Networks (NNs). Moreover, to improve the accuracy of the adaptive weights of individual NNs, GA is used to optimize the cluster centers. Empirical results in predicting carbon flux of Duke Forest reveal that GA-ANNE can predict the carbon flux more accurately than Radial Basis Function Neural Network (RBFNN), Bagging NN ensemble, and ANNE. ?? 2007 IEEE.

  9. Crossover transition in the fluctuation of Internet

    NASA Astrophysics Data System (ADS)

    Qian, Jiang-Hai

    2018-06-01

    The inconsistent fluctuation behavior of Internet predicted by preferential attachment(PA) and Gibrat's law requires empirical investigations on the actual system. By using the interval-tunable Gibrat's law statistics, we find the actual fluctuation, characterized by the conditional standard deviation of the degree growth rate, changes with the interval length and displays a crossover transition from PA type to Gibrat's law type, which has not yet been captured by any previous models. We characterize the transition dynamics quantitatively and determine the applicative range of PA and Gibrat's law. The correlation analysis indicates the crossover transition may be attributed to the accumulative correlation between the internal links.

  10. Numerical analysis of the effect of the kind of activating agent and the impregnation ratio on the parameters of the microporous structure of the active carbons

    NASA Astrophysics Data System (ADS)

    Kwiatkowski, Mirosław

    2015-09-01

    The paper presents the results of the research on the application of the LBET class adsorption models with the fast multivariant identification procedure as a tool for analysing the microporous structure of the active carbons obtained by chemical activation using potassium and sodium hydroxides as an activator. The proposed technique of the fast multivariant fitting of the LBET class models to the empirical adsorption data was employed particularly to evaluate the impact of the used activator and the impregnation ratio on the obtained microporous structure of the carbonaceous adsorbents.

  11. Semantic Web and Contextual Information: Semantic Network Analysis of Online Journalistic Texts

    NASA Astrophysics Data System (ADS)

    Lim, Yon Soo

    This study examines why contextual information is important to actualize the idea of semantic web, based on a case study of a socio-political issue in South Korea. For this study, semantic network analyses were conducted regarding English-language based 62 blog posts and 101 news stories on the web. The results indicated the differences of the meaning structures between blog posts and professional journalism as well as between conservative journalism and progressive journalism. From the results, this study ascertains empirical validity of current concerns about the practical application of the new web technology, and discusses how the semantic web should be developed.

  12. A generalization of random matrix theory and its application to statistical physics.

    PubMed

    Wang, Duan; Zhang, Xin; Horvatic, Davor; Podobnik, Boris; Eugene Stanley, H

    2017-02-01

    To study the statistical structure of crosscorrelations in empirical data, we generalize random matrix theory and propose a new method of cross-correlation analysis, known as autoregressive random matrix theory (ARRMT). ARRMT takes into account the influence of auto-correlations in the study of cross-correlations in multiple time series. We first analytically and numerically determine how auto-correlations affect the eigenvalue distribution of the correlation matrix. Then we introduce ARRMT with a detailed procedure of how to implement the method. Finally, we illustrate the method using two examples taken from inflation rates for air pressure data for 95 US cities.

  13. Time-dependent seismic hazard analysis for the Greater Tehran and surrounding areas

    NASA Astrophysics Data System (ADS)

    Jalalalhosseini, Seyed Mostafa; Zafarani, Hamid; Zare, Mehdi

    2018-01-01

    This study presents a time-dependent approach for seismic hazard in Tehran and surrounding areas. Hazard is evaluated by combining background seismic activity, and larger earthquakes may emanate from fault segments. Using available historical and paleoseismological data or empirical relation, the recurrence time and maximum magnitude of characteristic earthquakes for the major faults have been explored. The Brownian passage time (BPT) distribution has been used to calculate equivalent fictitious seismicity rate for major faults in the region. To include ground motion uncertainty, a logic tree and five ground motion prediction equations have been selected based on their applicability in the region. Finally, hazard maps have been presented.

  14. Piezoelectric line moment actuator for active radiation control from light-weight structures

    NASA Astrophysics Data System (ADS)

    Jandak, Vojtech; Svec, Petr; Jiricek, Ondrej; Brothanek, Marek

    2017-11-01

    This article outlines the design of a piezoelectric line moment actuator used for active structural acoustic control. Actuators produce a dynamic bending moment that appears in the controlled structure resulting from the inertial forces when the attached piezoelectric stripe actuators start to oscillate. The article provides a detailed theoretical analysis necessary for the practical realization of these actuators, including considerations concerning their placement, a crucial factor in the overall system performance. Approximate formulas describing the dependency of the moment amplitude on the frequency and the required electric voltage are derived. Recommendations applicable for the system's design based on both theoretical and empirical results are provided.

  15. RF Frequency Oscillations in the Early Stages of Vacuum Arc Collapse

    NASA Technical Reports Server (NTRS)

    Griffin, Steven T.; Thio, Y. C. Francis

    2003-01-01

    RF frequency oscillations may be produced in a typical capacitive charging / discharging pulsed power system. These oscillations may be benign, parasitic, destructive or crucial to energy deposition. In some applications, proper damping of oscillations may be critical to proper plasma formation. Because the energy deposited into the plasma is a function of plasma and circuit conditions, the entire plasma / circuit system needs to be considered as a unit To accomplish this, the initiation of plasma is modeled as a time-varying, non-linear element in a circuit analysis model. The predicted spectra are compared to empirical power density spectra including those obtained from vacuum arcs.

  16. Discriminative components of data.

    PubMed

    Peltonen, Jaakko; Kaski, Samuel

    2005-01-01

    A simple probabilistic model is introduced to generalize classical linear discriminant analysis (LDA) in finding components that are informative of or relevant for data classes. The components maximize the predictability of the class distribution which is asymptotically equivalent to 1) maximizing mutual information with the classes, and 2) finding principal components in the so-called learning or Fisher metrics. The Fisher metric measures only distances that are relevant to the classes, that is, distances that cause changes in the class distribution. The components have applications in data exploration, visualization, and dimensionality reduction. In empirical experiments, the method outperformed, in addition to more classical methods, a Renyi entropy-based alternative while having essentially equivalent computational cost.

  17. The Ideologies of American Social Critics: An Empirical Test of Kadushin's Theory

    ERIC Educational Resources Information Center

    Simon, David R.

    1977-01-01

    Examines Kadushin's earlier empirical efforts to determine the leading social critics and organizations of social criticism in America and investigates his theory through content analysis of leading journals of social criticism. (MH)

  18. Behavioral economics: areas of cooperative research between economics and applied behavioral analysis1

    PubMed Central

    Kagel, John H.; Winkler, Robin C.

    1972-01-01

    The current research methods of behavioral economics are characterized by inadequate empirical foundations. Psychologists involved in the experimental analysis of behavior with their research strategies and their experimental technology, particularly that of the Token Economy, can assist in providing empirical foundations for behavioral economics. Cooperative research between economists and psychologists to this end should be immediately fruitful and mutually beneficial. PMID:16795356

  19. Empirical Assessment of Effect of Publication Bias on a Meta-Analysis of Validity Studies on University Matriculation Examinations in Nigeria

    ERIC Educational Resources Information Center

    Adeyemo, Emily Oluseyi

    2012-01-01

    This study examined the impact of publication bias on a meta-analysis of empirical studies on validity of University Matriculation Examinations in Nigeria with a view to determine the level of difference between published and unpublished articles. Specifically, the design was an ex-post facto, a causal comparative design. The sample size consisted…

  20. The Credibility Revolution in Empirical Economics: How Better Research Design is Taking the Con out of Econometrics. NBER Working Paper No. 15794

    ERIC Educational Resources Information Center

    Angrist, Joshua; Pischke, Jorn-Steffen

    2010-01-01

    This essay reviews progress in empirical economics since Leamer'rs (1983) critique. Leamer highlighted the benefits of sensitivity analysis, a procedure in which researchers show how their results change with changes in specification or functional form. Sensitivity analysis has had a salutary but not a revolutionary effect on econometric practice.…

  1. No complexity–stability relationship in empirical ecosystems

    PubMed Central

    Jacquet, Claire; Moritz, Charlotte; Morissette, Lyne; Legagneux, Pierre; Massol, François; Archambault, Philippe; Gravel, Dominique

    2016-01-01

    Understanding the mechanisms responsible for stability and persistence of ecosystems is one of the greatest challenges in ecology. Robert May showed that, contrary to intuition, complex randomly built ecosystems are less likely to be stable than simpler ones. Few attempts have been tried to test May's prediction empirically, and we still ignore what is the actual complexity–stability relationship in natural ecosystems. Here we perform a stability analysis of 116 quantitative food webs sampled worldwide. We find that classic descriptors of complexity (species richness, connectance and interaction strength) are not associated with stability in empirical food webs. Further analysis reveals that a correlation between the effects of predators on prey and those of prey on predators, combined with a high frequency of weak interactions, stabilize food web dynamics relative to the random expectation. We conclude that empirical food webs have several non-random properties contributing to the absence of a complexity–stability relationship. PMID:27553393

  2. Discovery of Empirical Components by Information Theory

    DTIC Science & Technology

    2016-08-10

    AFRL-AFOSR-VA-TR-2016-0289 Discovery of Empirical Components by Information Theory Amit Singer TRUSTEES OF PRINCETON UNIVERSITY 1 NASSAU HALL...3. DATES COVERED (From - To) 15 Feb 2013 to 14 Feb 2016 5a. CONTRACT NUMBER Discovery of Empirical Components by Information Theory 5b. GRANT...they draw not only from traditional linear algebra based numerical analysis or approximation theory , but also from information theory , graph theory

  3. Applied Virtual Reality Research and Applications at NASA/Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Hale, Joseph P.

    1995-01-01

    A Virtual Reality (VR) applications program has been under development at NASA/Marshall Space Flight Center (MSFC) since 1989. The objectives of the MSFC VR Applications Program are to develop, assess, validate, and utilize VR in hardware development, operations development and support, mission operations training and science training. Before this technology can be utilized with confidence in these applications, it must be validated for each particular class of application. That is, the precision and reliability with which it maps onto real settings and scenarios, representative of a class, must be calculated and assessed. The approach of the MSFC VR Applications Program is to develop and validate appropriate virtual environments and associated object kinematic and behavior attributes for specific classes of applications. These application-specific environments and associated simulations will be validated, where possible, through empirical comparisons with existing, accepted tools and methodologies. These validated VR analytical tools will then be available for use in the design and development of space systems and operations and in training and mission support systems. Specific validation studies for selected classes of applications have been completed or are currently underway. These include macro-ergonomic "control-room class" design analysis, Spacelab stowage reconfiguration training, a full-body micro-gravity functional reach simulator, and a gross anatomy teaching simulator. This paper describes the MSFC VR Applications Program and the validation studies.

  4. Clinical characteristics of ceftriaxone plus metronidazole in complicated intra-abdominal infection

    PubMed Central

    2015-01-01

    Purpose Empirical antibiotics in complicated intra-abdominal infection (c-IAI), such as secondary peritonitis are a first step of treatment. Empirical antibiotic regimen is very diverse. Ceftriaxone plus metronidazole regimen (CMR) is one of the empirical antibiotic regimens used in treatment of c-IAI. However, although CMR is a widely used empirical antibiotic regimen, study regarding success, failure or efficacy of CMR has been poorly understood. This retrospective study is conducted to compare the clinical efficacy of this regimen in c-IAI according to clinical characteristics. Methods The subjects were patients in this hospital who were diagnosed as secondary peritonitis between 2009 and 2013. Retrospective analysis was performed based on the records made after surgery regarding clinical characteristics including albumin level, blood pressure, pulse rate, respiration rate, smoking, age, sex, body mass index, hemoglobin, coexisting disease, leukocytosis, and APACHE (acute physiology and chronic health evaluation) II score. Results A total of 114 patients were enrolled. In univariated analysis, the success and failure of CMR showed significant association with preoperative low albumin, old age, and preoperative tachycardia. In multivariated analysis, low albumin and preoperative tachycardia were significant. Conclusion It is thought that an additional antibiotic treatment plan is necessary in patients with low albumin and tachycardia when the empirical antibiotic regimen is CMR in c-IAI. Conduct of research through well-designed prospective randomized clinical study is also necessary in order to evaluate the appropriateness of CMR and decide on a proper empirical antibiotic regimen between many regimens in c-IAI based on our country. PMID:26131444

  5. Diagnostic Value of PCR Analysis of Bacteria and Fungi from Blood in Empiric-Therapy-Resistant Febrile Neutropenia ▿

    PubMed Central

    Nakamura, Akiko; Sugimoto, Yuka; Ohishi, Kohshi; Sugawara, Yumiko; Fujieda, Atsushi; Monma, Fumihiko; Suzuki, Kei; Masuya, Masahiro; Nakase, Kazunori; Matsushima, Yoshiko; Wada, Hideo; Katayama, Naoyuki; Nobori, Tsutomu

    2010-01-01

    This study aimed to assess the clinical utility of PCR for the analysis of bacteria and fungi from blood for the management of febrile neutropenic patients with hematologic malignancies. Using a PCR system able to detect a broad range of bacteria and fungi, we conducted a prospective pilot study of periodic analyses of blood from patients following intensive chemotherapy. When fever occurred, it was treated with empirical antibiotic therapy, basically without knowledge of the PCR results. In 23 febrile episodes during the neutropenic period, bacteria were detected by PCR in 11 cases, while the same species were identified by blood culture in 3 cases. In 10 out of 11 PCR-positive cases, fever could be managed by empirical therapy. In the empirical-therapy-resistant case, the identification of Stenotrophomonas maltophilia by PCR led to improvement of fever. No fungi were detected by PCR in febrile cases, while Aspergillus fumigatus was detected in one afebrile patient, several days before a clinical diagnosis was made. In subsequent sporadic PCR analyses in 15 cases of febrile neutropenia, bacteria were detected by both PCR and blood culture in 7 cases and by PCR alone in 6. Fungi were not detected. While fever was improved by empirical therapy in 12 out of the 13 PCR-positive cases, the identification of Pseudomonas aeruginosa by PCR in one therapy-resistant case contributed to the successful treatment of persistent fever. Our results indicate that PCR analysis of bacteria from blood provides essential information for managing empirical-therapy-resistant febrile neutropenia. PMID:20392911

  6. Concept Analysis: Health-Promoting Behaviors Related to Human Papilloma Virus (HPV) Infection.

    PubMed

    McCutcheon, Tonna; Schaar, Gina; Parker, Karen L

    2015-01-01

    The concept of health-promoting behaviors incorporates ideas presented in the Ottawa Charter of Public Health and the nursing-based Health Promotion Model. Despite the fact that the concept of health-promoting behaviors has a nursing influence, literature suggests nursing has inadequately developed and used this concept within nursing practice. A further review of literature regarding health promotion behaviors and the human papilloma virus suggest a distinct gap in nursing literature. This article presents a concept analysis of health-promoting behaviors related to the human papilloma virus in order to encourage the application of the concept into nursing practice, promote continued nursing research regarding this concept, and further expand the application of health-promoting behaviors to other situations and populations within the nursing discipline. Attributes of health-promoting behaviors are presented and include empowerment, participation, community, and a positive concept of health. Antecedents, consequences, and empirical referents are also presented, as are model, borderline, and contrary cases to help clarify the concept. Recommendations for human papilloma virus health-promoting behaviors within the nursing practice are also provided. © 2014 Wiley Periodicals, Inc.

  7. Applications of species accumulation curves in large-scale biological data analysis.

    PubMed

    Deng, Chao; Daley, Timothy; Smith, Andrew D

    2015-09-01

    The species accumulation curve, or collector's curve, of a population gives the expected number of observed species or distinct classes as a function of sampling effort. Species accumulation curves allow researchers to assess and compare diversity across populations or to evaluate the benefits of additional sampling. Traditional applications have focused on ecological populations but emerging large-scale applications, for example in DNA sequencing, are orders of magnitude larger and present new challenges. We developed a method to estimate accumulation curves for predicting the complexity of DNA sequencing libraries. This method uses rational function approximations to a classical non-parametric empirical Bayes estimator due to Good and Toulmin [Biometrika, 1956, 43, 45-63]. Here we demonstrate how the same approach can be highly effective in other large-scale applications involving biological data sets. These include estimating microbial species richness, immune repertoire size, and k -mer diversity for genome assembly applications. We show how the method can be modified to address populations containing an effectively infinite number of species where saturation cannot practically be attained. We also introduce a flexible suite of tools implemented as an R package that make these methods broadly accessible.

  8. Applications of species accumulation curves in large-scale biological data analysis

    PubMed Central

    Deng, Chao; Daley, Timothy; Smith, Andrew D

    2016-01-01

    The species accumulation curve, or collector’s curve, of a population gives the expected number of observed species or distinct classes as a function of sampling effort. Species accumulation curves allow researchers to assess and compare diversity across populations or to evaluate the benefits of additional sampling. Traditional applications have focused on ecological populations but emerging large-scale applications, for example in DNA sequencing, are orders of magnitude larger and present new challenges. We developed a method to estimate accumulation curves for predicting the complexity of DNA sequencing libraries. This method uses rational function approximations to a classical non-parametric empirical Bayes estimator due to Good and Toulmin [Biometrika, 1956, 43, 45–63]. Here we demonstrate how the same approach can be highly effective in other large-scale applications involving biological data sets. These include estimating microbial species richness, immune repertoire size, and k-mer diversity for genome assembly applications. We show how the method can be modified to address populations containing an effectively infinite number of species where saturation cannot practically be attained. We also introduce a flexible suite of tools implemented as an R package that make these methods broadly accessible. PMID:27252899

  9. SU-F-BRE-14: Uncertainty Analysis for Dose Measurements Using OSLD NanoDots

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kry, S; Alvarez, P; Stingo, F

    2014-06-15

    Purpose: Optically stimulated luminescent dosimeters (OSLD) are an increasingly popular dosimeter for research and clinical applications. It is also used by the Radiological Physics Center for remote auditing of machine output. In this work we robustly calculated the reproducibility and uncertainty of the OSLD nanoDot. Methods: For the RPC dose calculation, raw readings are corrected for depletion, element sensitivity, fading, linearity, and energy. System calibration is determined for the experimental OSLD irradiated at different institutions by using OSLD irradiated by the RPC under reference conditions (i.e., standards): 1 Gy in a Cobalt beam. The intra-dot and inter-dot reproducibilities (coefficient ofmore » variation) were determined from the history of RPC readings of these standards. The standard deviation of the corrected OSLD signal was then calculated analytically using a recursive formalism that did not rely on the normality assumption of the underlying uncertainties, or on any type of mathematical approximation. This analytical uncertainty was compared to that empirically estimated from >45,000 RPC beam audits. Results: The intra-dot variability was found to be 0.59%, with only a small variation between readers. Inter-dot variability was found to be 0.85%. The uncertainty in each of the individual correction factors was empirically determined. When the raw counts from each OSLD were adjusted for the appropriate correction factors, the analytically determined coefficient of variation was 1.8% over a range of institutional irradiation conditions that are seen at the RPC. This is reasonably consistent with the empirical observations of the RPC, where the coefficient of variation of the measured beam outputs is 1.6% (photons) and 1.9% (electrons). Conclusion: OSLD nanoDots provide sufficiently good precision for a wide range of applications, including the RPC remote monitoring program for megavoltage beams. This work was supported by PHS grant CA10953 awarded by the NIH (DHHS)« less

  10. Structural Equations and Causal Explanations: Some Challenges for Causal SEM

    ERIC Educational Resources Information Center

    Markus, Keith A.

    2010-01-01

    One common application of structural equation modeling (SEM) involves expressing and empirically investigating causal explanations. Nonetheless, several aspects of causal explanation that have an impact on behavioral science methodology remain poorly understood. It remains unclear whether applications of SEM should attempt to provide complete…

  11. An Integrated Tone Mapping for High Dynamic Range Image Visualization

    NASA Astrophysics Data System (ADS)

    Liang, Lei; Pan, Jeng-Shyang; Zhuang, Yongjun

    2018-01-01

    There are two type tone mapping operators for high dynamic range (HDR) image visualization. HDR image mapped by perceptual operators have strong sense of reality, but will lose local details. Empirical operators can maximize local detail information of HDR image, but realism is not strong. A common tone mapping operator suitable for all applications is not available. This paper proposes a novel integrated tone mapping framework which can achieve conversion between empirical operators and perceptual operators. In this framework, the empirical operator is rendered based on improved saliency map, which simulates the visual attention mechanism of the human eye to the natural scene. The results of objective evaluation prove the effectiveness of the proposed solution.

  12. Vast Portfolio Selection with Gross-exposure Constraints*

    PubMed Central

    Fan, Jianqing; Zhang, Jingjin; Yu, Ke

    2012-01-01

    We introduce the large portfolio selection using gross-exposure constraints. We show that with gross-exposure constraint the empirically selected optimal portfolios based on estimated covariance matrices have similar performance to the theoretical optimal ones and there is no error accumulation effect from estimation of vast covariance matrices. This gives theoretical justification to the empirical results in Jagannathan and Ma (2003). We also show that the no-short-sale portfolio can be improved by allowing some short positions. The applications to portfolio selection, tracking, and improvements are also addressed. The utility of our new approach is illustrated by simulation and empirical studies on the 100 Fama-French industrial portfolios and the 600 stocks randomly selected from Russell 3000. PMID:23293404

  13. Bias analysis to improve monitoring an HIV epidemic and its response: approach and application to a survey of female sex workers in Iran.

    PubMed

    Mirzazadeh, Ali; Mansournia, Mohammad-Ali; Nedjat, Saharnaz; Navadeh, Soodabeh; McFarland, Willi; Haghdoost, Ali Akbar; Mohammad, Kazem

    2013-10-01

    We present probabilistic and Bayesian techniques to correct for bias in categorical and numerical measures and empirically apply them to a recent survey of female sex workers (FSW) conducted in Iran. We used bias parameters from a previous validation study to correct estimates of behaviours reported by FSW. Monte-Carlo Sensitivity Analysis and Bayesian bias analysis produced point and simulation intervals (SI). The apparent and corrected prevalence differed by a minimum of 1% for the number of 'non-condom use sexual acts' (36.8% vs 35.8%) to a maximum of 33% for 'ever associated with a venue to sell sex' (35.5% vs 68.0%). The negative predictive value of the questionnaire for 'history of STI' and 'ever associated with a venue to sell sex' was 36.3% (95% SI 4.2% to 69.1%) and 46.9% (95% SI 6.3% to 79.1%), respectively. Bias-adjusted numerical measures of behaviours increased by 0.1 year for 'age at first sex act for money' to 1.5 for 'number of sexual contacts in last 7 days'. The 'true' estimates of most behaviours are considerably higher than those reported and the related SIs are wider than conventional CIs. Our analysis indicates the need for and applicability of bias analysis in surveys, particularly in stigmatised settings.

  14. The validation of peer review through research impact measures and the implications for funding strategies.

    PubMed

    Gallo, Stephen A; Carpenter, Afton S; Irwin, David; McPartland, Caitlin D; Travis, Joseph; Reynders, Sofie; Thompson, Lisa A; Glisson, Scott R

    2014-01-01

    There is a paucity of data in the literature concerning the validation of the grant application peer review process, which is used to help direct billions of dollars in research funds. Ultimately, this validation will hinge upon empirical data relating the output of funded projects to the predictions implicit in the overall scientific merit scores from the peer review of submitted applications. In an effort to address this need, the American Institute of Biological Sciences (AIBS) conducted a retrospective analysis of peer review data of 2,063 applications submitted to a particular research program and the bibliometric output of the resultant 227 funded projects over an 8-year period. Peer review scores associated with applications were found to be moderately correlated with the total time-adjusted citation output of funded projects, although a high degree of variability existed in the data. Analysis over time revealed that as average annual scores of all applications (both funded and unfunded) submitted to this program improved with time, the average annual citation output per application increased. Citation impact did not correlate with the amount of funds awarded per application or with the total annual programmatic budget. However, the number of funded applications per year was found to correlate well with total annual citation impact, suggesting that improving funding success rates by reducing the size of awards may be an efficient strategy to optimize the scientific impact of research program portfolios. This strategy must be weighed against the need for a balanced research portfolio and the inherent high costs of some areas of research. The relationship observed between peer review scores and bibliometric output lays the groundwork for establishing a model system for future prospective testing of the validity of peer review formats and procedures.

  15. The Validation of Peer Review through Research Impact Measures and the Implications for Funding Strategies

    PubMed Central

    Gallo, Stephen A.; Carpenter, Afton S.; Irwin, David; McPartland, Caitlin D.; Travis, Joseph; Reynders, Sofie; Thompson, Lisa A.; Glisson, Scott R.

    2014-01-01

    There is a paucity of data in the literature concerning the validation of the grant application peer review process, which is used to help direct billions of dollars in research funds. Ultimately, this validation will hinge upon empirical data relating the output of funded projects to the predictions implicit in the overall scientific merit scores from the peer review of submitted applications. In an effort to address this need, the American Institute of Biological Sciences (AIBS) conducted a retrospective analysis of peer review data of 2,063 applications submitted to a particular research program and the bibliometric output of the resultant 227 funded projects over an 8-year period. Peer review scores associated with applications were found to be moderately correlated with the total time-adjusted citation output of funded projects, although a high degree of variability existed in the data. Analysis over time revealed that as average annual scores of all applications (both funded and unfunded) submitted to this program improved with time, the average annual citation output per application increased. Citation impact did not correlate with the amount of funds awarded per application or with the total annual programmatic budget. However, the number of funded applications per year was found to correlate well with total annual citation impact, suggesting that improving funding success rates by reducing the size of awards may be an efficient strategy to optimize the scientific impact of research program portfolios. This strategy must be weighed against the need for a balanced research portfolio and the inherent high costs of some areas of research. The relationship observed between peer review scores and bibliometric output lays the groundwork for establishing a model system for future prospective testing of the validity of peer review formats and procedures. PMID:25184367

  16. Sparsity guided empirical wavelet transform for fault diagnosis of rolling element bearings

    NASA Astrophysics Data System (ADS)

    Wang, Dong; Zhao, Yang; Yi, Cai; Tsui, Kwok-Leung; Lin, Jianhui

    2018-02-01

    Rolling element bearings are widely used in various industrial machines, such as electric motors, generators, pumps, gearboxes, railway axles, turbines, and helicopter transmissions. Fault diagnosis of rolling element bearings is beneficial to preventing any unexpected accident and reducing economic loss. In the past years, many bearing fault detection methods have been developed. Recently, a new adaptive signal processing method called empirical wavelet transform attracts much attention from readers and engineers and its applications to bearing fault diagnosis have been reported. The main problem of empirical wavelet transform is that Fourier segments required in empirical wavelet transform are strongly dependent on the local maxima of the amplitudes of the Fourier spectrum of a signal, which connotes that Fourier segments are not always reliable and effective if the Fourier spectrum of the signal is complicated and overwhelmed by heavy noises and other strong vibration components. In this paper, sparsity guided empirical wavelet transform is proposed to automatically establish Fourier segments required in empirical wavelet transform for fault diagnosis of rolling element bearings. Industrial bearing fault signals caused by single and multiple railway axle bearing defects are used to verify the effectiveness of the proposed sparsity guided empirical wavelet transform. Results show that the proposed method can automatically discover Fourier segments required in empirical wavelet transform and reveal single and multiple railway axle bearing defects. Besides, some comparisons with three popular signal processing methods including ensemble empirical mode decomposition, the fast kurtogram and the fast spectral correlation are conducted to highlight the superiority of the proposed method.

  17. SMART (Sports Medicine and Rehabilitation Team) Centers: An Empirical Analysis

    DTIC Science & Technology

    2007-04-01

    completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection ofinformation ...of finite health care resources, increased military operational tempo, and smaller expeditionary fighting forces, the US Navy has developed SMART...through their sacrifices due I owe my success. SMART Centers: An Empirical Analysis 5 Abstract In an era of finite health care resources, increased military

  18. Increasing Genome Sampling and Improving SNP Genotyping for Genotyping-by-Sequencing with New Combinations of Restriction Enzymes.

    PubMed

    Fu, Yong-Bi; Peterson, Gregory W; Dong, Yibo

    2016-04-07

    Genotyping-by-sequencing (GBS) has emerged as a useful genomic approach for exploring genome-wide genetic variation. However, GBS commonly samples a genome unevenly and can generate a substantial amount of missing data. These technical features would limit the power of various GBS-based genetic and genomic analyses. Here we present software called IgCoverage for in silico evaluation of genomic coverage through GBS with an individual or pair of restriction enzymes on one sequenced genome, and report a new set of 21 restriction enzyme combinations that can be applied to enhance GBS applications. These enzyme combinations were developed through an application of IgCoverage on 22 plant, animal, and fungus species with sequenced genomes, and some of them were empirically evaluated with different runs of Illumina MiSeq sequencing in 12 plant species. The in silico analysis of 22 organisms revealed up to eight times more genome coverage for the new combinations consisted of pairing four- or five-cutter restriction enzymes than the commonly used enzyme combination PstI + MspI. The empirical evaluation of the new enzyme combination (HinfI + HpyCH4IV) in 12 plant species showed 1.7-6 times more genome coverage than PstI + MspI, and 2.3 times more genome coverage in dicots than monocots. Also, the SNP genotyping in 12 Arabidopsis and 12 rice plants revealed that HinfI + HpyCH4IV generated 7 and 1.3 times more SNPs (with 0-16.7% missing observations) than PstI + MspI, respectively. These findings demonstrate that these novel enzyme combinations can be utilized to increase genome sampling and improve SNP genotyping in various GBS applications. Copyright © 2016 Fu et al.

  19. Relational frame theory: A new paradigm for the analysis of social behavior

    PubMed Central

    Roche, Bryan; Barnes-Holmes, Yvonne; Barnes-Holmes, Dermot; Stewart, Ian; O'Hora, Denis

    2002-01-01

    Recent developments in the analysis of derived relational responding, under the rubric of relational frame theory, have brought several complex language and cognitive phenomena within the empirical reach of the experimental analysis of behavior. The current paper provides an outline of relational frame theory as a new approach to the analysis of language, cognition, and complex behavior more generally. Relational frame theory, it is argued, also provides a suitable paradigm for the analysis of a wide variety of social behavior that is mediated by language. Recent empirical evidence and theoretical interpretations are provided in support of the relational frame approach to social behavior. PMID:22478379

  20. Applications systems verification and transfer project. Volume 1: Operational applications of satellite snow cover observations: Executive summary. [usefulness of satellite snow-cover data for water yield prediction

    NASA Technical Reports Server (NTRS)

    Rango, A.

    1981-01-01

    Both LANDSAT and NOAA satellite data were used in improving snowmelt runoff forecasts. When the satellite snow cover data were tested in both empirical seasonal runoff estimation and short term modeling approaches, a definite potential for reducing forecast error was evident. A cost benefit analysis run in conjunction with the snow mapping indicated a $36.5 million annual benefit accruing from a one percent improvement in forecast accuracy using the snow cover data for the western United States. The annual cost of employing the system would be $505,000. The snow mapping has proven that satellite snow cover data can be used to reduce snowmelt runoff forecast error in a cost effective manner once all operational satellite data are available within 72 hours after acquisition. Executive summaries of the individual snow mapping projects are presented.

Top