DOT National Transportation Integrated Search
2017-02-08
The study re-evaluates distress prediction models using the Mechanistic-Empirical Pavement Design Guide (MEPDG) and expands the sensitivity analysis to a wide range of pavement structures and soils. In addition, an extensive validation analysis of th...
[Mobbing: a meta-analysis and integrative model of its antecedents and consequences].
Topa Cantisano, Gabriela; Depolo, Marco; Morales Domínguez, J Francisco
2007-02-01
Although mobbing has been extensively studied, empirical research has not led to firm conclusions regarding its antecedents and consequences, both at personal and organizational levels. An extensive literature search yielded 86 empirical studies with 93 samples. The matrix correlation obtained through meta-analytic techniques was used to test a structural equation model. Results supported hypotheses regarding organizational environmental factors as main predictors of mobbing.
Topa Cantisano, Gabriela; Morales Domínguez, J F; Depolo, Marco
2008-05-01
Although sexual harassment has been extensively studied, empirical research has not led to firm conclusions about its antecedents and consequences, both at the personal and organizational level. An extensive literature search yielded 42 empirical studies with 60 samples. The matrix correlation obtained through meta-analytic techniques was used to test a structural equation model. Results supported the hypotheses regarding organizational environmental factors as main predictors of harassment.
Gender Differences in Access to Extension Services and Agricultural Productivity
ERIC Educational Resources Information Center
Ragasa, Catherine; Berhane, Guush; Tadesse, Fanaye; Taffesse, Alemayehu Seyoum
2013-01-01
Purpose: This article contributes new empirical evidence and nuanced analysis on the gender difference in access to extension services and how this translates to observed differences in technology adoption and agricultural productivity. Approach: It looks at the case of Ethiopia, where substantial investments in the extension system have been…
ERIC Educational Resources Information Center
De Rosa, Marcello; Bartoli, Luca
2017-01-01
Purpose: The aim of the paper is to evaluate how advisory services stimulate the adoption of rural development policies (RDP) aiming at value creation. Design/methodology/approach: By linking the use of agricultural extension services (AES) to policies for value creation, we will put forward an empirical analysis in Italy, with the aim of…
A License to Produce? Farmer Interpretations of the New Food Security Agenda
ERIC Educational Resources Information Center
Fish, Rob; Lobley, Matt; Winter, Michael
2013-01-01
Drawing on the findings of empirical research conducted in the South West of England, this paper explores how farmers make sense of re-emerging imperatives for "food security" in UK policy and political discourse. The analysis presented is based on two types of empirical inquiry. First, an extensive survey of 1543 farmers, exploring the…
ERIC Educational Resources Information Center
Forster, Greg
2008-01-01
The impact of Florida's "A+" accountability program, which until 2006 included a voucher program for chronically failing schools, on public school performance has been extensively studied. The results have consistently shown a positive effect on academic outcomes in Florida public schools. However, no empirical research has been done on…
ERIC Educational Resources Information Center
Schmid, Richard F.; Bernard, Robert M.; Borokhovski, Eugene; Tamim, Rana; Abrami, Philip C.; Wade, C. Anne; Surkes, Michael A.; Lowerison, Gretchen
2009-01-01
This paper reports the findings of a Stage I meta-analysis exploring the achievement effects of computer-based technology use in higher education classrooms (non-distance education). An extensive literature search revealed more than 6,000 potentially relevant primary empirical studies. Analysis of a representative sample of 231 studies (k = 310)…
Modeling the effects of study abroad programs on college students
Alvin H. Yu; Garry E. Chick; Duarte B. Morais; Chung-Hsien Lin
2009-01-01
This study explored the possibility of modeling the effects of a study abroad program on students from a university in the northeastern United States. A program effect model was proposed after conducting an extensive literature review and empirically examining a sample of 265 participants in 2005. Exploratory factor analysis (EFA), confirmatory factor analysis (CFA),...
Quantitative genetic versions of Hamilton's rule with empirical applications
McGlothlin, Joel W.; Wolf, Jason B.; Brodie, Edmund D.; Moore, Allen J.
2014-01-01
Hamilton's theory of inclusive fitness revolutionized our understanding of the evolution of social interactions. Surprisingly, an incorporation of Hamilton's perspective into the quantitative genetic theory of phenotypic evolution has been slow, despite the popularity of quantitative genetics in evolutionary studies. Here, we discuss several versions of Hamilton's rule for social evolution from a quantitative genetic perspective, emphasizing its utility in empirical applications. Although evolutionary quantitative genetics offers methods to measure each of the critical parameters of Hamilton's rule, empirical work has lagged behind theory. In particular, we lack studies of selection on altruistic traits in the wild. Fitness costs and benefits of altruism can be estimated using a simple extension of phenotypic selection analysis that incorporates the traits of social interactants. We also discuss the importance of considering the genetic influence of the social environment, or indirect genetic effects (IGEs), in the context of Hamilton's rule. Research in social evolution has generated an extensive body of empirical work focusing—with good reason—almost solely on relatedness. We argue that quantifying the roles of social and non-social components of selection and IGEs, in addition to relatedness, is now timely and should provide unique additional insights into social evolution. PMID:24686930
Hard-Rock Stability Analysis for Span Design in Entry-Type Excavations with Learning Classifiers
García-Gonzalo, Esperanza; Fernández-Muñiz, Zulima; García Nieto, Paulino José; Bernardo Sánchez, Antonio; Menéndez Fernández, Marta
2016-01-01
The mining industry relies heavily on empirical analysis for design and prediction. An empirical design method, called the critical span graph, was developed specifically for rock stability analysis in entry-type excavations, based on an extensive case-history database of cut and fill mining in Canada. This empirical span design chart plots the critical span against rock mass rating for the observed case histories and has been accepted by many mining operations for the initial span design of cut and fill stopes. Different types of analysis have been used to classify the observed cases into stable, potentially unstable and unstable groups. The main purpose of this paper is to present a new method for defining rock stability areas of the critical span graph, which applies machine learning classifiers (support vector machine and extreme learning machine). The results show a reasonable correlation with previous guidelines. These machine learning methods are good tools for developing empirical methods, since they make no assumptions about the regression function. With this software, it is easy to add new field observations to a previous database, improving prediction output with the addition of data that consider the local conditions for each mine. PMID:28773653
Hard-Rock Stability Analysis for Span Design in Entry-Type Excavations with Learning Classifiers.
García-Gonzalo, Esperanza; Fernández-Muñiz, Zulima; García Nieto, Paulino José; Bernardo Sánchez, Antonio; Menéndez Fernández, Marta
2016-06-29
The mining industry relies heavily on empirical analysis for design and prediction. An empirical design method, called the critical span graph, was developed specifically for rock stability analysis in entry-type excavations, based on an extensive case-history database of cut and fill mining in Canada. This empirical span design chart plots the critical span against rock mass rating for the observed case histories and has been accepted by many mining operations for the initial span design of cut and fill stopes. Different types of analysis have been used to classify the observed cases into stable, potentially unstable and unstable groups. The main purpose of this paper is to present a new method for defining rock stability areas of the critical span graph, which applies machine learning classifiers (support vector machine and extreme learning machine). The results show a reasonable correlation with previous guidelines. These machine learning methods are good tools for developing empirical methods, since they make no assumptions about the regression function. With this software, it is easy to add new field observations to a previous database, improving prediction output with the addition of data that consider the local conditions for each mine.
Dimensions of Early Speech Sound Disorders: A Factor Analytic Study
ERIC Educational Resources Information Center
Lewis, Barbara A.; Freebairn, Lisa A.; Hansen, Amy J.; Stein, Catherine M.; Shriberg, Lawrence D.; Iyengar, Sudha K.; Taylor, H. Gerry
2006-01-01
The goal of this study was to classify children with speech sound disorders (SSD) empirically, using factor analytic techniques. Participants were 3-7-year olds enrolled in speech/language therapy (N=185). Factor analysis of an extensive battery of speech and language measures provided support for two distinct factors, representing the skill…
Usage Intention Framework Model: A Fuzzy Logic Interpretation of the Classical Utaut Model
ERIC Educational Resources Information Center
Sandaire, Johnny
2009-01-01
A fuzzy conjoint analysis (FCA: Turksen, 1992) model for enhancing management decision in the technology adoption domain was implemented as an extension to the UTAUT model (Venkatesh, Morris, Davis, & Davis, 2003). Additionally, a UTAUT-based Usage Intention Framework Model (UIFM) introduced a closed-loop feedback system. The empirical evidence…
School Reforms, Principal Leadership, and Teacher Resistance: Evidence from Korea
ERIC Educational Resources Information Center
Park, Joo-Ho; Jeong, Dong Wook
2013-01-01
Many countries design and implement school change with a focus on the fundamental reconfiguration in the structures of schooling. In this article, we examined the relationship between principal leadership and teacher resistance to school reforms driven by external interveners. For an empirical analysis, we took advantage of extensive data derived…
Mixture Distribution Latent State-Trait Analysis: Basic Ideas and Applications
ERIC Educational Resources Information Center
Courvoisier, Delphine S.; Eid, Michael; Nussbeck, Fridtjof W.
2007-01-01
Extensions of latent state-trait models for continuous observed variables to mixture latent state-trait models with and without covariates of change are presented that can separate individuals differing in their occasion-specific variability. An empirical application to the repeated measurement of mood states (N = 501) revealed that a model with 2…
NASA Astrophysics Data System (ADS)
Levy, M. C.; Thompson, S. E.; Cohn, A.
2014-12-01
Land use/cover change (LUCC) has occurred extensively in the Brazilian Amazon rainforest-savanna transition. Agricultural development-driven LUCC at regional scales can alter surface energy budgets, evapotranspiration (ET) and rainfall; these hydroclimatic changes impact streamflows, and thus hydropower. To date, there is only limited empirical understanding of these complex land-water-energy nexus dynamics, yet understanding is important to developing countries where both agriculture and hydropower are expanding and intensifying. To observe these changes and their interconnections, we synthesize a novel combination of ground network, remotely sensed, and empirically modeled data for LUCC, rainfall, flows, and hydropower potential. We connect the extensive temporal and spatial trends in LUCC occurring from 2000-2012 (and thus observable in the satellite record) to long-term historical flow records and run-of-river hydropower generation potential estimates. Changes in hydrologic condition are observed in terms of dry and wet season moments, extremes, and flow duration curves. Run-of-river hydropower generation potential is modeled at basin gauge points using equation models parameterized with literature-based low-head turbine efficiencies, and simple algorithms establishing optimal head and capacity from elevation and flows, respectively. Regression analyses are used to demonstrate a preliminary causal analysis of LUCC impacts to flow and energy, and discuss extension of the analysis to ungauged basins. The results are transferable to tropical and transitional forest regions worldwide where simultaneous agricultural and hydropower development potentially compete for coupled components of regional water cycles, and where policy makers and planners require an understanding of LUCC impacts to hydroclimate-dependent industries and ecosystems.
Martin, Guillaume; Chapuis, Elodie; Goudet, Jérôme
2008-01-01
Neutrality tests in quantitative genetics provide a statistical framework for the detection of selection on polygenic traits in wild populations. However, the existing method based on comparisons of divergence at neutral markers and quantitative traits (Qst–Fst) suffers from several limitations that hinder a clear interpretation of the results with typical empirical designs. In this article, we propose a multivariate extension of this neutrality test based on empirical estimates of the among-populations (D) and within-populations (G) covariance matrices by MANOVA. A simple pattern is expected under neutrality: D = 2Fst/(1 − Fst)G, so that neutrality implies both proportionality of the two matrices and a specific value of the proportionality coefficient. This pattern is tested using Flury's framework for matrix comparison [common principal-component (CPC) analysis], a well-known tool in G matrix evolution studies. We show the importance of using a Bartlett adjustment of the test for the small sample sizes typically found in empirical studies. We propose a dual test: (i) that the proportionality coefficient is not different from its neutral expectation [2Fst/(1 − Fst)] and (ii) that the MANOVA estimates of mean square matrices between and among populations are proportional. These two tests combined provide a more stringent test for neutrality than the classic Qst–Fst comparison and avoid several statistical problems. Extensive simulations of realistic empirical designs suggest that these tests correctly detect the expected pattern under neutrality and have enough power to efficiently detect mild to strong selection (homogeneous, heterogeneous, or mixed) when it is occurring on a set of traits. This method also provides a rigorous and quantitative framework for disentangling the effects of different selection regimes and of drift on the evolution of the G matrix. We discuss practical requirements for the proper application of our test in empirical studies and potential extensions. PMID:18245845
Conducting a Multivocal Thematic Synthesis on an Extensive Body of Literature
ERIC Educational Resources Information Center
Befus, Madelaine
2016-01-01
This paper will provide a methodology and progress report from a multivocal thematic synthesis being conducted on an extensive, diverse body of empirical studies. The study data includes a corpus of peer-reviewed empirical literature sharing a common reference published in English between 2000 and 2014. In this study, data to be synthesized share…
Rosen, Baruch; Tepper, Yotam; Bar-Oz, Guy
2018-01-01
Metric data of 6th century CE pigeons from the Negev Desert, Israel, are employed to test competing hypotheses on flock management strategies: that directed selection for size or shape took place under intensive management; or, alternatively, that stabilizing selection was a stronger determinant of size and shape under extensive management conditions. The results of the analysis support the second hypothesis by demonstrating that the Byzantine Negev pigeons were like wild pigeon (Columba livia) in shape, albeit small-sized. The inferred extensive management system is then discussed in the context of pigeon domestication and human micro-ecologies in marginal regions. PMID:29561880
Empirical factors and structure transference: Returning to the London account
NASA Astrophysics Data System (ADS)
Bueno, Otávio; French, Steven; Ladyman, James
2012-05-01
We offer a framework to represent the roles of empirical and theoretical factors in theory construction, and examine a case study to illustrate how the framework can be used to illuminate central features of scientific reasoning. The case study provides an extension of French and Ladyman's (1997) analysis of Fritz and Heinz London's model of superconductivity to accommodate the role of the analogy between superconductivity and diamagnetic phenomena in the development of the model between 1935 and 1937. We focus on this case since it allows us to separate the roles of empirical and theoretical factors, and so provides an example of the utility of the approach that we have adopted. We conclude the paper by drawing on the particular framework here developed to address a range of concerns.
An Empirical Approach to Analysis of Similarities between Software Failure Regions
1991-09-01
cycle costs after the soft- ware has been marketed (Alberts, 1976). 1 Unfortunately, extensive software testing is frequently necessary in spite of...incidence is primarily syntactic. This mixing of semantic and syntactic forms in the same analysis could lead to some distortion, especially since the...of formulae to improve readability or to indicate precedence of operations. * All defintions within ’Condition I’ of a failure region are assumed to
ERIC Educational Resources Information Center
Desjardins, Richard; Ederer, Peer
2015-01-01
This article explores the relative importance of different socio-demographic and practice-oriented factors that are related to proficiency in problem solving in technology-rich environments (PSTREs) and by extension may be related to complex problem solving (CPS). The empirical analysis focuses on the proficiency measurements of PSTRE made…
The Effect of Automobile Safety on Vehicle Type Choice: An Empirical Study.
ERIC Educational Resources Information Center
McCarthy, Patrick S.
An analysis was made of the extent to which the safety characteristics of new vehicles affect consumer purchase decisions. Using an extensive data set that combines vehicle data collected by the Automobile Club of Southern California Target Car Program with the responses from a national household survey of new car buyers, a statistical model of…
An Econometric Examination of the Behavioral Perspective Model in the Context of Norwegian Retailing
ERIC Educational Resources Information Center
Sigurdsson, Valdimar; Kahamseh, Saeed; Gunnarsson, Didrik; Larsen, Nils Magne; Foxall, Gordon R.
2013-01-01
The behavioral perspective model's (BPM; Foxall, 1990) retailing literature is built on extensive empirical research and techniques that were originally refined in choice experiments in behavioral economics and behavior analysis, and then tested mostly on British consumer panel data. We test the BPM in the context of Norwegian retailing. This…
Theoretical and Empirical Analysis of a Spatial EA Parallel Boosting Algorithm.
Kamath, Uday; Domeniconi, Carlotta; De Jong, Kenneth
2018-01-01
Many real-world problems involve massive amounts of data. Under these circumstances learning algorithms often become prohibitively expensive, making scalability a pressing issue to be addressed. A common approach is to perform sampling to reduce the size of the dataset and enable efficient learning. Alternatively, one customizes learning algorithms to achieve scalability. In either case, the key challenge is to obtain algorithmic efficiency without compromising the quality of the results. In this article we discuss a meta-learning algorithm (PSBML) that combines concepts from spatially structured evolutionary algorithms (SSEAs) with concepts from ensemble and boosting methodologies to achieve the desired scalability property. We present both theoretical and empirical analyses which show that PSBML preserves a critical property of boosting, specifically, convergence to a distribution centered around the margin. We then present additional empirical analyses showing that this meta-level algorithm provides a general and effective framework that can be used in combination with a variety of learning classifiers. We perform extensive experiments to investigate the trade-off achieved between scalability and accuracy, and robustness to noise, on both synthetic and real-world data. These empirical results corroborate our theoretical analysis, and demonstrate the potential of PSBML in achieving scalability without sacrificing accuracy.
ERIC Educational Resources Information Center
Matson, Johnny L.; Kozlowski, Alison M.; Worley, Julie A.; Shoemaker, Mary E.; Sipes, Megan; Horovitz, Max
2011-01-01
An extensive literature on the causes of challenging behaviors has been developed, primarily in the applied behavior analysis literature. One hundred and seventy-three empirical studies were reviewed where functional assessment serves as the primary method of identifying these causes. Most of the studies were able to identify a clear function or…
Artistry and Analysis: Student Experiences of UK Practice-Based Doctorates in Art and Design
ERIC Educational Resources Information Center
Collinson, Jacquelyn Allen
2005-01-01
During the last decade, doctoral education has been the focus of much international academic attention. This period has also witnessed the rapid growth of practice-based research degrees in art and design in the UK. To date, however, there has been no extensive empirical research on the subjective experiences of students undertaking this form of…
An Empirical Analysis of the Navy Junior Reserve Officer Training Corps (NJROTC)
1987-12-01
o_ _ion For NTIS GRA&I DTIC TAB 0 Ula•ncnmwood [] , ,, Juutiriou~iozl Dis !tribution/ Availability (,’ odk 3 Alra .taI / Diat ~~za V. TABLE OF CONTENTS...followingq: I. The overall total 2. The row totals 3. The column totals "Supertables" were also extensively used. A supertable is essentially a collection
Niang, Oumar; Thioune, Abdoulaye; El Gueirea, Mouhamed Cheikh; Deléchelle, Eric; Lemoine, Jacques
2012-09-01
The major problem with the empirical mode decomposition (EMD) algorithm is its lack of a theoretical framework. So, it is difficult to characterize and evaluate this approach. In this paper, we propose, in the 2-D case, the use of an alternative implementation to the algorithmic definition of the so-called "sifting process" used in the original Huang's EMD method. This approach, especially based on partial differential equations (PDEs), was presented by Niang in previous works, in 2005 and 2007, and relies on a nonlinear diffusion-based filtering process to solve the mean envelope estimation problem. In the 1-D case, the efficiency of the PDE-based method, compared to the original EMD algorithmic version, was also illustrated in a recent paper. Recently, several 2-D extensions of the EMD method have been proposed. Despite some effort, 2-D versions for EMD appear poorly performing and are very time consuming. So in this paper, an extension to the 2-D space of the PDE-based approach is extensively described. This approach has been applied in cases of both signal and image decomposition. The obtained results confirm the usefulness of the new PDE-based sifting process for the decomposition of various kinds of data. Some results have been provided in the case of image decomposition. The effectiveness of the approach encourages its use in a number of signal and image applications such as denoising, detrending, or texture analysis.
ERIC Educational Resources Information Center
Sanchez-Franco, Manuel J.; Martinez-Lopez, Francisco J.; Martin-Velicia, Felix A.
2009-01-01
Our research specifically focuses on the effects of the national cultural background of educators on the acceptance and usage of ICT, particularly the Web as an extensive and expanding information base that provides the ultimate in resource-rich learning. Most research has been used North Americans as subjects. For this reason, we interviewed…
Military Suicide Research Consortium: Extension to New Opportunities and Challenges
2017-04-01
Abnormal Psychology . 56. Tucker, R., Michaels, M., Rogers, M., Wingate, L., & Joiner, T. (2016). Construct validity of a proposed new diagnostic entity...analysis with implications for understanding suicidal behavior. Journal of Abnormal Psychology , 123, 835-840. 2. Anestis, M., Soberay, K., Gutierrez, P...predictions of the interpersonal- psychological theory of suicidal behavior: Empirical tests in two samples of young adults. Journal of Abnormal
ERIC Educational Resources Information Center
Tilak, Jandhyala B. G.
An extensive survey of empirical research on education as related to poverty, growth, and income distribution is presented, with the focus on 21 developing nations. The study uses the latest available data on alternative measures of income distribution, income shares of various population groups by income classes, and poverty ratios. The analysis…
The Effects of Extensive Reading on Reading Comprehension, Reading Rate, and Vocabulary Acquisition
ERIC Educational Resources Information Center
Suk, Namhee
2017-01-01
Several empirical studies and syntheses of extensive reading have concluded that extensive reading has positive impacts on language learning in second- and foreign-language settings. However, many of the studies contained methodological or curricular limitations, raising questions about the asserted positive effects of extensive reading. The…
Semi-Empirical Prediction of Aircraft Low-Speed Aerodynamic Characteristics
NASA Technical Reports Server (NTRS)
Olson, Erik D.
2015-01-01
This paper lays out a comprehensive methodology for computing a low-speed, high-lift polar, without requiring additional details about the aircraft design beyond what is typically available at the conceptual design stage. Introducing low-order, physics-based aerodynamic analyses allows the methodology to be more applicable to unconventional aircraft concepts than traditional, fully-empirical methods. The methodology uses empirical relationships for flap lift effectiveness, chord extension, drag-coefficient increment and maximum lift coefficient of various types of flap systems as a function of flap deflection, and combines these increments with the characteristics of the unflapped airfoils. Once the aerodynamic characteristics of the flapped sections are known, a vortex-lattice analysis calculates the three-dimensional lift, drag and moment coefficients of the whole aircraft configuration. This paper details the results of two validation cases: a supercritical airfoil model with several types of flaps; and a 12-foot, full-span aircraft model with slats and double-slotted flaps.
NASA Astrophysics Data System (ADS)
Sword-Daniels, V. L.; Rossetto, T.; Wilson, T. M.; Sargeant, S.
2015-05-01
The essential services that support urban living are complex and interdependent, and their disruption in disasters directly affects society. Yet there are few empirical studies to inform our understanding of the vulnerabilities and resilience of complex infrastructure systems in disasters. This research takes a systems thinking approach to explore the dynamic behaviour of a network of essential services, in the presence and absence of volcanic ashfall hazards in Montserrat, West Indies. Adopting a case study methodology and qualitative methods to gather empirical data, we centre the study on the healthcare system and its interconnected network of essential services. We identify different types of relationship between sectors and develop a new interdependence classification system for analysis. Relationships are further categorised by hazard conditions, for use in extensive risk contexts. During heightened volcanic activity, relationships between systems transform in both number and type: connections increase across the network by 41%, and adapt to increase cooperation and information sharing. Interconnections add capacities to the network, increasing the resilience of prioritised sectors. This in-depth and context-specific approach provides a new methodology for studying the dynamics of infrastructure interdependence in an extensive risk context, and can be adapted for use in other hazard contexts.
NASA Astrophysics Data System (ADS)
Sword-Daniels, V. L.; Rossetto, T.; Wilson, T. M.; Sargeant, S.
2015-02-01
The essential services that support urban living are complex and interdependent, and their disruption in disasters directly affects society. Yet there are few empirical studies to inform our understanding of the vulnerabilities and resilience of complex infrastructure systems in disasters. This research takes a systems thinking approach to explore the dynamic behaviour of a network of essential services, in the presence and absence of volcanic ashfall hazards in Montserrat, West Indies. Adopting a case study methodology and qualitative methods to gather empirical data we centre the study on the healthcare system and its interconnected network of essential services. We identify different types of relationship between sectors and develop a new interdependence classification system for analysis. Relationships are further categorised by hazard condition, for use in extensive risk contexts. During heightened volcanic activity, relationships between systems transform in both number and type: connections increase across the network by 41%, and adapt to increase cooperation and information sharing. Interconnections add capacities to the network, increasing the resilience of prioritised sectors. This in-depth and context-specific approach provides a new methodology for studying the dynamics of infrastructure interdependence in an extensive risk context, and can be adapted for use in other hazard contexts.
A study of multiplex data bus techniques for the space shuttle
NASA Technical Reports Server (NTRS)
Kearney, R. J.; Kalange, M. A.
1972-01-01
A comprehensive technology base for the design of a multiplexed data bus subsystem is provided. Extensive analyses, both analytical and empirical, were performed. Subjects covered are classified under the following headings: requirements identification and analysis; transmission media studies; signal design and detection studies; synchronization, timing, and control studies; user-subsystem interface studies; operational reliability analyses; design of candidate data bus configurations; and evaluation of candidate data bus designs.
Hemakom, Apit; Goverdovsky, Valentin; Looney, David; Mandic, Danilo P
2016-04-13
An extension to multivariate empirical mode decomposition (MEMD), termed adaptive-projection intrinsically transformed MEMD (APIT-MEMD), is proposed to cater for power imbalances and inter-channel correlations in real-world multichannel data. It is shown that the APIT-MEMD exhibits similar or better performance than MEMD for a large number of projection vectors, whereas it outperforms MEMD for the critical case of a small number of projection vectors within the sifting algorithm. We also employ the noise-assisted APIT-MEMD within our proposed intrinsic multiscale analysis framework and illustrate the advantages of such an approach in notoriously noise-dominated cooperative brain-computer interface (BCI) based on the steady-state visual evoked potentials and the P300 responses. Finally, we show that for a joint cognitive BCI task, the proposed intrinsic multiscale analysis framework improves system performance in terms of the information transfer rate. © 2016 The Author(s).
Terrorism as a process: a critical review of Moghaddam's "Staircase to Terrorism".
Lygre, Ragnhild B; Eid, Jarle; Larsson, Gerry; Ranstorp, Magnus
2011-12-01
This study reviews empirical evidence for Moghaddam's model "Staircase to Terrorism," which portrays terrorism as a process of six consecutive steps culminating in terrorism. An extensive literature search, where 2,564 publications on terrorism were screened, resulted in 38 articles which were subject to further analysis. The results showed that while most of the theories and processes linked to Moghaddam's model are supported by empirical evidence, the proposed transitions between the different steps are not. These results may question the validity of a linear stepwise model and may suggest that a combination of mechanisms/factors could combine in different ways to produce terrorism. © 2011 The Authors. Scandinavian Journal of Psychology © 2011 The Scandinavian Psychological Associations.
Ivory, James D; Williams, Dmitri; Martins, Nicole; Consalvo, Mia
2009-08-01
Although violent video game content and its effects have been examined extensively by empirical research, verbal aggression in the form of profanity has received less attention. Building on preliminary findings from previous studies, an extensive content analysis of profanity in video games was conducted using a sample of the 150 top-selling video games across all popular game platforms (including home consoles, portable consoles, and personal computers). The frequency of profanity, both in general and across three profanity categories, was measured and compared to games' ratings, sales, and platforms. Generally, profanity was found in about one in five games and appeared primarily in games rated for teenagers or above. Games containing profanity, however, tended to contain it frequently. Profanity was not found to be related to games' sales or platforms.
Empirical Questionnaire Methods for Fund-Raising Campaign Preparedness in Extension
ERIC Educational Resources Information Center
Comley Adams, Catherine; Butler, Douglass A.
2017-01-01
Amid waning public financial support for Extension program offerings, highly strategic and professional fund-raising practices are necessary for gaining momentum among private philanthropists and closing the fiscal gap. University of Missouri Extension conducted a precampaign survey that invited feedback from stakeholders to inform Extension…
Correspondence Analysis-Theory and Application in Management Accounting Research
NASA Astrophysics Data System (ADS)
Duller, Christine
2010-09-01
Correspondence analysis is an explanatory data analytic technique and is used to identify systematic relations between categorical variables. It is related to principal component analysis and the results provide information on the structure of categorical variables similar to the results given by a principal component analysis in case of metric variables. Classical correspondence analysis is designed two-dimensional, whereas multiple correspondence analysis is an extension to more than two variables. After an introductory overview of the idea and the implementation in standard software packages (PASW, SAS, R) an example in recent research is presented, which deals with strategic management accounting in family and non-family enterprises in Austria, where 70% to 80% of all enterprises can be classified as family firms. Although there is a growing body of literature focusing on various management issues in family firms, so far the state of the art of strategic management accounting in family firms is an empirically under-researched subject. In relevant literature only the (empirically untested) hypothesis can be found, that family firms tend to have less formalized management accounting systems than non-family enterprises. Creating a correspondence analysis will help to identify the underlying structure, which is responsible for differences in strategic management accounting.
Technology in Gifted Education: A Review of Best Practices and Empirical Research
ERIC Educational Resources Information Center
Periathiruvadi, Sita; Rinn, Anne N.
2013-01-01
The article aims to explore the progress of technology use in gifted education and highlight the best practices and empirical research in this area. The literature on the use of technology with gifted students and their teachers has been extensive, with articles on best practices, but the empirical research in this area is still emerging. With the…
Wu, Hung-Yi; Lin, Yi-Kuei; Chang, Chi-Hsiang
2011-02-01
This study aims at developing a set of appropriate performance evaluation indices mainly based on balanced scorecard (BSC) for extension education centers in universities by utilizing multiple criteria decision making (MCDM). Through literature reviews and experts who have real practical experiences in extension education, adequate performance evaluation indices have been selected and then utilizing the decision making trial and evaluation laboratory (DEMATEL) and analytic network process (ANP), respectively, further establishes the causality between the four BSC perspectives as well as the relative weights between evaluation indices. According to this previous result, an empirical analysis of the performance evaluation of extension education centers of three universities at Taoyuan County in Taiwan is illustrated by applying VlseKriterijumska Optimizacija I Kompromisno Resenje (VIKOR). From the analysis results, it indicates that "Learning and growth" is the significant influential factor and it would affect the other three perspectives. In addition, it is discovered that "Internal process" perspective as well as "Financial" perspective play important roles in the performance evaluation of extension education centers. The top three key performance indices are "After-sales service", "Turnover volume", and "Net income". The proposed evaluation model could be considered as a reference for extension education centers in universities to prioritize their improvements on the key performance indices after performing VIKOR analyses. 2010 Elsevier Ltd. All rights reserved.
Morselli, Davide; Passini, Stefano
2015-11-01
In Crimes of obedience, Kelman and Hamilton argue that societies can be protected by the degeneration of authority only when citizenship is based on a strong values orientation. This reference to values may be the weakest point in their theory because they do not explicitly define these values. Nevertheless, their empirical findings suggest that the authors are referring to specific democratic principles and universal values (e.g., equality, fairness, harmlessness). In this article, a composite index known as the value-oriented citizenship (VOC) index is introduced and empirically analysed. The results confirm that the VOC index discriminates between people who relate to authority based on values rather than based on their role or on rules in general. The article discusses the utility of the VOC index to develop Kelman and Hamilton's framework further empirically as well as its implications for the analysis of the relationship between individuals and authority. Copyright © 2015 Elsevier Inc. All rights reserved.
Sorption and reemission of formaldehyde by gypsum wallboard. Report for June 1990-August 1992
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, J.C.S.
1993-01-01
The paper gives results of an analysis of the sorption and desorption of formaldehyde by unpainted wallboard, using a mass transfer model based on the Langmuir sorption isotherm. The sorption and desorption rate constants are determined by short-term experimental data. Long-term sorption and desorption curves are developed by the mass transfer model without any adjustable parameters. Compared with other empirically developed models, the mass transfer model has more extensive applicability and provides an elucidation of the sorption and desorption mechanism that empirical models cannot. The mass transfer model is also more feasible and accurate than empirical models for applications suchmore » as scale-up and exposure assessment. For a typical indoor environment, the model predicts that gypsum wallboard is a much stronger sink for formaldehyde than for other indoor air pollutants such as tetrachloroethylene and ethylbenzene. The strong sink effects are reflected by the high equilibrium capacity and slow decay of the desorption curve.« less
A glacier runoff extension to the Precipitation Runoff Modeling System
A. E. Van Beusekom; R. J. Viger
2016-01-01
A module to simulate glacier runoff, PRMSglacier, was added to PRMS (Precipitation Runoff Modeling System), a distributed-parameter, physical-process hydrological simulation code. The extension does not require extensive on-glacier measurements or computational expense but still relies on physical principles over empirical relations as much as is feasible while...
NASA Technical Reports Server (NTRS)
Devenport, William J.; Glegg, Stewart A. L.
1995-01-01
This report summarizes accomplishments and progress for the period ending April 1995. Much of the work during this period has concentrated on preparation for an analysis of data produced by an extensive wind tunnel test. Time has also been spent further developing an empirical theory to account for the effects of blade-vortex interaction upon the circulation distribution of the vortex and on preliminary measurements aimed at controlling the vortex core size.
Heat transfer correlations for multilayer insulation systems
NASA Astrophysics Data System (ADS)
Krishnaprakas, C. K.; Badari Narayana, K.; Dutta, Pradip
2000-01-01
Multilayer insulation (MLI) blankets are extensively used in spacecrafts as lightweight thermal protection systems. Heat transfer analysis of MLI is sometimes too complex to use in practical design applications. Hence, for practical engineering design purposes, it is necessary to have simpler procedures to evaluate the heat transfer rate through MLI. In this paper, four different empirical models for heat transfer are evaluated by fitting against experimentally observed heat flux through MLI blankets of various configurations, and the results are discussed.
Bayesian Group Bridge for Bi-level Variable Selection.
Mallick, Himel; Yi, Nengjun
2017-06-01
A Bayesian bi-level variable selection method (BAGB: Bayesian Analysis of Group Bridge) is developed for regularized regression and classification. This new development is motivated by grouped data, where generic variables can be divided into multiple groups, with variables in the same group being mechanistically related or statistically correlated. As an alternative to frequentist group variable selection methods, BAGB incorporates structural information among predictors through a group-wise shrinkage prior. Posterior computation proceeds via an efficient MCMC algorithm. In addition to the usual ease-of-interpretation of hierarchical linear models, the Bayesian formulation produces valid standard errors, a feature that is notably absent in the frequentist framework. Empirical evidence of the attractiveness of the method is illustrated by extensive Monte Carlo simulations and real data analysis. Finally, several extensions of this new approach are presented, providing a unified framework for bi-level variable selection in general models with flexible penalties.
Spectrum of bacteremia in posthematopoietic stem cell transplant patients from an Indian center.
Ghafur, A; Devarajan, V; Raj, R; Easow, J; Raja, T
2016-01-01
Despite the relatively low prevalence of Gram-positive bacteremic infections in Indian oncology patients, glycopeptides are extensively used for empirical management of febrile neutropenia. Our aim was to analyze the spectrum of bacteremia in posthematopoietic stem cell transplant (HSCT) recipients in our center and make a recommendation on glycopeptide use in this patient population. Retrospective analysis of bacteremic data from HSCT recipients in a tertiary care oncology and transplant center from South India, between 2011 and 2013. In 217 patients, 52 bacteremic episodes were identified. The majority of the isolates were Gram-negatives (88.4%) with very few Gram-positives (7.69%). Glycopeptides need not be included in the empirical antibiotic regimen in post-HSCT settings with very low Gram-positive infection rates.
Multilevel corporate environmental responsibility.
Karassin, Orr; Bar-Haim, Aviad
2016-12-01
The multilevel empirical study of the antecedents of corporate social responsibility (CSR) has been identified as "the first knowledge gap" in CSR research. Based on an extensive literature review, the present study outlines a conceptual multilevel model of CSR, then designs and empirically validates an operational multilevel model of the principal driving factors affecting corporate environmental responsibility (CER), as a measure of CSR. Both conceptual and operational models incorporate three levels of analysis: institutional, organizational, and individual. The multilevel nature of the design allows for the assessment of the relative importance of the levels and of their components in the achievement of CER. Unweighted least squares (ULS) regression analysis reveals that the institutional-level variables have medium relationships with CER, some variables having a negative effect. The organizational level is revealed as having strong and positive significant relationships with CER, with organizational culture and managers' attitudes and behaviors as significant driving forces. The study demonstrates the importance of multilevel analysis in improving the understanding of CSR drivers, relative to single level models, even if the significance of specific drivers and levels may vary by context. Copyright © 2016 Elsevier Ltd. All rights reserved.
Dong, Fengxia; Mitchell, Paul D; Colquhoun, Jed
2015-01-01
Measuring farm sustainability performance is a crucial component for improving agricultural sustainability. While extensive assessments and indicators exist that reflect the different facets of agricultural sustainability, because of the relatively large number of measures and interactions among them, a composite indicator that integrates and aggregates over all variables is particularly useful. This paper describes and empirically evaluates a method for constructing a composite sustainability indicator that individually scores and ranks farm sustainability performance. The method first uses non-negative polychoric principal component analysis to reduce the number of variables, to remove correlation among variables and to transform categorical variables to continuous variables. Next the method applies common-weight data envelope analysis to these principal components to individually score each farm. The method solves weights endogenously and allows identifying important practices in sustainability evaluation. An empirical application to Wisconsin cranberry farms finds heterogeneity in sustainability practice adoption, implying that some farms could adopt relevant practices to improve the overall sustainability performance of the industry. Copyright © 2014 Elsevier Ltd. All rights reserved.
Harmonic analysis of electrified railway based on improved HHT
NASA Astrophysics Data System (ADS)
Wang, Feng
2018-04-01
In this paper, the causes and harms of the current electric locomotive electrical system harmonics are firstly studied and analyzed. Based on the characteristics of the harmonics in the electrical system, the Hilbert-Huang transform method is introduced. Based on the in-depth analysis of the empirical mode decomposition method and the Hilbert transform method, the reasons and solutions to the endpoint effect and modal aliasing problem in the HHT method are explored. For the endpoint effect of HHT, this paper uses point-symmetric extension method to extend the collected data; In allusion to the modal aliasing problem, this paper uses the high frequency harmonic assistant method to preprocess the signal and gives the empirical formula of high frequency auxiliary harmonic. Finally, combining the suppression of HHT endpoint effect and modal aliasing problem, an improved HHT method is proposed and simulated by matlab. The simulation results show that the improved HHT is effective for the electric locomotive power supply system.
ExGUtils: A Python Package for Statistical Analysis With the ex-Gaussian Probability Density.
Moret-Tatay, Carmen; Gamermann, Daniel; Navarro-Pardo, Esperanza; Fernández de Córdoba Castellá, Pedro
2018-01-01
The study of reaction times and their underlying cognitive processes is an important field in Psychology. Reaction times are often modeled through the ex-Gaussian distribution, because it provides a good fit to multiple empirical data. The complexity of this distribution makes the use of computational tools an essential element. Therefore, there is a strong need for efficient and versatile computational tools for the research in this area. In this manuscript we discuss some mathematical details of the ex-Gaussian distribution and apply the ExGUtils package, a set of functions and numerical tools, programmed for python, developed for numerical analysis of data involving the ex-Gaussian probability density. In order to validate the package, we present an extensive analysis of fits obtained with it, discuss advantages and differences between the least squares and maximum likelihood methods and quantitatively evaluate the goodness of the obtained fits (which is usually an overlooked point in most literature in the area). The analysis done allows one to identify outliers in the empirical datasets and criteriously determine if there is a need for data trimming and at which points it should be done.
ExGUtils: A Python Package for Statistical Analysis With the ex-Gaussian Probability Density
Moret-Tatay, Carmen; Gamermann, Daniel; Navarro-Pardo, Esperanza; Fernández de Córdoba Castellá, Pedro
2018-01-01
The study of reaction times and their underlying cognitive processes is an important field in Psychology. Reaction times are often modeled through the ex-Gaussian distribution, because it provides a good fit to multiple empirical data. The complexity of this distribution makes the use of computational tools an essential element. Therefore, there is a strong need for efficient and versatile computational tools for the research in this area. In this manuscript we discuss some mathematical details of the ex-Gaussian distribution and apply the ExGUtils package, a set of functions and numerical tools, programmed for python, developed for numerical analysis of data involving the ex-Gaussian probability density. In order to validate the package, we present an extensive analysis of fits obtained with it, discuss advantages and differences between the least squares and maximum likelihood methods and quantitatively evaluate the goodness of the obtained fits (which is usually an overlooked point in most literature in the area). The analysis done allows one to identify outliers in the empirical datasets and criteriously determine if there is a need for data trimming and at which points it should be done. PMID:29765345
Zhang, Kui; Wiener, Howard; Beasley, Mark; George, Varghese; Amos, Christopher I; Allison, David B
2006-08-01
Individual genome scans for quantitative trait loci (QTL) mapping often suffer from low statistical power and imprecise estimates of QTL location and effect. This lack of precision yields large confidence intervals for QTL location, which are problematic for subsequent fine mapping and positional cloning. In prioritizing areas for follow-up after an initial genome scan and in evaluating the credibility of apparent linkage signals, investigators typically examine the results of other genome scans of the same phenotype and informally update their beliefs about which linkage signals in their scan most merit confidence and follow-up via a subjective-intuitive integration approach. A method that acknowledges the wisdom of this general paradigm but formally borrows information from other scans to increase confidence in objectivity would be a benefit. We developed an empirical Bayes analytic method to integrate information from multiple genome scans. The linkage statistic obtained from a single genome scan study is updated by incorporating statistics from other genome scans as prior information. This technique does not require that all studies have an identical marker map or a common estimated QTL effect. The updated linkage statistic can then be used for the estimation of QTL location and effect. We evaluate the performance of our method by using extensive simulations based on actual marker spacing and allele frequencies from available data. Results indicate that the empirical Bayes method can account for between-study heterogeneity, estimate the QTL location and effect more precisely, and provide narrower confidence intervals than results from any single individual study. We also compared the empirical Bayes method with a method originally developed for meta-analysis (a closely related but distinct purpose). In the face of marked heterogeneity among studies, the empirical Bayes method outperforms the comparator.
Does the U.S. exercise contagion on Italy? A theoretical model and empirical evidence
NASA Astrophysics Data System (ADS)
Cerqueti, Roy; Fenga, Livio; Ventura, Marco
2018-06-01
This paper deals with the theme of contagion in financial markets. At this aim, we develop a model based on Mixed Poisson Processes to describe the abnormal returns of financial markets of two considered countries. In so doing, the article defines the theoretical conditions to be satisfied in order to state that one of them - the so-called leader - exercises contagion on the others - the followers. Specifically, we employ an invariant probabilistic result stating that a suitable transformation of a Mixed Poisson Process is still a Mixed Poisson Process. The theoretical claim is validated by implementing an extensive simulation analysis grounded on empirical data. The countries considered are the U.S. (as the leader) and Italy (as the follower) and the period under scrutiny is very large, ranging from 1970 to 2014.
Powerlaw: a Python package for analysis of heavy-tailed distributions.
Alstott, Jeff; Bullmore, Ed; Plenz, Dietmar
2014-01-01
Power laws are theoretically interesting probability distributions that are also frequently used to describe empirical data. In recent years, effective statistical methods for fitting power laws have been developed, but appropriate use of these techniques requires significant programming and statistical insight. In order to greatly decrease the barriers to using good statistical methods for fitting power law distributions, we developed the powerlaw Python package. This software package provides easy commands for basic fitting and statistical analysis of distributions. Notably, it also seeks to support a variety of user needs by being exhaustive in the options available to the user. The source code is publicly available and easily extensible.
Quantifying patterns of research interest evolution
NASA Astrophysics Data System (ADS)
Jia, Tao; Wang, Dashun; Szymanski, Boleslaw
Changing and shifting research interest is an integral part of a scientific career. Despite extensive investigations of various factors that influence a scientist's choice of research topics, quantitative assessments of mechanisms that give rise to macroscopic patterns characterizing research interest evolution of individual scientists remain limited. Here we perform a large-scale analysis of extensive publication records, finding that research interest change follows a reproducible pattern characterized by an exponential distribution. We identify three fundamental features responsible for the observed exponential distribution, which arise from a subtle interplay between exploitation and exploration in research interest evolution. We develop a random walk based model, which adequately reproduces our empirical observations. Our study presents one of the first quantitative analyses of macroscopic patterns governing research interest change, documenting a high degree of regularity underlying scientific research and individual careers.
Liao, J. G.; Mcmurry, Timothy; Berg, Arthur
2014-01-01
Empirical Bayes methods have been extensively used for microarray data analysis by modeling the large number of unknown parameters as random effects. Empirical Bayes allows borrowing information across genes and can automatically adjust for multiple testing and selection bias. However, the standard empirical Bayes model can perform poorly if the assumed working prior deviates from the true prior. This paper proposes a new rank-conditioned inference in which the shrinkage and confidence intervals are based on the distribution of the error conditioned on rank of the data. Our approach is in contrast to a Bayesian posterior, which conditions on the data themselves. The new method is almost as efficient as standard Bayesian methods when the working prior is close to the true prior, and it is much more robust when the working prior is not close. In addition, it allows a more accurate (but also more complex) non-parametric estimate of the prior to be easily incorporated, resulting in improved inference. The new method’s prior robustness is demonstrated via simulation experiments. Application to a breast cancer gene expression microarray dataset is presented. Our R package rank.Shrinkage provides a ready-to-use implementation of the proposed methodology. PMID:23934072
Highly comparative time-series analysis: the empirical structure of time series and their methods.
Fulcher, Ben D; Little, Max A; Jones, Nick S
2013-06-06
The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.
Highly comparative time-series analysis: the empirical structure of time series and their methods
Fulcher, Ben D.; Little, Max A.; Jones, Nick S.
2013-01-01
The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines. PMID:23554344
Transition mixing study empirical model report
NASA Technical Reports Server (NTRS)
Srinivasan, R.; White, C.
1988-01-01
The empirical model developed in the NASA Dilution Jet Mixing Program has been extended to include the curvature effects of transition liners. This extension is based on the results of a 3-D numerical model generated under this contract. The empirical model results agree well with the numerical model results for all tests cases evaluated. The empirical model shows faster mixing rates compared to the numerical model. Both models show drift of jets toward the inner wall of a turning duct. The structure of the jets from the inner wall does not exhibit the familiar kidney-shaped structures observed for the outer wall jets or for jets injected in rectangular ducts.
The Past, Present and Future of Geodemographic Research in the United States and United Kingdom
Singleton, Alexander D.; Spielman, Seth E.
2014-01-01
This article presents an extensive comparative review of the emergence and application of geodemographics in both the United States and United Kingdom, situating them as an extension of earlier empirically driven models of urban socio-spatial structure. The empirical and theoretical basis for this generalization technique is also considered. Findings demonstrate critical differences in both the application and development of geodemographics between the United States and United Kingdom resulting from their diverging histories, variable data economies, and availability of academic or free classifications. Finally, current methodological research is reviewed, linking this discussion prospectively to the changing spatial data economy in both the United States and United Kingdom. PMID:25484455
Hemakom, Apit; Powezka, Katarzyna; Goverdovsky, Valentin; Jaffer, Usman; Mandic, Danilo P
2017-12-01
A highly localized data-association measure, termed intrinsic synchrosqueezing transform (ISC), is proposed for the analysis of coupled nonlinear and non-stationary multivariate signals. This is achieved based on a combination of noise-assisted multivariate empirical mode decomposition and short-time Fourier transform-based univariate and multivariate synchrosqueezing transforms. It is shown that the ISC outperforms six other combinations of algorithms in estimating degrees of synchrony in synthetic linear and nonlinear bivariate signals. Its advantage is further illustrated in the precise identification of the synchronized respiratory and heart rate variability frequencies among a subset of bass singers of a professional choir, where it distinctly exhibits better performance than the continuous wavelet transform-based ISC. We also introduce an extension to the intrinsic phase synchrony (IPS) measure, referred to as nested intrinsic phase synchrony (N-IPS), for the empirical quantification of physically meaningful and straightforward-to-interpret trends in phase synchrony. The N-IPS is employed to reveal physically meaningful variations in the levels of cooperation in choir singing and performing a surgical procedure. Both the proposed techniques successfully reveal degrees of synchronization of the physiological signals in two different aspects: (i) precise localization of synchrony in time and frequency (ISC), and (ii) large-scale analysis for the empirical quantification of physically meaningful trends in synchrony (N-IPS).
Raeven, Vivian M; Spoorenberg, Simone M C; Boersma, Wim G; van de Garde, Ewoudt M W; Cannegieter, Suzanne C; Voorn, G P Paul; Bos, Willem Jan W; van Steenbergen, Jim E
2016-06-17
Microorganisms causing community-acquired pneumonia (CAP) can be categorised into viral, typical and atypical (Legionella species, Coxiella burnetii, Mycoplasma pneumoniae, and Chlamydia species). Extensive microbiological testing to identify the causative microorganism is not standardly recommended, and empiric treatment does not always cover atypical pathogens. In order to optimize epidemiologic knowledge of CAP and to improve empiric antibiotic choice, we investigated whether atypical microorganisms are associated with a particular season or with the patient characteristics age, gender, or chronic obstructive pulmonary disease (COPD). A data-analysis was performed on databases from four prospective studies, which all included adult patients hospitalised with CAP in the Netherlands (N = 980). All studies performed extensive microbiological testing. A main causative agent was identified in 565/980 (57.7 %) patients. Of these, 117 (20.7 %) were atypical microorganisms. This percentage was 40.4 % (57/141) during the non-respiratory season (week 20 to week 39, early May to early October), and 67.2 % (41/61) for patients under the age of 60 during this season. Factors that were associated with atypical causative agents were: CAP acquired in the non-respiratory season (odds ratio (OR) 4.3, 95 % CI 2.68-6.84), age <60 year (OR 2.9, 95 % CI 1.83-4.66), male gender (OR 1.7, 95 % CI 1.06-2.71) and absence of COPD (OR 0.2, 95 % CI 0.12-0.52). Atypical causative agents in CAP are associated with respectively non-respiratory season, age <60 years, male gender and absence of COPD. Therefore, to maximise its yield, extensive microbiological testing should be considered in patients <60 years old who are admitted with CAP from early May to early October. NCT00471640 , NCT00170196 (numbers of original studies).
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-15
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PF10-5-000] Empire Pipeline, Inc.; Notice of Intent To Prepare an Environmental Assessment for the Planned Tioga County Extension Project, Request for Comments on Environmental Issues, and Notice of Public Scoping Meeting April 7, 2010. The staff of the Federal Energy...
ERIC Educational Resources Information Center
Liu, Xun
2010-01-01
This study extended the technology acceptance model and empirically tested the new model with wikis, a new type of educational technology. Based on social cognitive theory and the theory of planned behavior, three new variables, wiki self-efficacy, online posting anxiety, and perceived behavioral control, were added to the original technology…
Bartz, Daniel; Hatrick, Kerr; Hesse, Christian W; Müller, Klaus-Robert; Lemm, Steven
2013-01-01
Robust and reliable covariance estimates play a decisive role in financial and many other applications. An important class of estimators is based on factor models. Here, we show by extensive Monte Carlo simulations that covariance matrices derived from the statistical Factor Analysis model exhibit a systematic error, which is similar to the well-known systematic error of the spectrum of the sample covariance matrix. Moreover, we introduce the Directional Variance Adjustment (DVA) algorithm, which diminishes the systematic error. In a thorough empirical study for the US, European, and Hong Kong stock market we show that our proposed method leads to improved portfolio allocation.
Bartz, Daniel; Hatrick, Kerr; Hesse, Christian W.; Müller, Klaus-Robert; Lemm, Steven
2013-01-01
Robust and reliable covariance estimates play a decisive role in financial and many other applications. An important class of estimators is based on factor models. Here, we show by extensive Monte Carlo simulations that covariance matrices derived from the statistical Factor Analysis model exhibit a systematic error, which is similar to the well-known systematic error of the spectrum of the sample covariance matrix. Moreover, we introduce the Directional Variance Adjustment (DVA) algorithm, which diminishes the systematic error. In a thorough empirical study for the US, European, and Hong Kong stock market we show that our proposed method leads to improved portfolio allocation. PMID:23844016
An Extension of the Partial Credit Model with an Application to the Measurement of Change.
ERIC Educational Resources Information Center
Fischer, Gerhard H.; Ponocny, Ivo
1994-01-01
An extension to the partial credit model, the linear partial credit model, is considered under the assumption of a certain linear decomposition of the item x category parameters into basic parameters. A conditional maximum likelihood algorithm for estimating basic parameters is presented and illustrated with simulation and an empirical study. (SLD)
More than associations: an ideomotor perspective on mirror neurons.
Brass, Marcel; Muhle-Karbe, Paul S
2014-04-01
In this commentary, we propose an extension of the associative approach of mirror neurons, namely, ideomotor theory. Ideomotor theory assumes that actions are controlled by anticipatory representations of their sensory consequences. As we outline below, this extension is necessary to clarify a number of empirical observations that are difficult to explain from a purely associative perspective.
On the prediction of auto-rotational characteristics of light airplane fuselages
NASA Technical Reports Server (NTRS)
Pamadi, B. N.; Taylor, L. W., Jr.
1984-01-01
A semi-empirical theory is presented for the estimation of aerodynamic forces and moments acting on a steadily rotating (spinning) airplane fuselage, with a particular emphasis on the prediction of its auto-rotational behavior. This approach is based on an extension of the available analytical methods for high angle of attack and side-slip and then coupling this procedure with strip theory for application to a rotating airplane fuselage. The analysis is applied to the fuselage of a light general aviation airplane and the results are shown to be in fair agreement with experimental data.
Two Strategies for Qualitative Content Analysis: An Intramethod Approach to Triangulation.
Renz, Susan M; Carrington, Jane M; Badger, Terry A
2018-04-01
The overarching aim of qualitative research is to gain an understanding of certain social phenomena. Qualitative research involves the studied use and collection of empirical materials, all to describe moments and meanings in individuals' lives. Data derived from these various materials require a form of analysis of the content, focusing on written or spoken language as communication, to provide context and understanding of the message. Qualitative research often involves the collection of data through extensive interviews, note taking, and tape recording. These methods are time- and labor-intensive. With the advances in computerized text analysis software, the practice of combining methods to analyze qualitative data can assist the researcher in making large data sets more manageable and enhance the trustworthiness of the results. This article will describe a novel process of combining two methods of qualitative data analysis, or Intramethod triangulation, as a means to provide a deeper analysis of text.
Correcting for population structure and kinship using the linear mixed model: theory and extensions.
Hoffman, Gabriel E
2013-01-01
Population structure and kinship are widespread confounding factors in genome-wide association studies (GWAS). It has been standard practice to include principal components of the genotypes in a regression model in order to account for population structure. More recently, the linear mixed model (LMM) has emerged as a powerful method for simultaneously accounting for population structure and kinship. The statistical theory underlying the differences in empirical performance between modeling principal components as fixed versus random effects has not been thoroughly examined. We undertake an analysis to formalize the relationship between these widely used methods and elucidate the statistical properties of each. Moreover, we introduce a new statistic, effective degrees of freedom, that serves as a metric of model complexity and a novel low rank linear mixed model (LRLMM) to learn the dimensionality of the correction for population structure and kinship, and we assess its performance through simulations. A comparison of the results of LRLMM and a standard LMM analysis applied to GWAS data from the Multi-Ethnic Study of Atherosclerosis (MESA) illustrates how our theoretical results translate into empirical properties of the mixed model. Finally, the analysis demonstrates the ability of the LRLMM to substantially boost the strength of an association for HDL cholesterol in Europeans.
ERIC Educational Resources Information Center
Oladele, O. I.; Adekoya, A. E.
2006-01-01
This paper examines the implications of farmers' propensity to discontinue the adoption of agricultural technologies in southwestern Nigeria. This is predicated on the fact that extension education process should be proactive in addressing farmers in order to sustain the adoption process. Empirical studies looking at diffusion processes from an…
ERIC Educational Resources Information Center
Sanga, Camilius; Mlozi, Malongo; Haug, Ruth; Tumbo, Siza
2016-01-01
The ubiquitous nature of mobile phones offers a noble environment where farmers can learn informally anywhere, anytime and at any location. This is an innovative way to address some of the weakness of conventional agricultural extension service. Few empirical studies have reported on the development of mobile phone application to support blended…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Slunge, Daniel, E-mail: daniel.slunge@economics.gu.se; Tran, Trang Thi Huyen, E-mail: trang2k@yahoo.com
Building on new institutional theory, this paper develops an analytical framework for analyzing constraints to the institutionalization of strategic environmental assessment (SEA) at four different institutional levels. The framework is tested in an empirical analysis of the environmental assessment system in Vietnam, which is a frontrunner among developing countries regarding the introduction and use of SEA. Building on interviews with Vietnamese and international experts, as well as an extensive literature review, we identify institutional constraints which challenge the effective use of SEA in Vietnam. We conclude that commonly identified constraints, such as inadequate training, technical guidelines, baseline data and financialmore » resources, are strongly linked to constraints at higher institutional levels, such as incentives to not share information between ministries and severe restrictions on access to information and public participation. Without a thorough understanding of these institutional constraints, there is a risk that attempts to improve the use of SEA are misdirected. Thus, a careful institutional analysis should guide efforts to introduce and improve the use of SEA in Vietnam and other developing countries. The analytical framework for analyzing constraints to institutionalization of SEA presented in this paper represents a systematic effort in this direction. - Highlights: • A framework for analyzing constraints to institutionalizing SEA is developed • Empirical analysis of the strategic environmental assessment system in Vietnam • Constraints in the action arena linked to deeper institutional constraints • Institutional analysis needed prior to introducing SEA in developing countries.« less
Suppressing disease spreading by using information diffusion on multiplex networks.
Wang, Wei; Liu, Quan-Hui; Cai, Shi-Min; Tang, Ming; Braunstein, Lidia A; Stanley, H Eugene
2016-07-06
Although there is always an interplay between the dynamics of information diffusion and disease spreading, the empirical research on the systemic coevolution mechanisms connecting these two spreading dynamics is still lacking. Here we investigate the coevolution mechanisms and dynamics between information and disease spreading by utilizing real data and a proposed spreading model on multiplex network. Our empirical analysis finds asymmetrical interactions between the information and disease spreading dynamics. Our results obtained from both the theoretical framework and extensive stochastic numerical simulations suggest that an information outbreak can be triggered in a communication network by its own spreading dynamics or by a disease outbreak on a contact network, but that the disease threshold is not affected by information spreading. Our key finding is that there is an optimal information transmission rate that markedly suppresses the disease spreading. We find that the time evolution of the dynamics in the proposed model qualitatively agrees with the real-world spreading processes at the optimal information transmission rate.
Experiments in dilution jet mixing effects of multiple rows and non-circular orifices
NASA Technical Reports Server (NTRS)
Holdeman, J. D.; Srinivasan, R.; Coleman, E. B.; Meyers, G. D.; White, C. D.
1985-01-01
Experimental and empirical model results are presented that extend previous studies of the mixing of single-sided and opposed rows of jets in a confined duct flow to include effects of non-circular orifices and double rows of jets. Analysis of the mean temperature data obtained in this investigation showed that the effects of orifice shape and double rows are significant only in the region close to the injection plane, provided that the orifices are symmetric with respect to the main flow direction. The penetration and mixing of jets from 45-degree slanted slots is slightly less than that from equivalent-area symmetric orifices. The penetration from 2-dimensional slots is similar to that from equivalent-area closely-spaced rows of holes, but the mixing is slower for the 2-D slots. Calculated mean temperature profiles downstream of jets from non-circular and double rows of orifices, made using an extension developed for a previous empirical model, are shown to be in good agreement with the measured distributions.
Experiments in dilution jet mixing - Effects of multiple rows and non-circular orifices
NASA Technical Reports Server (NTRS)
Holdeman, J. D.; Srinivasan, R.; Coleman, E. B.; Meyers, G. D.; White, C. D.
1985-01-01
Experimental and empirical model results are presented that extend previous studies of the mixing of single-sided and opposed rows of jets in a confined duct flow to include effects of non-circular orifices and double rows of jets. Analysis of the mean temperature data obtained in this investigation showed that the effects of orifice shape and double rows are significant only in the region close to the injection plane, provided that the orifices are symmetric with respect to the main flow direction. The penetration and mixing of jets from 45-degree slanted slots is slightly less than that from equivalent-area symmetric orifices. The penetration from two-dimensional slots is similar to that from equivalent-area closely-spaced rows of holes, but the mixing is slower for the 2-D slots. Calculated mean temperature profiles downstream of jets from non-circular and double rows of orifices, made using an extension developed for a previous empirical model, are shown to be in good agreement with the measured distributions.
Salloch, Sabine; Schildmann, Jan; Vollmann, Jochen
2012-04-13
The methodology of medical ethics during the last few decades has shifted from a predominant use of normative-philosophical analyses to an increasing involvement of empirical methods. The articles which have been published in the course of this so-called 'empirical turn' can be divided into conceptual accounts of empirical-normative collaboration and studies which use socio-empirical methods to investigate ethically relevant issues in concrete social contexts. A considered reference to normative research questions can be expected from good quality empirical research in medical ethics. However, a significant proportion of empirical studies currently published in medical ethics lacks such linkage between the empirical research and the normative analysis. In the first part of this paper, we will outline two typical shortcomings of empirical studies in medical ethics with regard to a link between normative questions and empirical data: (1) The complete lack of normative analysis, and (2) cryptonormativity and a missing account with regard to the relationship between 'is' and 'ought' statements. Subsequently, two selected concepts of empirical-normative collaboration will be presented and how these concepts may contribute to improve the linkage between normative and empirical aspects of empirical research in medical ethics will be demonstrated. Based on our analysis, as well as our own practical experience with empirical research in medical ethics, we conclude with a sketch of concrete suggestions for the conduct of empirical research in medical ethics. High quality empirical research in medical ethics is in need of a considered reference to normative analysis. In this paper, we demonstrate how conceptual approaches of empirical-normative collaboration can enhance empirical research in medical ethics with regard to the link between empirical research and normative analysis.
2012-01-01
Background The methodology of medical ethics during the last few decades has shifted from a predominant use of normative-philosophical analyses to an increasing involvement of empirical methods. The articles which have been published in the course of this so-called 'empirical turn' can be divided into conceptual accounts of empirical-normative collaboration and studies which use socio-empirical methods to investigate ethically relevant issues in concrete social contexts. Discussion A considered reference to normative research questions can be expected from good quality empirical research in medical ethics. However, a significant proportion of empirical studies currently published in medical ethics lacks such linkage between the empirical research and the normative analysis. In the first part of this paper, we will outline two typical shortcomings of empirical studies in medical ethics with regard to a link between normative questions and empirical data: (1) The complete lack of normative analysis, and (2) cryptonormativity and a missing account with regard to the relationship between 'is' and 'ought' statements. Subsequently, two selected concepts of empirical-normative collaboration will be presented and how these concepts may contribute to improve the linkage between normative and empirical aspects of empirical research in medical ethics will be demonstrated. Based on our analysis, as well as our own practical experience with empirical research in medical ethics, we conclude with a sketch of concrete suggestions for the conduct of empirical research in medical ethics. Summary High quality empirical research in medical ethics is in need of a considered reference to normative analysis. In this paper, we demonstrate how conceptual approaches of empirical-normative collaboration can enhance empirical research in medical ethics with regard to the link between empirical research and normative analysis. PMID:22500496
Spencer, James Herbert
2013-04-01
The literature on development has focused on the concept of transition in understanding the emergent challenges facing poor but rapidly developing countries. Scholars have focused extensively on the health and urban transitions associated with this change and, in particular, its use for understanding emerging infectious diseases. However, few have developed explicit empirical measures to quantify the extent to which a transitions focus is useful for theory, policy, and practice. Using open source data on avian influenza in 2004 and 2005 and the Vietnam Census of Population and Housing, this paper introduces the Kuznets curve as a tool for empirically estimating transition and disease. Findings suggest that the Kuznets curve is a viable tool for empirically assessing the role of transitional dynamics in the emergence of new infectious diseases.
Choi, Hyungwon; Kim, Sinae; Fermin, Damian; Tsou, Chih-Chiang; Nesvizhskii, Alexey I
2015-11-03
We introduce QPROT, a statistical framework and computational tool for differential protein expression analysis using protein intensity data. QPROT is an extension of the QSPEC suite, originally developed for spectral count data, adapted for the analysis using continuously measured protein-level intensity data. QPROT offers a new intensity normalization procedure and model-based differential expression analysis, both of which account for missing data. Determination of differential expression of each protein is based on the standardized Z-statistic based on the posterior distribution of the log fold change parameter, guided by the false discovery rate estimated by a well-known Empirical Bayes method. We evaluated the classification performance of QPROT using the quantification calibration data from the clinical proteomic technology assessment for cancer (CPTAC) study and a recently published Escherichia coli benchmark dataset, with evaluation of FDR accuracy in the latter. QPROT is a statistical framework with computational software tool for comparative quantitative proteomics analysis. It features various extensions of QSPEC method originally built for spectral count data analysis, including probabilistic treatment of missing values in protein intensity data. With the increasing popularity of label-free quantitative proteomics data, the proposed method and accompanying software suite will be immediately useful for many proteomics laboratories. This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.
Research on Liquidity Risk Evaluation of Chinese A-Shares Market Based on Extension Theory
NASA Astrophysics Data System (ADS)
Bai-Qing, Sun; Peng-Xiang, Liu; Lin, Zhang; Yan-Ge, Li
This research defines the liquidity risk of stock market in matter-element theory and affair-element theory, establishes the indicator system of the forewarning for liquidity risks,designs the model and the process of early warning using the extension set method, extension dependent function and the comprehensive evaluation model. And the paper studies empirically A-shares market through the data of 1A0001, which prove that the model can better describe liquidity risk of China’s A-share market. At last, it gives the corresponding policy recommendations.
NASA Astrophysics Data System (ADS)
Xu, M., III; Liu, X.
2017-12-01
In the past 60 years, both the runoff and sediment load in the Yellow River Basin showed significant decreasing trends owing to the influences of human activities and climate change. Quantifying the impact of each factor (e.g. precipitation, sediment trapping dams, pasture, terrace, etc.) on the runoff and sediment load is among the key issues to guide the implement of water and soil conservation measures, and to predict the variation trends in the future. Hundreds of methods have been developed for studying the runoff and sediment load in the Yellow River Basin. Generally, these methods can be classified into empirical methods and physical-based models. The empirical methods, including hydrological method, soil and water conservation method, etc., are widely used in the Yellow River management engineering. These methods generally apply the statistical analyses like the regression analysis to build the empirical relationships between the main characteristic variables in a river basin. The elasticity method extensively used in the hydrological research can be classified into empirical method as it is mathematically deduced to be equivalent with the hydrological method. Physical-based models mainly include conceptual models and distributed models. The conceptual models are usually lumped models (e.g. SYMHD model, etc.) and can be regarded as transition of empirical models and distributed models. Seen from the publications that less studies have been conducted applying distributed models than empirical models as the simulation results of runoff and sediment load based on distributed models (e.g. the Digital Yellow Integrated Model, the Geomorphology-Based Hydrological Model, etc.) were usually not so satisfied owing to the intensive human activities in the Yellow River Basin. Therefore, this study primarily summarizes the empirical models applied in the Yellow River Basin and theoretically analyzes the main causes for the significantly different results using different empirical researching methods. Besides, we put forward an assessment frame for the researching methods of the runoff and sediment load variations in the Yellow River Basin from the point of view of inputting data, model structure and result output. And the assessment frame was then applied in the Huangfuchuan River.
Understanding medication compliance and persistence from an economics perspective.
Elliott, Rachel A; Shinogle, Judith A; Peele, Pamela; Bhosle, Monali; Hughes, Dyfrig A
2008-01-01
An increased understanding of the reasons for noncompliance and lack of persistence with prescribed medication is an important step to improve treatment effectiveness, and thus patient health. Explanations have been attempted from epidemiological, sociological, and psychological perspectives. Economic models (utility maximization, time preferences, health capital, bilateral bargaining, stated preference, and prospect theory) may contribute to the understanding of medication-taking behavior. Economic models are applied to medication noncompliance. Traditional consumer choice models under a budget constraint do apply to medication-taking behavior in that increased prices cause decreased utilization. Nevertheless, empiric evidence suggests that budget constraints are not the only factor affecting consumer choice around medicines. Examination of time preference models suggests that the intuitive association between time preference and medication compliance has not been investigated extensively, and has not been proven empirically. The health capital model has theoretical relevance, but has not been applied to compliance. Bilateral bargaining may present an alternative model to concordance of the patient-prescriber relationship, taking account of game-playing by either party. Nevertheless, there is limited empiric evidence to test its usefulness. Stated preference methods have been applied most extensively to medicines use. Evidence suggests that patients' preferences are consistently affected by side effects, and that preferences change over time, with age and experience. Prospect theory attempts to explain how new information changes risk perceptions and associated behavior but has not been applied empirically to medication use. Economic models of behavior may contribute to the understanding of medication use, but more empiric work is needed to assess their applicability.
Estimating Mixture of Gaussian Processes by Kernel Smoothing
Huang, Mian; Li, Runze; Wang, Hansheng; Yao, Weixin
2014-01-01
When the functional data are not homogeneous, e.g., there exist multiple classes of functional curves in the dataset, traditional estimation methods may fail. In this paper, we propose a new estimation procedure for the Mixture of Gaussian Processes, to incorporate both functional and inhomogeneous properties of the data. Our method can be viewed as a natural extension of high-dimensional normal mixtures. However, the key difference is that smoothed structures are imposed for both the mean and covariance functions. The model is shown to be identifiable, and can be estimated efficiently by a combination of the ideas from EM algorithm, kernel regression, and functional principal component analysis. Our methodology is empirically justified by Monte Carlo simulations and illustrated by an analysis of a supermarket dataset. PMID:24976675
Time-frequency analysis : mathematical analysis of the empirical mode decomposition.
DOT National Transportation Integrated Search
2009-01-01
Invented over 10 years ago, empirical mode : decomposition (EMD) provides a nonlinear : time-frequency analysis with the ability to successfully : analyze nonstationary signals. Mathematical : Analysis of the Empirical Mode Decomposition : is a...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lombardo, N.J.; Marseille, T.J.; White, M.D.
TRUMP-BD (Boil Down) is an extension of the TRUMP (Edwards 1972) computer program for the analysis of nuclear fuel assemblies under severe accident conditions. This extension allows prediction of the heat transfer rates, metal-water oxidation rates, fission product release rates, steam generation and consumption rates, and temperature distributions for nuclear fuel assemblies under core uncovery conditions. The heat transfer processes include conduction in solid structures, convection across fluid-solid boundaries, and radiation between interacting surfaces. Metal-water reaction kinetics are modeled with empirical relationships to predict the oxidation rates of steam-exposed Zircaloy and uranium metal. The metal-water oxidation models are parabolic inmore » form with an Arrhenius temperature dependence. Uranium oxidation begins when fuel cladding failure occurs; Zircaloy oxidation occurs continuously at temperatures above 13000{degree}F when metal and steam are available. From the metal-water reactions, the hydrogen generation rate, total hydrogen release, and temporal and spatial distribution of oxide formations are computed. Consumption of steam from the oxidation reactions and the effect of hydrogen on the coolant properties is modeled for independent coolant flow channels. Fission product release from exposed uranium metal Zircaloy-clad fuel is modeled using empirical time and temperature relationships that consider the release to be subject to oxidation and volitization/diffusion ( bake-out'') release mechanisms. Release of the volatile species of iodine (I), tellurium (Te), cesium (Ce), ruthenium (Ru), strontium (Sr), zirconium (Zr), cerium (Cr), and barium (Ba) from uranium metal fuel may be modeled.« less
Progress Toward Efficient Laminar Flow Analysis and Design
NASA Technical Reports Server (NTRS)
Campbell, Richard L.; Campbell, Matthew L.; Streit, Thomas
2011-01-01
A multi-fidelity system of computer codes for the analysis and design of vehicles having extensive areas of laminar flow is under development at the NASA Langley Research Center. The overall approach consists of the loose coupling of a flow solver, a transition prediction method and a design module using shell scripts, along with interface modules to prepare the input for each method. This approach allows the user to select the flow solver and transition prediction module, as well as run mode for each code, based on the fidelity most compatible with the problem and available resources. The design module can be any method that designs to a specified target pressure distribution. In addition to the interface modules, two new components have been developed: 1) an efficient, empirical transition prediction module (MATTC) that provides n-factor growth distributions without requiring boundary layer information; and 2) an automated target pressure generation code (ATPG) that develops a target pressure distribution that meets a variety of flow and geometry constraints. The ATPG code also includes empirical estimates of several drag components to allow the optimization of the target pressure distribution. The current system has been developed for the design of subsonic and transonic airfoils and wings, but may be extendable to other speed ranges and components. Several analysis and design examples are included to demonstrate the current capabilities of the system.
An empirical potential for simulating vacancy clusters in tungsten.
Mason, D R; Nguyen-Manh, D; Becquart, C S
2017-12-20
We present an empirical interatomic potential for tungsten, particularly well suited for simulations of vacancy-type defects. We compare energies and structures of vacancy clusters generated with the empirical potential with an extensive new database of values computed using density functional theory, and show that the new potential predicts low-energy defect structures and formation energies with high accuracy. A significant difference to other popular embedded-atom empirical potentials for tungsten is the correct prediction of surface energies. Interstitial properties and short-range pairwise behaviour remain similar to the Ackford-Thetford potential on which it is based, making this potential well-suited to simulations of microstructural evolution following irradiation damage cascades. Using atomistic kinetic Monte Carlo simulations, we predict vacancy cluster dissociation in the range 1100-1300 K, the temperature range generally associated with stage IV recovery.
Network accessibility & the evoluation of urban employment.
DOT National Transportation Integrated Search
2011-06-01
This research examines the impact of accessibility on the growth of employment centers in the : Los Angeles Region between 1980 and 2000. There is extensive empirical documentation of : polycentricity the presence of multiple concentrations of em...
Darwin and Evolutionary Psychology
ERIC Educational Resources Information Center
Ghiselin, Michael T.
1973-01-01
Darwin's views on various psychological behaviors were significant. Basing his conclusions on empirical research, he wrote extensively on the phylogeny of behavior, emotional expression, sexual selection, instincts, evolution of morals, ontogeny of behavior, and genetics of behavior. (PS)
Disorders without borders: current and future directions in the meta-structure of mental disorders.
Carragher, Natacha; Krueger, Robert F; Eaton, Nicholas R; Slade, Tim
2015-03-01
Classification is the cornerstone of clinical diagnostic practice and research. However, the extant psychiatric classification systems are not well supported by research evidence. In particular, extensive comorbidity among putatively distinct disorders flags an urgent need for fundamental changes in how we conceptualize psychopathology. Over the past decade, research has coalesced on an empirically based model that suggests many common mental disorders are structured according to two correlated latent dimensions: internalizing and externalizing. We review and discuss the development of a dimensional-spectrum model which organizes mental disorders in an empirically based manner. We also touch upon changes in the DSM-5 and put forward recommendations for future research endeavors. Our review highlights substantial empirical support for the empirically based internalizing-externalizing model of psychopathology, which provides a parsimonious means of addressing comorbidity. As future research goals, we suggest that the field would benefit from: expanding the meta-structure of psychopathology to include additional disorders, development of empirically based thresholds, inclusion of a developmental perspective, and intertwining genomic and neuroscience dimensions with the empirical structure of psychopathology.
Modeling thermal sensation in a Mediterranean climate—a comparison of linear and ordinal models
NASA Astrophysics Data System (ADS)
Pantavou, Katerina; Lykoudis, Spyridon
2014-08-01
A simple thermo-physiological model of outdoor thermal sensation adjusted with psychological factors is developed aiming to predict thermal sensation in Mediterranean climates. Microclimatic measurements simultaneously with interviews on personal and psychological conditions were carried out in a square, a street canyon and a coastal location of the greater urban area of Athens, Greece. Multiple linear and ordinal regression were applied in order to estimate thermal sensation making allowance for all the recorded parameters or specific, empirically selected, subsets producing so-called extensive and empirical models, respectively. Meteorological, thermo-physiological and overall models - considering psychological factors as well - were developed. Predictions were improved when personal and psychological factors were taken into account as compared to meteorological models. The model based on ordinal regression reproduced extreme values of thermal sensation vote more adequately than the linear regression one, while the empirical model produced satisfactory results in relation to the extensive model. The effects of adaptation and expectation on thermal sensation vote were introduced in the models by means of the exposure time, season and preference related to air temperature and irradiation. The assessment of thermal sensation could be a useful criterion in decision making regarding public health, outdoor spaces planning and tourism.
Conceptual analyses of extensible booms to support a solar sail
NASA Technical Reports Server (NTRS)
Crawford, R. F.; Benton, M. D.
1977-01-01
Extensible booms which could function as the diagonal spars and central mast of an 800 meter square, non-rotating Solar Sailing Vehicle were conceptually designed and analyzed. The boom design concept that was investigated is an extensible lattice boom which is stowed and deployed by elastically coiling and uncoiling its continuous longerons. The seven different free-span lengths in each spar which would minimize the total weights of the spars and mast were determined. Boom weights were calculated by using a semi-empirical formulation which related the overall weight of a boom to the weight of its longerons.
Renovating the Pyramid of Needs: Contemporary Extensions Built Upon Ancient Foundations.
Kenrick, Douglas T; Griskevicius, Vladas; Neuberg, Steven L; Schaller, Mark
2010-05-01
Maslow's pyramid of human needs, proposed in 1943, has been one of the most cognitively contagious ideas in the behavioral sciences. Anticipating later evolutionary views of human motivation and cognition, Maslow viewed human motives as based in innate and universal predispositions. We revisit the idea of a motivational hierarchy in light of theoretical developments at the interface of evolutionary biology, anthropology, and psychology. After considering motives at three different levels of analysis, we argue that the basic foundational structure of the pyramid is worth preserving, but that it should be buttressed with a few architectural extensions. By adding a contemporary design feature, connections between fundamental motives and immediate situational threats and opportunities should be highlighted. By incorporating a classical element, these connections can be strengthened by anchoring the hierarchy of human motives more firmly in the bedrock of modern evolutionary theory. We propose a renovated hierarchy of fundamental motives that serves as both an integrative framework and a generative foundation for future empirical research. © The Author(s) 2010.
Renovating the Pyramid of Needs: Contemporary Extensions Built Upon Ancient Foundations
Kenrick, Douglas T.; Griskevicius, Vladas; Neuberg, Steven L.; Schaller, Mark
2011-01-01
Maslow’s pyramid of human needs, proposed in 1943, has been one of the most cognitively contagious ideas in the behavioral sciences. Anticipating later evolutionary views of human motivation and cognition, Maslow viewed human motives as based in innate and universal predispositions. We revisit the idea of a motivational hierarchy in light of theoretical developments at the interface of evolutionary biology, anthropology, and psychology. After considering motives at three different levels of analysis, we argue that the basic foundational structure of the pyramid is worth preserving, but that it should be buttressed with a few architectural extensions. By adding a contemporary design feature, connections between fundamental motives and immediate situational threats and opportunities should be highlighted. By incorporating a classical element, these connections can be strengthened by anchoring the hierarchy of human motives more firmly in the bedrock of modern evolutionary theory. We propose a renovated hierarchy of fundamental motives that serves as both an integrative framework and a generative foundation for future empirical research. PMID:21874133
Modified linear predictive coding approach for moving target tracking by Doppler radar
NASA Astrophysics Data System (ADS)
Ding, Yipeng; Lin, Xiaoyi; Sun, Ke-Hui; Xu, Xue-Mei; Liu, Xi-Yao
2016-07-01
Doppler radar is a cost-effective tool for moving target tracking, which can support a large range of civilian and military applications. A modified linear predictive coding (LPC) approach is proposed to increase the target localization accuracy of the Doppler radar. Based on the time-frequency analysis of the received echo, the proposed approach first real-time estimates the noise statistical parameters and constructs an adaptive filter to intelligently suppress the noise interference. Then, a linear predictive model is applied to extend the available data, which can help improve the resolution of the target localization result. Compared with the traditional LPC method, which empirically decides the extension data length, the proposed approach develops an error array to evaluate the prediction accuracy and thus, adjust the optimum extension data length intelligently. Finally, the prediction error array is superimposed with the predictor output to correct the prediction error. A series of experiments are conducted to illustrate the validity and performance of the proposed techniques.
Villar, S E J; Edwards, H G M
2005-05-01
Seventy-five specimens from thirty fragments of Roman villa wall-paintings from sites in Burgos Castilla y Leon, Spain, have been analysed by Raman spectroscopy. This is the first time that a Raman spectrocopic study of Roman wall-paintings from Spain has been reported. The extensive range of tonalities and colour compositions contrasts with the results found in other provinces of the Roman Empire, for example Romano-British villas. Calcite, aragonite, haematite, caput mortuum, cinnabar, limonite, goethite, cuprorivaite, lazurite, green earth, carbon and verdigris have been found as pigments. Some mineral mixtures with different tonalities have been made using different strategies to those more usually found. Of particular interest is the assignation of the Tarna mine for the origin of the cinnabar used for obtaining the red colour in some specimens analysed here. The wide range of colours, tonalities and minerals found in some of the sites studied in this work is suggestive of a high social status for the community.
Marino, Nicholas Dos Anjos Cristiano; Romero, Gustavo Quevedo; Farjalla, Vinicius Fortes
2018-03-01
Ecologists have extensively investigated the effect of warming on consumer-resource interactions, with experiments revealing that warming can strengthen, weaken or have no net effect on top-down control of resources. These experiments have inspired a body of theoretical work to explain the variation in the effect of warming on top-down control. However, there has been no quantitative attempt to reconcile theory with outcomes from empirical studies. To address the gap between theory and experiment, we performed a meta-analysis to examine the combined effect of experimental warming and top-down control on resource biomass and determined potential sources of variation across experiments. We show that differences in experimental outcomes are related to systematic variation in the geographical distribution of studies. Specifically, warming strengthened top-down control when experiments were conducted in colder regions, but had the opposite effect in warmer regions. Furthermore, we found that differences in the thermoregulation strategy of the consumer and openness of experimental arenas to dispersal can contribute to some deviation from the overall geographical pattern. These results reconcile empirical findings and support the expectation of geographical variation in the response of consumer-resource interactions to warming. © 2018 John Wiley & Sons Ltd/CNRS.
Analysis of transitional separation bubbles on infinite swept wings
NASA Technical Reports Server (NTRS)
Davis, R. L.; Carter, J. E.
1986-01-01
A previously developed two-dimensional local inviscid-viscous interaction technique for the analysis of airfoil transitional separation bubbles, ALESEP (Airfoil Leading Edge Separation), has been extended for the calculation of transitional separation bubbles over infinite swept wings. As part of this effort, Roberts' empirical correlation, which is interpreted as a separated flow empirical extension of Mack's stability theory for attached flows, has been incorporated into the ALESEP procedure for the prediction of the transition location within the separation bubble. In addition, the viscous procedure used in the ALESEP techniques has been modified to allow for wall suction. A series of two-dimensional calculations is presented as a verification of the prediction capability of the interaction techniques with the Roberts' transition model. Numerical tests have shown that this two-dimensional natural transition correlation may also be applied to transitional separation bubbles over infinite swept wings. Results of the interaction procedure are compared with Horton's detailed experimental data for separated flow over a swept plate which demonstrates the accuracy of the present technique. Wall suction has been applied to a similar interaction calculation to demonstrate its effect on the separation bubble. The principal conclusion of this paper is that the prediction of transitional separation bubbles over two-dimensional or infinite swept geometries is now possible using the present interacting boundary layer approach.
Pocock, Nicola S; Phua, Kai Hong
2011-05-04
Medical tourism is a growing phenomenon with policy implications for health systems, particularly of destination countries. Private actors and governments in Southeast Asia are promoting the medical tourist industry, but the potential impact on health systems, particularly in terms of equity in access and availability for local consumers, is unclear. This article presents a conceptual framework that outlines the policy implications of medical tourism's growth for health systems, drawing on the cases of Thailand, Singapore and Malaysia, three regional hubs for medical tourism, via an extensive review of academic and grey literature. Variables for further analysis of the potential impact of medical tourism on health systems are also identified. The framework can provide a basis for empirical, in country studies weighing the benefits and disadvantages of medical tourism for health systems. The policy implications described are of particular relevance for policymakers and industry practitioners in other Southeast Asian countries with similar health systems where governments have expressed interest in facilitating the growth of the medical tourist industry. This article calls for a universal definition of medical tourism and medical tourists to be enunciated, as well as concerted data collection efforts, to be undertaken prior to any meaningful empirical analysis of medical tourism's impact on health systems.
2011-01-01
Medical tourism is a growing phenomenon with policy implications for health systems, particularly of destination countries. Private actors and governments in Southeast Asia are promoting the medical tourist industry, but the potential impact on health systems, particularly in terms of equity in access and availability for local consumers, is unclear. This article presents a conceptual framework that outlines the policy implications of medical tourism's growth for health systems, drawing on the cases of Thailand, Singapore and Malaysia, three regional hubs for medical tourism, via an extensive review of academic and grey literature. Variables for further analysis of the potential impact of medical tourism on health systems are also identified. The framework can provide a basis for empirical, in country studies weighing the benefits and disadvantages of medical tourism for health systems. The policy implications described are of particular relevance for policymakers and industry practitioners in other Southeast Asian countries with similar health systems where governments have expressed interest in facilitating the growth of the medical tourist industry. This article calls for a universal definition of medical tourism and medical tourists to be enunciated, as well as concerted data collection efforts, to be undertaken prior to any meaningful empirical analysis of medical tourism's impact on health systems. PMID:21539751
The psychobiological theory of temperament and character: comment on Farmer and Goldberg (2008).
Cloninger, C Robert
2008-09-01
The revised Temperament and Character Inventory (TCI-R) is the third stage of development of a widely used multiscale personality inventory that began with the Tridimensional Personality Questionnaire (TPQ) and then the Temperament and Character Inventory (TCI). The author describes the third stage of the psychobiological theory of temperament and character; empirical tests of its predictions from genetics, neurobiology, psychosocial development, and clinical studies; and empirical findings that stimulated incremental changes in theory and test construction. Linear factor analysis is an inadequate method for evaluating the nonlinear and dynamical nature of the intrapsychic processes that influence human personality. Traits derived by factor analysis under the doubtful assumption of linearity are actually heterogeneous composites of rational and emotional processes that differ fundamentally in their underlying brain processes. The predictions of the psychobiological theory are strongly validated by extensive data from genetics, neurobiology, longitudinal studies of development, and clinical assessment. The distinction between temperament and character allows the TCI and TCI-R to outperform other popular personality inventories in distinguishing individuals with personality disorders from others and in describing the developmental path to well-being in terms of dynamical processes within the individual that are useful for both research and clinical practice. (c) 2008 APA, all rights reserved.
High throughput film dosimetry in homogeneous and heterogeneous media for a small animal irradiator
Wack, L.; Ngwa, W.; Tryggestad, E.; Tsiamas, P.; Berbeco, R.; Ng, S.K.; Hesser, J.
2013-01-01
Purpose We have established a high-throughput Gafchromic film dosimetry protocol for narrow kilo-voltage beams in homogeneous and heterogeneous media for small-animal radiotherapy applications. The kV beam characterization is based on extensive Gafchromic film dosimetry data acquired in homogeneous and heterogeneous media. An empirical model is used for parameterization of depth and off-axis dependence of measured data. Methods We have modified previously published methods of film dosimetry to suit the specific tasks of the study. Unlike film protocols used in previous studies, our protocol employs simultaneous multichannel scanning and analysis of up to nine Gafchromic films per scan. A scanner and background correction were implemented to improve accuracy of the measurements. Measurements were taken in homogeneous and inhomogeneous phantoms at 220 kVp and a field size of 5 × 5 mm2. The results were compared against Monte Carlo simulations. Results Dose differences caused by variations in background signal were effectively removed by the corrections applied. Measurements in homogeneous phantoms were used to empirically characterize beam data in homogeneous and heterogeneous media. Film measurements in inhomogeneous phantoms and their empirical parameterization differed by about 2%–3%. The model differed from MC by about 1% (water, lung) to 7% (bone). Good agreement was found for measured and modelled off-axis ratios. Conclusions EBT2 films are a valuable tool for characterization of narrow kV beams, though care must be taken to eliminate disturbances caused by varying background signals. The usefulness of the empirical beam model in interpretation and parameterization of film data was demonstrated. PMID:23510532
Symbiotic empirical ethics: a practical methodology.
Frith, Lucy
2012-05-01
Like any discipline, bioethics is a developing field of academic inquiry; and recent trends in scholarship have been towards more engagement with empirical research. This 'empirical turn' has provoked extensive debate over how such 'descriptive' research carried out in the social sciences contributes to the distinctively normative aspect of bioethics. This paper will address this issue by developing a practical research methodology for the inclusion of data from social science studies into ethical deliberation. This methodology will be based on a naturalistic conception of ethical theory that sees practice as informing theory just as theory informs practice - the two are symbiotically related. From this engagement with practice, the ways that such theories need to be extended and developed can be determined. This is a practical methodology for integrating theory and practice that can be used in empirical studies, one that uses ethical theory both to explore the data and to draw normative conclusions. © 2010 Blackwell Publishing Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foight, Dillon R.; Slane, Patrick O.; Güver, Tolga
We present a comprehensive study of interstellar X-ray extinction using the extensive Chandra supernova remnant (SNR) archive and use our results to refine the empirical relation between the hydrogen column density and optical extinction. In our analysis, we make use of the large, uniform data sample to assess various systematic uncertainties in the measurement of the interstellar X-ray absorption. Specifically, we address systematic uncertainties that originate from (i) the emission models used to fit SNR spectra; (ii) the spatial variations within individual remnants; (iii) the physical conditions of the remnant such as composition, temperature, and non-equilibrium regions; and (iv) themore » model used for the absorption of X-rays in the interstellar medium. Using a Bayesian framework to quantify these systematic uncertainties, and combining the resulting hydrogen column density measurements with the measurements of optical extinction toward the same remnants, we find the empirical relation N {sub H} = (2.87 ± 0.12) × 10{sup 21} A {sub V} cm{sup 2}, which is significantly higher than the previous measurements.« less
Rossi, Fabrizio; Barth, James R; Cebula, Richard J
2018-06-01
The data presented in this article are related to the research article entitled "Do shareholder coalitions affect agency costs? Evidence from Italian-listed companies", Research in International Business and Finance , Forthcoming (Rossi et al., 2018) [1]. The study shows an empirical analysis using an extensive balanced panel dataset of 163 Italian listed companies for the period 2002-2013, which is a sample yielding 1956 firm-year observations. The sample consists primarily of manufacturing firms, but also includes some service enterprises. However all financial firms and regulated utilities are excluded. We collected data on ownership structure for the entire study period. Information was acquired from the Consob website and the individual company reports on corporate governance. Data on firm-level indicators (debt-to-capital ratio, firm size, and age of the firm) for all companies in the sample were collected from Datastream, Bloomberg , and Calepino dell'Azionista , as well as obtained manually from the financial statements of the individual companies being studied. Our dataset contains several measures of ownership structure for Italian listed companies.
Distinct timing mechanisms produce discrete and continuous movements.
Huys, Raoul; Studenka, Breanna E; Rheaume, Nicole L; Zelaznik, Howard N; Jirsa, Viktor K
2008-04-25
The differentiation of discrete and continuous movement is one of the pillars of motor behavior classification. Discrete movements have a definite beginning and end, whereas continuous movements do not have such discriminable end points. In the past decade there has been vigorous debate whether this classification implies different control processes. This debate up until the present has been empirically based. Here, we present an unambiguous non-empirical classification based on theorems in dynamical system theory that sets discrete and continuous movements apart. Through computational simulations of representative modes of each class and topological analysis of the flow in state space, we show that distinct control mechanisms underwrite discrete and fast rhythmic movements. In particular, we demonstrate that discrete movements require a time keeper while fast rhythmic movements do not. We validate our computational findings experimentally using a behavioral paradigm in which human participants performed finger flexion-extension movements at various movement paces and under different instructions. Our results demonstrate that the human motor system employs different timing control mechanisms (presumably via differential recruitment of neural subsystems) to accomplish varying behavioral functions such as speed constraints.
Taking Innovation To Scale In Primary Care Practices: The Functions Of Health Care Extension.
Ono, Sarah S; Crabtree, Benjamin F; Hemler, Jennifer R; Balasubramanian, Bijal A; Edwards, Samuel T; Green, Larry A; Kaufman, Arthur; Solberg, Leif I; Miller, William L; Woodson, Tanisha Tate; Sweeney, Shannon M; Cohen, Deborah J
2018-02-01
Health care extension is an approach to providing external support to primary care practices with the aim of diffusing innovation. EvidenceNOW was launched to rapidly disseminate and implement evidence-based guidelines for cardiovascular preventive care in the primary care setting. Seven regional grantee cooperatives provided the foundational elements of health care extension-technological and quality improvement support, practice capacity building, and linking with community resources-to more than two hundred primary care practices in each region. This article describes how the cooperatives varied in their approaches to extension and provides early empirical evidence that health care extension is a feasible and potentially useful approach for providing quality improvement support to primary care practices. With investment, health care extension may be an effective platform for federal and state quality improvement efforts to create economies of scale and provide practices with more robust and coordinated support services.
MPI Runtime Error Detection with MUST: Advances in Deadlock Detection
Hilbrich, Tobias; Protze, Joachim; Schulz, Martin; ...
2013-01-01
The widely used Message Passing Interface (MPI) is complex and rich. As a result, application developers require automated tools to avoid and to detect MPI programming errors. We present the Marmot Umpire Scalable Tool (MUST) that detects such errors with significantly increased scalability. We present improvements to our graph-based deadlock detection approach for MPI, which cover future MPI extensions. Our enhancements also check complex MPI constructs that no previous graph-based detection approach handled correctly. Finally, we present optimizations for the processing of MPI operations that reduce runtime deadlock detection overheads. Existing approaches often require ( p ) analysis time permore » MPI operation, for p processes. We empirically observe that our improvements lead to sub-linear or better analysis time per operation for a wide range of real world applications.« less
NASA Astrophysics Data System (ADS)
Tiwari, Harinarayan; Sharma, Nayan
2017-05-01
This research paper focuses on the need of turbulence, instruments reliable to capture turbulence, different turbulence parameters and some advance methodology which can decompose various turbulence structures at different levels near hydraulic structures. Small-scale turbulence research has valid prospects in open channel flow. The relevance of the study is amplified as we introduce any hydraulic structure in the channel which disturbs the natural flow and creates discontinuity. To recover this discontinuity, the piano key weir (PKW) might be used with sloped keys. Constraints of empirical results in the vicinity of PKW necessitate extensive laboratory experiments with fair and reliable instrumentation techniques. Acoustic Doppler velocimeter was established to be best suited within range of some limitations using principal component analysis. Wavelet analysis is proposed to decompose the underlying turbulence structure in a better way.
Gui, Jiang; Moore, Jason H.; Williams, Scott M.; Andrews, Peter; Hillege, Hans L.; van der Harst, Pim; Navis, Gerjan; Van Gilst, Wiek H.; Asselbergs, Folkert W.; Gilbert-Diamond, Diane
2013-01-01
We present an extension of the two-class multifactor dimensionality reduction (MDR) algorithm that enables detection and characterization of epistatic SNP-SNP interactions in the context of a quantitative trait. The proposed Quantitative MDR (QMDR) method handles continuous data by modifying MDR’s constructive induction algorithm to use a T-test. QMDR replaces the balanced accuracy metric with a T-test statistic as the score to determine the best interaction model. We used a simulation to identify the empirical distribution of QMDR’s testing score. We then applied QMDR to genetic data from the ongoing prospective Prevention of Renal and Vascular End-Stage Disease (PREVEND) study. PMID:23805232
Length of hospitalization and outcome of commitment and recommitment hearings.
Parry, C D; Turkheimer, E
1992-01-01
Despite extensive legislative reformulation of civil commitment procedures, empirical studies have shown that civil commitment hearings continue to be largely nonadversarial. The authors observed all civil commitment hearings during a three-month period at a large state hospital in Virginia and examined the characteristics of patients and the actions of attorneys, clinical examiners, and judges as a function of the length of time the patient had been in the hospital. The analysis revealed that as the length of a patient's hospitalization increased, the hearings became shorter and less adversarial; patients tended to show fewer signs of acute psychiatric illness and more signs of chronic schizophrenia. The implications of these findings for civil commitment policy are discussed.
"Fuzziness" in the celular interactome: a historical perspective.
Welch, G Rickey
2012-01-01
Some historical background is given for appreciating the impact of the empirical construct known as the cellular protein-protein interactome, which is a seemingly de novo entity that has arisen of late within the context of postgenomic systems biology. The approach here builds on a generalized principle of "fuzziness" in protein behavior, proposed by Tompa and Fuxreiter.(1) Recent controversies in the analysis and interpretation of the interactome studies are rationalized historically under the auspices of this concept. There is an extensive literature on protein-protein interactions, dating to the mid-1900s, which may help clarify the "fuzziness" in the interactome picture and, also, provide a basis for understanding the physiological importance of protein-protein interactions in vivo.
Intergenerational Ties in Context: Grandparents Caring for Grandchildren in China
Chen, Feinian; Liu, Guangya; Mair, Christine A.
2012-01-01
Guided by theories and empirical research on intergenerational relationships, we examine the phenomenon of grandparents caring for grandchildren in contemporary China. Using a longitudinal dataset (China Health and Nutrition Survey), we document a high level of structural and functional solidarity in grandparent-grandchildren relationships. Intergenerational solidarity is indicated by a high rate of coresidence between grandchildren and grandparents, a sizable number of skipped-generation households (no parent present), extensive childcare involvement by non-coresidential grandparents, and a large amount of care provided by coresidential grandparents. Multivariate analysis further suggests that grandparents’ childcare load is adaptive to familial needs, as reflected by the characteristics of the household, household members, and work activities of the mothers. PMID:22544978
An optimal control strategy for two-dimensional motion camouflage with non-holonimic constraints.
Rañó, Iñaki
2012-07-01
Motion camouflage is a stealth behaviour observed both in hover-flies and in dragonflies. Existing controllers for mimicking motion camouflage generate this behaviour on an empirical basis or without considering the kinematic motion restrictions present in animal trajectories. This study summarises our formal contributions to solve the generation of motion camouflage as a non-linear optimal control problem. The dynamics of the system capture the kinematic restrictions to motion of the agents, while the performance index ensures camouflage trajectories. An extensive set of simulations support the technique, and a novel analysis of the obtained trajectories contributes to our understanding of possible mechanisms to obtain sensor based motion camouflage, for instance, in mobile robots.
A comparison of advanced overlay technologies
NASA Astrophysics Data System (ADS)
Dasari, Prasad; Smith, Nigel; Goelzer, Gary; Liu, Zhuan; Li, Jie; Tan, Asher; Koh, Chin Hwee
2010-03-01
The extension of optical lithography to 22nm and beyond by Double Patterning Technology is often challenged by CDU and overlay control. With reduced overlay measurement error budgets in the sub-nm range, relying on traditional Total Measurement Uncertainty (TMU) estimates alone is no longer sufficient. In this paper we will report scatterometry overlay measurements data from a set of twelve test wafers, using four different target designs. The TMU of these measurements is under 0.4nm, within the process control requirements for the 22nm node. Comparing the measurement differences between DBO targets (using empirical and model based analysis) and with image-based overlay data indicates the presence of systematic and random measurement errors that exceeds the TMU estimate.
NASA Astrophysics Data System (ADS)
Elliott, R. M.; Gibson, R. A.; Carson, T. B.; Marasco, D. E.; Culligan, P. J.; McGillis, W. R.
2016-07-01
Green roofs have been utilized for urban stormwater management due to their ability to capture rainwater locally. Studies of the most common type, extensive green roofs, have demonstrated that green roofs can retain significant amounts of stormwater, but have also shown variation in seasonal performance. The purpose of this study is to determine how time of year impacts the hydrologic performance of extensive green roofs considering the covariates of antecedent dry weather period (ADWP), potential evapotranspiration (ET0) and storm event size. To do this, nearly four years of monitoring data from two full-scale extensive green roofs (with differing substrate depths of 100 mm and 31 mm) are analyzed. The annual performance is then modeled using a common empirical relationship between rainfall and green roof runoff, with the addition of Julian day in one approach, ET0 in another, and both ADWP and ET0 in a third approach. Together the monitoring and modeling results confirm that stormwater retention is highest in warmer months, the green roofs retain more rainfall with longer ADWPs, and the seasonal variations in behavior are more pronounced for the roof with the thinner media than the roof with the deeper media. Overall, the ability of seasonal accounting to improve stormwater retention modeling is demonstrated; modification of the empirical model to include ADWP, and ET0 improves the model R 2 from 0.944 to 0.975 for the thinner roof, and from 0.866 to 0.870 for the deeper roof. Furthermore, estimating the runoff with the empirical approach was shown to be more accurate then using a water balance model, with model R 2 of 0.944 and 0.866 compared to 0.975 and 0.866 for the thinner and deeper roof, respectively. This finding is attributed to the difficulty of accurately parameterizing the water balance model.
Hayden, Eric J
2016-08-15
RNA molecules provide a realistic but tractable model of a genotype to phenotype relationship. This relationship has been extensively investigated computationally using secondary structure prediction algorithms. Enzymatic RNA molecules, or ribozymes, offer access to genotypic and phenotypic information in the laboratory. Advancements in high-throughput sequencing technologies have enabled the analysis of sequences in the lab that now rivals what can be accomplished computationally. This has motivated a resurgence of in vitro selection experiments and opened new doors for the analysis of the distribution of RNA functions in genotype space. A body of computational experiments has investigated the persistence of specific RNA structures despite changes in the primary sequence, and how this mutational robustness can promote adaptations. This article summarizes recent approaches that were designed to investigate the role of mutational robustness during the evolution of RNA molecules in the laboratory, and presents theoretical motivations, experimental methods and approaches to data analysis. Copyright © 2016 Elsevier Inc. All rights reserved.
Modified Distribution-Free Goodness-of-Fit Test Statistic.
Chun, So Yeon; Browne, Michael W; Shapiro, Alexander
2018-03-01
Covariance structure analysis and its structural equation modeling extensions have become one of the most widely used methodologies in social sciences such as psychology, education, and economics. An important issue in such analysis is to assess the goodness of fit of a model under analysis. One of the most popular test statistics used in covariance structure analysis is the asymptotically distribution-free (ADF) test statistic introduced by Browne (Br J Math Stat Psychol 37:62-83, 1984). The ADF statistic can be used to test models without any specific distribution assumption (e.g., multivariate normal distribution) of the observed data. Despite its advantage, it has been shown in various empirical studies that unless sample sizes are extremely large, this ADF statistic could perform very poorly in practice. In this paper, we provide a theoretical explanation for this phenomenon and further propose a modified test statistic that improves the performance in samples of realistic size. The proposed statistic deals with the possible ill-conditioning of the involved large-scale covariance matrices.
Theory of earthquakes interevent times applied to financial markets
NASA Astrophysics Data System (ADS)
Jagielski, Maciej; Kutner, Ryszard; Sornette, Didier
2017-10-01
We analyze the probability density function (PDF) of waiting times between financial loss exceedances. The empirical PDFs are fitted with the self-excited Hawkes conditional Poisson process with a long power law memory kernel. The Hawkes process is the simplest extension of the Poisson process that takes into account how past events influence the occurrence of future events. By analyzing the empirical data for 15 different financial assets, we show that the formalism of the Hawkes process used for earthquakes can successfully model the PDF of interevent times between successive market losses.
Shear in high strength concrete bridge girders : technical report.
DOT National Transportation Integrated Search
2013-04-01
Prestressed Concrete (PC) I-girders are used extensively as the primary superstructure components in Texas highway bridges. : A simple semi-empirical equation was developed at the University of Houston (UH) to predict the shear strength of PC I-girde...
NASA Technical Reports Server (NTRS)
Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)
2000-01-01
This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor-noise correlation model was developed from engine acoustic test results. This work provided several insights on potential approaches to reducing aircraft engine noise. Code development is described in this report, and those insights are discussed.
Mathematical Modelling as a Professional Task
ERIC Educational Resources Information Center
Frejd, Peter; Bergsten, Christer
2016-01-01
Educational research literature on mathematical modelling is extensive. However, not much attention has been paid to empirical investigations of its scholarly knowledge from the perspective of didactic transposition processes. This paper reports from an interview study of mathematical modelling activities involving nine professional model…
Equal Work, Unequal Pay: Gender Discrimination within Work-Similar Occupations.
ERIC Educational Resources Information Center
Kemp, Alice Abel; Beck, E. M.
1986-01-01
Describes an empirical method to identify work-similar occupations using selected measures from the Dictionary of Occupational Titles. Examines male-female earnings differences within a group of work-similar occupations and finds that discrimination against females is extensive. (Author/CH)
Goldstein, Naomi E. S.; Kemp, Kathleen A.; Leff, Stephen S.; Lochman, John E.
2014-01-01
The use of manual-based interventions tends to improve client outcomes and promote replicability. With an increasingly strong link between funding and the use of empirically supported prevention and intervention programs, manual development and adaptation have become research priorities. As a result, researchers and scholars have generated guidelines for developing manuals from scratch, but there are no extant guidelines for adapting empirically supported, manualized prevention and intervention programs for use with new populations. Thus, this article proposes step-by-step guidelines for the manual adaptation process. It also describes two adaptations of an extensively researched anger management intervention to exemplify how an empirically supported program was systematically and efficiently adapted to achieve similar outcomes with vastly different populations in unique settings. PMID:25110403
Taking Innovation To Scale In Primary Care Practices: The Functions Of Health Care Extension
Ono, Sarah S.; Crabtree, Benjamin F.; Hemler, Jennifer R.; Balasubramanian, Bijal A.; Edwards, Samuel T.; Green, Larry A.; Kaufman, Arthur; Solberg, Leif I.; Miller, William L.; Woodson, Tanisha Tate; Sweeney, Shannon M.; Cohen, Deborah J.
2018-01-01
Health care extension is an approach to providing external support to primary care practices with the aim of diffusing innovation. EvidenceNOW was launched to rapidly disseminate and implement evidence-based guidelines for cardiovascular preventive care in the primary care setting. Seven regional grantee cooperatives provided the foundational elements of health care extension—technological and quality improvement support, practice capacity building, and linking with community resources—to more than two hundred primary care practices in each region. This article describes how the cooperatives varied in their approaches to extension and provides early empirical evidence that health care extension is a feasible and potentially useful approach for providing quality improvement support to primary care practices. With investment, health care extension may be an effective platform for federal and state quality improvement efforts to create economies of scale and provide practices with more robust and coordinated support services. PMID:29401016
Asymptotics of empirical eigenstructure for high dimensional spiked covariance.
Wang, Weichen; Fan, Jianqing
2017-06-01
We derive the asymptotic distributions of the spiked eigenvalues and eigenvectors under a generalized and unified asymptotic regime, which takes into account the magnitude of spiked eigenvalues, sample size, and dimensionality. This regime allows high dimensionality and diverging eigenvalues and provides new insights into the roles that the leading eigenvalues, sample size, and dimensionality play in principal component analysis. Our results are a natural extension of those in Paul (2007) to a more general setting and solve the rates of convergence problems in Shen et al. (2013). They also reveal the biases of estimating leading eigenvalues and eigenvectors by using principal component analysis, and lead to a new covariance estimator for the approximate factor model, called shrinkage principal orthogonal complement thresholding (S-POET), that corrects the biases. Our results are successfully applied to outstanding problems in estimation of risks of large portfolios and false discovery proportions for dependent test statistics and are illustrated by simulation studies.
Asymptotics of empirical eigenstructure for high dimensional spiked covariance
Wang, Weichen
2017-01-01
We derive the asymptotic distributions of the spiked eigenvalues and eigenvectors under a generalized and unified asymptotic regime, which takes into account the magnitude of spiked eigenvalues, sample size, and dimensionality. This regime allows high dimensionality and diverging eigenvalues and provides new insights into the roles that the leading eigenvalues, sample size, and dimensionality play in principal component analysis. Our results are a natural extension of those in Paul (2007) to a more general setting and solve the rates of convergence problems in Shen et al. (2013). They also reveal the biases of estimating leading eigenvalues and eigenvectors by using principal component analysis, and lead to a new covariance estimator for the approximate factor model, called shrinkage principal orthogonal complement thresholding (S-POET), that corrects the biases. Our results are successfully applied to outstanding problems in estimation of risks of large portfolios and false discovery proportions for dependent test statistics and are illustrated by simulation studies. PMID:28835726
MOVING BEYOND COLOR: THE CASE FOR MULTISPECTRAL IMAGING IN BRIGHTFIELD PATHOLOGY.
Cukierski, William J; Qi, Xin; Foran, David J
2009-01-01
A multispectral camera is capable of imaging a histologic slide at narrow bandwidths over the range of the visible spectrum. While several uses for multispectral imaging (MSI) have been demonstrated in pathology [1, 2], there is no unified consensus over when and how MSI might benefit automated analysis [3, 4]. In this work, we use a linear-algebra framework to investigate the relationship between the spectral image and its standard-image counterpart. The multispectral "cube" is treated as an extension of a traditional image in a high-dimensional color space. The concept of metamers is introduced and used to derive regions of the visible spectrum where MSI may provide an advantage. Furthermore, histological stains which are amenable to analysis by MSI are reported. We show the Commission internationale de l'éclairage (CIE) 1931 transformation from spectrum to color is non-neighborhood preserving. Empirical results are demonstrated on multispectral images of peripheral blood smears.
Vu, Duy; Lomi, Alessandro; Mascia, Daniele; Pallotti, Francesca
2017-06-30
The main objective of this paper is to introduce and illustrate relational event models, a new class of statistical models for the analysis of time-stamped data with complex temporal and relational dependencies. We outline the main differences between recently proposed relational event models and more conventional network models based on the graph-theoretic formalism typically adopted in empirical studies of social networks. Our main contribution involves the definition and implementation of a marked point process extension of currently available models. According to this approach, the sequence of events of interest is decomposed into two components: (a) event time and (b) event destination. This decomposition transforms the problem of selection of event destination in relational event models into a conditional multinomial logistic regression problem. The main advantages of this formulation are the possibility of controlling for the effect of event-specific data and a significant reduction in the estimation time of currently available relational event models. We demonstrate the empirical value of the model in an analysis of interhospital patient transfers within a regional community of health care organizations. We conclude with a discussion of how the models we presented help to overcome some the limitations of statistical models for networks that are currently available. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
A Divergence Statistics Extension to VTK for Performance Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pebay, Philippe Pierre; Bennett, Janine Camille
This report follows the series of previous documents ([PT08, BPRT09b, PT09, BPT09, PT10, PB13], where we presented the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k -means, order and auto-correlative statistics engines which we developed within the Visualization Tool Kit ( VTK ) as a scalable, parallel and versatile statistics package. We now report on a new engine which we developed for the calculation of divergence statistics, a concept which we hereafter explain and whose main goal is to quantify the discrepancy, in a stasticial manner akin to measuring a distance, between an observed empirical distribution and a theoretical,more » "ideal" one. The ease of use of the new diverence statistics engine is illustrated by the means of C++ code snippets. Although this new engine does not yet have a parallel implementation, it has already been applied to HPC performance analysis, of which we provide an example.« less
Multifractal Cross Wavelet Analysis
NASA Astrophysics Data System (ADS)
Jiang, Zhi-Qiang; Gao, Xing-Lu; Zhou, Wei-Xing; Stanley, H. Eugene
Complex systems are composed of mutually interacting components and the output values of these components usually exhibit long-range cross-correlations. Using wavelet analysis, we propose a method of characterizing the joint multifractal nature of these long-range cross correlations, a method we call multifractal cross wavelet analysis (MFXWT). We assess the performance of the MFXWT method by performing extensive numerical experiments on the dual binomial measures with multifractal cross correlations and the bivariate fractional Brownian motions (bFBMs) with monofractal cross correlations. For binomial multifractal measures, we find the empirical joint multifractality of MFXWT to be in approximate agreement with the theoretical formula. For bFBMs, MFXWT may provide spurious multifractality because of the wide spanning range of the multifractal spectrum. We also apply the MFXWT method to stock market indices, and in pairs of index returns and volatilities we find an intriguing joint multifractal behavior. The tests on surrogate series also reveal that the cross correlation behavior, particularly the cross correlation with zero lag, is the main origin of cross multifractality.
Concept analysis: nurse-to-nurse lateral violence.
Embree, Jennifer L; White, Ann H
2010-01-01
The purpose of this paper is to examine the concept of nurse-to-nurse lateral violence (LV). Published literature--LV among nurses is significant and results in social, psychological, and physical consequences, negative patient and nursing outcomes, and damaged relationships. An extensive review of literature through Health Source, Cumulative Index to Nursing and Allied Health Literature (CINAHL), ProQuest health, and Medical Complete was used to determine agreement and disagreement across disciplines and emerging trends. This concept analysis demonstrates that nurse-to-nurse LV is nurse-to-nurse aggression with overtly or covertly directing dissatisfaction toward another. Origins include role issues, oppression, strict hierarchy, disenfranchising work practices, low self-esteem, powerlessness perception, anger, and circuits of power. The result of this analysis provides guidance for further conceptual and empirical research as well as for clinical practice. Organizations must learn how to eliminate antecedents and provide nurses with skills and techniques to eradicate LV to improve the nursing work environment, patient care outcomes, and nurse retention.
Performance bounds for modal analysis using sparse linear arrays
NASA Astrophysics Data System (ADS)
Li, Yuanxin; Pezeshki, Ali; Scharf, Louis L.; Chi, Yuejie
2017-05-01
We study the performance of modal analysis using sparse linear arrays (SLAs) such as nested and co-prime arrays, in both first-order and second-order measurement models. We treat SLAs as constructed from a subset of sensors in a dense uniform linear array (ULA), and characterize the performance loss of SLAs with respect to the ULA due to using much fewer sensors. In particular, we claim that, provided the same aperture, in order to achieve comparable performance in terms of Cramér-Rao bound (CRB) for modal analysis, SLAs require more snapshots, of which the number is about the number of snapshots used by ULA times the compression ratio in the number of sensors. This is shown analytically for the case with one undamped mode, as well as empirically via extensive numerical experiments for more complex scenarios. Moreover, the misspecified CRB proposed by Richmond and Horowitz is also studied, where SLAs suffer more performance loss than their ULA counterpart.
Comparative analysis of two discretizations of Ricci curvature for complex networks.
Samal, Areejit; Sreejith, R P; Gu, Jiao; Liu, Shiping; Saucan, Emil; Jost, Jürgen
2018-06-05
We have performed an empirical comparison of two distinct notions of discrete Ricci curvature for graphs or networks, namely, the Forman-Ricci curvature and Ollivier-Ricci curvature. Importantly, these two discretizations of the Ricci curvature were developed based on different properties of the classical smooth notion, and thus, the two notions shed light on different aspects of network structure and behavior. Nevertheless, our extensive computational analysis in a wide range of both model and real-world networks shows that the two discretizations of Ricci curvature are highly correlated in many networks. Moreover, we show that if one considers the augmented Forman-Ricci curvature which also accounts for the two-dimensional simplicial complexes arising in graphs, the observed correlation between the two discretizations is even higher, especially, in real networks. Besides the potential theoretical implications of these observations, the close relationship between the two discretizations has practical implications whereby Forman-Ricci curvature can be employed in place of Ollivier-Ricci curvature for faster computation in larger real-world networks whenever coarse analysis suffices.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dasgupta-Schubert, N.; Reyes, M. A.; Tamez, V. A.
2009-04-20
Alpha decay is one of the two main decay modes of the heaviest nuclei, (SHE), and constitutes one of the dominant decay modes of highly neutron deficient medium mass nuclei ('exotics'). Thus identifying and characterizing the alpha decay chains form a crucial part of the identification of SHE. We report the extension of the previously developed method for the detailed and systematic investigation of the reliability of the three main extant analytical formulae of alpha decay half-lives: the generalized liquid drop model based formula of Royer et al. (FR), the Sobiczewski modified semi-empirical Viola-Seaborg formula (VSS) and the recent phenomenologicalmore » formula of Sobiczewski and Parkhomenko (SP)« less
Malik, Salim S; Lythgoe, Mark P; McPhail, Mark; Monahan, Kevin J
2017-11-30
Around 5% of colorectal cancers are due to mutations within DNA mismatch repair genes, resulting in Lynch syndrome (LS). These mutations have a high penetrance with early onset of colorectal cancer at a mean age of 45 years. The mainstay of surgical management is either a segmental or extensive colectomy. Currently there is no unified agreement as to which management strategy is superior due to limited conclusive empirical evidence available. A systematic review and meta- analysis to evaluate the risk of metachronous colorectal cancer (MCC) and mortality in LS following segmental and extensive colectomy. A systematic review of the PubMed database was conducted. Studies were included/ excluded based on pre-specified criteria. To assess the risk of MCC and mortality attributed to segmental or extensive colectomies, relative risks (RR) were calculated and corresponding 95% confidence intervals (CI). Publication bias was investigated using funnel plots. Data about mortality, as well as patient ascertainment [Amsterdam criteria (AC), germline mutation (GM)] were also extracted. Statistical analysis was conducted using the R program (version 3.2.3). The literature search identified 85 studies. After further analysis ten studies were eligible for inclusion in data synthesis. Pooled data identified 1389 patients followed up for a mean of 100.7 months with a mean age of onset of 45.5 years of age. A total 1119 patients underwent segmental colectomies with an absolute risk of MCC in this group of 22.4% at the end of follow-up. The 270 patients who had extensive colectomies had a MCC absolute risk of 4.7% (0% in those with a panproctocolecomy). Segmental colectomy was significantly associated with an increased relative risk of MCC (RR = 5.12; 95% CI 2.88-9.11; Fig. 1), although no significant association with mortality was identified (RR = 1.65; 95% CI 0.90-3.02). There was no statistically significant difference in the risk of MCC between AC and GM cohorts (p = 0.5, Chi-squared test). In LS, segmental colectomy results in a significant increased risk of developing MCC. Despite the choice of segmental or extensive colectomies having no statistically significant impact on mortality, the choice of initial surgical management can impact a patient's requirement for further surgery. An extensive colectomy can result in decreased need for further surgery; reduced hospital stays and associated costs. The significant difference in the risk of MCC, following segmental or extensive colectomies should be discussed with patients when deciding appropriate management. An individualised approach should be utilised, taking into account the patient's age, co-morbidities and genotype. In order to determine likely germline-specific effects, or a difference in survival, larger and more comprehensive studies are required.
Welch, Vivian; Jull, J; Petkovic, J; Armstrong, R; Boyer, Y; Cuervo, L G; Edwards, Sjl; Lydiatt, A; Gough, D; Grimshaw, J; Kristjansson, E; Mbuagbaw, L; McGowan, J; Moher, D; Pantoja, T; Petticrew, M; Pottie, K; Rader, T; Shea, B; Taljaard, M; Waters, E; Weijer, C; Wells, G A; White, H; Whitehead, M; Tugwell, P
2015-10-21
Health equity concerns the absence of avoidable and unfair differences in health. Randomized controlled trials (RCTs) can provide evidence about the impact of an intervention on health equity for specific disadvantaged populations or in general populations; this is important for equity-focused decision-making. Previous work has identified a lack of adequate reporting guidelines for assessing health equity in RCTs. The objective of this study is to develop guidelines to improve the reporting of health equity considerations in RCTs, as an extension of the Consolidated Standards of Reporting Trials (CONSORT). A six-phase study using integrated knowledge translation governed by a study executive and advisory board will assemble empirical evidence to inform the CONSORT-equity extension. To create the guideline, the following steps are proposed: (1) develop a conceptual framework for identifying "equity-relevant trials," (2) assess empirical evidence regarding reporting of equity-relevant trials, (3) consult with global methods and content experts on how to improve reporting of health equity in RCTs, (4) collect broad feedback and prioritize items needed to improve reporting of health equity in RCTs, (5) establish consensus on the CONSORT-equity extension: the guideline for equity-relevant trials, and (6) broadly disseminate and implement the CONSORT-equity extension. This work will be relevant to a broad range of RCTs addressing questions of effectiveness for strategies to improve practice and policy in the areas of social determinants of health, clinical care, health systems, public health, and international development, where health and/or access to health care is a primary outcome. The outcomes include a reporting guideline (CONSORT-equity extension) for equity-relevant RCTs and a knowledge translation strategy to broadly encourage its uptake and use by journal editors, authors, and funding agencies.
[The Antonine plague: A global pestilence in the II century d.C].
Sáez, Andrés
2016-04-01
The Antonine plague was the first plague affecting globally the Western world. It affected all aspects of life of mankind in the Roman Empire: economics, politics, religion and the culture. The especialists set the mortality rate in the 10% of the population. On the other hand the existence of unified Roman Empire from culturally and territorially helped to spreading the plague as it could similarly occur in our society in a similar pandemic. In conclusion, it is argued that the epidemic was global in a sense of the geographical extension and the effects this had on the population.
NASA Technical Reports Server (NTRS)
Ellis, Stephen R.; Adelstein, Bernard D.; Yeom, Kiwon
2013-01-01
The Misalignment Effect Function (MEF) describes the decrement in manual performance associated with a rotation between operators' visual display frame of reference and that of their manual control. It now has been empirically determined for rotation axes oblique to canonical body axes and is compared with the MEF previously measured for rotations about canonical axes. A targeting rule, called the Secant Rule, based on these earlier measurements is derived from a hypothetical process and shown to describe some of the data from three previous experiments. It explains the motion trajectories determined for rotations less than 65deg in purely kinematic terms without the need to appeal to a mental rotation process. Further analysis of this rule in three dimensions applied to oblique rotation axes leads to a somewhat surprising expectation that the difficulty posed by rotational misalignment should get harder as the required movement is shorter. This prediction is confirmed. Geometry underlying this rule also suggests analytic extensions for predicting more generally the difficulty of making movements in arbitrary directions subject to arbitrary misalignments.
Cerebral cartography and connectomics
Sporns, Olaf
2015-01-01
Cerebral cartography and connectomics pursue similar goals in attempting to create maps that can inform our understanding of the structural and functional organization of the cortex. Connectome maps explicitly aim at representing the brain as a complex network, a collection of nodes and their interconnecting edges. This article reflects on some of the challenges that currently arise in the intersection of cerebral cartography and connectomics. Principal challenges concern the temporal dynamics of functional brain connectivity, the definition of areal parcellations and their hierarchical organization into large-scale networks, the extension of whole-brain connectivity to cellular-scale networks, and the mapping of structure/function relations in empirical recordings and computational models. Successfully addressing these challenges will require extensions of methods and tools from network science to the mapping and analysis of human brain connectivity data. The emerging view that the brain is more than a collection of areas, but is fundamentally operating as a complex networked system, will continue to drive the creation of ever more detailed and multi-modal network maps as tools for on-going exploration and discovery in human connectomics. PMID:25823870
2012-03-01
EMPIRICAL ANALYSIS OF OPTICAL ATTENUATOR PERFORMANCE IN QUANTUM KEY DISTRIBUTION SYSTEMS USING A...DISTRIBUTION IS UNLIMITED AFIT/GCS/ENG/12-01 EMPIRICAL ANALYSIS OF OPTICAL ATTENUATOR PERFORMANCE IN QUANTUM KEY DISTRIBUTION SYSTEMS USING ...challenging as the complexity of actual implementation specifics are considered. Two components common to most quantum key distribution
Fostering Effective Leadership in Foreign Contexts through Study of Cultural Values
ERIC Educational Resources Information Center
Schenck, Andrew D.
2016-01-01
While leadership styles have been extensively examined, cultural biases implicit within research methodologies often preclude application of results in foreign contexts. To more holistically comprehend the impact of culture on leadership, belief systems were empirically correlated to both transactional and transformational tendencies in public…
Theorizing and Researching Levels of Processing in Self-Regulated Learning
ERIC Educational Resources Information Center
Winne, Philip H.
2018-01-01
Background: Deep versus surface knowledge is widely discussed by educational practitioners. A corresponding construct, levels of processing, has received extensive theoretical and empirical attention in learning science and psychology. In both arenas, lower levels of information and shallower levels of processing are predicted and generally…
Identity Texts and Academic Achievement: Connecting the Dots in Multilingual School Contexts
ERIC Educational Resources Information Center
Cummins, Jim; Hu, Shirley; Markus, Paula; Kristiina Montero, M.
2015-01-01
The construct of "identity text" conjoins notions of identity affirmation and literacy engagement as equally relevant to addressing causes of underachievement among low socioeconomic status, multilingual, and marginalized group students. Despite extensive empirical evidence supporting the impact on academic achievement of both identity…
Study Guide in Health Economics.
ERIC Educational Resources Information Center
Dawson, George; Jablon, Bert
Prepared to assist students at Empire State College in developing learning contracts for the study of the economics of health care delivery, this study guide discusses various aspects of the topic, suggests student projects, and provides an extensive bibliography. First, introductory material discusses the relationship of economics to health care…
Improving Marine Ecosystem Models with Biochemical Tracers
NASA Astrophysics Data System (ADS)
Pethybridge, Heidi R.; Choy, C. Anela; Polovina, Jeffrey J.; Fulton, Elizabeth A.
2018-01-01
Empirical data on food web dynamics and predator-prey interactions underpin ecosystem models, which are increasingly used to support strategic management of marine resources. These data have traditionally derived from stomach content analysis, but new and complementary forms of ecological data are increasingly available from biochemical tracer techniques. Extensive opportunities exist to improve the empirical robustness of ecosystem models through the incorporation of biochemical tracer data and derived indices, an area that is rapidly expanding because of advances in analytical developments and sophisticated statistical techniques. Here, we explore the trophic information required by ecosystem model frameworks (species, individual, and size based) and match them to the most commonly used biochemical tracers (bulk tissue and compound-specific stable isotopes, fatty acids, and trace elements). Key quantitative parameters derived from biochemical tracers include estimates of diet composition, niche width, and trophic position. Biochemical tracers also provide powerful insight into the spatial and temporal variability of food web structure and the characterization of dominant basal and microbial food web groups. A major challenge in incorporating biochemical tracer data into ecosystem models is scale and data type mismatches, which can be overcome with greater knowledge exchange and numerical approaches that transform, integrate, and visualize data.
Jenkins, Emily K; Kothari, Anita; Bungay, Vicky; Johnson, Joy L; Oliffe, John L
2016-08-30
Much of the research and theorising in the knowledge translation (KT) field has focused on clinical settings, providing little guidance to those working in community settings. In this study, we build on previous research in community-based KT by detailing the theory driven and empirically-informed CollaboraKTion framework. A case study design and ethnographic methods were utilised to gain an in-depth understanding of the processes for conducting a community-based KT study as a means to distilling the CollaboraKTion framework. Drawing on extensive field notes describing fieldwork observations and interactions as well as evidence from the participatory research and KT literature, we detail the processes and steps undertaken in this community-based KT study as well as their rationale and the challenges encountered. In an effort to build upon existing knowledge, Kitson and colleagues' co-KT framework, which provides guidance for conducting KT aimed at addressing population-level health, was applied as a coding structure to inform the current analysis. This approach was selected because it (1) supported the application of an existing community-based KT framework to empirical data and (2) provided an opportunity to contribute to the theory and practice gaps in the community-based KT literature through an inductively derived empirical example. Analysis revealed that community-based KT is an iterative process that can be viewed as comprising five overarching processes: (1) contacting and connecting; (2) deepening understandings; (3) adapting and applying the knowledge base; (4) supporting and evaluating continued action; and (5) transitioning and embedding as well as several key elements within each of these processes (e.g. building on existing knowledge, establishing partnerships). These empirically informed theory advancements in KT and participatory research traditions are summarised in the CollaboraKTion framework. We suggest that community-based KT researchers place less emphasis on enhancing uptake of specific interventions and focus on collaboratively identifying and creating changes to the contextual factors that influence health outcomes. The CollaboraKTion framework can be used to guide the development, implementation and evaluation of contextually relevant, evidence-informed initiatives aimed at improving population health, amid providing a foundation to leverage future research and practice in this emergent KT area.
AN EMPIRICAL FORMULA FOR THE DISTRIBUTION FUNCTION OF A THIN EXPONENTIAL DISC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharma, Sanjib; Bland-Hawthorn, Joss
2013-08-20
An empirical formula for a Shu distribution function that reproduces a thin disc with exponential surface density to good accuracy is presented. The formula has two free parameters that specify the functional form of the velocity dispersion. Conventionally, this requires the use of an iterative algorithm to produce the correct solution, which is computationally taxing for applications like Markov Chain Monte Carlo model fitting. The formula has been shown to work for flat, rising, and falling rotation curves. Application of this methodology to one of the Dehnen distribution functions is also shown. Finally, an extension of this formula to reproducemore » velocity dispersion profiles that are an exponential function of radius is also presented. Our empirical formula should greatly aid the efficient comparison of disc models with large stellar surveys or N-body simulations.« less
Wodarski, John S; Feit, Marvin D
2011-01-01
The problematic behaviors of teenagers and the subsequent negative consequences are extensive and well documented: unwanted pregnancy, substance abuse, violent behavior, depression, and social and psychological consequences of unemployment. In this article, the authors review an approach that uses a cooperative learning, empirically based intervention that employs peers as teachers. This intervention of choice is Teams-Games-Tournaments (TGT), a paradigm backed by five decades of empirical support. The application of TGT in preventive health programs incorporates elements in common with other prevention programs that are based on a public health orientation and constitute the essential components of health education, that is, skills training and practice in applying skills. The TGT intervention supports the idea that children and adolescents from various socioeconomic classes, between the ages of 8 and 18 and in classrooms or groups ranging in size from 4 to 17 members, can work together for one another. TGT has been applied successfully in such diverse areas as adolescent development, sexuality education, psychoactive substance abuse education, anger control, coping with depression and suicide, nutrition, comprehensive employment preparation, and family intervention. This article reviews the extensive research on TGT using examples of successful projects in substance abuse, violence, and nutrition. Issues are raised that relate to the implementation of preventive health strategies for adolescents, including cognitive aspects, social and family networks, and intervention components.
Inducing Fuzzy Models for Student Classification
ERIC Educational Resources Information Center
Nykanen, Ossi
2006-01-01
We report an approach for implementing predictive fuzzy systems that manage capturing both the imprecision of the empirically induced classifications and the imprecision of the intuitive linguistic expressions via the extensive use of fuzzy sets. From end-users' point of view, the approach enables encapsulating the technical details of the…
Literacy, Competence and Meaning-Making: A Human Sciences Approach
ERIC Educational Resources Information Center
Nikolajeva, Maria
2010-01-01
This semiotically informed article problematizes the concept of literacy as an aesthetic activity rather than reading skills and offers strategies for assessing young readers' understanding of fictional texts. Although not based on empirical research, the essay refers to and theorizes from extensive field studies of children's responses to…
Empirical Evaluation of Directional-Dependence Tests
ERIC Educational Resources Information Center
Thoemmes, Felix
2015-01-01
Testing of directional dependence is a method to infer causal direction that recently has attracted some attention. Previous examples by e.g. von Eye and DeShon (2012a) and extensive simulation studies by Pornprasertmanit and Little (2012) have demonstrated that under specific assumptions, directional-dependence tests can recover the true causal…
Rehabilitation Counselor Work Environment: Examining Congruence with Prototypic Work Personality
ERIC Educational Resources Information Center
Zanskas, Stephen; Strohmer, Douglas C.
2010-01-01
The profession of rehabilitation counseling has undergone extensive empirical study. Absent from this research has been a theoretical basis for describing and understanding the profession and its associated work environment. The focus of this study was to further our understanding of the nature of the rehabilitation counselor's work environment…
Child Psychotherapy Dropout: An Empirical Research Review
ERIC Educational Resources Information Center
Deakin, Elisabeth; Gastaud, Marina; Nunes, Maria Lucia Tiellet
2012-01-01
This study aims to discuss the most recent data about child psychotherapy dropout, especially child psychoanalytical psychotherapy. The authors also try to offer some possible alternatives to prevent such a phenomenon. The definition of "child psychotherapy dropout" is extensively discussed. The goal has been to attempt to create a standardised…
Information Sharing in the Field of Design Research
ERIC Educational Resources Information Center
Pilerot, Ola
2015-01-01
Introduction: This paper reports on an extensive research project which aimed at exploring information sharing activities in a scholarly context. The paper presents and synthesises findings from a literature review and three qualitative case studies. The empirical setting is a geographically distributed Nordic network of design scholars. Method:…
Initial Development and Validation of the Global Citizenship Scale
ERIC Educational Resources Information Center
Morais, Duarte B.; Ogden, Anthony C.
2011-01-01
The purpose of this article is to report on the initial development of a theoretically grounded and empirically validated scale to measure global citizenship. The methodology employed is multi-faceted, including two expert face validity trials, extensive exploratory and confirmatory factor analyses with multiple datasets, and a series of three…
Relationship between Defenses, Personality, and Affect during a Stress Task in Normal Adolescents
ERIC Educational Resources Information Center
Steiner, Hans; Erickson, Sarah J.; MacLean, Peggy; Medic, Sanja; Plattner, Belinda; Koopman, Cheryl
2007-01-01
Objective: Although there are extensive data on the relationship between personality and stress reactivity in adults, there is little comparable empirical research with adolescents. This study examines the simultaneous relationships between long term functioning (personality, defenses) and observed stress reactivity (affect) in adolescents.…
ERIC Educational Resources Information Center
Huberman, Bernardo A.; Loch, Christoph H.; Onculer, Ayse
2004-01-01
The striving for status has long been recognized in sociology and economics. Extensive theoretical arguments and empirical evidence propose that people view status as a sign of competence and pursue it as a means to achieve power and resources. A small literature, however, based on arguments from biology and evolutionary psychology, proposes that…
Visual resources and the public: an empirical approach
Rachel Kaplan
1979-01-01
Visual resource management systems incorporate many assumptions about how people see the landscape. While these assumptions are not articulated, they nonetheless affect the decision process. Problems inherent in some of these assumptions are examined. Extensive research based on people's preference ratings of different settings provides insight into people's...
High-resolution, spatially extensive climate grids can be useful in regional hydrologic applications. However, in regions where precipitation is dominated by snow, snowmelt models are often used to account for timing and magnitude of water delivery. We developed an empirical, non...
Olson's "Cognitive Development": A Commentary.
ERIC Educational Resources Information Center
Follettie, Joseph F.
This report is a review of Olson's "Cognitive Development." Unlike a typical book review it does not compare and contrast the author's theoretical framework and methodological practices with those of others in the field, but rather it extensively describes and critiques the reported empirical work. The reasons given for this approach are that…
Assessing the Value of E-Learning Systems
ERIC Educational Resources Information Center
Levy, Yair
2006-01-01
"Assessing the Value of E-Learning Systems" provides an extensive literature review pulling theories from the field of information systems, psychology and cognitive sciences, distance and online learning, as well as marketing and decision sciences. This book provides empirical evidence for the power of measuring value in the context of e-learning…
The Status of Projective Techniques: Or, "Wishing Won't Make It Go Away."
ERIC Educational Resources Information Center
Piotrowski, Chris
The predicted decline in usefulness and emphasis of projective techniques was analyzed from several different perspectives including the academic community, members of the American Psychological Association (APA) Division 12, internship centers, the applied clinical setting, and private practitioners. In addition, an extensive review of empirical,…
NASA Astrophysics Data System (ADS)
Fioretti, Guido
2007-02-01
The productions function maps the inputs of a firm or a productive system onto its outputs. This article expounds generalizations of the production function that include state variables, organizational structures and increasing returns to scale. These extensions are needed in order to explain the regularities of the empirical distributions of certain economic variables.
ERIC Educational Resources Information Center
Barrow, Robin
2004-01-01
Recent empirical research into the brain, while reinforcing the view that we are extensively "programmed", does not refute the idea of a distinctive human mind. The human mind is primarily a product of the human capacity for a distinctive kind of language. Human language is thus what gives us our consciousness, reasoning capacity and autonomy. To…
Empirical Validation and Application of the Computing Attitudes Survey
ERIC Educational Resources Information Center
Dorn, Brian; Elliott Tew, Allison
2015-01-01
Student attitudes play an important role in shaping learning experiences. However, few validated instruments exist for measuring student attitude development in a discipline-specific way. In this paper, we present the design, development, and validation of the computing attitudes survey (CAS). The CAS is an extension of the Colorado Learning…
Creativity and Flow in Musical Composition: An Empirical Investigation
ERIC Educational Resources Information Center
MacDonald, Raymond; Byrne, Charles; Carlton, Lana
2006-01-01
Although an extensive literature exists on creativity and music, there is a lack of published research investigating possible links between musical creativity and Csikszentmihalyi's concept of flow or optimal experience. This article examines a group composition task to study the relationships between creativity, flow and the quality of the…
Atay, Christina; Conway, Erin R.; Angus, Daniel; Wiles, Janet; Baker, Rosemary; Chenery, Helen J.
2015-01-01
The progressive neuropathology involved in dementia frequently causes a gradual decline in communication skills. Communication partners who are unaware of the specific communication problems faced by people with dementia (PWD) can inadvertently challenge their conversation partner, leading to distress and a reduced flow of information between speakers. Previous research has produced an extensive literature base recommending strategies to facilitate conversational engagement in dementia. However, empirical evidence for the beneficial effects of these strategies on conversational dynamics is sparse. This study uses a time-efficient computational discourse analysis tool called Discursis to examine the link between specific communication behaviours and content-based conversational engagement in 20 conversations between PWD living in residential aged-care facilities and care staff members. Conversations analysed here were baseline conversations recorded before staff members underwent communication training. Care staff members spontaneously exhibited a wide range of facilitative and non-facilitative communication behaviours, which were coded for analysis of conversation dynamics within these baseline conversations. A hybrid approach combining manual coding and automated Discursis metric analysis provides two sets of novel insights. Firstly, this study revealed nine communication behaviours that, if used by the care staff member in a given turn, significantly increased the appearance of subsequent content-based engagement in the conversation by PWD. Secondly, the current findings reveal alignment between human- and computer-generated labelling of communication behaviour for 8 out of the total 22 behaviours under investigation. The approach demonstrated in this study provides an empirical procedure for the detailed evaluation of content-based conversational engagement associated with specific communication behaviours. PMID:26658135
Linking knowledge and action through mental models of sustainable agriculture.
Hoffman, Matthew; Lubell, Mark; Hillis, Vicken
2014-09-09
Linking knowledge to action requires understanding how decision-makers conceptualize sustainability. This paper empirically analyzes farmer "mental models" of sustainability from three winegrape-growing regions of California where local extension programs have focused on sustainable agriculture. The mental models are represented as networks where sustainability concepts are nodes, and links are established when a farmer mentions two concepts in their stated definition of sustainability. The results suggest that winegrape grower mental models of sustainability are hierarchically structured, relatively similar across regions, and strongly linked to participation in extension programs and adoption of sustainable farm practices. We discuss the implications of our findings for the debate over the meaning of sustainability, and the role of local extension programs in managing knowledge systems.
Linking knowledge and action through mental models of sustainable agriculture
Hoffman, Matthew; Lubell, Mark; Hillis, Vicken
2014-01-01
Linking knowledge to action requires understanding how decision-makers conceptualize sustainability. This paper empirically analyzes farmer “mental models” of sustainability from three winegrape-growing regions of California where local extension programs have focused on sustainable agriculture. The mental models are represented as networks where sustainability concepts are nodes, and links are established when a farmer mentions two concepts in their stated definition of sustainability. The results suggest that winegrape grower mental models of sustainability are hierarchically structured, relatively similar across regions, and strongly linked to participation in extension programs and adoption of sustainable farm practices. We discuss the implications of our findings for the debate over the meaning of sustainability, and the role of local extension programs in managing knowledge systems. PMID:25157158
From conscious thought to automatic action: A simulation account of action planning.
Martiny-Huenger, Torsten; Martiny, Sarah E; Parks-Stamm, Elizabeth J; Pfeiffer, Elisa; Gollwitzer, Peter M
2017-10-01
We provide a theoretical framework and empirical evidence for how verbally planning an action creates direct perception-action links and behavioral automaticity. We argue that planning actions in an if (situation)-then (action) format induces sensorimotor simulations (i.e., activity patterns reenacting the event in the sensory and motor brain areas) of the anticipated situation and the intended action. Due to their temporal overlap, these activity patterns become linked. Whenever the previously simulated situation is encountered, the previously simulated action is partially reactivated through spreading activation and thus more likely to be executed. In 4 experiments (N = 363), we investigated the relation between specific if-then action plans worded to activate simulations of elbow flexion versus extension movements and actual elbow flexion versus extension movements in a subsequent, ostensibly unrelated categorization task. As expected, linking a critical stimulus to intended actions that implied elbow flexion movements (e.g., grabbing it for consumption) subsequently facilitated elbow flexion movements upon encountering the critical stimulus. However, linking a critical stimulus to actions that implied elbow extension movements (e.g., pointing at it) subsequently facilitated elbow extension movements upon encountering the critical stimulus. Thus, minor differences (i.e., exchanging the words "point at" with "grab") in verbally formulated action plans (i.e., conscious thought) had systematic consequences on subsequent actions. The question of how conscious thought can induce stimulus-triggered action is illuminated by the provided theoretical framework and the respective empirical evidence, facilitating the understanding of behavioral automaticity and human agency. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Sexual selection and mate choice.
Andersson, Malte; Simmons, Leigh W
2006-06-01
The past two decades have seen extensive growth of sexual selection research. Theoretical and empirical work has clarified many components of pre- and postcopulatory sexual selection, such as aggressive competition, mate choice, sperm utilization and sexual conflict. Genetic mechanisms of mate choice evolution have been less amenable to empirical testing, but molecular genetic analyses can now be used for incisive experimentation. Here, we highlight some of the currently debated areas in pre- and postcopulatory sexual selection. We identify where new techniques can help estimate the relative roles of the various selection mechanisms that might work together in the evolution of mating preferences and attractive traits, and in sperm-egg interactions.
Reconstructing the world trade multiplex: The role of intensive and extensive biases
NASA Astrophysics Data System (ADS)
Mastrandrea, Rossana; Squartini, Tiziano; Fagiolo, Giorgio; Garlaschelli, Diego
2014-12-01
In economic and financial networks, the strength of each node has always an important economic meaning, such as the size of supply and demand, import and export, or financial exposure. Constructing null models of networks matching the observed strengths of all nodes is crucial in order to either detect interesting deviations of an empirical network from economically meaningful benchmarks or reconstruct the most likely structure of an economic network when the latter is unknown. However, several studies have proved that real economic networks and multiplexes topologically differ from configurations inferred only from node strengths. Here we provide a detailed analysis of the world trade multiplex by comparing it to an enhanced null model that simultaneously reproduces the strength and the degree of each node. We study several temporal snapshots and almost 100 layers (commodity classes) of the multiplex and find that the observed properties are systematically well reproduced by our model. Our formalism allows us to introduce the (static) concept of extensive and intensive bias, defined as a measurable tendency of the network to prefer either the formation of extra links or the reinforcement of link weights, with respect to a reference case where only strengths are enforced. Our findings complement the existing economic literature on (dynamic) intensive and extensive trade margins. More generally, they show that real-world multiplexes can be strongly shaped by layer-specific local constraints.
An analysis of empirical estimates of sexual aggression victimization and perpetration.
Spitzberg, B H
1999-01-01
Estimates of prevalence for several categories of sexual coercion, including rape and attempted rape, were statistically aggregated across 120 studies, involving over 100,000 subjects. According to the data, almost 13% of women and over 3% of men have been raped, and almost 5% of men claim to have perpetrated rape. In contrast, about 25% of women and men claim to have been sexually coerced and to have perpetrated sexual coercion. In general, the mediating variables examined--population type, decade, date of publication, and type of operationalization--were not consistently related to rates of victimization or perpetration. Nevertheless, the extensive variation among study estimates strongly suggests the possibility of systematic sources of variation that have yet to be identified. Further analyses are called for to disentangle such sources.
Semantic memory: a feature-based analysis and new norms for Italian.
Montefinese, Maria; Ambrosini, Ettore; Fairfield, Beth; Mammarella, Nicola
2013-06-01
Semantic norms for properties produced by native speakers are valuable tools for researchers interested in the structure of semantic memory and in category-specific semantic deficits in individuals following brain damage. The aims of this study were threefold. First, we sought to extend existing semantic norms by adopting an empirical approach to category (Exp. 1) and concept (Exp. 2) selection, in order to obtain a more representative set of semantic memory features. Second, we extensively outlined a new set of semantic production norms collected from Italian native speakers for 120 artifactual and natural basic-level concepts, using numerous measures and statistics following a feature-listing task (Exp. 3b). Finally, we aimed to create a new publicly accessible database, since only a few existing databases are publicly available online.
Structural Patterns in Empirical Research Articles: A Cross-Disciplinary Study
ERIC Educational Resources Information Center
Lin, Ling; Evans, Stephen
2012-01-01
This paper presents an analysis of the major generic structures of empirical research articles (RAs), with a particular focus on disciplinary variation and the relationship between the adjacent sections in the introductory and concluding parts. The findings were derived from a close "manual" analysis of 433 recent empirical RAs from high-impact…
[Legionnaire's disease with predominant liver involvement].
Magro Molina, A; Plaza Poquet, V; Giner Galvañ, V
2002-04-01
Like other pneumonias due to atypical agents, pneumonia due to Legionela Pneumophila has no characteristic clinical facts, although fever and non-productive cough are almost constant and diarrhea with changes in mental status are common. Hyponatremia and moderate transient hypertransaminasemia are common too. Severe systemic affectation after hematogenous dissemination similar to those described with typical bacterial pneumonias is a prominent difference with other atypical agents, with high mortality rates in the absence of appropriate treatment. Etiological diagnosis is very difficult and it is normally achieved late in the course of the infection. Because of diagnostic difficulties and potential mortality in predisposed patients, empirical antibiotherapy has been extensively recommended. We present a patient affected by critical community-acquired pneumonia due to Legionela Pneumophila serogroup 1 with liver alteration as the main manifestation and good response to empirical antibiotherapy with claritromycine and rifampin. We recommended the empirical use of such therapy in those pneumonias without microbiological diagnosis and torpid evolution.
ERIC Educational Resources Information Center
Halloran, Roberta Kathryn
2011-01-01
Self-regulation, executive function and working memory are areas of cognitive processing that have been studied extensively. Although many studies have examined the constructs, there is limited empirical support suggesting a formal link between the three cognitive processes and their prediction of academic achievement. Thus, the present study…
Sample Size Estimation in Cluster Randomized Educational Trials: An Empirical Bayes Approach
ERIC Educational Resources Information Center
Rotondi, Michael A.; Donner, Allan
2009-01-01
The educational field has now accumulated an extensive literature reporting on values of the intraclass correlation coefficient, a parameter essential to determining the required size of a planned cluster randomized trial. We propose here a simple simulation-based approach including all relevant information that can facilitate this task. An…
English as a Lingua Franca in Europe: An Empirical Perspective
ERIC Educational Resources Information Center
Breiteneder, Angelika
2009-01-01
In 2008, the need for intra-European communication has long exceeded the limits set by language barriers. As a result, English acts extensively as a lingua franca among Europeans with different mother tongues, particularly so in the professional domains of education, business, international relations and scientific research. Yet, despite its…
Group Performance in Information Systems Project Groups: An Empirical Study
ERIC Educational Resources Information Center
Bahli, Bouchaib; Buyukkurt, Meral Demirbag
2005-01-01
The importance of teamwork in Information Systems Development (ISD) practice and education has been acknowledged but not studied extensively to date. This paper tests a model of how groups participating in ISD projects perform and examines the relationships between some antecedents of this performance based on group research theory well…
Training Older Adults about Alzheimer's Disease--Impact on Knowledge and Fear
ERIC Educational Resources Information Center
Scerri, Anthony; Scerri, Charles
2017-01-01
Although the impact of Alzheimer's disease training programs directed to informal and formal caregivers has been extensively studied, programs for older adults who do not have the disease are relatively few. Moreover, increased knowledge increases fear of the disease, even though there is little empirical evidence to support this. This study…
Lessons Learned from Instructional Design Theory: An Application in Management Education
ERIC Educational Resources Information Center
Burke, Lisa A.
2007-01-01
Given that many doctoral programs do not provide extensive training on how to present course information in the classroom, the current paper looks to educational psychology theory and research for guidance. Richard Mayer and others' copious empirical work on effective and ineffective instructional design, along with relevant research findings in…
Sex Ratios, Economic Power, and Women's Roles: A Theoretical Extension and Empirical Test.
ERIC Educational Resources Information Center
South, Scott J.
1988-01-01
Tested hypotheses concerning sex ratios, women's roles, and economic power with data from 111 countries. Found undersupply of women positively associated with proportion of women who marry and fertility rate; inversely associated with women's average age at marriage, literacy rate, and divorce rate. Suggests women's economic power may counteract…
ERIC Educational Resources Information Center
Dee, Thomas; Penner, Emily
2016-01-01
An extensive theoretical and qualitative literature stresses the promise of instructional practices and content aligned with the cultural experiences of minority students. Ethnic studies courses provide a growing but controversial example of such "culturally relevant pedagogy." However, the empirical evidence on the effectiveness of…
ERIC Educational Resources Information Center
St. Pierre, Tena L.; Kaltreider, D. Lynne
2004-01-01
Despite availability of empirically supported school-based substance abuse prevention programs, adoption and implementation fidelity of such programs appear to be low. A replicated case study was conducted to investigate school adoption and implementation processes of the EXSELS model (Project ALERT delivered by program leaders through Cooperative…
ERIC Educational Resources Information Center
Cousans, Fran; Patterson, Fiona; Edwards, Helena; Walker, Kim; McLachlan, John C.; Good, David
2017-01-01
Although there is extensive evidence confirming the predictive validity of situational judgement tests (SJTs) in medical education, there remains a shortage of evidence for their predictive validity for performance of postgraduate trainees in their first role in clinical practice. Moreover, to date few researchers have empirically examined the…
Learning in the Liminal Space: A Semiotic Approach to Threshold Concepts
ERIC Educational Resources Information Center
Land, Ray; Rattray, Julie; Vivian, Peter
2014-01-01
The threshold concepts approach to student learning and curriculum design now informs an empirical research base comprising over 170 disciplinary and professional contexts. It draws extensively on the notion of troublesomeness in a "liminal" space of learning. The latter is a transformative state in the process of learning in which there…
The net economic value of wilderness
J. Michael Bowker; J.E. Harvard; John C. Bergstrom; H. Ken Cordell; Donald B.K. English; John B. Loomis
2005-01-01
The purpose of this chapter is to inventory and assess what is currently known about the economic or "dollar" values accruing to Americans from the National Wilderness Preservation System. This chapter identifies the benefits of Wilderness and the economic value of these benefits through an extensive review of published conceptual and empirical literature. It...
A Unified Approach to Measurement Error and Missing Data: Details and Extensions
ERIC Educational Resources Information Center
Blackwell, Matthew; Honaker, James; King, Gary
2017-01-01
We extend a unified and easy-to-use approach to measurement error and missing data. In our companion article, Blackwell, Honaker, and King give an intuitive overview of the new technique, along with practical suggestions and empirical applications. Here, we offer more precise technical details, more sophisticated measurement error model…
Task Oriented Tools for Information Retrieval
ERIC Educational Resources Information Center
Yang, Peilin
2017-01-01
Information Retrieval (IR) is one of the most evolving research fields and has drawn extensive attention in recent years. Because of its empirical nature, the advance of the IR field is closely related to the development of various toolkits. While the traditional IR toolkit mainly provides a platform to evaluate the effectiveness of retrieval…
Implicit Theories of Ability in Physical Education: Current Issues and Future Directions
ERIC Educational Resources Information Center
Warburton, Victoria Emily; Spray, Christopher Mark
2017-01-01
Purpose: In light of the extensive empirical evidence that implicit theories have important motivational consequences for young people across a range of educational settings we seek to provide a summary of, and personal reflection on, implicit theory research and practice in physical education (PE). Overview: We first provide an introduction to…
Expert Panel Reviews of Research Centers: The Site Visit Process
ERIC Educational Resources Information Center
Lawrenz, Frances; Thao, Mao; Johnson, Kelli
2012-01-01
Site visits are used extensively in a variety of settings within the evaluation community. They are especially common in making summative value decisions about the quality and worth of research programs/centers. However, there has been little empirical research and guidance about how to appropriately conduct evaluative site visits of research…
Characteristics of Academically Excellent Business Studies Students in a Post-1992 University
ERIC Educational Resources Information Center
Bennett, Roger; Barkensjo, Anna
2005-01-01
In contrast to the extensive investigation of the characteristics of students who fail or perform badly in "new" universities, research into the factors associated with academic excellence within post-1992 institutions has been sparse. This empirical study examined the profile of a sample of 81 high-flying business studies undergraduates…
ERIC Educational Resources Information Center
Brasiel, Sarah; Martin, Taylor; Jeong, Soojeong; Yuan, Min
2016-01-01
An extensive body of research has demonstrated that the use in a K-12 classroom of technology, such as the Internet, computers, and software programs, enhances the learning of mathematics (Cheung & Slavin, 2013; Cohen & Hollebrands, 2011). In particular, growing empirical evidence supports that certain types of technology, such as…
The Construction of the Self: A Developmental Perspective.
ERIC Educational Resources Information Center
Harter, Susan
Drawing upon extensive theoretical knowledge and decades of empirical research, this book traces changes in the structure and content of self-representations from early childhood through late adolescence. Chapter 1 includes a discussion of the self as subject (I-self) and object (Me-self) and describes the historical roots of contemporary issues…
Building More Solid Bridges between Buddhism and Western Psychology
ERIC Educational Resources Information Center
Sugamura, Genji; Haruki, Yutaka; Koshikawa, Fusako
2007-01-01
Introducing the ways of cultivating mental balance, B. A. Wallace and S. L. Shapiro attempted to build bridges between Buddhism and psychology. Their systematic categorization of Buddhist teachings and extensive review of empirical support from Western psychology are valuable for future study. However, it remains a matter of concern that some more…
Systems for Instructional Improvement: Creating Coherence from the Classroom to the District Office
ERIC Educational Resources Information Center
Cobb, Paul; Jackson, Kara; Henrick, Erin; Smith, Thomas M.
2018-01-01
In "Systems for Instructional Improvement," Paul Cobb and his colleagues draw on their extensive research to propose a series of specific, empirically grounded recommendations that together constitute a theory of action for advancing instruction at scale. The authors outline the elements of a coherent instructional system; describe…
A Social Psychological Exploration of Power Motivation Among Disadvantaged Workers.
ERIC Educational Resources Information Center
Levitin, Teresa Ellen
An extensive review of the literature on the social psychology of social power led to the conclusion that the area contains many unrelated, noncumulative theoretical and empirical works. Three conceptual distinctions were introduced to facilitate the systematic study of social power. Effectance motivation was used to describe the joint, often…
Can Multifactor Models of Teaching Improve Teacher Effectiveness Measures?
ERIC Educational Resources Information Center
Lazarev, Valeriy; Newman, Denis
2014-01-01
NCLB waiver requirements have led to development of teacher evaluation systems, in which student growth is a significant component. Recent empirical research has been focusing on metrics of student growth--value-added scores in particular--and their relationship to other metrics. An extensive set of recent teacher-evaluation studies conducted by…
A Comparison of Flexible Prompt Fading and Constant Time Delay for Five Children with Autism
ERIC Educational Resources Information Center
Soluaga, Doris; Leaf, Justin B.; Taubman, Mitchell; McEachin, John; Leaf, Ron
2008-01-01
Given the increasing rates of autism, identifying prompting procedures that can assist in the development of more optimal learning opportunities for this population is critical. Extensive empirical research exists supporting the effectiveness of various prompting strategies. Constant time delay (CTD) is a highly implemented prompting procedure…
Parenting Behaviour among Parents of Children with Autism Spectrum Disorder
ERIC Educational Resources Information Center
Lambrechts, Greet; Van Leeuwen, Karla; Boonen, Hannah; Maes, Bea; Noens, Ilse
2011-01-01
Contrary to the extensive amount of empirical findings about parental perceptions, parenting cognitions, and coping in families with a child with autism spectrum disorder (ASD), research about parenting itself is very scarce. A first goal of this study was to examine the factor structure and internal consistency of two scales to measure parenting…
Application of LSP Texts in Translator Training
ERIC Educational Resources Information Center
Ilynska, Larisa; Smirnova, Tatjana; Platonova, Marina
2017-01-01
The paper presents discussion of the results of extensive empirical research into efficient methods of educating and training translators of LSP (language for special purposes) texts. The methodology is based on using popular LSP texts in the respective fields as one of the main media for translator training. The aim of the paper is to investigate…
Causal Responsibility and Counterfactuals
Lagnado, David A; Gerstenberg, Tobias; Zultan, Ro'i
2013-01-01
How do people attribute responsibility in situations where the contributions of multiple agents combine to produce a joint outcome? The prevalence of over-determination in such cases makes this a difficult problem for counterfactual theories of causal responsibility. In this article, we explore a general framework for assigning responsibility in multiple agent contexts. We draw on the structural model account of actual causation (e.g., Halpern & Pearl, 2005) and its extension to responsibility judgments (Chockler & Halpern, 2004). We review the main theoretical and empirical issues that arise from this literature and propose a novel model of intuitive judgments of responsibility. This model is a function of both pivotality (whether an agent made a difference to the outcome) and criticality (how important the agent is perceived to be for the outcome, before any actions are taken). The model explains empirical results from previous studies and is supported by a new experiment that manipulates both pivotality and criticality. We also discuss possible extensions of this model to deal with a broader range of causal situations. Overall, our approach emphasizes the close interrelations between causality, counterfactuals, and responsibility attributions. PMID:23855451
NASA Astrophysics Data System (ADS)
Hargrove, W. W.; Hoffman, F. M.; Kumar, J.; Spruce, J.; Norman, S. P.
2013-12-01
Here we present diverse examples where empirical mining and statistical analysis of large data sets have already been shown to be useful for a wide variety of practical decision-making problems within the realm of large-scale ecology. Because a full understanding and appreciation of particular ecological phenomena are possible only after hypothesis-directed research regarding the existence and nature of that process, some ecologists may feel that purely empirical data harvesting may represent a less-than-satisfactory approach. Restricting ourselves exclusively to process-driven approaches, however, may actually slow progress, particularly for more complex or subtle ecological processes. We may not be able to afford the delays caused by such directed approaches. Rather than attempting to formulate and ask every relevant question correctly, empirical methods allow trends, relationships and associations to emerge freely from the data themselves, unencumbered by a priori theories, ideas and prejudices that have been imposed upon them. Although they cannot directly demonstrate causality, empirical methods can be extremely efficient at uncovering strong correlations with intermediate "linking" variables. In practice, these correlative structures and linking variables, once identified, may provide sufficient predictive power to be useful themselves. Such correlation "shadows" of causation can be harnessed by, e.g., Bayesian Belief Nets, which bias ecological management decisions, made with incomplete information, toward favorable outcomes. Empirical data-harvesting also generates a myriad of testable hypotheses regarding processes, some of which may even be correct. Quantitative statistical regionalizations based on quantitative multivariate similarity have lended insights into carbon eddy-flux direction and magnitude, wildfire biophysical conditions, phenological ecoregions useful for vegetation type mapping and monitoring, forest disease risk maps (e.g., sudden oak death), global aquatic ecoregion risk maps for aquatic invasives, and forest vertical structure ecoregions (e.g., using extensive LiDAR data sets). Multivariate Spatio-Temporal Clustering, which quantitatively places alternative future conditions on a common footing with present conditions, allows prediction of present and future shifts in tree species ranges, given alternative climatic change forecasts. ForWarn, a forest disturbance detection and monitoring system mining 12 years of national 8-day MODIS phenology data, has been operating since 2010, producing national maps every 8 days showing many kinds of potential forest disturbances. Forest resource managers can view disturbance maps via a web-based viewer, and alerts are issued when particular forest disturbances are seen. Regression-based decadal trend analysis showing long-term forest thrive and decline areas, and individual-based, brute-force supercomputing to map potential movement corridors and migration routes across landscapes will also be discussed. As significant ecological changes occur with increasing rapidity, such empirical data-mining approaches may be the most efficient means to help land managers find the best, most-actionable policies and decision strategies.
Event time analysis of longitudinal neuroimage data.
Sabuncu, Mert R; Bernal-Rusiel, Jorge L; Reuter, Martin; Greve, Douglas N; Fischl, Bruce
2014-08-15
This paper presents a method for the statistical analysis of the associations between longitudinal neuroimaging measurements, e.g., of cortical thickness, and the timing of a clinical event of interest, e.g., disease onset. The proposed approach consists of two steps, the first of which employs a linear mixed effects (LME) model to capture temporal variation in serial imaging data. The second step utilizes the extended Cox regression model to examine the relationship between time-dependent imaging measurements and the timing of the event of interest. We demonstrate the proposed method both for the univariate analysis of image-derived biomarkers, e.g., the volume of a structure of interest, and the exploratory mass-univariate analysis of measurements contained in maps, such as cortical thickness and gray matter density. The mass-univariate method employs a recently developed spatial extension of the LME model. We applied our method to analyze structural measurements computed using FreeSurfer, a widely used brain Magnetic Resonance Image (MRI) analysis software package. We provide a quantitative and objective empirical evaluation of the statistical performance of the proposed method on longitudinal data from subjects suffering from Mild Cognitive Impairment (MCI) at baseline. Copyright © 2014 Elsevier Inc. All rights reserved.
Rostami, Javad; Chen, Jingming; Tse, Peter W.
2017-01-01
Ultrasonic guided waves have been extensively applied for non-destructive testing of plate-like structures particularly pipes in past two decades. In this regard, if a structure has a simple geometry, obtained guided waves’ signals are easy to explain. However, any small degree of complexity in the geometry such as contacting with other materials may cause an extra amount of complication in the interpretation of guided wave signals. The problem deepens if defects have irregular shapes such as natural corrosion. Signal processing techniques that have been proposed for guided wave signals’ analysis are generally good for simple signals obtained in a highly controlled experimental environment. In fact, guided wave signals in a real situation such as the existence of natural corrosion in wall-covered pipes are much more complicated. Considering pipes in residential buildings that pass through concrete walls, in this paper we introduced Smooth Empirical Mode Decomposition (SEMD) to efficiently separate overlapped guided waves. As empirical mode decomposition (EMD) which is a good candidate for analyzing non-stationary signals, suffers from some shortcomings, wavelet transform was adopted in the sifting stage of EMD to improve its outcome in SEMD. However, selection of mother wavelet that suits best for our purpose plays an important role. Since in guided wave inspection, the incident waves are well known and are usually tone-burst signals, we tailored a complex tone-burst signal to be used as our mother wavelet. In the sifting stage of EMD, wavelet de-noising was applied to eliminate unwanted frequency components from each IMF. SEMD greatly enhances the performance of EMD in guided wave analysis for highly contaminated signals. In our experiment on concrete covered pipes with natural corrosion, this method not only separates the concrete wall indication clearly in time domain signal, a natural corrosion with complex geometry that was hidden and located inside the concrete section was successfully exposed. PMID:28178220
Staver, A Carla; Archibald, Sally; Levin, Simon
2011-05-01
Savannas are known as ecosystems with tree cover below climate-defined equilibrium values. However, a predictive framework for understanding constraints on tree cover is lacking. We present (a) a spatially extensive analysis of tree cover and fire distribution in sub-Saharan Africa, and (b) a model, based on empirical results, demonstrating that savanna and forest may be alternative stable states in parts of Africa, with implications for understanding savanna distributions. Tree cover does not increase continuously with rainfall, but rather is constrained to low (<50%, "savanna") or high tree cover (>75%, "forest"). Intermediate tree cover rarely occurs. Fire, which prevents trees from establishing, differentiates high and low tree cover, especially in areas with rainfall between 1000 mm and 2000 mm. Fire is less important at low rainfall (<1000 mm), where rainfall limits tree cover, and at high rainfall (>2000 mm), where fire is rare. This pattern suggests that complex interactions between climate and disturbance produce emergent alternative states in tree cover. The relationship between tree cover and fire was incorporated into a dynamic model including grass, savanna tree saplings, and savanna trees. Only recruitment from sapling to adult tree varied depending on the amount of grass in the system. Based on our empirical analysis and previous work, fires spread only at tree cover of 40% or less, producing a sigmoidal fire probability distribution as a function of grass cover and therefore a sigmoidal sapling to tree recruitment function. This model demonstrates that, given relatively conservative and empirically supported assumptions about the establishment of trees in savannas, alternative stable states for the same set of environmental conditions (i.e., model parameters) are possible via a fire feedback mechanism. Integrating alternative stable state dynamics into models of biome distributions could improve our ability to predict changes in biome distributions and in carbon storage under climate and global change scenarios.
Rostami, Javad; Chen, Jingming; Tse, Peter W
2017-02-07
Ultrasonic guided waves have been extensively applied for non-destructive testing of plate-like structures particularly pipes in past two decades. In this regard, if a structure has a simple geometry, obtained guided waves' signals are easy to explain. However, any small degree of complexity in the geometry such as contacting with other materials may cause an extra amount of complication in the interpretation of guided wave signals. The problem deepens if defects have irregular shapes such as natural corrosion. Signal processing techniques that have been proposed for guided wave signals' analysis are generally good for simple signals obtained in a highly controlled experimental environment. In fact, guided wave signals in a real situation such as the existence of natural corrosion in wall-covered pipes are much more complicated. Considering pipes in residential buildings that pass through concrete walls, in this paper we introduced Smooth Empirical Mode Decomposition (SEMD) to efficiently separate overlapped guided waves. As empirical mode decomposition (EMD) which is a good candidate for analyzing non-stationary signals, suffers from some shortcomings, wavelet transform was adopted in the sifting stage of EMD to improve its outcome in SEMD. However, selection of mother wavelet that suits best for our purpose plays an important role. Since in guided wave inspection, the incident waves are well known and are usually tone-burst signals, we tailored a complex tone-burst signal to be used as our mother wavelet. In the sifting stage of EMD, wavelet de-noising was applied to eliminate unwanted frequency components from each IMF. SEMD greatly enhances the performance of EMD in guided wave analysis for highly contaminated signals. In our experiment on concrete covered pipes with natural corrosion, this method not only separates the concrete wall indication clearly in time domain signal, a natural corrosion with complex geometry that was hidden and located inside the concrete section was successfully exposed.
Asymmetric multiscale detrended fluctuation analysis of California electricity spot price
NASA Astrophysics Data System (ADS)
Fan, Qingju
2016-01-01
In this paper, we develop a new method called asymmetric multiscale detrended fluctuation analysis, which is an extension of asymmetric detrended fluctuation analysis (A-DFA) and can assess the asymmetry correlation properties of series with a variable scale range. We investigate the asymmetric correlations in California 1999-2000 power market after filtering some periodic trends by empirical mode decomposition (EMD). Our findings show the coexistence of symmetric and asymmetric correlations in the price series of 1999 and strong asymmetric correlations in 2000. What is more, we detect subtle correlation properties of the upward and downward price series for most larger scale intervals in 2000. Meanwhile, the fluctuations of Δα(s) (asymmetry) and | Δα(s) | (absolute asymmetry) are more significant in 2000 than that in 1999 for larger scale intervals, and they have similar characteristics for smaller scale intervals. We conclude that the strong asymmetry property and different correlation properties of upward and downward price series for larger scale intervals in 2000 have important implications on the collapse of California power market, and our findings shed a new light on the underlying mechanisms of power price.
NASA Astrophysics Data System (ADS)
Joevivek, V.; Chandrasekar, N.; Saravanan, S.; Anandakumar, H.; Thanushkodi, K.; Suguna, N.; Jaya, J.
2018-06-01
Investigation of a beach and its wave conditions is highly requisite for understanding the physical processes in a coast. This study composes spatial and temporal correlation between beach and nearshore processes along the extensive sandy beach of Nagapattinam coast, southeast peninsular India. The data collection includes beach profile, wave data, and intertidal sediment samples for 2 years from January 2011 to January 2013. The field data revealed significant variability in beach and wave morphology during the northeast (NE) and southwest (SW) monsoon. However, the beach has been stabilized by the reworking of sediment distribution during the calm period. The changes in grain sorting and longshore sediment transport serve as a clear evidence of the sediment migration that persevered between foreshore and nearshore regions. The Empirical Orthogonal Function (EOF) analysis and Canonical Correlation Analysis (CCA) were utilized to investigate the spatial and temporal linkages between beach and nearshore criterions. The outcome of the multivariate analysis unveiled that the seasonal variations in the wave climate tends to influence the bar-berm sediment transition that is discerned in the coast.
Work, malaise, and well-being in Spanish and Latin-American doctors
Ochoa, Paola; Blanch, Josep M
2016-01-01
ABSTRACT OBJECTIVE To analyze the relations between the meanings of working and the levels of doctors work well-being in the context of their working conditions. METHOD The research combined the qualitative methodology of textual analysis and the quantitative one of correspondence factor analysis. A convenience, intentional, and stratified sample composed of 305 Spanish and Latin American doctors completed an extensive questionnaire on the topics of the research. RESULTS The general meaning of working for the group located in the quartile of malaise included perceptions of discomfort, frustration, and exhaustion. However, those showing higher levels of well-being, located on the opposite quartile, associated their working experience with good conditions and the development of their professional and personal competences. CONCLUSIONS The study provides empirical evidence of the relationship between contextual factors and the meanings of working for participants with higher levels of malaise, and of the importance granted both to intrinsic and extrinsic factors by those who scored highest on well-being. PMID:27191157
Molenaar, Dylan; Tuerlinckx, Francis; van der Maas, Han L J
2015-01-01
A generalized linear modeling framework to the analysis of responses and response times is outlined. In this framework, referred to as bivariate generalized linear item response theory (B-GLIRT), separate generalized linear measurement models are specified for the responses and the response times that are subsequently linked by cross-relations. The cross-relations can take various forms. Here, we focus on cross-relations with a linear or interaction term for ability tests, and cross-relations with a curvilinear term for personality tests. In addition, we discuss how popular existing models from the psychometric literature are special cases in the B-GLIRT framework depending on restrictions in the cross-relation. This allows us to compare existing models conceptually and empirically. We discuss various extensions of the traditional models motivated by practical problems. We also illustrate the applicability of our approach using various real data examples, including data on personality and cognitive ability.
The evolution of speech: a comparative review.
Fitch
2000-07-01
The evolution of speech can be studied independently of the evolution of language, with the advantage that most aspects of speech acoustics, physiology and neural control are shared with animals, and thus open to empirical investigation. At least two changes were necessary prerequisites for modern human speech abilities: (1) modification of vocal tract morphology, and (2) development of vocal imitative ability. Despite an extensive literature, attempts to pinpoint the timing of these changes using fossil data have proven inconclusive. However, recent comparative data from nonhuman primates have shed light on the ancestral use of formants (a crucial cue in human speech) to identify individuals and gauge body size. Second, comparative analysis of the diverse vertebrates that have evolved vocal imitation (humans, cetaceans, seals and birds) provides several distinct, testable hypotheses about the adaptive function of vocal mimicry. These developments suggest that, for understanding the evolution of speech, comparative analysis of living species provides a viable alternative to fossil data. However, the neural basis for vocal mimicry and for mimesis in general remains unknown.
NASA Astrophysics Data System (ADS)
Lange, Rense
2015-02-01
An extension of concurrent validity is proposed that uses qualitative data for the purpose of validating quantitative measures. The approach relies on Latent Semantic Analysis (LSA) which places verbal (written) statements in a high dimensional semantic space. Using data from a medical / psychiatric domain as a case study - Near Death Experiences, or NDE - we established concurrent validity by connecting NDErs qualitative (written) experiential accounts with their locations on a Rasch scalable measure of NDE intensity. Concurrent validity received strong empirical support since the variance in the Rasch measures could be predicted reliably from the coordinates of their accounts in the LSA derived semantic space (R2 = 0.33). These coordinates also predicted NDErs age with considerable precision (R2 = 0.25). Both estimates are probably artificially low due to the small available data samples (n = 588). It appears that Rasch scalability of NDE intensity is a prerequisite for these findings, as each intensity level is associated (at least probabilistically) with a well- defined pattern of item endorsements.
A Method for Automated Detection of Usability Problems from Client User Interface Events
Saadawi, Gilan M.; Legowski, Elizabeth; Medvedeva, Olga; Chavan, Girish; Crowley, Rebecca S.
2005-01-01
Think-aloud usability analysis provides extremely useful data but is very time-consuming and expensive to perform because of the extensive manual video analysis that is required. We describe a simple method for automated detection of usability problems from client user interface events for a developing medical intelligent tutoring system. The method incorporates (1) an agent-based method for communication that funnels all interface events and system responses to a centralized database, (2) a simple schema for representing interface events and higher order subgoals, and (3) an algorithm that reproduces the criteria used for manual coding of usability problems. A correction factor was empirically determining to account for the slower task performance of users when thinking aloud. We tested the validity of the method by simultaneously identifying usability problems using TAU and manually computing them from stored interface event data using the proposed algorithm. All usability problems that did not rely on verbal utterances were detectable with the proposed method. PMID:16779121
MOVING BEYOND COLOR: THE CASE FOR MULTISPECTRAL IMAGING IN BRIGHTFIELD PATHOLOGY
Cukierski, William J.; Qi, Xin; Foran, David J.
2009-01-01
A multispectral camera is capable of imaging a histologic slide at narrow bandwidths over the range of the visible spectrum. While several uses for multispectral imaging (MSI) have been demonstrated in pathology [1, 2], there is no unified consensus over when and how MSI might benefit automated analysis [3, 4]. In this work, we use a linear-algebra framework to investigate the relationship between the spectral image and its standard-image counterpart. The multispectral “cube” is treated as an extension of a traditional image in a high-dimensional color space. The concept of metamers is introduced and used to derive regions of the visible spectrum where MSI may provide an advantage. Furthermore, histological stains which are amenable to analysis by MSI are reported. We show the Commission internationale de l’éclairage (CIE) 1931 transformation from spectrum to color is non-neighborhood preserving. Empirical results are demonstrated on multispectral images of peripheral blood smears. PMID:19997528
NASA Astrophysics Data System (ADS)
Alves, L. G. A.; Ribeiro, H. V.; Lenzi, E. K.; Mendes, R. S.
2014-09-01
We report on the existing connection between power-law distributions and allometries. As it was first reported in Gomez-Lievano et al. (2012) for the relationship between homicides and population, when these urban indicators present asymptotic power-law distributions, they can also display specific allometries among themselves. Here, we present an extensive characterization of this connection when considering all possible pairs of relationships from twelve urban indicators of Brazilian cities (such as child labor, illiteracy, income, sanitation and unemployment). Our analysis reveals that all our urban indicators are asymptotically distributed as power laws and that the proposed connection also holds for our data when the allometric relationship displays enough correlations. We have also found that not all allometric relationships are independent and that they can be understood as a consequence of the allometric relationship between the urban indicator and the population size. We further show that the residuals fluctuations surrounding the allometries are characterized by an almost constant variance and log-normal distributions.
Cerebral cartography and connectomics.
Sporns, Olaf
2015-05-19
Cerebral cartography and connectomics pursue similar goals in attempting to create maps that can inform our understanding of the structural and functional organization of the cortex. Connectome maps explicitly aim at representing the brain as a complex network, a collection of nodes and their interconnecting edges. This article reflects on some of the challenges that currently arise in the intersection of cerebral cartography and connectomics. Principal challenges concern the temporal dynamics of functional brain connectivity, the definition of areal parcellations and their hierarchical organization into large-scale networks, the extension of whole-brain connectivity to cellular-scale networks, and the mapping of structure/function relations in empirical recordings and computational models. Successfully addressing these challenges will require extensions of methods and tools from network science to the mapping and analysis of human brain connectivity data. The emerging view that the brain is more than a collection of areas, but is fundamentally operating as a complex networked system, will continue to drive the creation of ever more detailed and multi-modal network maps as tools for on-going exploration and discovery in human connectomics. © 2015 The Author(s) Published by the Royal Society. All rights reserved.
Ternès, Nils; Rotolo, Federico; Michiels, Stefan
2016-07-10
Correct selection of prognostic biomarkers among multiple candidates is becoming increasingly challenging as the dimensionality of biological data becomes higher. Therefore, minimizing the false discovery rate (FDR) is of primary importance, while a low false negative rate (FNR) is a complementary measure. The lasso is a popular selection method in Cox regression, but its results depend heavily on the penalty parameter λ. Usually, λ is chosen using maximum cross-validated log-likelihood (max-cvl). However, this method has often a very high FDR. We review methods for a more conservative choice of λ. We propose an empirical extension of the cvl by adding a penalization term, which trades off between the goodness-of-fit and the parsimony of the model, leading to the selection of fewer biomarkers and, as we show, to the reduction of the FDR without large increase in FNR. We conducted a simulation study considering null and moderately sparse alternative scenarios and compared our approach with the standard lasso and 10 other competitors: Akaike information criterion (AIC), corrected AIC, Bayesian information criterion (BIC), extended BIC, Hannan and Quinn information criterion (HQIC), risk information criterion (RIC), one-standard-error rule, adaptive lasso, stability selection, and percentile lasso. Our extension achieved the best compromise across all the scenarios between a reduction of the FDR and a limited raise of the FNR, followed by the AIC, the RIC, and the adaptive lasso, which performed well in some settings. We illustrate the methods using gene expression data of 523 breast cancer patients. In conclusion, we propose to apply our extension to the lasso whenever a stringent FDR with a limited FNR is targeted. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
A discrete element method-based approach to predict the breakage of coal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gupta, Varun; Sun, Xin; Xu, Wei
Pulverization is an essential pre-combustion technique employed for solid fuels, such as coal, to reduce particle sizes. Smaller particles ensure rapid and complete combustion, leading to low carbon emissions. Traditionally, the resulting particle size distributions from pulverizers have been determined by empirical or semi-empirical approaches that rely on extensive data gathered over several decades during operations or experiments, with limited predictive capabilities for new coals and processes. Our work presents a Discrete Element Method (DEM)-based computational approach to model coal particle breakage with experimentally characterized coal physical properties. We also examined the effect of select operating parameters on the breakagemore » behavior of coal particles.« less
A discrete element method-based approach to predict the breakage of coal
Gupta, Varun; Sun, Xin; Xu, Wei; ...
2017-08-05
Pulverization is an essential pre-combustion technique employed for solid fuels, such as coal, to reduce particle sizes. Smaller particles ensure rapid and complete combustion, leading to low carbon emissions. Traditionally, the resulting particle size distributions from pulverizers have been determined by empirical or semi-empirical approaches that rely on extensive data gathered over several decades during operations or experiments, with limited predictive capabilities for new coals and processes. Our work presents a Discrete Element Method (DEM)-based computational approach to model coal particle breakage with experimentally characterized coal physical properties. We also examined the effect of select operating parameters on the breakagemore » behavior of coal particles.« less
Scarbecz, Mark
2004-11-01
Despite some important differences, relationships among dental team members bear striking similarities to marital relationships. Empirical research on marital interaction can be useful in enhancing relationships among dental team members. As with marriage, it is unrealistic to expect that conflict and differences of opinion will never occur among dental team members. However, a set of principles derived from extensive, empirical, behavioral science research on marital interaction can provide dental teams with strategies for strengthening working relationships and managing conflict. Benefits of using these principles may include a reduction in employee turnover, improvements in efficiency and productivity, and the creation of an environment that helps attract and retain patients.
Delay and Probability Discounting in Humans: An Overview
ERIC Educational Resources Information Center
McKerchar, Todd L.; Renda, C. Renee
2012-01-01
The purpose of this review is to introduce the reader to the concepts of delay and probability discounting as well as the major empirical findings to emerge from research with humans on these concepts. First, we review a seminal discounting study by Rachlin, Raineri, and Cross (1991) as well as an influential extension of this study by Madden,…
ERIC Educational Resources Information Center
Horner, Robert H.; Kincaid, Donald; Sugai, George; Lewis, Timothy; Eber, Lucille; Barrett, Susan; Dickey, Celeste Rossetto; Richter, Mary; Sullivan, Erin; Boezio, Cyndi; Algozzine, Bob; Reynolds, Heather; Johnson, Nanci
2014-01-01
Scaling of evidence-based practices in education has received extensive discussion but little empirical evaluation. We present here a descriptive summary of the experience from seven states with a history of implementing and scaling School-Wide Positive Behavioral Interventions and Supports (SWPBIS) over the past decade. Each state has been…
Leadership Matters: Teachers' Roles in School Decision Making and School Performance
ERIC Educational Resources Information Center
Ingersoll, Richard M.; Sirinides, Philip; Dougherty, Patrick
2018-01-01
Given the prominence of both instructional leadership and teacher leadership in the realms of school reform and policy, not surprisingly, both have also been the focus of extensive empirical research. But there have been limits to this research. It is, for example, unclear which of the many key elements of instructional leadership are more, or…
ERIC Educational Resources Information Center
Spennemann, Dirk H. R.; Atkinson, John; Cornforth, David
2007-01-01
Most universities have invested in extensive infrastructure in the form of computer laboratories and computer kiosks. However, is this investment justified when it is suggested that students work predominantly from home using their own computers? This paper provides an empirical study investigating how students at a regional multi-campus…
ERIC Educational Resources Information Center
Lin, Yu-Wei; Zini, Enrico
2008-01-01
This empirical paper shows how free/libre open source software (FLOSS) contributes to mutual and collaborative learning in an educational environment. Unlike proprietary software, FLOSS allows extensive customisation of software to support the needs of local users better. This also allows users to participate more proactively in the development…
Putting Children Front and Center: Building Coordinated Social Policy for America's Children
ERIC Educational Resources Information Center
Zaff, Jonathan F.; Smerdon, Becky
2009-01-01
In this article, we argue that policymakers in America should reference a coherent, comprehensive, and child-centered framework for children. That is, based on an extensive review of the empirical literature on the first two decades of life, we conclude that policies should address the needs of young people throughout the first two decades of…
ERIC Educational Resources Information Center
Nesset, Valerie
2015-01-01
Introduction: As part of a larger study in 2006 of the information-seeking behaviour of third-grade students in Montreal, Quebec, Canada, a model of their information-seeking behaviour was developed. To further improve the model, an extensive examination of the literature into information-seeking behaviour and information literacy was conducted…
Future Orientation, School Contexts, and Problem Behaviors: A Multilevel Study
ERIC Educational Resources Information Center
Chen, Pan; Vazsonyi, Alexander T.
2013-01-01
The association between future orientation and problem behaviors has received extensive empirical attention; however, previous work has not considered school contextual influences on this link. Using a sample of N = 9,163 9th to 12th graders (51.0% females) from N = 85 high schools of the National Longitudinal Study of Adolescent Health, the…
ERIC Educational Resources Information Center
Lim, Lois; Oei, Adam C.
2015-01-01
Despite the widespread use of Orton-Gillingham (OG) based approaches to dyslexia remediation, empirical support documenting its effectiveness is lacking. Recently, Chia and Houghton demonstrated the effectiveness of the OG approach for remediation of dyslexia in Singapore. As a conceptual replication and extension of that research, we report…
An Empirical Study on Behavioural Intention to Reuse E-Learning Systems in Rural China
ERIC Educational Resources Information Center
Li, Yan; Duan, Yanqing; Fu, Zetian; Alford, Philip
2012-01-01
The learner's acceptance of e-learning systems has received extensive attention in prior studies, but how their experience of using e-learning systems impacts on their behavioural intention to reuse those systems has attracted limited research. As the applications of e-learning are still gaining momentum in developing countries, such as China,…
What We Are Learning about How the Brain Learns-Implications for the Use of Video in the Classroom.
ERIC Educational Resources Information Center
Davidson, Tom; McKenzie, Barbara K.
2000-01-01
Describes empirical research in the fields of neurology and cognitive science that is being conducted to determine how and why the brain learns. Explains ways that video is compatible with how the brain learns and suggests it should be used more extensively by teachers and library media specialists. (LRW)
Worldwide Ocean Optics Database (WOOD)
2002-09-30
attenuation estimated from diffuse attenuation and backscatter data). Error estimates will also be provided for the computed results. Extensive algorithm...empirical algorithms (e.g., beam attenuation estimated from diffuse attenuation and backscatter data). Error estimates will also be provided for the...properties, including diffuse attenuation, beam attenuation, and scattering. Data from ONR-funded bio-optical cruises will be given priority for loading
ERIC Educational Resources Information Center
Hofman, Roelande H.; de Boom, Jan; Meeuwisse, Marieke; Hofman, W. H. Adriaan
2013-01-01
Despite the extensive literature on educational innovations, there is only limited empirical research available into the impact of innovations on student achievement. In this article, the following research questions will be answered: What form do innovations in secondary education take, are there types of innovative schools, and what effect do…
An Empirical Look at Recipient Benefits Associated with a University-Issued Student Leadership Award
ERIC Educational Resources Information Center
Adams, Robyn L.
2012-01-01
Within academia there is an elaborate and extensive system of awards for both students and faculty (Frey, 2006). Although the majority of student-based awards are for outstanding leadership and related accomplishments, there has been virtually no research on the impact of receiving such a leadership award (Frey, 2006). Due to the conspicuous…
The Educational Use of Social Annotation Tools in Higher Education: A Literature Review
ERIC Educational Resources Information Center
Novak, Elena; Razzouk, Rim; Johnson, Tristan E.
2012-01-01
This paper presents a literature review of empirical research related to the use and effect of online social annotation (SA) tools in higher education settings. SA technology is an emerging educational technology that has not yet been extensively used and examined in education. As such, the research focusing on this technology is still very…
ERIC Educational Resources Information Center
Cheung, Ronnie; Vogel, Doug
2013-01-01
Collaborative technologies support group work in project-based environments. In this study, we enhance the technology acceptance model to explain the factors that influence the acceptance of Google Applications for collaborative learning. The enhanced model was empirically evaluated using survey data collected from 136 students enrolled in a…
Equality of Opportunity and Equality of Outcome
ERIC Educational Resources Information Center
Kodelja, Zdenko
2016-01-01
The report on the findings of extensive empirical research on equality of educational opportunities carried out in the United States on a very large sample of public schools by Coleman and his colleagues has had a major impact on education policy and has given rise to a large amount of research and various interpretations. However, as some…
An Automated Individual Feedback and Marking System: An Empirical Study
ERIC Educational Resources Information Center
Barker, Trevor
2011-01-01
The recent National Students Survey showed that feedback to students was an ongoing problem in Higher Education. This paper reports on the extension of our past research into the provision of automated feedback for objective testing. In the research presented here, the system has been further developed for marking practical and essay questions and…
ERIC Educational Resources Information Center
Inoue, Chihiro
2016-01-01
The constructs of complexity, accuracy and fluency (CAF) have been used extensively to investigate learner performance on second language tasks. However, a serious concern is that the variables used to measure these constructs are sometimes used conventionally without any empirical justification. It is crucial for researchers to understand how…
Strategic Planning, Recasts, Noticing, and L2 Development
ERIC Educational Resources Information Center
Hama, Mika
2012-01-01
Since the mid-1990s, the link between recasts and L2 development has been extensively tested, and the results from those studies have largely demonstrated that recasts have a positive effect on L2 learning. With this firm support from previous empirical evidence, studies have begun to focus on how recasts assist learning and under what conditions…
ERIC Educational Resources Information Center
Rimpiläinen, Sanna
2015-01-01
What do different research methods and approaches "do" in practice? The article seeks to discuss this point by drawing upon socio-material research approaches and empirical examples taken from the early stages of an extensive case study on an interdisciplinary project between two multidisciplinary fields of study, education and computer…
Modeling wildland fire propagation with level set methods
V. Mallet; D.E Keyes; F.E. Fendell
2009-01-01
Level set methods are versatile and extensible techniques for general front tracking problems, including the practically important problem of predicting the advance of a fire front across expanses of surface vegetation. Given a rule, empirical or otherwise, to specify the rate of advance of an infinitesimal segment of fire front arc normal to itself (i.e., given the...
The Role of Key Qualifications in the Transition from Vocational Education to Work
ERIC Educational Resources Information Center
van Zolingen, S. J.
2002-01-01
This study presents a new definition of key qualifications related to occupations based on an extensive literature search. The empirical aspect of this study describes a Delphi study focused on policy where a number of key qualifications were operationalized for three selected jobs: commercial employee at a bank, claims assessor or acceptor at an…
Negotiating complementary and alternative medicine use in primary care visits with older patients
Koenig, Christopher J.; Ho, Evelyn Y.; Yadegar, Vivien; Tarn, Derjung M.
2013-01-01
Objective To empirically investigate the ways in which patients and providers discuss Complementary and Alternative Medicine (CAM) treatment in primary care visits. Methods Audio recordings from visits between 256 adult patients aged 50 years and older and 28 primary care physicians were transcribed and analyzed using discourse analysis, an empirical sociolinguistic methodology focusing on how language is used to negotiate meaning. Results Discussion about CAM occurred 128 times in 82 of 256 visits (32.0%). The most frequently discussed CAM modalities were non-vitamin, non-mineral supplements and massage. Three physician–patient interactions were analyzed turn-by-turn to demonstrate negotiations about CAM use. Patients raised CAM discussions to seek physician expertise about treatments, and physicians adopted a range of responses along a continuum that included encouragement, neutrality, and discouragement. Despite differential knowledge about CAM treatments, physicians helped patients assess the risks and benefits of CAM treatments and made recommendations based on patient preferences for treatment. Conclusion Regardless of a physician's stance or knowledge about CAM, she or he can help patients negotiate CAM treatment decisions. Practice implications Providers do not have to possess extensive knowledge about specific CAM treatments to have meaningful discussions with patients and to give patients a framework for evaluating CAM treatment use. PMID:22483672
Negotiating complementary and alternative medicine use in primary care visits with older patients.
Koenig, Christopher J; Ho, Evelyn Y; Yadegar, Vivien; Tarn, Derjung M
2012-12-01
To empirically investigate the ways in which patients and providers discuss Complementary and Alternative Medicine (CAM) treatment in primary care visits. Audio recordings from visits between 256 adult patients aged 50 years and older and 28 primary care physicians were transcribed and analyzed using discourse analysis, an empirical sociolinguistic methodology focusing on how language is used to negotiate meaning. Discussion about CAM occurred 128 times in 82 of 256 visits (32.0%). The most frequently discussed CAM modalities were non-vitamin, non-mineral supplements and massage. Three physician-patient interactions were analyzed turn-by-turn to demonstrate negotiations about CAM use. Patients raised CAM discussions to seek physician expertise about treatments, and physicians adopted a range of responses along a continuum that included encouragement, neutrality, and discouragement. Despite differential knowledge about CAM treatments, physicians helped patients assess the risks and benefits of CAM treatments and made recommendations based on patient preferences for treatment. Regardless of a physician's stance or knowledge about CAM, she or he can help patients negotiate CAM treatment decisions. Providers do not have to possess extensive knowledge about specific CAM treatments to have meaningful discussions with patients and to give patients a framework for evaluating CAM treatment use. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
A Study Space Analysis and Narrative Review of Trauma-Informed Mediators of Dating Violence.
Cascardi, Michele; Jouriles, Ernest N
2018-07-01
Research linking child maltreatment and dating violence in adolescence and emerging adulthood has proliferated in the past two decades; however, the precise mechanisms by which these experiences are related remain elusive. A trauma-informed perspective suggests four particularly promising mediators: maladaptive attachment, emotion regulation difficulties, emotional distress, and hostility. The current article characterizes the status of the empirical literature examining these four mediators using a study space analysis and a narrative review of existing research. An extensive literature search identified 42 papers (44 studies) that met the following criteria: (1) at least one measure of child maltreatment (emotional, physical, sexual, neglect, or exposure to intimate partner violence); (2) a measure of one of the four mediator variables; (3) a measure of dating violence perpetration or victimization; and (4) a sample of adolescents or young adults. The study space analysis suggested several important observations about the research on this topic, including a dearth of studies examining hostility as a mediator and little research using prospective designs or clinical samples. There are also limitations with the conceptualization and measurement of dating violence, child maltreatment, and some of the mediator variables. In addition, few studies examined more than one mediator variable in the same study. The narrative review suggested that maladaptive attachment (specifically insecure attachment styles), emotion regulation difficulties (specifically regulation of the emotion of anger), and emotional distress construed broadly represent promising mediators of the association between child maltreatment and dating violence, but conclusions about mediation must remain tentative given the state of the literature. The discussion offers recommendations for improved theoretical and empirical rigor to advance future research on mechanisms linking child maltreatment and dating violence.
How GPs value guidelines applied to patients with multimorbidity: a qualitative study
Luijks, Hilde; Lucassen, Peter; van Weel, Chris; Loeffen, Maartje; Lagro-Janssen, Antoine; Schermer, Tjard
2015-01-01
Objectives To explore and describe the value general practitioner (GPs) attribute to medical guidelines when they are applied to patients with multimorbidity, and to describe which benefits GPs experience from guideline adherence in these patients. Also, we aimed to identify limitations from guideline adherence in patients with multimorbidity, as perceived by GPs, and to describe their empirical solutions to manage these obstacles. Design Focus group study with purposive sampling of participants. Focus groups were guided by an experienced moderator who used an interview guide. Interviews were transcribed verbatim. Data analysis was performed by two researchers using the constant comparison analysis technique and field notes were used in the analysis. Data collection proceeded until saturation was reached. Setting Primary care, eastern part of The Netherlands. Participants Dutch GPs, heterogeneous in age, sex and academic involvement. Results 25 GPs participated in five focus groups. GPs valued the guidance that guidelines provide, but experienced shortcomings when they were applied to patients with multimorbidity. Taking these patients’ personal circumstances into account was regarded as important, but it was impeded by a consistent focus on guideline adherence. Preventative measures were considered less appropriate in (elderly) patients with multimorbidity. Moreover, the applicability of guidelines in patients with multimorbidity was questioned. GPs’ extensive practical experience with managing multimorbidity resulted in several empirical solutions, for example, using their ‘common sense’ to respond to the perceived shortcomings. Conclusions GPs applying guidelines for patients with multimorbidity integrate patient-specific factors in their medical decisions, aiming for patient-centred solutions. Such integration of clinical experience and best evidence is required to practise evidence-based medicine. More flexibility in pay-for-performance systems is needed to facilitate this integration. Several improvements in guideline reporting are necessary to enhance the applicability of guidelines in patients with multimorbidity. PMID:26503382
Six sigma critical success factors in manufacturing industries
NASA Astrophysics Data System (ADS)
Mustafa, Zainol; Jamaluddin, Z.
2017-04-01
The success of Six Sigma implementation is known to depend on a number of contributing factors. The purpose of this paper is to explore Six Sigma critical success factors (CSFs) in the context of Malaysian manufacturing organizations. Although Six Sigma success factors have been abundantly researched in the global context, in this paper, a maiden attempt is made to identify, through an extensive literature review, the CSFs for Six Sigma implementation followed by their validation using primary data collection from Malaysian manufacturing companies. A total of 33 indicators have thus been compiled through an extensive literature review which then been grouped into 6 contributing factors. These contributing success factors are then validated through an empirical research of selected Malaysian manufacturing companies at various stages of implementation of the Six Sigma process improvement methodology. There has been an overemphasis on the role and commitment of the management in the success of a Six Sigma program. Though it is undoubted, certain other factors also play an equally important role in ensuring that the Six Sigma programs are successful. The factor analysis of CSFs of the Malaysian manufacturing organizations selected in this study demonstrates that the top factor is a composite factor showing combination of the ability of the project teams to use the process management on quality initiative and a training using a proper analysis in problem solving. The CSFs extracted through the factor analysis could provide a basis for manufacturing organizations embarking on the Six Sigma journey to look beyond just management involvement. Thus, one can develop an integrated framework of other factors as outlined and give them appropriate priority and focus.
Analysis of copper and brass coins of the early roman empire.
Carter, G F
1966-01-14
X-ray fluorescence analysis of 14 copper and brass coins of the early Roman Empire shows differences in composition between coins minted in Rome and in France. Concentrations of tin, lead, and antimony are nearly always less than in coins minted before 29 B.C. or after 54 A.D. Older coins were not melted to make copper coins of the early empire.
Li, Dongmei; Le Pape, Marc A; Parikh, Nisha I; Chen, Will X; Dye, Timothy D
2013-01-01
Microarrays are widely used for examining differential gene expression, identifying single nucleotide polymorphisms, and detecting methylation loci. Multiple testing methods in microarray data analysis aim at controlling both Type I and Type II error rates; however, real microarray data do not always fit their distribution assumptions. Smyth's ubiquitous parametric method, for example, inadequately accommodates violations of normality assumptions, resulting in inflated Type I error rates. The Significance Analysis of Microarrays, another widely used microarray data analysis method, is based on a permutation test and is robust to non-normally distributed data; however, the Significance Analysis of Microarrays method fold change criteria are problematic, and can critically alter the conclusion of a study, as a result of compositional changes of the control data set in the analysis. We propose a novel approach, combining resampling with empirical Bayes methods: the Resampling-based empirical Bayes Methods. This approach not only reduces false discovery rates for non-normally distributed microarray data, but it is also impervious to fold change threshold since no control data set selection is needed. Through simulation studies, sensitivities, specificities, total rejections, and false discovery rates are compared across the Smyth's parametric method, the Significance Analysis of Microarrays, and the Resampling-based empirical Bayes Methods. Differences in false discovery rates controls between each approach are illustrated through a preterm delivery methylation study. The results show that the Resampling-based empirical Bayes Methods offer significantly higher specificity and lower false discovery rates compared to Smyth's parametric method when data are not normally distributed. The Resampling-based empirical Bayes Methods also offers higher statistical power than the Significance Analysis of Microarrays method when the proportion of significantly differentially expressed genes is large for both normally and non-normally distributed data. Finally, the Resampling-based empirical Bayes Methods are generalizable to next generation sequencing RNA-seq data analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smyth, Padhraic
2013-07-22
This is the final report for a DOE-funded research project describing the outcome of research on non-homogeneous hidden Markov models (NHMMs) and coupled ocean-atmosphere (O-A) intermediate-complexity models (ICMs) to identify the potentially predictable modes of climate variability, and to investigate their impacts on the regional-scale. The main results consist of extensive development of the hidden Markov models for rainfall simulation and downscaling specifically within the non-stationary climate change context together with the development of parallelized software; application of NHMMs to downscaling of rainfall projections over India; identification and analysis of decadal climate signals in data and models; and, studies ofmore » climate variability in terms of the dynamics of atmospheric flow regimes.« less
Beyond upgrading typologies - In search of a better deal for honey value chains in Brazil.
Figueiredo Junior, Hugo S de; Meuwissen, Miranda P M; van der Lans, Ivo A; Oude Lansink, Alfons G J M
2017-01-01
Selection of value chain strategies by development practitioners and value chain participants themselves has been restricted to preset types of upgrading. This paper argues for an extension of the range of strategy solutions to value chains. An empirical application identifies successful strategies for honey value chains in Brazil for 2015-2020. Strategy and performance indicators were selected using the value chain Structure-Conduct-Performance (SCP) framework. Experts' opinion was elicited in a Delphi for business scenarios, and adaptive conjoint analysis was used to identify strategies for increasing production growth and local value-added. This study identifies important strategies beyond upgrading typologies, and finds that important strategies differ by performance goal and scenario. The value chain SCP allows searching for promising strategies towards performance-the "better deal"-in an integrated way.
Beyond upgrading typologies – In search of a better deal for honey value chains in Brazil
Meuwissen, Miranda P. M.; van der Lans, Ivo A.; Oude Lansink, Alfons G. J. M.
2017-01-01
Selection of value chain strategies by development practitioners and value chain participants themselves has been restricted to preset types of upgrading. This paper argues for an extension of the range of strategy solutions to value chains. An empirical application identifies successful strategies for honey value chains in Brazil for 2015–2020. Strategy and performance indicators were selected using the value chain Structure-Conduct-Performance (SCP) framework. Experts’ opinion was elicited in a Delphi for business scenarios, and adaptive conjoint analysis was used to identify strategies for increasing production growth and local value-added. This study identifies important strategies beyond upgrading typologies, and finds that important strategies differ by performance goal and scenario. The value chain SCP allows searching for promising strategies towards performance–the “better deal”–in an integrated way. PMID:28742804
Empirically derived pain-patient MMPI subgroups: prediction of treatment outcome.
Moore, J E; Armentrout, D P; Parker, J C; Kivlahan, D R
1986-02-01
Fifty-seven male chronic pain patients admitted to an inpatient multimodal pain treatment program at a Midwestern Veterans Administration hospital completed the MMPI, Profile of Mood States (POMS), Tennessee Self-Concept Scale (TSCS), Rathus Assertiveness Schedule (RAS), activity diaries, and an extensive pain questionnaire. All patients were assessed both before and after treatment, and most also were assessed 2-5 months prior to treatment. No significant changes occurred during the baseline period, but significant improvements were evident at posttreatment on most variables: MMPI, POMS, TSCS, RAS, pain severity, sexual functioning, and activity diaries. MMPI subgroup membership, based on a hierarchical cluster analysis in a larger sample, was not predictive of differential treatment outcome. Possible reasons for comparable treatment gains among these subgroups, which previously have been shown to differ on many psychological and behavioral factors, are discussed.
An Index and Test of Linear Moderated Mediation.
Hayes, Andrew F
2015-01-01
I describe a test of linear moderated mediation in path analysis based on an interval estimate of the parameter of a function linking the indirect effect to values of a moderator-a parameter that I call the index of moderated mediation. This test can be used for models that integrate moderation and mediation in which the relationship between the indirect effect and the moderator is estimated as linear, including many of the models described by Edwards and Lambert ( 2007 ) and Preacher, Rucker, and Hayes ( 2007 ) as well as extensions of these models to processes involving multiple mediators operating in parallel or in serial. Generalization of the method to latent variable models is straightforward. Three empirical examples describe the computation of the index and the test, and its implementation is illustrated using Mplus and the PROCESS macro for SPSS and SAS.
Growth Type and Functional Trajectories: An Empirical Study of Urban Expansion in Nanjing, China
Yuan, Feng
2016-01-01
Drawing upon the Landsat satellite images of Nanjing from 1985, 1995, 2001, 2007, and 2013, this paper integrates the convex hull analysis and common edge analysis at double scales, and develops a comprehensive matrix analysis to distinguish the different types of urban land expansion. The results show that Nanjing experienced rapid urban expansion, dominated by a mix of residential and manufacturing land from 1985 to 2013, which in turn has promoted Nanjing’s shift from a compact mononuclear city to a polycentric one. Spatial patterns of three specific types of growth, namely infilling, extension, and enclave were quite different in four consecutive periods. These patterns result primarily from the existing topographic constraints, as well as government-oriented urban planning and policies. By intersecting the function maps, we also reveal the functional evolution of newly-developed urban land. Moreover, both self-enhancing and mutual promotion of the newly developed functions are surveyed over the last decade. Our study confirms that the integration of a multi-scale method and multi-perspective analysis, such as the spatiotemporal patterns and functional evolution, helps us to better understand the rapid urban growth in China. PMID:26845155
Data Mining for Anomaly Detection
NASA Technical Reports Server (NTRS)
Biswas, Gautam; Mack, Daniel; Mylaraswamy, Dinkar; Bharadwaj, Raj
2013-01-01
The Vehicle Integrated Prognostics Reasoner (VIPR) program describes methods for enhanced diagnostics as well as a prognostic extension to current state of art Aircraft Diagnostic and Maintenance System (ADMS). VIPR introduced a new anomaly detection function for discovering previously undetected and undocumented situations, where there are clear deviations from nominal behavior. Once a baseline (nominal model of operations) is established, the detection and analysis is split between on-aircraft outlier generation and off-aircraft expert analysis to characterize and classify events that may not have been anticipated by individual system providers. Offline expert analysis is supported by data curation and data mining algorithms that can be applied in the contexts of supervised learning methods and unsupervised learning. In this report, we discuss efficient methods to implement the Kolmogorov complexity measure using compression algorithms, and run a systematic empirical analysis to determine the best compression measure. Our experiments established that the combination of the DZIP compression algorithm and CiDM distance measure provides the best results for capturing relevant properties of time series data encountered in aircraft operations. This combination was used as the basis for developing an unsupervised learning algorithm to define "nominal" flight segments using historical flight segments.
An operational GLS model for hydrologic regression
Tasker, Gary D.; Stedinger, J.R.
1989-01-01
Recent Monte Carlo studies have documented the value of generalized least squares (GLS) procedures to estimate empirical relationships between streamflow statistics and physiographic basin characteristics. This paper presents a number of extensions of the GLS method that deal with realities and complexities of regional hydrologic data sets that were not addressed in the simulation studies. These extensions include: (1) a more realistic model of the underlying model errors; (2) smoothed estimates of cross correlation of flows; (3) procedures for including historical flow data; (4) diagnostic statistics describing leverage and influence for GLS regression; and (5) the formulation of a mathematical program for evaluating future gaging activities. ?? 1989.
An Investigation of Document Partitions.
ERIC Educational Resources Information Center
Shaw, W. M., Jr.
1986-01-01
Empirical significance of document partitions is investigated as a function of index term-weight and similarity thresholds. Results show the same empirically preferred partitions can be detected by two independent strategies: an analysis of cluster-based retrieval analysis and an analysis of regularities in the underlying structure of the document…
2007-10-01
1984. Complex principal component analysis : Theory and examples. Journal of Climate and Applied Meteorology 23: 1660-1673. Hotelling, H. 1933...Sediments 99. ASCE: 2,566-2,581. Von Storch, H., and A. Navarra. 1995. Analysis of climate variability. Applications of statistical techniques. Berlin...ERDC TN-SWWRP-07-9 October 2007 Regional Morphology Empirical Analysis Package (RMAP): Orthogonal Function Analysis , Background and Examples by
Dai, Hongying; Wu, Guodong; Wu, Michael; Zhi, Degui
2016-01-01
Next-generation sequencing data pose a severe curse of dimensionality, complicating traditional "single marker-single trait" analysis. We propose a two-stage combined p-value method for pathway analysis. The first stage is at the gene level, where we integrate effects within a gene using the Sequence Kernel Association Test (SKAT). The second stage is at the pathway level, where we perform a correlated Lancaster procedure to detect joint effects from multiple genes within a pathway. We show that the Lancaster procedure is optimal in Bahadur efficiency among all combined p-value methods. The Bahadur efficiency,[Formula: see text], compares sample sizes among different statistical tests when signals become sparse in sequencing data, i.e. ε →0. The optimal Bahadur efficiency ensures that the Lancaster procedure asymptotically requires a minimal sample size to detect sparse signals ([Formula: see text]). The Lancaster procedure can also be applied to meta-analysis. Extensive empirical assessments of exome sequencing data show that the proposed method outperforms Gene Set Enrichment Analysis (GSEA). We applied the competitive Lancaster procedure to meta-analysis data generated by the Global Lipids Genetics Consortium to identify pathways significantly associated with high-density lipoprotein cholesterol, low-density lipoprotein cholesterol, triglycerides, and total cholesterol.
Nonlinear dynamical modes of climate variability: from curves to manifolds
NASA Astrophysics Data System (ADS)
Gavrilov, Andrey; Mukhin, Dmitry; Loskutov, Evgeny; Feigin, Alexander
2016-04-01
The necessity of efficient dimensionality reduction methods capturing dynamical properties of the system from observed data is evident. Recent study shows that nonlinear dynamical mode (NDM) expansion is able to solve this problem and provide adequate phase variables in climate data analysis [1]. A single NDM is logical extension of linear spatio-temporal structure (like empirical orthogonal function pattern): it is constructed as nonlinear transformation of hidden scalar time series to the space of observed variables, i. e. projection of observed dataset onto a nonlinear curve. Both the hidden time series and the parameters of the curve are learned simultaneously using Bayesian approach. The only prior information about the hidden signal is the assumption of its smoothness. The optimal nonlinearity degree and smoothness are found using Bayesian evidence technique. In this work we do further extension and look for vector hidden signals instead of scalar with the same smoothness restriction. As a result we resolve multidimensional manifolds instead of sum of curves. The dimension of the hidden manifold is optimized using also Bayesian evidence. The efficiency of the extension is demonstrated on model examples. Results of application to climate data are demonstrated and discussed. The study is supported by Government of Russian Federation (agreement #14.Z50.31.0033 with the Institute of Applied Physics of RAS). 1. Mukhin, D., Gavrilov, A., Feigin, A., Loskutov, E., & Kurths, J. (2015). Principal nonlinear dynamical modes of climate variability. Scientific Reports, 5, 15510. http://doi.org/10.1038/srep15510
Toward a Principled Sampling Theory for Quasi-Orders
Ünlü, Ali; Schrepp, Martin
2016-01-01
Quasi-orders, that is, reflexive and transitive binary relations, have numerous applications. In educational theories, the dependencies of mastery among the problems of a test can be modeled by quasi-orders. Methods such as item tree or Boolean analysis that mine for quasi-orders in empirical data are sensitive to the underlying quasi-order structure. These data mining techniques have to be compared based on extensive simulation studies, with unbiased samples of randomly generated quasi-orders at their basis. In this paper, we develop techniques that can provide the required quasi-order samples. We introduce a discrete doubly inductive procedure for incrementally constructing the set of all quasi-orders on a finite item set. A randomization of this deterministic procedure allows us to generate representative samples of random quasi-orders. With an outer level inductive algorithm, we consider the uniform random extensions of the trace quasi-orders to higher dimension. This is combined with an inner level inductive algorithm to correct the extensions that violate the transitivity property. The inner level correction step entails sampling biases. We propose three algorithms for bias correction and investigate them in simulation. It is evident that, on even up to 50 items, the new algorithms create close to representative quasi-order samples within acceptable computing time. Hence, the principled approach is a significant improvement to existing methods that are used to draw quasi-orders uniformly at random but cannot cope with reasonably large item sets. PMID:27965601
Toward a Principled Sampling Theory for Quasi-Orders.
Ünlü, Ali; Schrepp, Martin
2016-01-01
Quasi-orders, that is, reflexive and transitive binary relations, have numerous applications. In educational theories, the dependencies of mastery among the problems of a test can be modeled by quasi-orders. Methods such as item tree or Boolean analysis that mine for quasi-orders in empirical data are sensitive to the underlying quasi-order structure. These data mining techniques have to be compared based on extensive simulation studies, with unbiased samples of randomly generated quasi-orders at their basis. In this paper, we develop techniques that can provide the required quasi-order samples. We introduce a discrete doubly inductive procedure for incrementally constructing the set of all quasi-orders on a finite item set. A randomization of this deterministic procedure allows us to generate representative samples of random quasi-orders. With an outer level inductive algorithm, we consider the uniform random extensions of the trace quasi-orders to higher dimension. This is combined with an inner level inductive algorithm to correct the extensions that violate the transitivity property. The inner level correction step entails sampling biases. We propose three algorithms for bias correction and investigate them in simulation. It is evident that, on even up to 50 items, the new algorithms create close to representative quasi-order samples within acceptable computing time. Hence, the principled approach is a significant improvement to existing methods that are used to draw quasi-orders uniformly at random but cannot cope with reasonably large item sets.
NASA Technical Reports Server (NTRS)
Bergrun, N. R.
1951-01-01
An empirical method for the determination of the area, rate, and distribution of water-drop impingement on airfoils of arbitrary section is presented. The procedure represents an initial step toward the development of a method which is generally applicable in the design of thermal ice-prevention equipment for airplane wing and tail surfaces. Results given by the proposed empirical method are expected to be sufficiently accurate for the purpose of heated-wing design, and can be obtained from a few numerical computations once the velocity distribution over the airfoil has been determined. The empirical method presented for incompressible flow is based on results of extensive water-drop. trajectory computations for five airfoil cases which consisted of 15-percent-thick airfoils encompassing a moderate lift-coefficient range. The differential equations pertaining to the paths of the drops were solved by a differential analyzer. The method developed for incompressible flow is extended to the calculation of area and rate of impingement on straight wings in subsonic compressible flow to indicate the probable effects of compressibility for airfoils at low subsonic Mach numbers.
Critical Realism and Empirical Bioethics: A Methodological Exposition.
McKeown, Alex
2017-09-01
This paper shows how critical realism can be used to integrate empirical data and philosophical analysis within 'empirical bioethics'. The term empirical bioethics, whilst appearing oxymoronic, simply refers to an interdisciplinary approach to the resolution of practical ethical issues within the biological and life sciences, integrating social scientific, empirical data with philosophical analysis. It seeks to achieve a balanced form of ethical deliberation that is both logically rigorous and sensitive to context, to generate normative conclusions that are practically applicable to the problem, challenge, or dilemma. Since it incorporates both philosophical and social scientific components, empirical bioethics is a field that is consistent with the use of critical realism as a research methodology. The integration of philosophical and social scientific approaches to ethics has been beset with difficulties, not least because of the irreducibly normative, rather than descriptive, nature of ethical analysis and the contested relation between fact and value. However, given that facts about states of affairs inform potential courses of action and their consequences, there is a need to overcome these difficulties and successfully integrate data with theory. Previous approaches have been formulated to overcome obstacles in combining philosophical and social scientific perspectives in bioethical analysis; however each has shortcomings. As a mature interdisciplinary approach critical realism is well suited to empirical bioethics, although it has hitherto not been widely used. Here I show how it can be applied to this kind of research and explain how it represents an improvement on previous approaches.
The ancient city of Rome, its empire, and the spread of tuberculosis in Europe.
Eddy, Jared J
2015-06-01
The formation of the Roman Empire constituted an unprecedented joining of Mediterranean and European lands and peoples, centering on the capital of Rome. During the late Roman Republic and early Roman Empire (ca. 200B.C.-ca. 200 A.D.) urbanization and population growth led to conditions favorable to the spread of tuberculosis throughout Italy and especially within Rome itself. Trade and military expansion would have acted as vehicles for the further extension of tuberculosis to the provinces via direct transmission from Italian-born Romans to the native populations. However, an alternative explanation may better explain the increase in the number of archeological cases of tuberculosis with the start of the Roman era. A literature review of Roman-era cases and their locations suggests that the development of an urban, Roman way of life resulted in significant increases in prevalence in regions where tuberculosis had previously been endemic only at a low level. Copyright © 2015 Elsevier Ltd. All rights reserved.
Modern Experience in City Combat
1987-03-01
Extensive media coverage of Beirut served to erode both domestic and international support. Surprise. As in military operations on other terrain... social reasons which constrain military actions to some degree. At the same time an empirical study has little difficulty in distinguishing in most...Review, 38-39. 39 Bureau of Applied Social Research, Columbia University. 0953). Korean urbanization: Past development and future potentials. Maxwell
Does Grade Inflation Affect the Credibility of Grades? Evidence from US Law School Admissions
ERIC Educational Resources Information Center
Wongsurawat, Winai
2009-01-01
While the nature and causes of university grade inflation have been extensively studied, little empirical research on the consequence of this phenomenon is currently available. The present study uses data for 48 US law schools to analyze admission decisions in 1995, 2000, and 2007, a period during which university grade inflation appears to have…
ERIC Educational Resources Information Center
Taylor, Lauren J.; Maybery, Murray T.; Wray, John; Ravine, David; Hunt, Anna; Whitehouse, Andrew J. O.
2013-01-01
Extensive empirical evidence indicates that the lesser variant of Autism Spectrum Disorders (ASD) involves a communication impairment that is similar to, but milder than, the deficit in clinical ASD. This research explored the relationship between the broader autism phenotype (BAP) among parents, an index of genetic liability for ASD, and proband…
The Role of Status in Producing Depressed Entitlement in Women's and Men's Pay Allocations
ERIC Educational Resources Information Center
Hogue, Mary; Yoder, Janice D.
2003-01-01
Extensive empirical evidence confirms a depressed entitlement effect wherein women pay themselves less than men for comparable work and believe the allocation fair. The present study tests the hypothesis that status subordination linked to being female underlies at least some of this effect. A 2 x 3 design crossed 180 undergraduates' gender with a…
ERIC Educational Resources Information Center
Polit, Denise; And Others
To expand the use of women in nontraditional industrial careers, the U.S. Air Force examined the questions of recruiting, selecting, and training women for traditionally male blue collar work. An extensive review of the literature revealed that little empirical data on the effectiveness of various administrative policies had been collected. The…
ERIC Educational Resources Information Center
Gilmore, Linda; Cuskelly, Monica; Browning, Melissa
2015-01-01
The main purpose of the current study was to provide empirical evidence to support or refute assumptions of phenotypic deficits in motivation for children with Down syndrome (DS). Children with moderate intellectual disability (MID) associated with etiologies other than DS were recruited in an extension of a previous study that involved children…
Anthony H. Conner; Melissa S. Reeves
2001-01-01
Computational chemistry methods can be used to explore the theoretical chemistry behind reactive systems, to compare the relative chemical reactivity of different systems, and, by extension, to predict the reactivity of new systems. Ongoing research has focused on the reactivity of a wide variety of phenolic compounds with formaldehyde using semi-empirical and ab...
ERIC Educational Resources Information Center
Shupe, Ellen I.; Pung, Stephanie K.
2011-01-01
Although issues related to the role of librarians have long been discussed in the literature on academic librarianship, there has been little attempt to incorporate the extensive psychological theory and research on role-related issues. In the current article we review the empirical literature on the role of librarians, with a particular focus on…
ERIC Educational Resources Information Center
Mak, Jennifer Y.; Cheung, Siu-Yin; King, Carina C.; Lam, Eddie T. C.
2016-01-01
There have been extensive studies of local residents' perception and reaction to the impacts of mega events. However, there is limited empirical research on the social impacts that shape foreign attitudes toward the host country. The purpose of this study was to develop and validate the Olympic Games Attitude Scale (OGAS) to examine viewers'…
ERIC Educational Resources Information Center
Lloyd, Eva; Edmonds, Casey; Downs, Celony; Crutchley, Rebecca; Paffard, Fran
2017-01-01
The acquisition of everyday scientific concepts by 3-6-year-old children attending early childhood institutions has been widely studied. In contrast, research on science learning processes among younger children is less extensive. This paper reports on findings from an exploratory empirical study undertaken in a "stay and play" service…
ERIC Educational Resources Information Center
Dymond, Simon; Alonso-Alvarez, Benigno
2010-01-01
In a recent article, Schlinger (2008) marked the 50th anniversary of the publication of Skinner's "Verbal Behavior" (1957) by considering its impact on the field of behaviorism and research on verbal behavior. In the present article, we comment on Schlinger's conclusions regarding the impact of the book and highlight the extensions and…
NASA Technical Reports Server (NTRS)
Morris, Carl N.
1987-01-01
Motivated by the LANDSAT problem of estimating the probability of crop or geological types based on multi-channel satellite imagery data, Morris and Kostal (1983), Hill, Hinkley, Kostal, and Morris (1984), and Morris, Hinkley, and Johnston (1985) developed an empirical Bayes approach to this problem. Here, researchers return to those developments, making certain improvements and extensions, but restricting attention to the binary case of only two attributes.
ERIC Educational Resources Information Center
Virues-Ortega, Javier; Hurtado-Parrado, Camilo; Martin, Toby L.; Julio, Flavia
2012-01-01
Mario Bunge is one of the most prolific philosophers of our time. Over the past sixty years he has written extensively about semantics, ontology, epistemology, philosophy of science and ethics. Bunge has been interested in the philosophical and methodological implications of modern psychology and more specifically in the philosophies of the…
Good Practices for Learning to Recognize Actions Using FV and VLAD.
Wu, Jianxin; Zhang, Yu; Lin, Weiyao
2016-12-01
High dimensional representations such as Fisher vectors (FV) and vectors of locally aggregated descriptors (VLAD) have shown state-of-the-art accuracy for action recognition in videos. The high dimensionality, on the other hand, also causes computational difficulties when scaling up to large-scale video data. This paper makes three lines of contributions to learning to recognize actions using high dimensional representations. First, we reviewed several existing techniques that improve upon FV or VLAD in image classification, and performed extensive empirical evaluations to assess their applicability for action recognition. Our analyses of these empirical results show that normality and bimodality are essential to achieve high accuracy. Second, we proposed a new pooling strategy for VLAD and three simple, efficient, and effective transformations for both FV and VLAD. Both proposed methods have shown higher accuracy than the original FV/VLAD method in extensive evaluations. Third, we proposed and evaluated new feature selection and compression methods for the FV and VLAD representations. This strategy uses only 4% of the storage of the original representation, but achieves comparable or even higher accuracy. Based on these contributions, we recommend a set of good practices for action recognition in videos for practitioners in this field.
[The neurodynamic core of consciousness and neural Darwinism].
Ibáñez, A
In the last decades, the scientific study of consciousness in the scope of the cognitive neurosciences can be considered one of the greatest challenges of contemporary science. The Gerald Edelman theory of consciousness is one of the most promising and controversial perspectives. This theory stands out by its approach to topics usually rejected by other neurophysiologic theories of consciousness, as the case of the neurophysiologic explanation of qualia. The goal of this paper is to review the dynamic core theory of consciousness, presenting the main features of the theory, analyzing the explanation strategies, their empirical extensions, and elaborating some critical considerations about the possibility of the neuroscientific study of qualia. The central and additional theoretical components are analyzed, emphasizing its ontological, restrictive and explanatory assumptions. The properties of conscious phenomena and their cerebral correlates as advanced by the theory are described, and finally its experiments and empirical extensions are examined. The explanatory strategies of the theory are analyzed, based on conceptual isomorphism between the phenomenological properties and the neurophysiological and mathematical measures. Some criticisms could be raised about the limitations of the dynamic core theory, especially regarding its account of the so-called 'hard problem' of consciousness or qualia.
Probabilistic analysis of tsunami hazards
Geist, E.L.; Parsons, T.
2006-01-01
Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).
Sensitivity Analysis of Empirical Results on Civil War Onset
ERIC Educational Resources Information Center
Hegre, Havard; Sambanis, Nicholas
2006-01-01
In the literature on civil war onset, several empirical results are not robust or replicable across studies. Studies use different definitions of civil war and analyze different time periods, so readers cannot easily determine if differences in empirical results are due to those factors or if most empirical results are just not robust. The authors…
Pieterse, Arwen H; de Vries, Marieke
2013-09-01
Increasingly, patient decision aids and values clarification methods (VCMs) are being developed to support patients in making preference-sensitive health-care decisions. Many VCMs encourage extensive deliberation about options, without solid theoretical or empirical evidence showing that deliberation is advantageous. Research suggests that simple, fast and frugal heuristic decision strategies sometimes result in better judgments and decisions. Durand et al. have developed two fast and frugal heuristic-based VCMs. To critically analyse the suitability of the 'take the best' (TTB) and 'tallying' fast and frugal heuristics in the context of patient decision making. Analysis of the structural similarities between the environments in which the TTB and tallying heuristics have been proven successful and the context of patient decision making and of the potential of these heuristic decision processes to support patient decision making. The specific nature of patient preference-sensitive decision making does not seem to resemble environments in which the TTB and tallying heuristics have proven successful. Encouraging patients to consider less rather than more relevant information potentially even deteriorates their values clarification process. Values clarification methods promoting the use of more intuitive decision strategies may sometimes be more effective. Nevertheless, we strongly recommend further theoretical thinking about the expected value of such heuristics and of other more intuitive decision strategies in this context, as well as empirical assessments of the mechanisms by which inducing such decision strategies may impact the quality and outcome of values clarification. © 2011 John Wiley & Sons Ltd.
Pieterse, Arwen H.; de Vries, Marieke
2011-01-01
Abstract Background Increasingly, patient decision aids and values clarification methods (VCMs) are being developed to support patients in making preference‐sensitive health‐care decisions. Many VCMs encourage extensive deliberation about options, without solid theoretical or empirical evidence showing that deliberation is advantageous. Research suggests that simple, fast and frugal heuristic decision strategies sometimes result in better judgments and decisions. Durand et al. have developed two fast and frugal heuristic‐based VCMs. Objective To critically analyse the suitability of the ‘take the best’ (TTB) and ‘tallying’ fast and frugal heuristics in the context of patient decision making. Strategy Analysis of the structural similarities between the environments in which the TTB and tallying heuristics have been proven successful and the context of patient decision making and of the potential of these heuristic decision processes to support patient decision making. Conclusion The specific nature of patient preference‐sensitive decision making does not seem to resemble environments in which the TTB and tallying heuristics have proven successful. Encouraging patients to consider less rather than more relevant information potentially even deteriorates their values clarification process. Values clarification methods promoting the use of more intuitive decision strategies may sometimes be more effective. Nevertheless, we strongly recommend further theoretical thinking about the expected value of such heuristics and of other more intuitive decision strategies in this context, as well as empirical assessments of the mechanisms by which inducing such decision strategies may impact the quality and outcome of values clarification. PMID:21902770
On the Accuracy of Probabilistic Bucking Load Prediction
NASA Technical Reports Server (NTRS)
Arbocz, Johann; Starnes, James H.; Nemeth, Michael P.
2001-01-01
The buckling strength of thin-walled stiffened or unstiffened, metallic or composite shells is of major concern in aeronautical and space applications. The difficulty to predict the behavior of axially compressed thin-walled cylindrical shells continues to worry design engineers as we enter the third millennium. Thanks to extensive research programs in the late sixties and early seventies and the contributions of many eminent scientists, it is known that buckling strength calculations are affected by the uncertainties in the definition of the parameters of the problem such as definition of loads, material properties, geometric variables, edge support conditions, and the accuracy of the engineering models and analysis tools used in the design phase. The NASA design criteria monographs from the late sixties account for these design uncertainties by the use of a lump sum safety factor. This so-called 'empirical knockdown factor gamma' usually results in overly conservative design. Recently new reliability based probabilistic design procedure for buckling critical imperfect shells have been proposed. It essentially consists of a stochastic approach which introduces an improved 'scientific knockdown factor lambda(sub a)', that is not as conservative as the traditional empirical one. In order to incorporate probabilistic methods into a High Fidelity Analysis Approach one must be able to assess the accuracy of the various steps that must be executed to complete a reliability calculation. In the present paper the effect of size of the experimental input sample on the predicted value of the scientific knockdown factor lambda(sub a) calculated by the First-Order, Second-Moment Method is investigated.
Clearance of the cervical spine in clinically unevaluable trauma patients.
Halpern, Casey H; Milby, Andrew H; Guo, Wensheng; Schuster, James M; Gracias, Vicente H; Stein, Sherman C
2010-08-15
Meta-analytic costeffectiveness analysis. Our goal was to compare the results of different management strategies for trauma patients in whom the cervical spine was not clinically evaluable due to impaired consciousness, endotracheal intubation, or painful distracting injuries. We performed a structured literature review related to cervical spine trauma, radiographic clearance techniques (plain radiography, flexion/extension, CT, and MRI), and complications associated with semirigid collar use. Meta-analytic techniques were used to pool data from multiple sources to calculate pooled mean estimates of sensitivities and specificities of imaging techniques for cervical spinal clearance, rates of complications from various clearance strategies and from empirical use of semirigid collars. A decision analysis model was used to compare outcomes and costs among these strategies. Slightly more than 7.5% of patients who are clinically unevaluable have cervical spine injuries, and 42% of these injuries are associated with spinal instability. Sensitivity of plain radiography or fluoroscopy for spinal clearance was 57% (95% CI: 57%-60%). Sensitivities for CT and MRI alone were 83% (82%-84%) and 87% (84%-89%), respectively. Complications associated with collar use ranged from 1.3% (2 days) to 7.1% (10 days) but were usually minor and short-lived. Quadriplegia resulting from spinal instability missed by a clearance test had enormous impacts on longevity, quality of life, and costs. These impacts overshadowed the effects of prolonged collar application, even when the incidence of quadriplegia was extremely low. As currently used, neuroimaging studies for cervical spinal clearance in clinically unevaluable patients are not cost-effective compared with empirical immobilization in a semirigid collar.
NASA Astrophysics Data System (ADS)
Stewart, Gavin B.; Pullin, Andrew S.; Tyler, Claire
2007-11-01
Bracken ( Pteridium aquilinum) is a major problem for livestock-based extensive agriculture, conservation, recreation, and game management globally. It is an invasive species often achieving dominance to the detriment of other species. Control is essential to maintain plant communities such as grassland and lowland heath or if extensive grazing by domestic stock, particularly sheep, is to be viable on upland margins. Bracken is managed primarily by herbicide application or cutting but other techniques including rolling, burning, and grazing are also utilized. Here we evaluate the evidence regarding the effectiveness of asulam for the control of bracken. Thirteen studies provided data for meta-analyses which demonstrate that application of the herbicide asulam reduces bracken abundance. Subgroup analyses indicate that the number of treatments had an important impact, with multiple follow-up treatments more effective than one or two treatments. Management practices should reflect the requirement for repeated follow-up. There is insufficient available experimental evidence for quantitative analysis of the effectiveness of other management interventions, although this results from lack of reporting in papers where cutting and comparisons of cutting and asulam application are concerned. Systematic searching and meta-analytical synthesis have effectively demonstrated the limits of current knowledge, based on recorded empirical evidence, and increasing the call for more rigorous monitoring of bracken control techniques. Lack of experimental evidence on the effectiveness of management such as rolling or grazing with hardy cattle breeds contrasts with the widespread acceptance of their use through dissemination of experience.
Stochastic Geometric Network Models for Groups of Functional and Structural Connectomes
Friedman, Eric J.; Landsberg, Adam S.; Owen, Julia P.; Li, Yi-Ou; Mukherjee, Pratik
2014-01-01
Structural and functional connectomes are emerging as important instruments in the study of normal brain function and in the development of new biomarkers for a variety of brain disorders. In contrast to single-network studies that presently dominate the (non-connectome) network literature, connectome analyses typically examine groups of empirical networks and then compare these against standard (stochastic) network models. Current practice in connectome studies is to employ stochastic network models derived from social science and engineering contexts as the basis for the comparison. However, these are not necessarily best suited for the analysis of connectomes, which often contain groups of very closely related networks, such as occurs with a set of controls or a set of patients with a specific disorder. This paper studies important extensions of standard stochastic models that make them better adapted for analysis of connectomes, and develops new statistical fitting methodologies that account for inter-subject variations. The extensions explicitly incorporate geometric information about a network based on distances and inter/intra hemispherical asymmetries (to supplement ordinary degree-distribution information), and utilize a stochastic choice of networks' density levels (for fixed threshold networks) to better capture the variance in average connectivity among subjects. The new statistical tools introduced here allow one to compare groups of networks by matching both their average characteristics and the variations among them. A notable finding is that connectomes have high “smallworldness” beyond that arising from geometric and degree considerations alone. PMID:25067815
NASA Astrophysics Data System (ADS)
Scheichl, B.; Kluwick, A.
2013-11-01
The classical analysis of turbulent boundary layers in the limit of large Reynolds number Re is characterised by an asymptotically small velocity defect with respect to the external irrotational flow. As an extension of the classical theory, it is shown in the present work that the defect may become moderately large and, in the most general case, independent of Re but still remain small compared to the external streamwise velocity for non-zero pressure gradient boundary layers. That wake-type flow turns out to be characterised by large values of the Rotta-Clauser parameter, serving as an appropriate measure for the defect and hence as a second perturbation parameter besides Re. Most important, it is demonstrated that also this case can be addressed by rigorous asymptotic analysis, which is essentially independent of the choice of a specific Reynolds stress closure. As a salient result of this procedure, transition from the classical small defect to a pronounced wake flow is found to be accompanied by quasi-equilibrium flow, described by a distinguished limit that involves the wall shear stress. This situation is associated with double-valued solutions of the boundary layer equations and an unconventional weak Re-dependence of the external bulk flow—a phenomenon seen to agree well with previous semi-empirical studies and early experimental observations. Numerical computations of the boundary layer flow for various values of Re reproduce these analytical findings with satisfactory agreement.
Infliximab-Related Infusion Reactions: Systematic Review
Ron, Yulia; Kivity, Shmuel; Ben-Horin, Shomron; Israeli, Eran; Fraser, Gerald M.; Dotan, Iris; Chowers, Yehuda; Confino-Cohen, Ronit; Weiss, Batia
2015-01-01
Objective: Administration of infliximab is associated with a well-recognised risk of infusion reactions. Lack of a mechanism-based rationale for their prevention, and absence of adequate and well-controlled studies, has led to the use of diverse empirical administration protocols. The aim of this study is to perform a systematic review of the evidence behind the strategies for preventing infusion reactions to infliximab, and for controlling the reactions once they occur. Methods: We conducted extensive search of electronic databases of MEDLINE [PubMed] for reports that communicate various aspects of infusion reactions to infliximab in IBD patients. Results: We examined full texts of 105 potentially eligible articles. No randomised controlled trials that pre-defined infusion reaction as a primary outcome were found. Three RCTs evaluated infusion reactions as a secondary outcome; another four RCTs included infusion reactions in the safety evaluation analysis; and 62 additional studies focused on various aspects of mechanism/s, risk, primary and secondary preventive measures, and management algorithms. Seven studies were added by a manual search of reference lists of the relevant articles. A total of 76 original studies were included in quantitative analysis of the existing strategies. Conclusions: There is still paucity of systematic and controlled data on the risk, prevention, and management of infusion reactions to infliximab. We present working algorithms based on systematic and extensive review of the available data. More randomised controlled trials are needed in order to investigate the efficacy of the proposed preventive and management algorithms. PMID:26092578
Directional Migration of Recirculating Lymphocytes through Lymph Nodes via Random Walks
Thomas, Niclas; Matejovicova, Lenka; Srikusalanukul, Wichat; Shawe-Taylor, John; Chain, Benny
2012-01-01
Naive T lymphocytes exhibit extensive antigen-independent recirculation between blood and lymph nodes, where they may encounter dendritic cells carrying cognate antigen. We examine how long different T cells may spend in an individual lymph node by examining data from long term cannulation of blood and efferent lymphatics of a single lymph node in the sheep. We determine empirically the distribution of transit times of migrating T cells by applying the Least Absolute Shrinkage & Selection Operator () or regularised to fit experimental data describing the proportion of labelled infused cells in blood and efferent lymphatics over time. The optimal inferred solution reveals a distribution with high variance and strong skew. The mode transit time is typically between 10 and 20 hours, but a significant number of cells spend more than 70 hours before exiting. We complement the empirical machine learning based approach by modelling lymphocyte passage through the lymph node . On the basis of previous two photon analysis of lymphocyte movement, we optimised distributions which describe the transit times (first passage times) of discrete one dimensional and continuous (Brownian) three dimensional random walks with drift. The optimal fit is obtained when drift is small, i.e. the ratio of probabilities of migrating forward and backward within the node is close to one. These distributions are qualitatively similar to the inferred empirical distribution, with high variance and strong skew. In contrast, an optimised normal distribution of transit times (symmetrical around mean) fitted the data poorly. The results demonstrate that the rapid recirculation of lymphocytes observed at a macro level is compatible with predominantly randomised movement within lymph nodes, and significant probabilities of long transit times. We discuss how this pattern of migration may contribute to facilitating interactions between low frequency T cells and antigen presenting cells carrying cognate antigen. PMID:23028891
NASA Astrophysics Data System (ADS)
Chen, Yi-Feng; Atal, Kiran; Xie, Sheng-Quan; Liu, Quan
2017-08-01
Objective. Accurate and efficient detection of steady-state visual evoked potentials (SSVEP) in electroencephalogram (EEG) is essential for the related brain-computer interface (BCI) applications. Approach. Although the canonical correlation analysis (CCA) has been applied extensively and successfully to SSVEP recognition, the spontaneous EEG activities and artifacts that often occur during data recording can deteriorate the recognition performance. Therefore, it is meaningful to extract a few frequency sub-bands of interest to avoid or reduce the influence of unrelated brain activity and artifacts. This paper presents an improved method to detect the frequency component associated with SSVEP using multivariate empirical mode decomposition (MEMD) and CCA (MEMD-CCA). EEG signals from nine healthy volunteers were recorded to evaluate the performance of the proposed method for SSVEP recognition. Main results. We compared our method with CCA and temporally local multivariate synchronization index (TMSI). The results suggest that the MEMD-CCA achieved significantly higher accuracy in contrast to standard CCA and TMSI. It gave the improvements of 1.34%, 3.11%, 3.33%, 10.45%, 15.78%, 18.45%, 15.00% and 14.22% on average over CCA at time windows from 0.5 s to 5 s and 0.55%, 1.56%, 7.78%, 14.67%, 13.67%, 7.33% and 7.78% over TMSI from 0.75 s to 5 s. The method outperformed the filter-based decomposition (FB), empirical mode decomposition (EMD) and wavelet decomposition (WT) based CCA for SSVEP recognition. Significance. The results demonstrate the ability of our proposed MEMD-CCA to improve the performance of SSVEP-based BCI.
NASA Astrophysics Data System (ADS)
Van doninck, Jasper; Tuomisto, Hanna
2017-06-01
Biodiversity mapping in extensive tropical forest areas poses a major challenge for the interpretation of Landsat images, because floristically clearly distinct forest types may show little difference in reflectance. In such cases, the effects of the bidirectional reflection distribution function (BRDF) can be sufficiently strong to cause erroneous image interpretation and classification. Since the opening of the Landsat archive in 2008, several BRDF normalization methods for Landsat have been developed. The simplest of these consist of an empirical view angle normalization, whereas more complex approaches apply the semi-empirical Ross-Li BRDF model and the MODIS MCD43-series of products to normalize directional Landsat reflectance to standard view and solar angles. Here we quantify the effect of surface anisotropy on Landsat TM/ETM+ images over old-growth Amazonian forests, and evaluate five angular normalization approaches. Even for the narrow swath of the Landsat sensors, we observed directional effects in all spectral bands. Those normalization methods that are based on removing the surface reflectance gradient as observed in each image were adequate to normalize TM/ETM+ imagery to nadir viewing, but were less suitable for multitemporal analysis when the solar vector varied strongly among images. Approaches based on the MODIS BRDF model parameters successfully reduced directional effects in the visible bands, but removed only half of the systematic errors in the infrared bands. The best results were obtained when the semi-empirical BRDF model was calibrated using pairs of Landsat observation. This method produces a single set of BRDF parameters, which can then be used to operationally normalize Landsat TM/ETM+ imagery over Amazonian forests to nadir viewing and a standard solar configuration.
A discrete element method-based approach to predict the breakage of coal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gupta, Varun; Sun, Xin; Xu, Wei
Pulverization is an essential pre-combustion technique employed for solid fuels, such as coal, to reduce particle sizes. Smaller particles ensure rapid and complete combustion, leading to low carbon emissions. Traditionally, the resulting particle size distributions from pulverizers have been informed by empirical or semi-empirical approaches that rely on extensive data gathered over several decades during operations or experiments. However, the predictive capabilities for new coals and processes are limited. This work presents a Discrete Element Method based computational framework to predict particle size distribution resulting from the breakage of coal particles characterized by the coal’s physical properties. The effect ofmore » certain operating parameters on the breakage behavior of coal particles also is examined.« less
Terror management theory applied clinically: implications for existential-integrative psychotherapy.
Lewis, Adam M
2014-01-01
Existential psychotherapy and Terror Management Theory (TMT) offer explanations for the potential psychological effects of death awareness, although their respective literatures bases differ in clarity, research, and implications for treating psychopathology. Existential therapy is often opaque to many therapists, in part due to the lack of consensus on what constitutes its practice, limited published practical examples, and few empirical studies examining its efficacy. By contrast, TMT has an extensive empirical literature base, both within social psychology and spanning multiple disciplines, although previously unexplored within clinical and counseling psychology. This article explores the implications of a proposed TMT integrated existential therapy (TIE), bridging the gap between disciplines in order to meet the needs of the aging population and current challenges facing existential therapists.
Empirical mode decomposition-based facial pose estimation inside video sequences
NASA Astrophysics Data System (ADS)
Qing, Chunmei; Jiang, Jianmin; Yang, Zhijing
2010-03-01
We describe a new pose-estimation algorithm via integration of the strength in both empirical mode decomposition (EMD) and mutual information. While mutual information is exploited to measure the similarity between facial images to estimate poses, EMD is exploited to decompose input facial images into a number of intrinsic mode function (IMF) components, which redistribute the effect of noise, expression changes, and illumination variations as such that, when the input facial image is described by the selected IMF components, all the negative effects can be minimized. Extensive experiments were carried out in comparisons to existing representative techniques, and the results show that the proposed algorithm achieves better pose-estimation performances with robustness to noise corruption, illumination variation, and facial expressions.
Filtration of human EEG recordings from physiological artifacts with empirical mode method
NASA Astrophysics Data System (ADS)
Grubov, Vadim V.; Runnova, Anastasiya E.; Khramova, Marina V.
2017-03-01
In the paper we propose the new method for dealing with noise and physiological artifacts in experimental human EEG recordings. The method is based on analysis of EEG signals with empirical mode decomposition (Hilbert-Huang transform). We consider noises and physiological artifacts on EEG as specific oscillatory patterns that cause problems during EEG analysis and can be detected with additional signals recorded simultaneously with EEG (ECG, EMG, EOG, etc.) We introduce the algorithm of the method with following steps: empirical mode decomposition of EEG signal, choosing of empirical modes with artifacts, removing empirical modes with artifacts, reconstruction of the initial EEG signal. We test the method on filtration of experimental human EEG signals from eye-moving artifacts and show high efficiency of the method.
ERIC Educational Resources Information Center
van der Molen, Hugo H.
1984-01-01
Describes a study designed to demonstrate that child pedestrian training objectives may be identified systematically through various task analysis methods, making use of different types of empirical information. Early approaches to analysis of pedestrian tasks are reviewed, and an outline of the Traffic Research Centre's pedestrian task analysis…
Parricide: An Empirical Analysis of 24 Years of U.S. Data
ERIC Educational Resources Information Center
Heide, Kathleen M.; Petee, Thomas A.
2007-01-01
Empirical analysis of homicides in which children have killed parents has been limited. The most comprehensive statistical analysis involving parents as victims was undertaken by Heide and used Supplementary Homicide Report (SHR) data for the 10-year period 1977 to 1986. This article provides an updated examination of characteristics of victims,…
Development of Alabama traffic factors for use in mechanistic-empirical pavement design.
DOT National Transportation Integrated Search
2015-02-01
The pavement engineering community is moving toward design practices that use mechanistic-empirical (M-E) approaches to the design and analysis of pavement structures. This effort is : embodied in the Mechanistic-Empirical Pavement Design Guide (MEPD...
The role of empirical Bayes methodology as a leading principle in modern medical statistics.
van Houwelingen, Hans C
2014-11-01
This paper reviews and discusses the role of Empirical Bayes methodology in medical statistics in the last 50 years. It gives some background on the origin of the empirical Bayes approach and its link with the famous Stein estimator. The paper describes the application in four important areas in medical statistics: disease mapping, health care monitoring, meta-analysis, and multiple testing. It ends with a warning that the application of the outcome of an empirical Bayes analysis to the individual "subjects" is a delicate matter that should be handled with prudence and care. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Varotsos, C. A.; Efstathiou, M. N.
2018-03-01
In this paper we investigate the evolution of the energy emitted by CO2 and NO from the Earth's thermosphere on a global scale using both observational and empirically derived data. In the beginning, we analyze the daily power observations of CO2 and NO received from the Sounding of the Atmosphere using Broadband Emission Radiometry (SABER) equipment on the NASA Thermosphere-Ionosphere-Mesosphere Energetics and Dynamics (TIMED) satellite for the entire period 2002-2016. We then perform the same analysis on the empirical daily power emitted by CO2 and NO that were derived recently from the infrared energy budget of the thermosphere during 1947-2016. The tool used for the analysis of the observational and empirical datasets is the detrended fluctuation analysis, in order to investigate whether the power emitted by CO2 and by NO from the thermosphere exhibits power-law behavior. The results obtained from both observational and empirical data do not support the establishment of the power-law behavior. This conclusion reveals that the empirically derived data are characterized by the same intrinsic properties as those of the observational ones, thus enhancing the validity of their reliability.
Frempong, Samuel N; Sutton, Andrew J; Davenport, Clare; Barton, Pelham
2018-02-01
There is little specific guidance on the implementation of cost-effectiveness modelling at the early stage of test development. The aim of this study was to review the literature in this field to examine the methodologies and tools that have been employed to date. Areas Covered: A systematic review to identify relevant studies in established literature databases. Five studies were identified and included for narrative synthesis. These studies revealed that there is no consistent approach in this growing field. The perspective of patients and the potential for value of information (VOI) to provide information on the value of future research is often overlooked. Test accuracy is an essential consideration, with most studies having described and included all possible test results in their analysis, and conducted extensive sensitivity analyses on important parameters. Headroom analysis was considered in some instances but at the early development stage (not the concept stage). Expert commentary: The techniques available to modellers that can demonstrate the value of conducting further research and product development (i.e. VOI analysis, headroom analysis) should be better utilized. There is the need for concerted efforts to develop rigorous methodology in this growing field to maximize the value and quality of such analysis.
NASA Astrophysics Data System (ADS)
Efstathiou, Angeliki; Tzanis, Andreas; Vallianatos, Filippos
2014-05-01
The context of Non Extensive Statistical Physics (NESP) has recently been suggested to comprise an appropriate tool for the analysis of complex dynamic systems with scale invariance, long-range interactions, long-range memory and systems that evolve in a fractal-like space-time. This is because the active tectonic grain is thought to comprise a (self-organizing) complex system; therefore, its expression (seismicity) should be manifested in the temporal and spatial statistics of energy release rates. In addition to energy release rates expressed by the magnitude M, measures of the temporal and spatial interactions are the time (Δt) and hypocentral distance (Δd) between consecutive events. Recent work indicated that if the distributions of M, Δt and Δd are independent so that the joint probability p(M,Δt,Δd) factorizes into the probabilities of M, Δt and Δd, i.e. p(M,Δt,Δd)= p(M)p(Δt)p(Δd), then the frequency of earthquake occurrence is multiply related, not only to magnitude as the celebrated Gutenberg - Richter law predicts, but also to interevent time and distance by means of well-defined power-laws consistent with NESP. The present work applies these concepts to investigate the self-organization and temporal/spatial dynamics of seismicity in Greece and western Turkey, for the period 1964-2011. The analysis was based on the ISC earthquake catalogue which is homogenous by construction with consistently determined hypocenters and magnitude. The presentation focuses on the analysis of bivariate Frequency-Magnitude-Time distributions, while using the interevent distances as spatial constraints (or spatial filters) for studying the spatial dependence of the energy and time dynamics of the seismicity. It is demonstrated that the frequency of earthquake occurrence is multiply related to the magnitude and the interevent time by means of well-defined multi-dimensional power-laws consistent with NESP and has attributes of universality,as its holds for a broad range of spatial, temporal and magnitude scales. Provided that the multivariate empirical frequency distributions are based on a sufficient number of observations as an empirical lower limit, the results are stable and consistent with the established ken, irrespective of the magnitude and spatio-temporal range of the earthquake catalogue, or operations pertaining to re-sampling, bootstrapping or re-arrangement of the catalogue. It is also demonstrated that that the expression of the regional active tectonic grain may comprise a mixture of processes significantly dependent on Δd. The analysis of the size (energy) distribution of earthquakes yielded results consistent with a correlated sub-extensive system; the results are also consistent with conventional determinations of Frequency-Magnitude distributions. The analysis of interevent times, has determined the existence of sub-extensivity and near-field interaction (correlation) in the complete catalogue of Greek and western Turkish seismicity (mixed background earthquake activity and aftershock processes),as well as in the pure background process (declustered catalogue).This could be attributed to the joint effect of near-field interaction between neighbouring earthquakes or seismic areas and interaction within aftershock sequences. The background process appears to be moderately - weakly correlated at the far field. Formal random temporal processes have not been detected. A general syllogism affordable by the above observations is that aftershock sequences may be an integral part of the seismogenetic process, as they appear to partake in long-range interaction. A formal explanation of such an effect is pending, but may nevertheless involve delayed remote triggering of seismic activity by (transient or static) stress transfer from the main shocks and large aftershocks and/or cascading effects already discussed by Marsan and Lengliné (2008). In this view, the effect weakens when aftershocks are removed because aftershocks are the link between the main shocks and their remote offshoot. Overall, the above results compare well to the results of North Californian seismicity which have shown that the expression of seismicity at Northern California is generally consistent with non-extensive (sub-extensive) thermodynamics. Acknowledgments: This work was supported by the THALES Program of the Ministry of Education of Greece and the European Union in the framework of the project "Integrated understanding of Seismicity, using innovative methodologies of Fracture Mechanics along with Earthquake and Non-Extensive Statistical Physics - Application to the geodynamic system of the Hellenic Arc - SEISMO FEAR HELLARC". References: Tzanis A., Vallianatos F., Efstathiou A., Multidimensional earthquake frequency distributions consistent with Non-Extensive Statistical Physics: the interdependence of magnitude, interevent time and interevent distance in North California. Bulletin of the Geological Society of Greece, vol. XLVII 2013. Proceedings of the 13th International Congress, Chania, Sept. 2013 Tzanis A., Vallianatos F., Efstathiou A., Generalized multidimensional earthquake frequency distributions consistent with Non-Extensive Statistical Physics: An appraisal of the universality in the interdependence of magnitude, interevent time and interevent distance Geophysical Research Abstracts, Vol. 15, EGU2013-628, 2013, EGU General Assembly 2013 Marsan, D. and Lengliné, O., 2008. Extending earthquakes's reach through cascading, Science, 319, 1076; doi: 10.1126/science.1148783 On-line Bulletin, http://www.isc.ac.uk, Internatl. Seis. Cent., Thatcham, United Kingdom, 2011.
The Ethics of Human Life Extension: The Second Argument from Evolution.
Gyngell, Chris
2015-12-01
One argument that is sometimes made against pursuing radical forms of human life extension is that such interventions will make the species less evolvable, which would be morally undesirable. In this article, I discuss the empirical and evaluative claims of this argument. I argue that radical increases in life expectancy could, in principle, reduce the evolutionary potential of human populations through both biological and cultural mechanisms. I further argue that if life extension did reduce the evolvability of the species, this will be undesirable for three reasons: (1) it may increase the species' susceptibility to extinction risks, (2) it may adversely affect institutions and practices that promote well-being, and (3) it may impede moral progress. © The Author 2015. Published by Oxford University Press, on behalf of the Journal of Medicine and Philosophy Inc. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
An Empirical Bayes Approach to Mantel-Haenszel DIF Analysis.
ERIC Educational Resources Information Center
Zwick, Rebecca; Thayer, Dorothy T.; Lewis, Charles
1999-01-01
Developed an empirical Bayes enhancement to Mantel-Haenszel (MH) analysis of differential item functioning (DIF) in which it is assumed that the MH statistics are normally distributed and that the prior distribution of underlying DIF parameters is also normal. (Author/SLD)
ERIC Educational Resources Information Center
Downey, James P.; Kher, Hemant V.
2015-01-01
Technology training in the classroom is critical in preparing students for upper level classes as well as professional careers, especially in fields such as technology. One of the key enablers to this process is computer self-efficacy (CSE), which has an extensive stream of empirical research. Despite this, one of the missing pieces is how CSE…
ERIC Educational Resources Information Center
Crisp, Nicola Elinor
2013-01-01
While some African American students perform as well as or better than their White peers on standardized tests, African Americans as a group attain lower scores on standardized tests than their White peers. This phenomenon has been addressed extensively in educational research. However, not much empirical research has been conducted to investigate…
ERIC Educational Resources Information Center
Mukala, Patrick; Cerone, Antonio; Turini, Franco
2017-01-01
Free\\Libre Open Source Software (FLOSS) environments are increasingly dubbed as learning environments where practical software engineering skills can be acquired. Numerous studies have extensively investigated how knowledge is acquired in these environments through a collaborative learning model that define a learning process. Such a learning…
ERIC Educational Resources Information Center
Kampourakis, Kostas
2016-01-01
Teaching about nature of science (NOS) is considered as an important goal of science education in various countries. Extensive empirical research about how some aspects of NOS can be effectively taught is also available. The most widely adopted conceptualization of NOS is based on a small number of general aspects of NOS, which fall into two…
ERIC Educational Resources Information Center
Guth, Jessica
2008-01-01
This paper, based on extensive empirical work with Polish and Bulgarian scientists in Germany and the UK, examines the impact of the EU enlargement including the free movement of persons provisions on the mobility of scientists from Eastern to Western Europe. It focuses on early career researchers and particularly PhD candidates and begins by…
ERIC Educational Resources Information Center
Ghaffarzadegan, Navid; Stewart, Thomas R.
2011-01-01
Elwin, Juslin, Olsson, and Enkvist (2007) and Henriksson, Elwin, and Juslin (2010) offered the constructivist coding hypothesis to describe how people code the outcomes of their decisions when availability of feedback is conditional on the decision. They provided empirical evidence only for the 0.5 base rate condition. This commentary argues that…
ERIC Educational Resources Information Center
De Grip, Andries; Sauermann, Jan
2013-01-01
Although the transfer of on-the-job training to the workplace belongs to the realm of educational research, it is also highly related to labour economics. In the economic literature, the transfer of training is based on the theoretical framework of human capital theory and has been extensively analysed empirically in econometric studies that take…
Li, Yang; Fu, Hua; Zhao, Fang; Luo, Jianfeng; Kawachi, Ichiro
2013-09-01
The effect of individual educational attainment on health has been extensively documented in western countries, whereas empirical evidence of education spillover effects in marital dyads is scarce and inconsistent. A total of 2764 individuals (or 1382 marital dyads) were surveyed in the Shanghai Healthy City Project 2008. Logistic regression models were used for analysis, and all analyses were stratified by gender. Significant protective associations were observed in univariate models linking general health status to the individual's own educational attainment and to their partner's educational level. After controlling for presence of chronic conditions, lifestyle factors, and social support, these associations were attenuated. The authors found a gender difference in the association of spouse's educational attainment with self-rated health. The influence of education on health may be partly mediated by lifestyle and other factors.
Local Linear Regression for Data with AR Errors.
Li, Runze; Li, Yan
2009-07-01
In many statistical applications, data are collected over time, and they are likely correlated. In this paper, we investigate how to incorporate the correlation information into the local linear regression. Under the assumption that the error process is an auto-regressive process, a new estimation procedure is proposed for the nonparametric regression by using local linear regression method and the profile least squares techniques. We further propose the SCAD penalized profile least squares method to determine the order of auto-regressive process. Extensive Monte Carlo simulation studies are conducted to examine the finite sample performance of the proposed procedure, and to compare the performance of the proposed procedures with the existing one. From our empirical studies, the newly proposed procedures can dramatically improve the accuracy of naive local linear regression with working-independent error structure. We illustrate the proposed methodology by an analysis of real data set.
A Multilevel Multiset Time-Series Model for Describing Complex Developmental Processes
Ma, Xin; Shen, Jianping
2017-01-01
The authors sought to develop an analytical platform where multiple sets of time series can be examined simultaneously. This multivariate platform capable of testing interaction effects among multiple sets of time series can be very useful in empirical research. The authors demonstrated that the multilevel framework can readily accommodate this analytical capacity. Given their intention to use the multilevel multiset time-series model to pursue complicated research purposes, their resulting model is relatively simple to specify, to run, and to interpret. These advantages make the adoption of their model relatively effortless as long as researchers have the basic knowledge and skills in working with multilevel growth modeling. With multiple potential extensions of their model, the establishment of this analytical platform for analysis of multiple sets of time series can inspire researchers to pursue far more advanced research designs to address complex developmental processes in reality. PMID:29881094
Oberg, Tomas
2004-01-01
Halogenated aliphatic compounds have many technical uses, but substances within this group are also ubiquitous environmental pollutants that can affect the ozone layer and contribute to global warming. The establishment of quantitative structure-property relationships is of interest not only to fill in gaps in the available database but also to validate experimental data already acquired. The three-dimensional structures of 240 compounds were modeled with molecular mechanics prior to the generation of empirical descriptors. Two bilinear projection methods, principal component analysis (PCA) and partial-least-squares regression (PLSR), were used to identify outliers. PLSR was subsequently used to build a multivariate calibration model by extracting the latent variables that describe most of the covariation between the molecular structure and the boiling point. Boiling points were also estimated with an extension of the group contribution method of Stein and Brown.
DQE analysis for CCD imaging arrays
NASA Astrophysics Data System (ADS)
Shaw, Rodney
1997-05-01
By consideration of the statistical interaction between exposure quanta and the mechanisms of image detection, the signal-to-noise limitations of a variety of image acquisition technologies are now well understood. However in spite of the growing fields of application for CCD imaging- arrays and the obvious advantages of their multi-level mode of quantum detection, only limited and largely empirical approaches have been made to quantify these advantages on an absolute basis. Here an extension is made of a previous model for noise-free sequential photon-counting to the more general case involving both count-noise and arbitrary separation functions between count levels. This allows a basic model to be developed for the DQE associated with devices which approximate to the CCD mode of operation, and conclusions to be made concerning the roles of the separation-function and count-noise in defining the departure from the ideal photon counter.
Norlander, Torsten; Nordén, Tommy
2015-01-01
The aim of the present article was to discuss the commentary by van Veldhuizen, Delespaul and Mulder (2015) regarding the review by Nordén and Norlander (2014) based on five empirical articles about Flexible Assertive Community Treatment (FACT). Veldhuizen et al. agree on that there is insufficient evidence for the effectiveness of FACT. However, van Veldhuizen et al. avoid a discussion of the lack of positive results despite extensive research during several years and therefore an analysis of why FACT did not fare better is missing. According to FACT it is an advantage that one single team spans the entire chain of care and rehabilitation, but no evidence is given for such an opinion. Instead there may be difficulties for the staff to shift between psychiatric care and psychiatric rehabilitation and the clients perhaps don't want to encounter the same professional team during all phases of care and rehabilitation.
Crash and rebound of indigenous populations in lowland South America
NASA Astrophysics Data System (ADS)
Hamilton, Marcus J.; Walker, Robert S.; Kesler, Dylan C.
2014-04-01
Lowland South America has long been a battle-ground between European colonization and indigenous survival. Initial waves of European colonization brought disease epidemics, slavery, and violence that had catastrophic impacts on indigenous cultures. In this paper we focus on the demography of 238 surviving populations in Brazil. We use longitudinal censuses from all known indigenous Brazilian societies to quantify three demographic metrics: 1) effects of European contact on indigenous populations; 2) empirical estimates of minimum viable population sizes; and 3) estimates of post-contact population growth rates. We use this information to conduct population viability analysis (PVA). Our results show that all surviving populations suffered extensive mortality during, and shortly after, contact. However, most surviving populations exhibit positive growth rates within the first decade post-contact. Our findings paint a positive demographic outlook for these indigenous populations, though long-term survival remains subject to powerful externalities, including politics, economics, and the pervasive illegal exploitation of indigenous lands.
Anchoring effect on first passage process in Taiwan financial market
NASA Astrophysics Data System (ADS)
Liu, Hsing; Liao, Chi-Yo; Ko, Jing-Yuan; Lih, Jiann-Shing
2017-07-01
Empirical analysis of the price fluctuations of financial markets has received extensive attention because a substantial amount of financial market data has been collected and because of advances in data-mining techniques. Price fluctuation trends can help investors to make informed trading decisions, but such decisions may also be affected by a psychological factors-the anchoring effect. This study explores the intraday price time series of Taiwan futures, and applies diffusion model and quantitative methods to analyze the relationship between the anchoring effect and price fluctuations during first passage process. Our results indicate that power-law scaling and anomalous diffusion for stock price fluctuations are related to the anchoring effect. Moreover, microscopic price fluctuations before switching point in first passage process correspond with long-term price fluctuations of Taiwan's stock market. We find that microscopic trends could provide useful information for understanding macroscopic trends in stock markets.
Climate Change: Modeling the Human Response
NASA Astrophysics Data System (ADS)
Oppenheimer, M.; Hsiang, S. M.; Kopp, R. E.
2012-12-01
Integrated assessment models have historically relied on forward modeling including, where possible, process-based representations to project climate change impacts. Some recent impact studies incorporate the effects of human responses to initial physical impacts, such as adaptation in agricultural systems, migration in response to drought, and climate-related changes in worker productivity. Sometimes the human response ameliorates the initial physical impacts, sometimes it aggravates it, and sometimes it displaces it onto others. In these arenas, understanding of underlying socioeconomic mechanisms is extremely limited. Consequently, for some sectors where sufficient data has accumulated, empirically based statistical models of human responses to past climate variability and change have been used to infer response sensitivities which may apply under certain conditions to future impacts, allowing a broad extension of integrated assessment into the realm of human adaptation. We discuss the insights gained from and limitations of such modeling for benefit-cost analysis of climate change.
2015-01-01
The recent availability of high frequency data has permitted more efficient ways of computing volatility. However, estimation of volatility from asset price observations is challenging because observed high frequency data are generally affected by noise-microstructure effects. We address this issue by using the Fourier estimator of instantaneous volatility introduced in Malliavin and Mancino 2002. We prove a central limit theorem for this estimator with optimal rate and asymptotic variance. An extensive simulation study shows the accuracy of the spot volatility estimates obtained using the Fourier estimator and its robustness even in the presence of different microstructure noise specifications. An empirical analysis on high frequency data (U.S. S&P500 and FIB 30 indices) illustrates how the Fourier spot volatility estimates can be successfully used to study intraday variations of volatility and to predict intraday Value at Risk. PMID:26421617
A Critical Analysis and Applied Intersectionality Framework with Intercultural Queer Couples.
Chan, Christian D; Erby, Adrienne N
2018-01-01
Intercultural queer couples are growing at an extensive rate in the United States, exemplifying diversity across multiple dimensions (e.g., race, ethnicity, sexuality, affectional identity, gender identity) while experiencing multiple converging forms of oppression (e.g., racism, heterosexism, genderism). Given the dearth of conceptual and empirical literature that unifies both dimensions related to intercultural and queer, applied practices and research contend with a unilateral approach focusing exclusively on either intercultural or queer couples. Intersectionality theory has revolutionized critical scholarship to determine overlapping forms of oppression, decenter hegemonic structures of power relations and social contexts, and enact a social justice agenda. This article addresses the following aims: (1) an overview of the gaps eliciting unilateral approaches to intercultural queer couples; (2) an illustration of intersectionality's theoretical underpinnings as a critical approach; and (3) applications for insights in practices and research with intercultural queer couples.
Excise tax avoidance: the case of state cigarette taxes.
DeCicca, Philip; Kenkel, Donald; Liu, Feng
2013-12-01
We conduct an applied welfare economics analysis of cigarette tax avoidance. We develop an extension of the standard formula for the optimal Pigouvian corrective tax to incorporate the possibility that consumers avoid the tax by making purchases in nearby lower tax jurisdictions. To provide a key parameter for our formula, we estimate a structural endogenous switching regression model of border-crossing and cigarette prices. In illustrative calculations, we find that for many states, after taking into account tax avoidance the optimal tax is at least 20% smaller than the standard Pigouvian tax that simply internalizes external costs. Our empirical estimate that tax avoidance strongly responds to the price differential is the main reason for this result. We also use our results to examine the benefits of replacing avoidable state excise taxes with a harder-to-avoid federal excise tax on cigarettes. Copyright © 2013 Elsevier B.V. All rights reserved.
Excise Tax Avoidance: The Case of State Cigarette Taxes
DeCicca, Philip; Kenkel, Donald; Liu, Feng
2013-01-01
We conduct an applied welfare economics analysis of cigarette tax avoidance. We develop an extension of the standard formula for the optimal Pigouvian corrective tax to incorporate the possibility that consumers avoid the tax by making purchases in nearby lower-tax jurisdictions. To provide a key parameter for our formula, we estimate a structural endogenous switching regression model of border-crossing and cigarette prices. In illustrative calculations, we find that for many states, after taking into account tax avoidance the optimal tax is at least 20 percent smaller than the standard Pigouvian tax that simply internalizes external costs. Our empirical estimate that tax avoidance strongly responds to the price differential is the main reason for this result. We also use our results to examine the benefits of replacing avoidable state excise taxes with a harder-to-avoid federal excise tax on cigarettes. PMID:24140760
NASA Astrophysics Data System (ADS)
Tinio, Pablo P. L.
2017-07-01
The Vienna Integrated Model of Art Perception (VIMAP; [5]) is the most comprehensive model of the art experience today. The model incorporates bottom-up and top-down cognitive processes and accounts for different outcomes of the art experience, such as aesthetic evaluations, emotions, and physiological and neurological responses to art. In their presentation of the model, Pelowski et al. also present hypotheses that are amenable to empirical testing. These features make the VIMAP an ambitious model that attempts to explain how meaningful, complex, and profound aspects of the art experience come about, which is a significant extension of previous models of the art experience (e.g., [1-3,10]), and which gives the VIMAP good explanatory power.
Le Pichon, Céline; Tales, Évelyne; Belliard, Jérôme; Torgersen, Christian E.
2017-01-01
Spatially intensive sampling by electrofishing is proposed as a method for quantifying spatial variation in fish assemblages at multiple scales along extensive stream sections in headwater catchments. We used this method to sample fish species at 10-m2 points spaced every 20 m throughout 5 km of a headwater stream in France. The spatially intensive sampling design provided information at a spatial resolution and extent that enabled exploration of spatial heterogeneity in fish assemblage structure and aquatic habitat at multiple scales with empirical variograms and wavelet analysis. These analyses were effective for detecting scales of periodicity, trends, and discontinuities in the distribution of species in relation to tributary junctions and obstacles to fish movement. This approach to sampling riverine fishes may be useful in fisheries research and management for evaluating stream fish responses to natural and altered habitats and for identifying sites for potential restoration.
The halo current in ASDEX Upgrade
NASA Astrophysics Data System (ADS)
Pautasso, G.; Giannone, L.; Gruber, O.; Herrmann, A.; Maraschek, M.; Schuhbeck, K. H.; ASDEX Upgrade Team
2011-04-01
Due to the complexity of the phenomena involved, a self-consistent physical model for the prediction of the halo current is not available. Therefore the ITER specifications of the spatial distribution and evolution of the halo current rely on empirical assumptions. This paper presents the results of an extensive analysis of the halo current measured in ASDEX Upgrade with particular emphasis on the evolution of the halo region, on the magnitude and time history of the halo current, and on the structure and duration of its toroidal and poloidal asymmetries. The effective length of the poloidal path of the halo current in the vessel is found to be rather insensitive to plasma parameters. Large values of the toroidally averaged halo current are observed in both vertical displacement events and centred disruptions but last a small fraction of the current quench; they coincide typically with a large but short-lived MHD event.
Snow and Ice Mask for the MODIS Aerosol Products
NASA Technical Reports Server (NTRS)
Li, Rong-Rong; Remer, Lorraine; Kaufman, Yoram J.; Mattoo, Shana; Gao, Bo-Cai; Vermote, Eric
2005-01-01
The atmospheric products have been derived operationally from multichannel imaging data collected with the Moderate Resolution Imaging SpectroRadiometers (MODIS) on board the NASA Terra and Aqua spacecrafts. Preliminary validations of the products were previously reported. Through analysis of more extensive time-series of MODIS aerosol products (Collection 4), we have found that the aerosol products over land areas are slightly contaminated by snow and ice during the springtime snow-melting season. We have developed an empirical technique using MODIS near-IR channels centered near 0.86 and 1.24 pm and a thermal emission channel near 11 pm to mask out these snow-contaminated pixels over land. Improved aerosol retrievals over land have been obtained. Sample results from application of the technique to MODIS data acquired over North America, northern Europe, and northeastern Asia are presented. The technique has been implemented into the MODIS Collection 5 operational algorithm for retrieving aerosols over land from MODIS data.
Solidarity in contemporary bioethics--towards a new approach.
Prainsack, Barbara; Buyx, Alena
2012-09-01
This paper, which is based on an extensive analysis of the literature, gives a brief overview of the main ways in which solidarity has been employed in bioethical writings in the last two decades. As the vagueness of the term has been one of the main targets of critique, we propose a new approach to defining solidarity, identifying it primarily as a practice enacted at the interpersonal, communal, and contractual/legal levels. Our three-tier model of solidarity can also help to explain the way in which crises of solidarity can occur, notably when formal solidaristic arrangements continue to exist despite 'lower tiers' of solidarity practices at inter-personal and communal levels having 'broken away'. We hope that this contribution to the growing debate on the potential for the value of solidarity to help tackle issues in bioethics and beyond, will stimulate further discussion involving both conceptual and empirically informed perspectives. © 2012 Blackwell Publishing Ltd.
ERIC Educational Resources Information Center
Lee, Barbara A.
1990-01-01
Questions assumptions by Schoenfeld and Zirkel in a study reviewing gender discrimination cases against institutions of higher education. Critiques the methodology used in that study, cautions about the overall utility of "outcomes analysis," and reports more promising routes of empirical legal research. (15 references) (MLF)
Exploring Advertising in Higher Education: An Empirical Analysis in North America, Europe, and Japan
ERIC Educational Resources Information Center
Papadimitriou, Antigoni; Blanco Ramírez, Gerardo
2015-01-01
This empirical study explores higher education advertising campaigns displayed in five world cities: Boston, New York, Oslo, Tokyo, and Toronto. The study follows a mixed-methods research design relying on content analysis and multimodal semiotic analysis and employs a conceptual framework based on the knowledge triangle of education, research,…
NASA Astrophysics Data System (ADS)
Nightingale, James; Wang, Qi; Grecos, Christos; Goma, Sergio
2014-02-01
High Efficiency Video Coding (HEVC), the latest video compression standard (also known as H.265), can deliver video streams of comparable quality to the current H.264 Advanced Video Coding (H.264/AVC) standard with a 50% reduction in bandwidth. Research into SHVC, the scalable extension to the HEVC standard, is still in its infancy. One important area for investigation is whether, given the greater compression ratio of HEVC (and SHVC), the loss of packets containing video content will have a greater impact on the quality of delivered video than is the case with H.264/AVC or its scalable extension H.264/SVC. In this work we empirically evaluate the layer-based, in-network adaptation of video streams encoded using SHVC in situations where dynamically changing bandwidths and datagram loss ratios require the real-time adaptation of video streams. Through the use of extensive experimentation, we establish a comprehensive set of benchmarks for SHVC-based highdefinition video streaming in loss prone network environments such as those commonly found in mobile networks. Among other results, we highlight that packet losses of only 1% can lead to a substantial reduction in PSNR of over 3dB and error propagation in over 130 pictures following the one in which the loss occurred. This work would be one of the earliest studies in this cutting-edge area that reports benchmark evaluation results for the effects of datagram loss on SHVC picture quality and offers empirical and analytical insights into SHVC adaptation to lossy, mobile networking conditions.
Vajda, E G; Skedros, J G; Bloebaum, R D
1998-10-01
Backscattered electron (BSE) imaging has proven to be a useful method for analyzing the mineral distribution in microscopic regions of bone. However, an accepted method of standardization has not been developed, limiting the utility of BSE imaging for truly quantitative analysis. Previous work has suggested that BSE images can be standardized by energy-dispersive x-ray spectrometry (EDX). Unfortunately, EDX-standardized BSE images tend to underestimate the mineral content of bone when compared with traditional ash measurements. The goal of this study is to investigate the nature of the deficit between EDX-standardized BSE images and ash measurements. A series of analytical standards, ashed bone specimens, and unembedded bone specimens were investigated to determine the source of the deficit previously reported. The primary source of error was found to be inaccurate ZAF corrections to account for the organic phase of the bone matrix. Conductive coatings, methylmethacrylate embedding media, and minor elemental constituents in bone mineral introduced negligible errors. It is suggested that the errors would remain constant and an empirical correction could be used to account for the deficit. However, extensive preliminary testing of the analysis equipment is essential.
NASA Astrophysics Data System (ADS)
Sarma, Pullela K.; Srinivas, Vadapalli; Rao, Vedula Dharma; Kumar, Ayyagari Kiran
2011-12-01
The present investigation summarizes detailed experimental studies with standard lubricants of commercial quality known as Racer-4 of Hindustan Petroleum Corporation (India) dispersed with different mass concentrations of nanoparticles of Cu and TiO2. The test bench is fabricated with a four-stroke Hero-Honda motorbike hydraulically loaded at the rear wheel with proper instrumentation to record the fuel consumption, the load on the rear wheel, and the linear velocity. The whole range of data obtained on a stationery bike is subjected to regression analysis to arrive at various relationships between fuel consumption as a function of brake power, linear velocity, and percentage mass concentration of nanoparticles in the lubricant. The empirical relation correlates with the observed data with reasonable accuracy. Further, extension of the analysis by developing a mathematical model has revealed a definite improvement in brake thermal efficiency which ultimately affects the fuel economy by diminishing frictional power in the system with the introduction of nanoparticles into the lubricant. The performance of the engine seems to be better with nano Cu-Racer-4 combination than the one with nano TiO2.
2011-01-01
The present investigation summarizes detailed experimental studies with standard lubricants of commercial quality known as Racer-4 of Hindustan Petroleum Corporation (India) dispersed with different mass concentrations of nanoparticles of Cu and TiO2. The test bench is fabricated with a four-stroke Hero-Honda motorbike hydraulically loaded at the rear wheel with proper instrumentation to record the fuel consumption, the load on the rear wheel, and the linear velocity. The whole range of data obtained on a stationery bike is subjected to regression analysis to arrive at various relationships between fuel consumption as a function of brake power, linear velocity, and percentage mass concentration of nanoparticles in the lubricant. The empirical relation correlates with the observed data with reasonable accuracy. Further, extension of the analysis by developing a mathematical model has revealed a definite improvement in brake thermal efficiency which ultimately affects the fuel economy by diminishing frictional power in the system with the introduction of nanoparticles into the lubricant. The performance of the engine seems to be better with nano Cu-Racer-4 combination than the one with nano TiO2. PMID:21711765
Sarma, Pullela K; Srinivas, Vadapalli; Rao, Vedula Dharma; Kumar, Ayyagari Kiran
2011-03-17
The present investigation summarizes detailed experimental studies with standard lubricants of commercial quality known as Racer-4 of Hindustan Petroleum Corporation (India) dispersed with different mass concentrations of nanoparticles of Cu and TiO2. The test bench is fabricated with a four-stroke Hero-Honda motorbike hydraulically loaded at the rear wheel with proper instrumentation to record the fuel consumption, the load on the rear wheel, and the linear velocity. The whole range of data obtained on a stationery bike is subjected to regression analysis to arrive at various relationships between fuel consumption as a function of brake power, linear velocity, and percentage mass concentration of nanoparticles in the lubricant. The empirical relation correlates with the observed data with reasonable accuracy. Further, extension of the analysis by developing a mathematical model has revealed a definite improvement in brake thermal efficiency which ultimately affects the fuel economy by diminishing frictional power in the system with the introduction of nanoparticles into the lubricant. The performance of the engine seems to be better with nano Cu-Racer-4 combination than the one with nano TiO2.
Determinants of linear judgment: a meta-analysis of lens model studies.
Karelaia, Natalia; Hogarth, Robin M
2008-05-01
The mathematical representation of E. Brunswik's (1952) lens model has been used extensively to study human judgment and provides a unique opportunity to conduct a meta-analysis of studies that covers roughly 5 decades. Specifically, the authors analyzed statistics of the "lens model equation" (L. R. Tucker, 1964) associated with 249 different task environments obtained from 86 articles. On average, fairly high levels of judgmental achievement were found, and people were seen to be capable of achieving similar levels of cognitive performance in noisy and predictable environments. Further, the effects of task characteristics that influence judgment (numbers and types of cues, inter-cue redundancy, function forms and cue weights in the ecology, laboratory versus field studies, and experience with the task) were identified and estimated. A detailed analysis of learning studies revealed that the most effective form of feedback was information about the task. The authors also analyzed empirically under what conditions the application of bootstrapping--or replacing judges by their linear models--is advantageous. Finally, the authors note shortcomings of the kinds of studies conducted to date, limitations in the lens model methodology, and possibilities for future research. (Copyright) 2008 APA, all rights reserved.
NASA Astrophysics Data System (ADS)
Cañon-Tapia, Edgardo; Mendoza-Borunda, Ramón
2014-06-01
The distribution of volcanic features is ultimately controlled by processes taking place beneath the surface of a planet. For this reason, characterization of volcano distribution at a global scale can be used to obtain insights concerning dynamic aspects of planetary interiors. Until present, studies of this type have focused on volcanic features of a specific type, or have concentrated on relatively small regions. In this paper, (the first of a series of three papers) we describe the distribution of volcanic features observed over the entire surface of the Earth, combining an extensive database of submarine and subaerial volcanoes. The analysis is based on spatial density contours obtained with the Fisher kernel. Based on an empirical approach that makes no a priori assumptions concerning the number of modes that should characterize the density distribution of volcanism we identified the most significant modes. Using those modes as a base, the relevant distance for the formation of clusters of volcanoes is constrained to be on the order of 100 to 200 km. In addition, it is noted that the most significant modes lead to the identification of clusters that outline the most important tectonic margins on Earth without the need of making any ad hoc assumptions. Consequently, we suggest that this method has the potential of yielding insights about the probable occurrence of tectonic features within other planets.
Analysis of dystonic tremor in musicians using empirical mode decomposition.
Lee, A; Schoonderwaldt, E; Chadde, M; Altenmüller, E
2015-01-01
Test the hypotheses that tremor amplitude in musicians with task-specific dystonia is higher at the affected finger (dystonic tremor, DT) or the adjacent finger (tremor associated with dystonia, TAD) than (1) in matched fingers of healthy musicians and non-musicians and (2) within patients in the unaffected and non-adjacent fingers of the affected side within patients. We measured 21 patients, 21 healthy musicians and 24 non-musicians. Participants exerted a flexion-extension movement. Instantaneous frequency and amplitude values were obtained with empirical mode decomposition and a Hilbert-transform, allowing to compare tremor amplitudes throughout the movement at various frequency ranges. We did not find a significant difference in tremor amplitude between patients and controls for either DT or TAD. Neither differed tremor amplitude in the within-patient comparisons. Both hypotheses were rejected and apparently neither DT nor TAD occur in musician's dystonia of the fingers. This is the first study assessing DT and TAD in musician's dystonia. Our finding suggests that even though MD is an excellent model for malplasticity due to excessive practice, it does not seem to provide a good model for DT. Rather it seems that musician's dystonia may manifest itself either as dystonic cramping without tremor or as task-specific tremor without overt dystonic cramping. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Knight, Rod
2016-05-01
The field of population and public health ethics (PPHE) has yet to fully embrace the generation of evidence as an important project. This article reviews the philosophical debates related to the 'empirical turn' in clinical bioethics, and critically analyses how PPHE has and can engage with the philosophical implications of generating empirical data within the task of normative inquiry. A set of five conceptual and theoretical issues pertaining to population health that are unresolved and could potentially benefit from empirical PPHE approaches to normative inquiry are discussed. Each issue differs from traditional empirical bioethical approaches, in that they emphasize (1) concerns related to the population, (2) 'upstream' policy-relevant health interventions - within and outside of the health care system and (3) the prevention of illness and disease. Within each theoretical issue, a conceptual example from population and public health approaches to HIV prevention and health promotion is interrogated. Based on the review and critical analysis, this article concludes that empirical-normative approaches to population and public health ethics would be most usefully pursued as an iterative project (rather than as a linear project), in which the normative informs the empirical questions to be asked and new empirical evidence constantly directs conceptualizations of what constitutes morally robust public health practices. Finally, a conceptualization of an empirical population and public health ethics is advanced in order to open up new interdisciplinary 'spaces', in which empirical and normative approaches to ethical inquiry are transparently (and ethically) integrated. © The Author(s) 2015.
A New Look at the Eclipse Timing Variation Diagram Analysis of Selected 3-body W UMa Systems
NASA Astrophysics Data System (ADS)
Christopoulou, P.-E.; Papageorgiou, A.
2015-07-01
The light travel effect produced by the presence of tertiary components can reveal much about the origin and evolution of over-contact binaries. Monitoring of W UMa systems over the last decade and/or the use of publicly available photometric surveys (NSVS, ASAS, etc.) has uncovered or suggested the presence of many unseen companions, which calls for an in-depth investigation of the parameters derived from cyclic period variations in order to confirm or reject the assumption of hidden companion(s). Progress in the analysis of eclipse timing variations is summarized here both from the empirical and the theoretical points of view, and a more extensive investigation of the proposed orbital parameters of third bodies is proposed. The code we have developed for this, implemented in Python, is set up to handle heuristic scanning with parameter perturbation in parameter space, and to establish realistic uncertainties from the least squares fitting. A computational example is given for TZ Boo, a W UMa system with a spectroscopically detected third component. Future options to be implemented include MCMC and bootstrapping.
Cohen, Michael X; Gulbinaite, Rasa
2017-02-15
Steady-state evoked potentials (SSEPs) are rhythmic brain responses to rhythmic sensory stimulation, and are often used to study perceptual and attentional processes. We present a data analysis method for maximizing the signal-to-noise ratio of the narrow-band steady-state response in the frequency and time-frequency domains. The method, termed rhythmic entrainment source separation (RESS), is based on denoising source separation approaches that take advantage of the simultaneous but differential projection of neural activity to multiple electrodes or sensors. Our approach is a combination and extension of existing multivariate source separation methods. We demonstrate that RESS performs well on both simulated and empirical data, and outperforms conventional SSEP analysis methods based on selecting electrodes with the strongest SSEP response, as well as several other linear spatial filters. We also discuss the potential confound of overfitting, whereby the filter captures noise in absence of a signal. Matlab scripts are available to replicate and extend our simulations and methods. We conclude with some practical advice for optimizing SSEP data analyses and interpreting the results. Copyright © 2016 Elsevier Inc. All rights reserved.
Characteristics of German hospitals adopting health IT systems - results from an empirical study.
Liebe, Jan-David; Egbert, Nicole; Frey, Andreas; Hübner, Ursula
2011-01-01
Hospital characteristics that facilitate IT adoption have been described by the literature extensively, however with controversial results. The aim of this study therefore is to draw a set of the most important variables from previous studies and include them in a combined analysis for testing their contribution as single factors and their interactions. Total number of IT systems installed and number of clinical IT systems in the hospital were used as criterion variables. Data from a national survey of German hospitals served as basis. Based on a stepwise multiple regression analysis four variables were identified to significantly explain the degree of IT adoption (60% explained variance): 1) hospital size, 2) IT department, 3) reference customer and 4) ownership (private vs. public). Our results replicate previous findings with regard to hospital size and ownership. In addition our study emphasizes the importance of a reliable internal structure for IT projects (existence of an IT department) and the culture of testing and installing most recent IT products (being a reference customer). None of the interactions between factors was significant.
HuH-7 reference genome profile: complex karyotype composed of massive loss of heterozygosity.
Kasai, Fumio; Hirayama, Noriko; Ozawa, Midori; Satoh, Motonobu; Kohara, Arihiro
2018-05-17
Human cell lines represent a valuable resource as in vitro experimental models. A hepatoma cell line, HuH-7 (JCRB0403), has been used extensively in various research fields and a number of studies using this line have been published continuously since it was established in 1982. However, an accurate genome profile, which can be served as a reliable reference, has not been available. In this study, we performed M-FISH, SNP microarray and amplicon sequencing to characterize the cell line. Single cell analysis of metaphases revealed a high level of heterogeneity with a mode of 60 chromosomes. Cytogenetic results demonstrated chromosome abnormalities involving every chromosome in addition to a massive loss of heterozygosity, which accounts for 55.3% of the genome, consistent with the homozygous variants seen in the sequence analysis. We provide empirical data that the HuH-7 cell line is composed of highly heterogeneous cell populations, suggesting that besides cell line authentication, the quality of cell lines needs to be taken into consideration in the future use of tumor cell lines.
Analysis of Dark Current in BRITE Nanostellite CCD Sensors †
Popowicz, Adam
2018-01-01
The BRightest Target Explorer (BRITE) is the pioneering nanosatellite mission dedicated for photometric observations of the brightest stars in the sky. The BRITE charge coupled device (CCD) sensors are poorly shielded against extensive flux of energetic particles which constantly induce defects in the silicon lattice. In this paper we investigate the temporal evolution of the generation of the dark current in the BRITE CCDs over almost four years after launch. Utilizing several steps of image processing and employing normalization of the results, it was possible to obtain useful information about the progress of thermal activity in the sensors. The outcomes show a clear and consistent linear increase of induced damage despite the fact that only about 0.14% of CCD pixels were probed. By performing the analysis of temperature dependencies of the dark current, we identified the observed defects as phosphorus-vacancy (PV) pairs, which are common in proton irradiated CCD matrices. Moreover, the Meyer-Neldel empirical rule was confirmed in our dark current data, yielding EMN=24.8 meV for proton-induced PV defects. PMID:29415471
Klein, Stanley B
2016-01-01
Following the seminal work of Ingvar (1985. "Memory for the future": An essay on the temporal organization of conscious awareness. Human Neurobiology, 4, 127-136), Suddendorf (1994. The discovery of the fourth dimension: Mental time travel and human evolution. Master's thesis. University of Waikato, Hamilton, New Zealand), and Tulving (1985. Memory and consciousness. Canadian Psychology/PsychologieCanadienne, 26, 1-12), exploration of the ability to anticipate and prepare for future contingencies that cannot be known with certainty has grown into a thriving research enterprise. A fundamental tenet of this line of inquiry is that future-oriented mental time travel, in most of its presentations, is underwritten by a property or an extension of episodic recollection. However, a careful conceptual analysis of exactly how episodic memory functions in this capacity has yet to be undertaken. In this paper I conduct such an analysis. Based on conceptual, phenomenological, and empirical considerations, I conclude that the autonoetic component of episodic memory, not episodic memory per se, is the causally determinative factor enabling an individual to project him or herself into a personal future.
Geomorphological approach in karstic domain: importance of underground water in the Jura mountains.
NASA Astrophysics Data System (ADS)
Rabin, Mickael; Sue, Christian; Champagnac, Jean Daniel; Bichet, Vincent; Carry, Nicolas; Eichenberger, Urs; Mudry, Jacques; Valla, Pierre
2014-05-01
The Jura mountain belt is the north-westernmost and one of the most recent expressions of the Alpine orogeny (i.e. Mio-Pliocene times). The Jura has been well studied from a structural framework, but still remains the source of scientific debates, especially regarding its current and recent tectonic activity [Laubscher, 1992; Burkhard and Sommaruga, 1998]. It is deemed to be always in a shortening state, according to leveling data [Jouanne et al., 1998] and neotectonic observations [Madritsch et al., 2010]. However, the few GPS data available on the Jura do not show evidence of shortening, but rather a low-magnitude extension parallel to the arc [Walpersdorf et al., 2006]. Moreover, the traditionally accepted assumption of a collisional activity of the Jura raises the question of its geodynamic origin. The Western Alps are themselves in a post-collisional regime and characterized by a noticeable isostatic-related extension, due to the interaction between buoyancy forces and external dynamics [Sue et al., 2007]. Quantitative morphotectonic approaches have been increasingly used in active mountain belts to infer relationship between climates and tectonics in landscape evolution [Whipple, 2009]. In this study, we propose to apply morphometric tools to calcareous bedrock, in a slowly deformed mountain belt. In particular, we have used watersheds metrics determination and associated river profiles analysis to allow quantifying the degree and nature of the equilibrium between the tectonic forcing and the fluvial erosional agent [Kirby and Whipple, 2001]. Indeed, long-term river profiles evolution is controlled by climatic and tectonic forcing through the following expression [Whipple and Tucker, 1999]: S = (U / K) 1/n Am/n (with U: uplift rate, K: empirical erodibility factor, function of hydrological and geological settings; A: drained area, m, n: empirical parameters). We present here a systematic analysis of river profiles applied to the main drainage system of the Jura. The objective is to assess to what extent this powerful landscape analysis tool will be applicable to limestone bedrock settings where groundwater flow might be an important component of the hydrological system. First results show that river slopes and knickpoints are poorly controlled by lithological variation within the Jura mountains. Quantitative analyses reveal abnormal longitudinal profiles, which are controlled by either tectonic and/or karstic processes. Evaluating the contribution of both tectonics and karst influence in the destabilization of river profiles is challenging and appears still unresolved. However these morphometrics signals seem to be in accordance with the presence of active N-S to NW-SE strike-slip faults, controlling both surface runoff and groundwater flow.
DOT National Transportation Integrated Search
2015-02-01
This TechBrief describes evaluating the use of the Modern-Era Retrospective Analysis for Research and Applications (MERRA) product as an alternative climatic data source for the Mechanistic-Empirical Pavement Design Guide (MEPDG) and other transporta...
Posttraumatic Stress Disorder and Intimate Relationship Problems: A Meta-Analysis
ERIC Educational Resources Information Center
Taft, Casey T.; Watkins, Laura E.; Stafford, Jane; Street, Amy E.; Monson, Candice M.
2011-01-01
Objective: The authors conducted a meta-analysis of empirical studies investigating associations between indices of posttraumatic stress disorder (PTSD) and intimate relationship problems to empirically synthesize this literature. Method: A literature search using PsycINFO, Medline, Published International Literature on Traumatic Stress (PILOTS),…
2012-01-01
Background Decision-making in healthcare is complex. Research on coverage decision-making has focused on comparative studies for several countries, statistical analyses for single decision-makers, the decision outcome and appraisal criteria. Accounting for decision processes extends the complexity, as they are multidimensional and process elements need to be regarded as latent constructs (composites) that are not observed directly. The objective of this study was to present a practical application of partial least square path modelling (PLS-PM) to evaluate how it offers a method for empirical analysis of decision-making in healthcare. Methods Empirical approaches that applied PLS-PM to decision-making in healthcare were identified through a systematic literature search. PLS-PM was used as an estimation technique for a structural equation model that specified hypotheses between the components of decision processes and the reasonableness of decision-making in terms of medical, economic and other ethical criteria. The model was estimated for a sample of 55 coverage decisions on the extension of newborn screening programmes in Europe. Results were evaluated by standard reliability and validity measures for PLS-PM. Results After modification by dropping two indicators that showed poor measures in the measurement models’ quality assessment and were not meaningful for newborn screening, the structural equation model estimation produced plausible results. The presence of three influences was supported: the links between both stakeholder participation or transparency and the reasonableness of decision-making; and the effect of transparency on the degree of scientific rigour of assessment. Reliable and valid measurement models were obtained to describe the composites of ‘transparency’, ‘participation’, ‘scientific rigour’ and ‘reasonableness’. Conclusions The structural equation model was among the first applications of PLS-PM to coverage decision-making. It allowed testing of hypotheses in situations where there are links between several non-observable constructs. PLS-PM was compatible in accounting for the complexity of coverage decisions to obtain a more realistic perspective for empirical analysis. The model specification can be used for hypothesis testing by using larger sample sizes and for data in the full domain of health technologies. PMID:22856325
Fischer, Katharina E
2012-08-02
Decision-making in healthcare is complex. Research on coverage decision-making has focused on comparative studies for several countries, statistical analyses for single decision-makers, the decision outcome and appraisal criteria. Accounting for decision processes extends the complexity, as they are multidimensional and process elements need to be regarded as latent constructs (composites) that are not observed directly. The objective of this study was to present a practical application of partial least square path modelling (PLS-PM) to evaluate how it offers a method for empirical analysis of decision-making in healthcare. Empirical approaches that applied PLS-PM to decision-making in healthcare were identified through a systematic literature search. PLS-PM was used as an estimation technique for a structural equation model that specified hypotheses between the components of decision processes and the reasonableness of decision-making in terms of medical, economic and other ethical criteria. The model was estimated for a sample of 55 coverage decisions on the extension of newborn screening programmes in Europe. Results were evaluated by standard reliability and validity measures for PLS-PM. After modification by dropping two indicators that showed poor measures in the measurement models' quality assessment and were not meaningful for newborn screening, the structural equation model estimation produced plausible results. The presence of three influences was supported: the links between both stakeholder participation or transparency and the reasonableness of decision-making; and the effect of transparency on the degree of scientific rigour of assessment. Reliable and valid measurement models were obtained to describe the composites of 'transparency', 'participation', 'scientific rigour' and 'reasonableness'. The structural equation model was among the first applications of PLS-PM to coverage decision-making. It allowed testing of hypotheses in situations where there are links between several non-observable constructs. PLS-PM was compatible in accounting for the complexity of coverage decisions to obtain a more realistic perspective for empirical analysis. The model specification can be used for hypothesis testing by using larger sample sizes and for data in the full domain of health technologies.
Analyzing Empirical Notions of Suffering: Advancing Youth Dialogue and Education
ERIC Educational Resources Information Center
Baring, Rito V.
2010-01-01
This article explores the possibilities of advancing youth dialogue and education among the Filipino youth using empirical notions of students on suffering. Examining empirical data, this analysis exposes uncharted notions of suffering and shows relevant meanings that underscore the plausible trappings of youth dialogue and its benefits on…
Empirical Data Collection and Analysis Using Camtasia and Transana
ERIC Educational Resources Information Center
Thorsteinsson, Gisli; Page, Tom
2009-01-01
One of the possible techniques for collecting empirical data is video recordings of a computer screen with specific screen capture software. This method for collecting empirical data shows how students use the BSCWII (Be Smart Cooperate Worldwide--a web based collaboration/groupware environment) to coordinate their work and collaborate in…
Valuing Informal Arguments and Empirical Investigations during Collective Argumentation
ERIC Educational Resources Information Center
Yopp, David A.
2012-01-01
Considerable literature has documented both the pros and cons of students' use of empirical evidence during proving activities. This article presents an analysis of a classroom episode involving in-service middle school, high school, and college teachers that demonstrates that learners need not be steered away from empirical investigations during…
Estimation of Vulnerability Functions for Debris Flows Using Different Intensity Parameters
NASA Astrophysics Data System (ADS)
Akbas, S. O.; Blahut, J.; Luna, B. Q.; Sterlacchini, S.
2009-04-01
In landslide risk research, the majority of past studies have focused on hazard analysis, with only few targeting the concept of vulnerability. When debris flows are considered, there is no consensus or even modest agreement on a generalized methodology to estimate physical vulnerability of the affected buildings. Very few quantitative relationships have been proposed between intensities and vulnerability values. More importantly, in most of the existing relationships, information on process intensity is often missing or only described semi-quantitatively. However, robust assessment of vulnerabilities along with the associated uncertainties is of utmost importance from a quantitative risk analysis point of view. On the morning of 13th July 2008, after more than two days of intense rainfall, several debris and mud flows were released in the central part of Valtellina, an Italian alpine valley in Lombardy Region. One of the largest muddy-debris flows occurred in Selvetta, a fraction of Colorina municipality. The result was the complete destruction of two buildings, and damage at varying severity levels to eight others. The authors had the chance to gather detailed information about the event, by conducting extensive field work and interviews with local inhabitants, civil protection teams, and officials. In addition to the data gathered from the field studies, the main characteristics of the debris flow have been estimated using numerical and empirical approaches. The extensive data obtained from Selvetta event gave an opportunity to develop three separate empirical vulnerability curves, which are functions of deposition height, debris flow velocity, and pressure, respectively. Deposition heights were directly obtained from field surveys, whereas the velocity and pressure values were back-calculated using the finite difference program FLO2D. The vulnerability was defined as the ratio between the monetary loss and the reconstruction value. The monetary losses were obtained from official RASDA documents, which were compiled for claim purposes. For each building, the approximate reconstruction value was calculated according to the building type and size, using the official data given in the Housing Prices Index prepared by the Engineers and Architects of Milan. The resulting vulnerability curves were compared to those in the literature, and among themselves. Specific recommendations were given regarding the most suitable parameter to be used for characterizing the intensity of debris flows within the context of physical vulnerability.
An Analysis of Social Justice Research in School Psychology
ERIC Educational Resources Information Center
Graybill, Emily; Baker, Courtney N.; Cloth, Allison H.; Fisher, Sycarah; Nastasi, Bonnie K.
2018-01-01
The purpose of the current content analysis was to build upon previous empirical research both within school psychology and in other subdisciplines of psychology to refine the operationalized definition of social justice within school psychology research. Operationalizing the definition and substantiating it within the empirical literature is a…
University Student Satisfaction: An Empirical Analysis
ERIC Educational Resources Information Center
Clemes, Michael D.; Gan, Christopher E. C.; Kao, Tzu-Hui
2008-01-01
The purpose of this research is to gain an empirical understanding of students' overall satisfaction with their academic university experiences. A hierarchal model is used as a framework for this analysis. Fifteen hypotheses are formulated and tested, in order to identify the dimensions of service quality as perceived by university students, to…
Determinants of Crime in Virginia: An Empirical Analysis
ERIC Educational Resources Information Center
Ali, Abdiweli M.; Peek, Willam
2009-01-01
This paper is an empirical analysis of the determinants of crime in Virginia. Over a dozen explanatory variables that current literature suggests as important determinants of crime are collected. The data is from 1970 to 2000. These include economic, fiscal, demographic, political, and social variables. The regression results indicate that crime…
What is heartburn worth? A cost-utility analysis of management strategies.
Heudebert, G R; Centor, R M; Klapow, J C; Marks, R; Johnson, L; Wilcox, C M
2000-03-01
To determine the best treatment strategy for the management of patients presenting with symptoms consistent with uncomplicated heartburn. We performed a cost-utility analysis of 4 alternatives: empirical proton pump inhibitor, empirical histamine2-receptor antagonist, and diagnostic strategies consisting of either esophagogastroduodenoscopy (EGD) or an upper gastrointestinal series before treatment. The time horizon of the model was 1 year. The base case analysis assumed a cohort of otherwise healthy 45-year-old individuals in a primary care practice. Empirical treatment with a proton pump inhibitor was projected to provide the greatest quality-adjusted survival for the cohort. Empirical treatment with a histamine2 receptor antagonist was projected to be the least costly of the alternatives. The marginal cost-effectiveness of using a proton pump inhibitor over a histamine2-receptor antagonist was approximately $10,400 per quality-adjusted life year (QALY) gained in the base case analysis and was less than $50,000 per QALY as long as the utility for heartburn was less than 0.95. Both diagnostic strategies were dominated by proton pump inhibitor alternative. Empirical treatment seems to be the optimal initial management strategy for patients with heartburn, but the choice between a proton pump inhibitor or histamine2-receptor antagonist depends on the impact of heartburn on quality of life.
ERIC Educational Resources Information Center
Gansemer, Lawrence P.; Bealer, Robert C.
Using data generated from the records of 460 rural-reared Pennsylvania males contacted initially as sophomores in 1947 and again in 1957 and 1971, an effort was made to replicate the tradition of path analytic, causal modeling of status attainment in American society and to assess the empirical efficacy of certain family input variables not…
Acoustic Scattering by Near-Surface Inhomogeneities in Porous Media
1990-02-21
surfaces [8]. Recently, this empirical model has been replaced by a more rigorous mi- crostructural model [9]. Here, the acoustical characteristics of...boundaries. A discussion of how ground acoustic characteristics are modelled then follows, with the chapter being concluded by a brief summary. 3.1...of ground acoustic char- acteristics, with particular emphasis on the Four parameter model of Atten- borough, that will be used extensively later. 48
ERIC Educational Resources Information Center
Desjardins, Richard
2013-01-01
This study considers the extensive critique of the impact of the "market" or "neoliberal" model on learning and its outcomes in the light of alternative models. The purpose is to consider the potential impacts of the market on learning and its outcomes and to contextualise critique by considering alternative coordination…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sundararaman, Ravishankar; Gunceler, Deniz; Arias, T. A.
2014-10-07
Continuum solvation models enable efficient first principles calculations of chemical reactions in solution, but require extensive parametrization and fitting for each solvent and class of solute systems. Here, we examine the assumptions of continuum solvation models in detail and replace empirical terms with physical models in order to construct a minimally-empirical solvation model. Specifically, we derive solvent radii from the nonlocal dielectric response of the solvent from ab initio calculations, construct a closed-form and parameter-free weighted-density approximation for the free energy of the cavity formation, and employ a pair-potential approximation for the dispersion energy. We show that the resulting modelmore » with a single solvent-independent parameter: the electron density threshold (n c), and a single solvent-dependent parameter: the dispersion scale factor (s 6), reproduces solvation energies of organic molecules in water, chloroform, and carbon tetrachloride with RMS errors of 1.1, 0.6 and 0.5 kcal/mol, respectively. We additionally show that fitting the solvent-dependent s 6 parameter to the solvation energy of a single non-polar molecule does not substantially increase these errors. Parametrization of this model for other solvents, therefore, requires minimal effort and is possible without extensive databases of experimental solvation free energies.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sundararaman, Ravishankar; Gunceler, Deniz; Arias, T. A.
2014-10-07
Continuum solvation models enable efficient first principles calculations of chemical reactions in solution, but require extensive parametrization and fitting for each solvent and class of solute systems. Here, we examine the assumptions of continuum solvation models in detail and replace empirical terms with physical models in order to construct a minimally-empirical solvation model. Specifically, we derive solvent radii from the nonlocal dielectric response of the solvent from ab initio calculations, construct a closed-form and parameter-free weighted-density approximation for the free energy of the cavity formation, and employ a pair-potential approximation for the dispersion energy. We show that the resulting modelmore » with a single solvent-independent parameter: the electron density threshold (n{sub c}), and a single solvent-dependent parameter: the dispersion scale factor (s{sub 6}), reproduces solvation energies of organic molecules in water, chloroform, and carbon tetrachloride with RMS errors of 1.1, 0.6 and 0.5 kcal/mol, respectively. We additionally show that fitting the solvent-dependent s{sub 6} parameter to the solvation energy of a single non-polar molecule does not substantially increase these errors. Parametrization of this model for other solvents, therefore, requires minimal effort and is possible without extensive databases of experimental solvation free energies.« less
NASA Astrophysics Data System (ADS)
Monteys, Xavier; Harris, Paul; Caloca, Silvia
2014-05-01
The coastal shallow water zone can be a challenging and expensive environment within which to acquire bathymetry and other oceanographic data using traditional survey methods. Dangers and limited swath coverage make some of these areas unfeasible to survey using ship borne systems, and turbidity can preclude marine LIDAR. As a result, an extensive part of the coastline worldwide remains completely unmapped. Satellite EO multispectral data, after processing, allows timely, cost efficient and quality controlled information to be used for planning, monitoring, and regulating coastal environments. It has the potential to deliver repetitive derivation of medium resolution bathymetry, coastal water properties and seafloor characteristics in shallow waters. Over the last 30 years satellite passive imaging methods for bathymetry extraction, implementing analytical or empirical methods, have had a limited success predicting water depths. Different wavelengths of the solar light penetrate the water column to varying depths. They can provide acceptable results up to 20 m but become less accurate in deeper waters. The study area is located in the inner part of Dublin Bay, on the East coast of Ireland. The region investigated is a C-shaped inlet covering an area of 10 km long and 5 km wide with water depths ranging from 0 to 10 m. The methodology employed on this research uses a ratio of reflectance from SPOT 5 satellite bands, differing to standard linear transform algorithms. High accuracy water depths were derived using multibeam data. The final empirical model uses spatially weighted geographical tools to retrieve predicted depths. The results of this paper confirm that SPOT satellite scenes are suitable to predict depths using empirical models in very shallow embayments. Spatial regression models show better adjustments in the predictions over non-spatial models. The spatial regression equation used provides realistic results down to 6 m below the water surface, with reliable and error controlled depths. Bathymetric extraction approaches involving satellite imagery data are regarded as a fast, successful and economically advantageous solution to automatic water depth calculation in shallow and complex environments.
NASA Astrophysics Data System (ADS)
Norros, Veera; Laine, Marko; Lignell, Risto; Thingstad, Frede
2017-10-01
Methods for extracting empirically and theoretically sound parameter values are urgently needed in aquatic ecosystem modelling to describe key flows and their variation in the system. Here, we compare three Bayesian formulations for mechanistic model parameterization that differ in their assumptions about the variation in parameter values between various datasets: 1) global analysis - no variation, 2) separate analysis - independent variation and 3) hierarchical analysis - variation arising from a shared distribution defined by hyperparameters. We tested these methods, using computer-generated and empirical data, coupled with simplified and reasonably realistic plankton food web models, respectively. While all methods were adequate, the simulated example demonstrated that a well-designed hierarchical analysis can result in the most accurate and precise parameter estimates and predictions, due to its ability to combine information across datasets. However, our results also highlighted sensitivity to hyperparameter prior distributions as an important caveat of hierarchical analysis. In the more complex empirical example, hierarchical analysis was able to combine precise identification of parameter values with reasonably good predictive performance, although the ranking of the methods was less straightforward. We conclude that hierarchical Bayesian analysis is a promising tool for identifying key ecosystem-functioning parameters and their variation from empirical datasets.
Retamar, Pilar; Portillo, María M.; López-Prieto, María Dolores; Rodríguez-López, Fernando; de Cueto, Marina; García, María V.; Gómez, María J.; del Arco, Alfonso; Muñoz, Angel; Sánchez-Porto, Antonio; Torres-Tortosa, Manuel; Martín-Aspas, Andrés; Arroyo, Ascensión; García-Figueras, Carolina; Acosta, Federico; Corzo, Juan E.; León-Ruiz, Laura; Escobar-Lara, Trinidad
2012-01-01
The impact of the adequacy of empirical therapy on outcome for patients with bloodstream infections (BSI) is key for determining whether adequate empirical coverage should be prioritized over other, more conservative approaches. Recent systematic reviews outlined the need for new studies in the field, using improved methodologies. We assessed the impact of inadequate empirical treatment on the mortality of patients with BSI in the present-day context, incorporating recent methodological recommendations. A prospective multicenter cohort including all BSI episodes in adult patients was performed in 15 hospitals in Andalucía, Spain, over a 2-month period in 2006 to 2007. The main outcome variables were 14- and 30-day mortality. Adjusted analyses were performed by multivariate analysis and propensity score-based matching. Eight hundred one episodes were included. Inadequate empirical therapy was administered in 199 (24.8%) episodes; mortality at days 14 and 30 was 18.55% and 22.6%, respectively. After controlling for age, Charlson index, Pitt score, neutropenia, source, etiology, and presentation with severe sepsis or shock, inadequate empirical treatment was associated with increased mortality at days 14 and 30 (odds ratios [ORs], 2.12 and 1.56; 95% confidence intervals [95% CI], 1.34 to 3.34 and 1.01 to 2.40, respectively). The adjusted ORs after a propensity score-based matched analysis were 3.03 and 1.70 (95% CI, 1.60 to 5.74 and 0.98 to 2.98, respectively). In conclusion, inadequate empirical therapy is independently associated with increased mortality in patients with BSI. Programs to improve the quality of empirical therapy in patients with suspicion of BSI and optimization of definitive therapy should be implemented. PMID:22005999
Geophysical evaluation of sandstone aquifers in the Reconcavo-Tucano Basin, Bahia -- Brazil
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lima, O.A.L. de
1993-11-01
The upper clastic sediments in the Reconcavo-Tucano basin comprise a multilayer aquifer system of Jurassic age. Its groundwater is normally fresh down to depths of more than 1,000 m. Locally, however, there are zones producing high salinity or sulfur geothermal water. Analysis of electrical logs of more than 150 wells enabled the identification of the most typical sedimentary structures and the gross geometries for the sandstone units in selected areas of the basin. Based on this information, the thick sands are interpreted as coalescent point bars and the shales as flood plain deposits of a large fluvial environment. The resistivitymore » logs and core laboratory data are combined to develop empirical equations relating aquifer porosity and permeability to log-derived parameters such as formation factor and cementation exponent. Temperature logs of 15 wells were useful to quantify the water leakage through semiconfining shales. The groundwater quality was inferred from spontaneous potential (SP) log deflections under control of chemical analysis of water samples. An empirical chart is developed that relates the SP-derived water resistivity to the true water resistivity within the formations. The patterns of salinity variation with depth inferred from SP logs were helpful in identifying subsurface flows along major fault zones, where extensive mixing of water is taking place. A total of 49 vertical Schlumberger resistivity soundings aid in defining aquifer structures and in extrapolating the log derived results. Transition zones between fresh and saline waters have also been detected based on a combination of logging and surface sounding data. Ionic filtering by water leakage across regional shales, local convection and mixing along major faults and hydrodynamic dispersion away from lateral permeability contrasts are the main mechanisms controlling the observed distributions of salinity and temperature within the basin.« less
NASA Astrophysics Data System (ADS)
Dreger, D. S.; Boyd, O. S.; Taira, T.; Gritto, R.
2017-12-01
Enhanced Geothermal System (EGS) resource development requires knowledge of subsurface physical parameters to quantify the evolution of fracture networks. Spatio-temporal source properties, including source dimension, rupture area, slip, rupture speed, and slip velocity of induced seismicity are of interest at The Geysers geothermal field, northern California to map the coseismic facture density of the EGS swarm. In this investigation we extend our previous finite-source analysis of selected M>4 earthquakes to examine source properties of smaller magnitude seismicity located in the Northwest Geysers Enhanced Geothermal System (EGS) demonstration project. Moment rate time histories of the source are found using empirical Green's function (eGf) deconvolution using the method of Mori (1993) as implemented by Dreger et al. (2007). The moment rate functions (MRFs) from data recorded using the Lawrence Berkeley National Laboratory (LBNL) short-period geophone network are inverted for finite-source parameters including the spatial distribution of fault slip, rupture velocity, and the orientation of the causative fault plane. The results show complexity in the MRF for the studied earthquakes. Thus far the estimated rupture area and the magnitude-area trend of the smaller magnitude Geysers seismicity is found to agree with the empirical relationships of Wells and Coppersmith (1994) and Leonard (2010), which were developed for much larger M>5.5 earthquakes worldwide indicating self-similar behavior extending to M2 earthquakes. We will present finite-source inversion results of the micro-earthquakes, attempting to extend the analysis to sub Mw, and demonstrate their magnitude-area scaling. The extension of the scaling laws will then enable the mapping of coseismic fracture density of the EGS swarm in the Northwest Geysers based on catalog moment magnitude estimates.
Pasaniuc, Bogdan; Sankararaman, Sriram; Torgerson, Dara G.; Gignoux, Christopher; Zaitlen, Noah; Eng, Celeste; Rodriguez-Cintron, William; Chapela, Rocio; Ford, Jean G.; Avila, Pedro C.; Rodriguez-Santana, Jose; Chen, Gary K.; Le Marchand, Loic; Henderson, Brian; Reich, David; Haiman, Christopher A.; Gonzàlez Burchard, Esteban; Halperin, Eran
2013-01-01
Motivation: Local ancestry analysis of genotype data from recently admixed populations (e.g. Latinos, African Americans) provides key insights into population history and disease genetics. Although methods for local ancestry inference have been extensively validated in simulations (under many unrealistic assumptions), no empirical study of local ancestry accuracy in Latinos exists to date. Hence, interpreting findings that rely on local ancestry in Latinos is challenging. Results: Here, we use 489 nuclear families from the mainland USA, Puerto Rico and Mexico in conjunction with 3204 unrelated Latinos from the Multiethnic Cohort study to provide the first empirical characterization of local ancestry inference accuracy in Latinos. Our approach for identifying errors does not rely on simulations but on the observation that local ancestry in families follows Mendelian inheritance. We measure the rate of local ancestry assignments that lead to Mendelian inconsistencies in local ancestry in trios (MILANC), which provides a lower bound on errors in the local ancestry estimates. We show that MILANC rates observed in simulations underestimate the rate observed in real data, and that MILANC varies substantially across the genome. Second, across a wide range of methods, we observe that loci with large deviations in local ancestry also show enrichment in MILANC rates. Therefore, local ancestry estimates at such loci should be interpreted with caution. Finally, we reconstruct ancestral haplotype panels to be used as reference panels in local ancestry inference and show that ancestry inference is significantly improved by incoroprating these reference panels. Availability and implementation: We provide the reconstructed reference panels together with the maps of MILANC rates as a public resource for researchers analyzing local ancestry in Latinos at http://bogdanlab.pathology.ucla.edu. Contact: bpasaniuc@mednet.ucla.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23572411
Wangmo, Tenzin; Hauri, Sirin; Gennet, Eloise; Anane-Sarpong, Evelyn; Provoost, Veerle; Elger, Bernice S
2018-02-07
A review of literature published a decade ago noted a significant increase in empirical papers across nine bioethics journals. This study provides an update on the presence of empirical papers in the same nine journals. It first evaluates whether the empirical trend is continuing as noted in the previous study, and second, how it is changing, that is, what are the characteristics of the empirical works published in these nine bioethics journals. A review of the same nine journals (Bioethics; Journal of Medical Ethics; Journal of Clinical Ethics; Nursing Ethics; Cambridge Quarterly of Healthcare Ethics; Hastings Center Report; Theoretical Medicine and Bioethics; Christian Bioethics; and Kennedy Institute of Ethics Journal) was conducted for a 12-year period from 2004 to 2015. Data obtained was analysed descriptively and using a non-parametric Chi-square test. Of the total number of original papers (N = 5567) published in the nine bioethics journals, 18.1% (n = 1007) collected and analysed empirical data. Journal of Medical Ethics and Nursing Ethics led the empirical publications, accounting for 89.4% of all empirical papers. The former published significantly more quantitative papers than qualitative, whereas the latter published more qualitative papers. Our analysis reveals no significant difference (χ2 = 2.857; p = 0.091) between the proportion of empirical papers published in 2004-2009 and 2010-2015. However, the increasing empirical trend has continued in these journals with the proportion of empirical papers increasing from 14.9% in 2004 to 17.8% in 2015. This study presents the current state of affairs regarding empirical research published nine bioethics journals. In the quarter century of data that is available about the nine bioethics journals studied in two reviews, the proportion of empirical publications continues to increase, signifying a trend towards empirical research in bioethics. The growing volume is mainly attributable to two journals: Journal of Medical Ethics and Nursing Ethics. This descriptive study further maps the still developing field of empirical research in bioethics. Additional studies are needed to completely map the nature and extent of empirical research in bioethics to inform the ongoing debate about the value of empirical research for bioethics.
Teaching Integrity in Empirical Research: A Protocol for Documenting Data Management and Analysis
ERIC Educational Resources Information Center
Ball, Richard; Medeiros, Norm
2012-01-01
This article describes a protocol the authors developed for teaching undergraduates to document their statistical analyses for empirical research projects so that their results are completely reproducible and verifiable. The protocol is guided by the principle that the documentation prepared to accompany an empirical research project should be…
NASA Astrophysics Data System (ADS)
Ahmed, Rounaq; Srinivasa Pai, P.; Sriram, N. S.; Bhat, Vasudeva
2018-02-01
Vibration Analysis has been extensively used in recent past for gear fault diagnosis. The vibration signals extracted is usually contaminated with noise and may lead to wrong interpretation of results. The denoising of extracted vibration signals helps the fault diagnosis by giving meaningful results. Wavelet Transform (WT) increases signal to noise ratio (SNR), reduces root mean square error (RMSE) and is effective to denoise the gear vibration signals. The extracted signals have to be denoised by selecting a proper denoising scheme in order to prevent the loss of signal information along with noise. An approach has been made in this work to show the effectiveness of Principal Component Analysis (PCA) to denoise gear vibration signal. In this regard three selected wavelet based denoising schemes namely PCA, Empirical Mode Decomposition (EMD), Neighcoeff Coefficient (NC), has been compared with Adaptive Threshold (AT) an extensively used wavelet based denoising scheme for gear vibration signal. The vibration signals acquired from a customized gear test rig were denoised by above mentioned four denoising schemes. The fault identification capability as well as SNR, Kurtosis and RMSE for the four denoising schemes have been compared. Features extracted from the denoised signals have been used to train and test artificial neural network (ANN) models. The performances of the four denoising schemes have been evaluated based on the performance of the ANN models. The best denoising scheme has been identified, based on the classification accuracy results. PCA is effective in all the regards as a best denoising scheme.
Power independent EMG based gesture recognition for robotics.
Li, Ling; Looney, David; Park, Cheolsoo; Rehman, Naveed U; Mandic, Danilo P
2011-01-01
A novel method for detecting muscle contraction is presented. This method is further developed for identifying four different gestures to facilitate a hand gesture controlled robot system. It is achieved based on surface Electromyograph (EMG) measurements of groups of arm muscles. The cross-information is preserved through a simultaneous processing of EMG channels using a recent multivariate extension of Empirical Mode Decomposition (EMD). Next, phase synchrony measures are employed to make the system robust to different power levels due to electrode placements and impedances. The multiple pairwise muscle synchronies are used as features of a discrete gesture space comprising four gestures (flexion, extension, pronation, supination). Simulations on real-time robot control illustrate the enhanced accuracy and robustness of the proposed methodology.
Test/semi-empirical analysis of a carbon/epoxy fabric stiffened panel
NASA Technical Reports Server (NTRS)
Spier, E. E.; Anderson, J. A.
1990-01-01
The purpose of this work-in-progress is to present a semi-empirical analysis method developed to predict the buckling and crippling loads of carbon/epoxy fabric blade stiffened panels in compression. This is a hand analysis method comprised of well known, accepted techniques, logical engineering judgements, and experimental data that results in conservative solutions. In order to verify this method, a stiffened panel was fabricated and tested. Both the best and analysis results are presented.
Concept Analysis of Spirituality: An Evolutionary Approach.
Weathers, Elizabeth; McCarthy, Geraldine; Coffey, Alice
2016-04-01
The aim of this article is to clarify the concept of spirituality for future nursing research. Previous concept analyses of spirituality have mostly reviewed the conceptual literature with little consideration of the empirical literature. The literature reviewed in prior concept analyses extends from 1972 to 2005, with no analysis conducted in the past 9 years. Rodgers' evolutionary framework was used to review both the theoretical and empirical literature pertaining to spirituality. Evolutionary concept analysis is a formal method of philosophical inquiry, in which papers are analyzed to identify attributes, antecedents, and consequences of the concept. Empirical and conceptual literature. Three defining attributes of spirituality were identified: connectedness, transcendence, and meaning in life. A conceptual definition of spirituality was proposed based on the findings. Also, four antecedents and five primary consequences of spirituality were identified. Spirituality is a complex concept. This concept analysis adds some clarification by proposing a definition of spirituality that is underpinned by both conceptual and empirical research. Furthermore, exemplars of spirituality, based on prior qualitative research, are presented to support the findings. Hence, the findings of this analysis could guide future nursing research on spirituality. © 2015 Wiley Periodicals, Inc.
A Conversation Analysis-Informed Test of L2 Aural Pragmatic Comprehension
ERIC Educational Resources Information Center
Walters, F. Scott
2009-01-01
Speech act theory-based, second language pragmatics testing (SLPT) raises test-validation issues owing to a lack of correspondence with empirical conversational data. On the assumption that conversation analysis (CA) provides a more accurate account of language use, it is suggested that CA serve as a more empirically valid basis for SLPT…
ERIC Educational Resources Information Center
Sanderson, Matthew R.; Kentor, Jeffrey D.
2009-01-01
It is widely argued that globalization and economic development are associated with international migration. However, these relationships have not been tested empirically. We use a cross-national empirical analysis to assess the impact of global and national factors on international migration from less-developed countries. An interdisciplinary…
Competences in Romanian Higher Education--An Empirical Investigation for the Business Sector
ERIC Educational Resources Information Center
Deaconu, Adela; Nistor, Cristina Silvia
2017-01-01
This research study particularizes the general descriptions of the European Qualifications Framework for Lifelong Learning, as compiled and developed within the Romanian qualification framework, to the business and economics field in general and to the property economic analysis and valuation field in particular. By means of an empirical analysis,…
ERIC Educational Resources Information Center
Friman, Margareta; Nyberg, Claes; Norlander, Torsten
2004-01-01
A descriptive qualitative analysis of in-depth interviews involving seven provincial Soccer Association referees was carried out in order to find out how referees experience threats and aggression directed to soccer referees. The Empirical Phenomenological Psychological method (EPP-method) was used. The analysis resulted in thirty categories which…
ERIC Educational Resources Information Center
Stefl-Mabry, Joette
2003-01-01
Describes a study that empirically identified individual preferences profiles to understand information-seeking behavior among professional groups for six selected information sources. Highlights include Social Judgment Analysis; the development of the survey used, a copy of which is appended; hypotheses tested; results of multiple regression…
Critical Access Hospitals and Retail Activity: An Empirical Analysis in Oklahoma
ERIC Educational Resources Information Center
Brooks, Lara; Whitacre, Brian E.
2011-01-01
Purpose: This paper takes an empirical approach to determining the effect that a critical access hospital (CAH) has on local retail activity. Previous research on the relationship between hospitals and economic development has primarily focused on single-case, multiplier-oriented analysis. However, as the efficacy of federal and state-level rural…
ERIC Educational Resources Information Center
Rine, P. Jesse; Guthrie, David S.
2016-01-01
Leaders of evangelical Christian colleges must navigate a challenging environment shaped by public concern about college costs and educational quality, federal inclinations toward increased regulation, and lingering fallout from the Great Recession. Proceeding from the premise that empirical analysis empowers institutional actors to lead well in…
ERIC Educational Resources Information Center
Coromaldi, Manuela; Zoli, Mariangela
2012-01-01
Theoretical and empirical studies have recently adopted a multidimensional concept of poverty. There is considerable debate about the most appropriate degree of multidimensionality to retain in the analysis. In this work we add to the received literature in two ways. First, we derive indicators of multiple deprivation by applying a particular…
Extended Analysis of Empirical Citations with Skinner's "Verbal Behavior": 1984-2004
ERIC Educational Resources Information Center
Dixon, Mark R.; Small, Stacey L.; Rosales, Rocio
2007-01-01
The present paper comments on and extends the citation analysis of verbal operant publications based on Skinner's "Verbal Behavior" (1957) by Dymond, O'Hora, Whelan, and O'Donovan (2006). Variations in population parameters were evaluated for only those studies that Dymond et al. categorized as empirical. Preliminary results indicate that the…
Artifact removal from EEG data with empirical mode decomposition
NASA Astrophysics Data System (ADS)
Grubov, Vadim V.; Runnova, Anastasiya E.; Efremova, Tatyana Yu.; Hramov, Alexander E.
2017-03-01
In the paper we propose the novel method for dealing with the physiological artifacts caused by intensive activity of facial and neck muscles and other movements in experimental human EEG recordings. The method is based on analysis of EEG signals with empirical mode decomposition (Hilbert-Huang transform). We introduce the mathematical algorithm of the method with following steps: empirical mode decomposition of EEG signal, choosing of empirical modes with artifacts, removing empirical modes with artifacts, reconstruction of the initial EEG signal. We test the method on filtration of experimental human EEG signals from movement artifacts and show high efficiency of the method.
Evaluating the evidence base for relational frame theory: a citation analysis.
Dymond, Simon; May, Richard J; Munnelly, Anita; Hoon, Alice E
2010-01-01
Relational frame theory (RFT) is a contemporary behavior-analytic account of language and cognition. Since it was first outlined in 1985, RFT has generated considerable controversy and debate, and several claims have been made concerning its evidence base. The present study sought to evaluate the evidence base for RFT by undertaking a citation analysis and by categorizing all articles that cited RFT-related search terms. A total of 174 articles were identified between 1991 and 2008, 62 (36%) of which were empirical and 112 (64%) were nonempirical articles. Further analyses revealed that 42 (68%) of the empirical articles were classified as empirical RFT and 20 (32%) as empirical other, whereas 27 (24%) of the nonempirical articles were assigned to the nonempirical reviews category and 85 (76%) to the nonempirical conceptual category. In addition, the present findings show that the majority of empirical research on RFT has been conducted with typically developing adult populations, on the relational frame of sameness, and has tended to be published in either The Psychological Record or the Journal of the Experimental Analysis of Behavior. Overall, RFT has made a substantial contribution to the literature in a relatively short period of time.
How GPs value guidelines applied to patients with multimorbidity: a qualitative study.
Luijks, Hilde; Lucassen, Peter; van Weel, Chris; Loeffen, Maartje; Lagro-Janssen, Antoine; Schermer, Tjard
2015-10-26
To explore and describe the value general practitioner (GPs) attribute to medical guidelines when they are applied to patients with multimorbidity, and to describe which benefits GPs experience from guideline adherence in these patients. Also, we aimed to identify limitations from guideline adherence in patients with multimorbidity, as perceived by GPs, and to describe their empirical solutions to manage these obstacles. Focus group study with purposive sampling of participants. Focus groups were guided by an experienced moderator who used an interview guide. Interviews were transcribed verbatim. Data analysis was performed by two researchers using the constant comparison analysis technique and field notes were used in the analysis. Data collection proceeded until saturation was reached. Primary care, eastern part of The Netherlands. Dutch GPs, heterogeneous in age, sex and academic involvement. 25 GPs participated in five focus groups. GPs valued the guidance that guidelines provide, but experienced shortcomings when they were applied to patients with multimorbidity. Taking these patients' personal circumstances into account was regarded as important, but it was impeded by a consistent focus on guideline adherence. Preventative measures were considered less appropriate in (elderly) patients with multimorbidity. Moreover, the applicability of guidelines in patients with multimorbidity was questioned. GPs' extensive practical experience with managing multimorbidity resulted in several empirical solutions, for example, using their 'common sense' to respond to the perceived shortcomings. GPs applying guidelines for patients with multimorbidity integrate patient-specific factors in their medical decisions, aiming for patient-centred solutions. Such integration of clinical experience and best evidence is required to practise evidence-based medicine. More flexibility in pay-for-performance systems is needed to facilitate this integration. Several improvements in guideline reporting are necessary to enhance the applicability of guidelines in patients with multimorbidity. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Concurrent enterprise: a conceptual framework for enterprise supply-chain network activities
NASA Astrophysics Data System (ADS)
Addo-Tenkorang, Richard; Helo, Petri T.; Kantola, Jussi
2017-04-01
Supply-chain management (SCM) in manufacturing industries has evolved significantly over the years. Recently, a lot more relevant research has picked up on the development of integrated solutions. Thus, seeking a collaborative optimisation of geographical, just-in-time (JIT), quality (customer demand/satisfaction) and return-on-investment (profits), aspects of organisational management and planning through 'best practice' business-process management - concepts and application; employing system tools such as certain applications/aspects of enterprise resource planning (ERP) - SCM systems information technology (IT) enablers to enhance enterprise integrated product development/concurrent engineering principles. This article assumed three main organisation theory applications in positioning its assumptions. Thus, proposing a feasible industry-specific framework not currently included within the SCOR model's level four (4) implementation level, as well as other existing SCM integration reference models such as in the MIT process handbook's - Process Interchange Format (PIF), the TOVE project, etc. which could also be replicated in other SCs. However, the wider focus of this paper's contribution will be concentrated on a complimentary proposed framework to the SCC's SCOR reference model. Quantitative empirical closed-ended questionnaires in addition to the main data collected from a qualitative empirical real-life industrial-based pilot case study were used: To propose a conceptual concurrent enterprise framework for SCM network activities. This research adopts a design structure matrix simulation approach analysis to propose an optimal enterprise SCM-networked value-adding, customised master data-management platform/portal for efficient SCM network information exchange and an effective supply-chain (SC) network systems-design teams' structure. Furthermore, social network theory analysis will be employed in a triangulation approach with statistical correlation analysis to assess the scale/level of frequency, importance, level of collaborative-ness, mutual trust as well as roles and responsibility among the enterprise SCM network for systems product development (PD) design teams' technical communication network as well as extensive literature reviews.
MEG-SIM: a web portal for testing MEG analysis methods using realistic simulated and empirical data.
Aine, C J; Sanfratello, L; Ranken, D; Best, E; MacArthur, J A; Wallace, T; Gilliam, K; Donahue, C H; Montaño, R; Bryant, J E; Scott, A; Stephen, J M
2012-04-01
MEG and EEG measure electrophysiological activity in the brain with exquisite temporal resolution. Because of this unique strength relative to noninvasive hemodynamic-based measures (fMRI, PET), the complementary nature of hemodynamic and electrophysiological techniques is becoming more widely recognized (e.g., Human Connectome Project). However, the available analysis methods for solving the inverse problem for MEG and EEG have not been compared and standardized to the extent that they have for fMRI/PET. A number of factors, including the non-uniqueness of the solution to the inverse problem for MEG/EEG, have led to multiple analysis techniques which have not been tested on consistent datasets, making direct comparisons of techniques challenging (or impossible). Since each of the methods is known to have their own set of strengths and weaknesses, it would be beneficial to quantify them. Toward this end, we are announcing the establishment of a website containing an extensive series of realistic simulated data for testing purposes ( http://cobre.mrn.org/megsim/ ). Here, we present: 1) a brief overview of the basic types of inverse procedures; 2) the rationale and description of the testbed created; and 3) cases emphasizing functional connectivity (e.g., oscillatory activity) suitable for a wide assortment of analyses including independent component analysis (ICA), Granger Causality/Directed transfer function, and single-trial analysis.
MEG-SIM: A Web Portal for Testing MEG Analysis Methods using Realistic Simulated and Empirical Data
Aine, C. J.; Sanfratello, L.; Ranken, D.; Best, E.; MacArthur, J. A.; Wallace, T.; Gilliam, K.; Donahue, C. H.; Montaño, R.; Bryant, J. E.; Scott, A.; Stephen, J. M.
2012-01-01
MEG and EEG measure electrophysiological activity in the brain with exquisite temporal resolution. Because of this unique strength relative to noninvasive hemodynamic-based measures (fMRI, PET), the complementary nature of hemodynamic and electrophysiological techniques is becoming more widely recognized (e.g., Human Connectome Project). However, the available analysis methods for solving the inverse problem for MEG and EEG have not been compared and standardized to the extent that they have for fMRI/PET. A number of factors, including the non-uniqueness of the solution to the inverse problem for MEG/EEG, have led to multiple analysis techniques which have not been tested on consistent datasets, making direct comparisons of techniques challenging (or impossible). Since each of the methods is known to have their own set of strengths and weaknesses, it would be beneficial to quantify them. Toward this end, we are announcing the establishment of a website containing an extensive series of realistic simulated data for testing purposes (http://cobre.mrn.org/megsim/). Here, we present: 1) a brief overview of the basic types of inverse procedures; 2) the rationale and description of the testbed created; and 3) cases emphasizing functional connectivity (e.g., oscillatory activity) suitable for a wide assortment of analyses including independent component analysis (ICA), Granger Causality/Directed transfer function, and single-trial analysis. PMID:22068921
Empirical relations for cavitation and liquid impingement erosion processes
NASA Technical Reports Server (NTRS)
Rao, P. V.; Buckley, D. H.
1984-01-01
A unified power-law relationship between average erosion rate and cumulative erosion is presented. Extensive data analyses from venturi, magnetostriction (stationary and oscillating specimens), liquid drop, and jet impact devices appear to conform to this relation. A normalization technique using cavitation and liquid impingement erosion data is also presented to facilitate prediction. Attempts are made to understand the relationship between the coefficients in the power-law relationships and the material properties.
An Initial Model of Requirements Traceability an Empirical Study
1992-09-22
procedures have been used extensively in the study of human problem-solving, including such areas as general problem-solving behavior, physics problem...heen doing unless you have traceability." " Humans don’t go back to the requirements enough." "Traceabi!ity should be extremely helpful with...by constraints on its usage: ("Traceability needs to be something that humans can work with, not just a whip held over people." "Traceability should
Simultaneous confidence bands for Cox regression from semiparametric random censorship.
Mondal, Shoubhik; Subramanian, Sundarraman
2016-01-01
Cox regression is combined with semiparametric random censorship models to construct simultaneous confidence bands (SCBs) for subject-specific survival curves. Simulation results are presented to compare the performance of the proposed SCBs with the SCBs that are based only on standard Cox. The new SCBs provide correct empirical coverage and are more informative. The proposed SCBs are illustrated with two real examples. An extension to handle missing censoring indicators is also outlined.
Empirical evaluation of neutral interactions in host-parasite networks.
Canard, E F; Mouquet, N; Mouillot, D; Stanko, M; Miklisova, D; Gravel, D
2014-04-01
While niche-based processes have been invoked extensively to explain the structure of interaction networks, recent studies propose that neutrality could also be of great importance. Under the neutral hypothesis, network structure would simply emerge from random encounters between individuals and thus would be directly linked to species abundance. We investigated the impact of species abundance distributions on qualitative and quantitative metrics of 113 host-parasite networks. We analyzed the concordance between neutral expectations and empirical observations at interaction, species, and network levels. We found that species abundance accurately predicts network metrics at all levels. Despite host-parasite systems being constrained by physiology and immunology, our results suggest that neutrality could also explain, at least partially, their structure. We hypothesize that trait matching would determine potential interactions between species, while abundance would determine their realization.
Nonparametric spirometry reference values for Hispanic Americans.
Glenn, Nancy L; Brown, Vanessa M
2011-02-01
Recent literature sites ethnic origin as a major factor in developing pulmonary function reference values. Extensive studies established reference values for European and African Americans, but not for Hispanic Americans. The Third National Health and Nutrition Examination Survey defines Hispanic as individuals of Spanish speaking cultures. While no group was excluded from the target population, sample size requirements only allowed inclusion of individuals who identified themselves as Mexican Americans. This research constructs nonparametric reference value confidence intervals for Hispanic American pulmonary function. The method is applicable to all ethnicities. We use empirical likelihood confidence intervals to establish normal ranges for reference values. Its major advantage: it is model free, but shares asymptotic properties of model based methods. Statistical comparisons indicate that empirical likelihood interval lengths are comparable to normal theory intervals. Power and efficiency studies agree with previously published theoretical results.
NASA Technical Reports Server (NTRS)
Bergrun, Norman R
1952-01-01
An empirically derived basis for predicting the area, rate, and distribution of water-drop impingement on airfoils of arbitrary section is presented. The concepts involved represent an initial step toward the development of a calculation technique which is generally applicable to the design of thermal ice-prevention equipment for airplane wing and tail surfaces. It is shown that sufficiently accurate estimates, for the purpose of heated-wing design, can be obtained by a few numerical computations once the velocity distribution over the airfoil has been determined. The calculation technique presented is based on results of extensive water-drop trajectory computations for five airfoil cases which consisted of 15-percent-thick airfoils encompassing a moderate lift-coefficient range. The differential equations pertaining to the paths of the drops were solved by a differential analyzer.
Mertz, Marcel; Schildmann, Jan
2018-06-01
Empirical bioethics is commonly understood as integrating empirical research with normative-ethical research in order to address an ethical issue. Methodological analyses in empirical bioethics mainly focus on the integration of socio-empirical sciences (e.g. sociology or psychology) and normative ethics. But while there are numerous multidisciplinary research projects combining life sciences and normative ethics, there is few explicit methodological reflection on how to integrate both fields, or about the goals and rationales of such interdisciplinary cooperation. In this paper we will review some drivers for the tendency of empirical bioethics methodologies to focus on the collaboration of normative ethics with particularly social sciences. Subsequently, we argue that the ends of empirical bioethics, not the empirical methods, are decisive for the question of which empirical disciplines can contribute to empirical bioethics in a meaningful way. Using already existing types of research integration as a springboard, five possible types of research which encompass life sciences and normative analysis will illustrate how such cooperation can be conceptualized from a methodological perspective within empirical bioethics. We will conclude with a reflection on the limitations and challenges of empirical bioethics research that integrates life sciences.
Rolls, David A.; Wang, Peng; McBryde, Emma; Pattison, Philippa; Robins, Garry
2015-01-01
We compare two broad types of empirically grounded random network models in terms of their abilities to capture both network features and simulated Susceptible-Infected-Recovered (SIR) epidemic dynamics. The types of network models are exponential random graph models (ERGMs) and extensions of the configuration model. We use three kinds of empirical contact networks, chosen to provide both variety and realistic patterns of human contact: a highly clustered network, a bipartite network and a snowball sampled network of a “hidden population”. In the case of the snowball sampled network we present a novel method for fitting an edge-triangle model. In our results, ERGMs consistently capture clustering as well or better than configuration-type models, but the latter models better capture the node degree distribution. Despite the additional computational requirements to fit ERGMs to empirical networks, the use of ERGMs provides only a slight improvement in the ability of the models to recreate epidemic features of the empirical network in simulated SIR epidemics. Generally, SIR epidemic results from using configuration-type models fall between those from a random network model (i.e., an Erdős-Rényi model) and an ERGM. The addition of subgraphs of size four to edge-triangle type models does improve agreement with the empirical network for smaller densities in clustered networks. Additional subgraphs do not make a noticeable difference in our example, although we would expect the ability to model cliques to be helpful for contact networks exhibiting household structure. PMID:26555701
Multicultural Counseling Competencies Research: A 20-Year Content Analysis
ERIC Educational Resources Information Center
Worthington, Roger L.; Soth-McNett, Angela M.; Moreno, Matthew V.
2007-01-01
The authors conducted a 20-year content analysis of the entire field of empirical research on the multicultural counseling competencies (D. W. Sue et al., 1982). They conducted an exhaustive search for empirical research articles using PsycINFO, as well as complete reviews of the past 20 years of several journals (e.g., Journal of Counseling…
ERIC Educational Resources Information Center
Zhang, Jingjing; Skryabin, Maxim; Song, Xiongwei
2016-01-01
This study attempts to make inferences about the mechanisms that drive network change over time. It adopts simulation investigation for empirical network analysis to examine the patterns and evolution of relationships formed in the context of a massive open online course (MOOC) discussion forum. Four network effects--"homophily,"…
Functions of Research in Radical Behaviorism for the Further Development of Behavior Analysis
ERIC Educational Resources Information Center
Leigland, Sam
2010-01-01
The experimental analysis of behavior began as an inductively oriented, empirically based scientific field. As the field grew, its distinctive system of science--radical behaviorism--grew with it. The continuing growth of the empirical base of the field has been accompanied by the growth of the literature on radical behaviorism and its…
Interest Rates and Coupon Bonds in Quantum Finance
NASA Astrophysics Data System (ADS)
Baaquie, Belal E.
2009-09-01
1. Synopsis; 2. Interest rates and coupon bonds; 3. Options and option theory; 4. Interest rate and coupon bond options; 5. Quantum field theory of bond forward interest rates; 6. Libor Market Model of interest rates; 7. Empirical analysis of forward interest rates; 8. Libor Market Model of interest rate options; 9. Numeraires for bond forward interest rates; 10. Empirical analysis of interest rate caps; 11. Coupon bond European and Asian options; 12. Empirical analysis of interest rate swaptions; 13. Correlation of coupon bond options; 14. Hedging interest rate options; 15. Interest rate Hamiltonian and option theory; 16. American options for coupon bonds and interest rates; 17. Hamiltonian derivation of coupon bond options; Appendixes; Glossaries; List of symbols; Reference; Index.
Jet Aeroacoustics: Noise Generation Mechanism and Prediction
NASA Technical Reports Server (NTRS)
Tam, Christopher
1998-01-01
This report covers the third year research effort of the project. The research work focussed on the fine scale mixing noise of both subsonic and supersonic jets and the effects of nozzle geometry and tabs on subsonic jet noise. In publication 1, a new semi-empirical theory of jet mixing noise from fine scale turbulence is developed. By an analogy to gas kinetic theory, it is shown that the source of noise is related to the time fluctuations of the turbulence kinetic theory. On starting with the Reynolds Averaged Navier-Stokes equations, a formula for the radiated noise is derived. An empirical model of the space-time correlation function of the turbulence kinetic energy is adopted. The form of the model is in good agreement with the space-time two-point velocity correlation function measured by Davies and coworkers. The parameters of the correlation are related to the parameters of the k-epsilon turbulence model. Thus the theory is self-contained. Extensive comparisons between the computed noise spectrum of the theory and experimental measured have been carried out. The parameters include jet Mach number from 0.3 to 2.0 and temperature ratio from 1.0 to 4.8. Excellent agreements are found in the spectrum shape, noise intensity and directivity. It is envisaged that the theory would supercede all semi-empirical and totally empirical jet noise prediction methods in current use.
Shen, Kunling; Xiong, Tengbin; Tan, Seng Chuen; Wu, Jiuhong
2016-01-01
Influenza is a common viral respiratory infection that causes epidemics and pandemics in the human population. Oseltamivir is a neuraminidase inhibitor-a new class of antiviral therapy for influenza. Although its efficacy and safety have been established, there is uncertainty regarding whether influenza-like illness (ILI) in children is best managed by oseltamivir at the onset of illness, and its cost-effectiveness in children has not been studied in China. To evaluate the cost-effectiveness of post rapid influenza diagnostic test (RIDT) treatment with oseltamivir and empiric treatment with oseltamivir comparing with no antiviral therapy against influenza for children with ILI. We developed a decision-analytic model based on previously published evidence to simulate and evaluate 1-year potential clinical and economic outcomes associated with three managing strategies for children presenting with symptoms of influenza. Model inputs were derived from literature and expert opinion of clinical practice and research in China. Outcome measures included costs and quality-adjusted life year (QALY). All the interventions were compared with incremental cost-effectiveness ratios (ICER). In base case analysis, empiric treatment with oseltamivir consistently produced the greatest gains in QALY. When compared with no antiviral therapy, the empiric treatment with oseltamivir strategy is very cost effective with an ICER of RMB 4,438. When compared with the post RIDT treatment with oseltamivir, the empiric treatment with oseltamivir strategy is dominant. Probabilistic sensitivity analysis projected that there is a 100% probability that empiric oseltamivir treatment would be considered as a very cost-effective strategy compared to the no antiviral therapy, according to the WHO recommendations for cost-effectiveness thresholds. The same was concluded with 99% probability for empiric oseltamivir treatment being a very cost-effective strategy compared to the post RIDT treatment with oseltamivir. In the Chinese setting of current health system, our modelling based simulation analysis suggests that empiric treatment with oseltamivir to be a cost-saving and very cost-effective strategy in managing children with ILI.
López-Cortés, L E; Almirante, B; Cuenca-Estrella, M; Garnacho-Montero, J; Padilla, B; Puig-Asensio, M; Ruiz-Camps, I; Rodríguez-Baño, J
2016-08-01
We compared the clinical efficacy of fluconazole and echinocandins in the treatment of candidemia in real practice. The CANDIPOP study is a prospective, population-based cohort study on candidemia carried out between May 2010 and April 2011 in 29 Spanish hospitals. Using strict inclusion criteria, we separately compared the impact of empirical and targeted therapy with fluconazole or echinocandins on 30-day mortality. Cox regression, including a propensity score (PS) for receiving echinocandins, stratified analysis on the PS quartiles and PS-based matched analyses, were performed. The empirical and targeted therapy cohorts comprised 316 and 421 cases, respectively; 30-day mortality was 18.7% with fluconazole and 33.9% with echinocandins (p 0.02) in the empirical therapy group and 19.8% with fluconazole and 27.7% with echinocandins (p 0.06) in the targeted therapy group. Multivariate Cox regression analysis including PS showed that empirical therapy with fluconazole was associated with better prognosis (adjusted hazard ratio 0.38; 95% confidence interval 0.17-0.81; p 0.01); no differences were found within each PS quartile or in cases matched according to PS. Targeted therapy with fluconazole did not show a significant association with mortality in the Cox regression analysis (adjusted hazard ratio 0.77; 95% confidence interval 0.41-1.46; p 0.63), in the PS quartiles or in PS-matched cases. The results were similar among patients with severe sepsis and septic shock. Empirical or targeted treatment with fluconazole was not associated with increased 30-day mortality compared to echinocandins among adults with candidemia. Copyright © 2016 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.
Empirical Likelihood in Nonignorable Covariate-Missing Data Problems.
Xie, Yanmei; Zhang, Biao
2017-04-20
Missing covariate data occurs often in regression analysis, which frequently arises in the health and social sciences as well as in survey sampling. We study methods for the analysis of a nonignorable covariate-missing data problem in an assumed conditional mean function when some covariates are completely observed but other covariates are missing for some subjects. We adopt the semiparametric perspective of Bartlett et al. (Improving upon the efficiency of complete case analysis when covariates are MNAR. Biostatistics 2014;15:719-30) on regression analyses with nonignorable missing covariates, in which they have introduced the use of two working models, the working probability model of missingness and the working conditional score model. In this paper, we study an empirical likelihood approach to nonignorable covariate-missing data problems with the objective of effectively utilizing the two working models in the analysis of covariate-missing data. We propose a unified approach to constructing a system of unbiased estimating equations, where there are more equations than unknown parameters of interest. One useful feature of these unbiased estimating equations is that they naturally incorporate the incomplete data into the data analysis, making it possible to seek efficient estimation of the parameter of interest even when the working regression function is not specified to be the optimal regression function. We apply the general methodology of empirical likelihood to optimally combine these unbiased estimating equations. We propose three maximum empirical likelihood estimators of the underlying regression parameters and compare their efficiencies with other existing competitors. We present a simulation study to compare the finite-sample performance of various methods with respect to bias, efficiency, and robustness to model misspecification. The proposed empirical likelihood method is also illustrated by an analysis of a data set from the US National Health and Nutrition Examination Survey (NHANES).
Evaluating the utility of two gestural discomfort evaluation methods
Son, Minseok; Jung, Jaemoon; Park, Woojin
2017-01-01
Evaluating physical discomfort of designed gestures is important for creating safe and usable gesture-based interaction systems; yet, gestural discomfort evaluation has not been extensively studied in HCI, and few evaluation methods seem currently available whose utility has been experimentally confirmed. To address this, this study empirically demonstrated the utility of the subjective rating method after a small number of gesture repetitions (a maximum of four repetitions) in evaluating designed gestures in terms of physical discomfort resulting from prolonged, repetitive gesture use. The subjective rating method has been widely used in previous gesture studies but without empirical evidence on its utility. This study also proposed a gesture discomfort evaluation method based on an existing ergonomics posture evaluation tool (Rapid Upper Limb Assessment) and demonstrated its utility in evaluating designed gestures in terms of physical discomfort resulting from prolonged, repetitive gesture use. Rapid Upper Limb Assessment is an ergonomics postural analysis tool that quantifies the work-related musculoskeletal disorders risks for manual tasks, and has been hypothesized to be capable of correctly determining discomfort resulting from prolonged, repetitive gesture use. The two methods were evaluated through comparisons against a baseline method involving discomfort rating after actual prolonged, repetitive gesture use. Correlation analyses indicated that both methods were in good agreement with the baseline. The methods proposed in this study seem useful for predicting discomfort resulting from prolonged, repetitive gesture use, and are expected to help interaction designers create safe and usable gesture-based interaction systems. PMID:28423016
NASA Astrophysics Data System (ADS)
Zhang, Meijun; Tang, Jian; Zhang, Xiaoming; Zhang, Jiaojiao
2016-03-01
The high accurate classification ability of an intelligent diagnosis method often needs a large amount of training samples with high-dimensional eigenvectors, however the characteristics of the signal need to be extracted accurately. Although the existing EMD(empirical mode decomposition) and EEMD(ensemble empirical mode decomposition) are suitable for processing non-stationary and non-linear signals, but when a short signal, such as a hydraulic impact signal, is concerned, their decomposition accuracy become very poor. An improve EEMD is proposed specifically for short hydraulic impact signals. The improvements of this new EEMD are mainly reflected in four aspects, including self-adaptive de-noising based on EEMD, signal extension based on SVM(support vector machine), extreme center fitting based on cubic spline interpolation, and pseudo component exclusion based on cross-correlation analysis. After the energy eigenvector is extracted from the result of the improved EEMD, the fault pattern recognition based on SVM with small amount of low-dimensional training samples is studied. At last, the diagnosis ability of improved EEMD+SVM method is compared with the EEMD+SVM and EMD+SVM methods, and its diagnosis accuracy is distinctly higher than the other two methods no matter the dimension of the eigenvectors are low or high. The improved EEMD is very propitious for the decomposition of short signal, such as hydraulic impact signal, and its combination with SVM has high ability for the diagnosis of hydraulic impact faults.
Equation-free mechanistic ecosystem forecasting using empirical dynamic modeling
Ye, Hao; Beamish, Richard J.; Glaser, Sarah M.; Grant, Sue C. H.; Hsieh, Chih-hao; Richards, Laura J.; Schnute, Jon T.; Sugihara, George
2015-01-01
It is well known that current equilibrium-based models fall short as predictive descriptions of natural ecosystems, and particularly of fisheries systems that exhibit nonlinear dynamics. For example, model parameters assumed to be fixed constants may actually vary in time, models may fit well to existing data but lack out-of-sample predictive skill, and key driving variables may be misidentified due to transient (mirage) correlations that are common in nonlinear systems. With these frailties, it is somewhat surprising that static equilibrium models continue to be widely used. Here, we examine empirical dynamic modeling (EDM) as an alternative to imposed model equations and that accommodates both nonequilibrium dynamics and nonlinearity. Using time series from nine stocks of sockeye salmon (Oncorhynchus nerka) from the Fraser River system in British Columbia, Canada, we perform, for the the first time to our knowledge, real-data comparison of contemporary fisheries models with equivalent EDM formulations that explicitly use spawning stock and environmental variables to forecast recruitment. We find that EDM models produce more accurate and precise forecasts, and unlike extensions of the classic Ricker spawner–recruit equation, they show significant improvements when environmental factors are included. Our analysis demonstrates the strategic utility of EDM for incorporating environmental influences into fisheries forecasts and, more generally, for providing insight into how environmental factors can operate in forecast models, thus paving the way for equation-free mechanistic forecasting to be applied in management contexts. PMID:25733874
Philosophy and the front line of science.
Pernu, Tuomas K
2008-03-01
According to one traditional view, empirical science is necessarily preceded by philosophical analysis. Yet the relevance of philosophy is often doubted by those engaged in empirical sciences. I argue that these doubts can be substantiated by two theoretical problems that the traditional conception of philosophy is bound to face. First, there is a strong normative etiology to philosophical problems, theories, and notions that is dfficult to reconcile with descriptive empirical study. Second, conceptual analysis (a role that is typically assigned to philosophy) seems to lose its object of study if it is granted that terms do not have purely conceptual meanings detached from their actual use in empirical sciences. These problems are particularly acute to the current naturalistic philosophy of science. I suggest a more concrete integration of philosophy and the sciences as a possible way of making philosophy of science have more impact.
Predicting the mineral composition of dust aerosols - Part 1: Representing key processes
NASA Astrophysics Data System (ADS)
Perlwitz, J. P.; Pérez García-Pando, C.; Miller, R. L.
2015-02-01
Soil dust aerosols created by wind erosion are typically assigned globally uniform physical and chemical properties within Earth system models, despite known regional variations in the mineral content of the parent soil. Mineral composition of the aerosol particles is important to their interaction with climate, including shortwave absorption and radiative forcing, nucleation of cloud droplets and ice crystals, coating by heterogeneous uptake of sulfates and nitrates, and atmospheric processing of iron into bioavailable forms that increase the productivity of marine phytoplankton. Here, aerosol mineral composition is derived by extending a method that provides the composition of a wet-sieved soil. The extension accounts for measurements showing significant differences between the mineral fractions of the wet-sieved soil and the resulting aerosol concentration. For example, some phyllosilicate aerosols are more prevalent at silt sizes, even though they are nearly absent in a soil whose aggregates are dispersed by wet sieving during analysis. We reconstruct the undispersed size distribution of the original soil that is subject to wind erosion. An empirical constraint upon the relative emission of clay and silt is applied that further differentiates the soil and aerosol mineral composition. In addition, a method is proposed for mixing minerals with small impurities composed of iron oxides. These mixtures are important for transporting iron far from the dust source, because pure iron oxides are more dense and vulnerable to gravitational removal than most minerals comprising dust aerosols. A limited comparison to measurements from North Africa shows that the extension brings the model into better agreement, consistent with a more extensive comparison to global observations as well as measurements of elemental composition downwind of the Sahara, as described in companion articles.
You, Joyce H S; Chan, Eva S K; Leung, Maggie Y K; Ip, Margaret; Lee, Nelson L S
2012-01-01
Seasonal and 2009 H1N1 influenza viruses may cause severe diseases and result in excess hospitalization and mortality in the older and younger adults, respectively. Early antiviral treatment may improve clinical outcomes. We examined potential outcomes and costs of test-guided versus empirical treatment in patients hospitalized for suspected influenza in Hong Kong. We designed a decision tree to simulate potential outcomes of four management strategies in adults hospitalized for severe respiratory infection suspected of influenza: "immunofluorescence-assay" (IFA) or "polymerase-chain-reaction" (PCR)-guided oseltamivir treatment, "empirical treatment plus PCR" and "empirical treatment alone". Model inputs were derived from literature. The average prevalence (11%) of influenza in 2010-2011 (58% being 2009 H1N1) among cases of respiratory infections was used in the base-case analysis. Primary outcome simulated was cost per quality-adjusted life-year (QALY) expected (ICER) from the Hong Kong healthcare providers' perspective. In base-case analysis, "empirical treatment alone" was shown to be the most cost-effective strategy and dominated the other three options. Sensitivity analyses showed that "PCR-guided treatment" would dominate "empirical treatment alone" when the daily cost of oseltamivir exceeded USD18, or when influenza prevalence was <2.5% and the predominant circulating viruses were not 2009 H1N1. Using USD50,000 as the threshold of willingness-to-pay, "empirical treatment alone" and "PCR-guided treatment" were cost-effective 97% and 3% of time, respectively, in 10,000 Monte-Carlo simulations. During influenza epidemics, empirical antiviral treatment appears to be a cost-effective strategy in managing patients hospitalized with severe respiratory infection suspected of influenza, from the perspective of healthcare providers in Hong Kong.
A comparison of four streamflow record extension techniques
Hirsch, Robert M.
1982-01-01
One approach to developing time series of streamflow, which may be used for simulation and optimization studies of water resources development activities, is to extend an existing gage record in time by exploiting the interstation correlation between the station of interest and some nearby (long-term) base station. Four methods of extension are described, and their properties are explored. The methods are regression (REG), regression plus noise (RPN), and two new methods, maintenance of variance extension types 1 and 2 (MOVE.l, MOVE.2). MOVE.l is equivalent to a method which is widely used in psychology, biometrics, and geomorphology and which has been called by various names, e.g., ‘line of organic correlation,’ ‘reduced major axis,’ ‘unique solution,’ and ‘equivalence line.’ The methods are examined for bias and standard error of estimate of moments and order statistics, and an empirical examination is made of the preservation of historic low-flow characteristics using 50-year-long monthly records from seven streams. The REG and RPN methods are shown to have serious deficiencies as record extension techniques. MOVE.2 is shown to be marginally better than MOVE.l, according to the various comparisons of bias and accuracy.
A Comparison of Four Streamflow Record Extension Techniques
NASA Astrophysics Data System (ADS)
Hirsch, Robert M.
1982-08-01
One approach to developing time series of streamflow, which may be used for simulation and optimization studies of water resources development activities, is to extend an existing gage record in time by exploiting the interstation correlation between the station of interest and some nearby (long-term) base station. Four methods of extension are described, and their properties are explored. The methods are regression (REG), regression plus noise (RPN), and two new methods, maintenance of variance extension types 1 and 2 (MOVE.l, MOVE.2). MOVE.l is equivalent to a method which is widely used in psychology, biometrics, and geomorphology and which has been called by various names, e.g., `line of organic correlation,' `reduced major axis,' `unique solution,' and `equivalence line.' The methods are examined for bias and standard error of estimate of moments and order statistics, and an empirical examination is made of the preservation of historic low-flow characteristics using 50-year-long monthly records from seven streams. The REG and RPN methods are shown to have serious deficiencies as record extension techniques. MOVE.2 is shown to be marginally better than MOVE.l, according to the various comparisons of bias and accuracy.
An assessment of laser velocimetry in hypersonic flow
NASA Technical Reports Server (NTRS)
1992-01-01
Although extensive progress has been made in computational fluid mechanics, reliable flight vehicle designs and modifications still cannot be made without recourse to extensive wind tunnel testing. Future progress in the computation of hypersonic flow fields is restricted by the need for a reliable mean flow and turbulence modeling data base which could be used to aid in the development of improved empirical models for use in numerical codes. Currently, there are few compressible flow measurements which could be used for this purpose. In this report, the results of experiments designed to assess the potential for laser velocimeter measurements of mean flow and turbulent fluctuations in hypersonic flow fields are presented. Details of a new laser velocimeter system which was designed and built for this test program are described.
Modelling erosion on a daily basis, an adaptation of the MMF approach
NASA Astrophysics Data System (ADS)
Shrestha, Dhruba Pikha; Jetten, Victor G.
2018-02-01
Effect of soil erosion causing negative impact on ecosystem services and food security is well known. On the other hand there can be yearly variation of total precipitation received in an area, with the presence of extreme rains. To assess annual erosion rates various empirical models have been extensively used in all the climatic regions. While these models are simple to operate and do not require lot of input data, the effect of extreme rain is not taken into account. Although physically based models are available to simulate erosion processes including particle detachment, transportation and deposition of sediments during a storm they are not applicable for assessing annual soil loss rates. Moreover storm event data may not be available everywhere prohibiting their extensive use.
Tsallis’ non-extensive free energy as a subjective value of an uncertain reward
NASA Astrophysics Data System (ADS)
Takahashi, Taiki
2009-03-01
Recent studies in neuroeconomics and econophysics revealed the importance of reward expectation in decision under uncertainty. Behavioral neuroeconomic studies have proposed that the unpredictability and the probability of an uncertain reward are distinctly encoded as entropy and a distorted probability weight, respectively, in the separate neural systems. However, previous behavioral economic and decision-theoretic models could not quantify reward-seeking and uncertainty aversion in a theoretically consistent manner. In this paper, we have: (i) proposed that generalized Helmholtz free energy in Tsallis’ non-extensive thermostatistics can be utilized to quantify a perceived value of an uncertain reward, and (ii) empirically examined the explanatory powers of the models. Future study directions in neuroeconomics and econophysics by utilizing the Tsallis’ free energy model are discussed.
On Modeling Eavesdropping Attacks in Underwater Acoustic Sensor Networks †
Wang, Qiu; Dai, Hong-Ning; Li, Xuran; Wang, Hao; Xiao, Hong
2016-01-01
The security and privacy of underwater acoustic sensor networks has received extensive attention recently due to the proliferation of underwater activities. This paper proposes an analytical model to investigate the eavesdropping attacks in underwater acoustic sensor networks. Our analytical framework considers the impacts of various underwater acoustic channel conditions (such as the acoustic signal frequency, spreading factor and wind speed) and different hydrophones (isotropic hydrophones and array hydrophones) in terms of network nodes and eavesdroppers. We also conduct extensive simulations to evaluate the effectiveness and the accuracy of our proposed model. Empirical results show that our proposed model is quite accurate. In addition, our results also imply that the eavesdropping probability heavily depends on both the underwater acoustic channel conditions and the features of hydrophones. PMID:27213379
Reflections on the Human Terrain System During the First 4 Years
2011-09-01
contracted social science research and analysis capability in both Iraq and Afghanistan to conduct empirical qualitative and quantita- tive...contracted social science research and analysis capability in both Iraq and Afghanistan to conduct empirical qualitative and quantita- tive research to...problematic.29 All research products in the public domain (including ethnographies produced by academic anthropologists) are accessible by intelligence
Accuracy of Revised and Traditional Parallel Analyses for Assessing Dimensionality with Binary Data
ERIC Educational Resources Information Center
Green, Samuel B.; Redell, Nickalus; Thompson, Marilyn S.; Levy, Roy
2016-01-01
Parallel analysis (PA) is a useful empirical tool for assessing the number of factors in exploratory factor analysis. On conceptual and empirical grounds, we argue for a revision to PA that makes it more consistent with hypothesis testing. Using Monte Carlo methods, we evaluated the relative accuracy of the revised PA (R-PA) and traditional PA…
Heudebert, Gustavo R; Centor, Robert M; Klapow, Joshua C; Marks, Robert; Johnson, Lawrence; Wilcox, C Mel
2000-01-01
OBJECTIVE T o determine the best treatment strategy for the management of patients presenting with symptoms consistent with uncomplicated heartburn. METHODS We performed a cost-utility analysis of 4 alternatives: empirical proton pump inhibitor, empirical histamine2-receptor antagonist, and diagnostic strategies consisting of either esophagogastroduodenoscopy (EGD) or an upper gastrointestinal series before treatment. The time horizon of the model was 1 year. The base case analysis assumed a cohort of otherwise healthy 45-year-old individuals in a primary care practice. MAIN RESULTS Empirical treatment with a proton pump inhibitor was projected to provide the greatest quality-adjusted survival for the cohort. Empirical treatment with a histamine2receptor antagonist was projected to be the least costly of the alternatives. The marginal cost-effectiveness of using a proton pump inhibitor over a histamine2-receptor antagonist was approximately $10,400 per quality-adjusted life year (QALY) gained in the base case analysis and was less than $50,000 per QALY as long as the utility for heartburn was less than 0.95. Both diagnostic strategies were dominated by proton pump inhibitor alternative. CONCLUSIONS Empirical treatment seems to be the optimal initial management strategy for patients with heartburn, but the choice between a proton pump inhibitor or histamine2-receptor antagonist depends on the impact of heartburn on quality of life. PMID:10718898
1986-01-01
the information that has been determined experimentally. The Labyrinth Seal Analysis program was, therefore, directed to the develop - ment of an...labyrinth seal performance, the program included the development of an improved empirical design model to pro- j. .,’ vide the calculation of the flow... program . * Phase I was directed to the analytical development of both an *analysis* model and an improvwd empirical *design" model. Supporting rig tests
Mesoscale Particle-Based Model of Electrophoresis
Giera, Brian; Zepeda-Ruiz, Luis A.; Pascall, Andrew J.; ...
2015-07-31
Here, we develop and evaluate a semi-empirical particle-based model of electrophoresis using extensive mesoscale simulations. We parameterize the model using only measurable quantities from a broad set of colloidal suspensions with properties that span the experimentally relevant regime. With sufficient sampling, simulated diffusivities and electrophoretic velocities match predictions of the ubiquitous Stokes-Einstein and Henry equations, respectively. This agreement holds for non-polar and aqueous solvents or ionic liquid colloidal suspensions under a wide range of applied electric fields.
Mesoscale Particle-Based Model of Electrophoresis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giera, Brian; Zepeda-Ruiz, Luis A.; Pascall, Andrew J.
Here, we develop and evaluate a semi-empirical particle-based model of electrophoresis using extensive mesoscale simulations. We parameterize the model using only measurable quantities from a broad set of colloidal suspensions with properties that span the experimentally relevant regime. With sufficient sampling, simulated diffusivities and electrophoretic velocities match predictions of the ubiquitous Stokes-Einstein and Henry equations, respectively. This agreement holds for non-polar and aqueous solvents or ionic liquid colloidal suspensions under a wide range of applied electric fields.
Bralten, Janita; Franke, Barbara; Waldman, Irwin; Rommelse, Nanda; Hartman, Catharina; Asherson, Philip; Banaschewski, Tobias; Ebstein, Richard P; Gill, Michael; Miranda, Ana; Oades, Robert D; Roeyers, Herbert; Rothenberger, Aribert; Sergeant, Joseph A; Oosterlaan, Jaap; Sonuga-Barke, Edmund; Steinhausen, Hans-Christoph; Faraone, Stephen V; Buitelaar, Jan K; Arias-Vásquez, Alejandro
2013-11-01
Because multiple genes with small effect sizes are assumed to play a role in attention-deficit/hyperactivity disorder (ADHD) etiology, considering multiple variants within the same analysis likely increases the total explained phenotypic variance, thereby boosting the power of genetic studies. This study investigated whether pathway-based analysis could bring scientists closer to unraveling the biology of ADHD. The pathway was described as a predefined gene selection based on a well-established database or literature data. Common genetic variants in pathways involved in dopamine/norepinephrine and serotonin neurotransmission and genes involved in neuritic outgrowth were investigated in cases from the International Multicentre ADHD Genetics (IMAGE) study. Multivariable analysis was performed to combine the effects of single genetic variants within the pathway genes. Phenotypes were DSM-IV symptom counts for inattention and hyperactivity/impulsivity (n = 871) and symptom severity measured with the Conners Parent (n = 930) and Teacher (n = 916) Rating Scales. Summing genetic effects of common genetic variants within the pathways showed a significant association with hyperactive/impulsive symptoms ((p)empirical = .007) but not with inattentive symptoms ((p)empirical = .73). Analysis of parent-rated Conners hyperactive/impulsive symptom scores validated this result ((p)empirical = .0018). Teacher-rated Conners scores were not associated. Post hoc analyses showed a significant contribution of all pathways to the hyperactive/impulsive symptom domain (dopamine/norepinephrine, (p)empirical = .0004; serotonin, (p)empirical = .0149; neuritic outgrowth, (p)empirical = .0452). The present analysis shows an association between common variants in 3 genetic pathways and the hyperactive/impulsive component of ADHD. This study demonstrates that pathway-based association analyses, using quantitative measurements of ADHD symptom domains, can increase the power of genetic analyses to identify biological risk factors involved in this disorder. Copyright © 2013 American Academy of Child and Adolescent Psychiatry. Published by Elsevier Inc. All rights reserved.
Gene Level Meta-Analysis of Quantitative Traits by Functional Linear Models.
Fan, Ruzong; Wang, Yifan; Boehnke, Michael; Chen, Wei; Li, Yun; Ren, Haobo; Lobach, Iryna; Xiong, Momiao
2015-08-01
Meta-analysis of genetic data must account for differences among studies including study designs, markers genotyped, and covariates. The effects of genetic variants may differ from population to population, i.e., heterogeneity. Thus, meta-analysis of combining data of multiple studies is difficult. Novel statistical methods for meta-analysis are needed. In this article, functional linear models are developed for meta-analyses that connect genetic data to quantitative traits, adjusting for covariates. The models can be used to analyze rare variants, common variants, or a combination of the two. Both likelihood-ratio test (LRT) and F-distributed statistics are introduced to test association between quantitative traits and multiple variants in one genetic region. Extensive simulations are performed to evaluate empirical type I error rates and power performance of the proposed tests. The proposed LRT and F-distributed statistics control the type I error very well and have higher power than the existing methods of the meta-analysis sequence kernel association test (MetaSKAT). We analyze four blood lipid levels in data from a meta-analysis of eight European studies. The proposed methods detect more significant associations than MetaSKAT and the P-values of the proposed LRT and F-distributed statistics are usually much smaller than those of MetaSKAT. The functional linear models and related test statistics can be useful in whole-genome and whole-exome association studies. Copyright © 2015 by the Genetics Society of America.
NASA Astrophysics Data System (ADS)
Grubov, V. V.; Runnova, A. E.; Hramov, A. E.
2018-05-01
A new method for adaptive filtration of experimental EEG signals in humans and for removal of different physiological artifacts has been proposed. The algorithm of the method includes empirical mode decomposition of EEG, determination of the number of empirical modes that are considered, analysis of the empirical modes and search for modes that contains artifacts, removal of these modes, and reconstruction of the EEG signal. The method was tested on experimental human EEG signals and demonstrated high efficiency in the removal of different types of physiological EEG artifacts.
Exponential model for option prices: Application to the Brazilian market
NASA Astrophysics Data System (ADS)
Ramos, Antônio M. T.; Carvalho, J. A.; Vasconcelos, G. L.
2016-03-01
In this paper we report an empirical analysis of the Ibovespa index of the São Paulo Stock Exchange and its respective option contracts. We compare the empirical data on the Ibovespa options with two option pricing models, namely the standard Black-Scholes model and an empirical model that assumes that the returns are exponentially distributed. It is found that at times near the option expiration date the exponential model performs better than the Black-Scholes model, in the sense that it fits the empirical data better than does the latter model.
Empirical analysis and modeling of manual turnpike tollbooths in China
NASA Astrophysics Data System (ADS)
Zhang, Hao
2017-03-01
To deal with low-level of service satisfaction at tollbooths of many turnpikes in China, we conduct an empirical study and use a queueing model to investigate performance measures. In this paper, we collect archived data from six tollbooths of a turnpike in China. Empirical analysis on vehicle's time-dependent arrival process and collector's time-dependent service time is conducted. It shows that the vehicle arrival process follows a non-homogeneous Poisson process while the collector service time follows a log-normal distribution. Further, we model the process of collecting tolls at tollbooths with MAP / PH / 1 / FCFS queue for mathematical tractability and present some numerical examples.
An Empirical Analysis of the Default Rate of Informal Lending—Evidence from Yiwu, China
NASA Astrophysics Data System (ADS)
Lu, Wei; Yu, Xiaobo; Du, Juan; Ji, Feng
This study empirically analyzes the underlying factors contributing to the default rate of informal lending. This paper adopts snowball sampling interview to collect data and uses the logistic regression model to explore the specific factors. The results of these analyses validate the explanation of how the informal lending differs from the commercial loan. Factors that contribute to the default rate have particular attributes, while sharing some similarities with commercial bank or FICO credit scoring Index. Finally, our concluding remarks draw some inferences from empirical analysis and speculate as to what this may imply for the role of formal and informal financial sectors.
Aspara, Jaakko; Klein, Jan F; Luo, Xueming; Tikkanen, Henrikki
2018-05-01
We conduct a systematic exploratory investigation of the effects of firms' existing service productivity on the success of their new service innovations. Although previous research extensively addresses service productivity and service innovation, this is the first empirical study that bridges the gap between these two research streams and examines the links between the two concepts. Based on a comprehensive data set of new service introductions in a financial services market over a 14-year period, we empirically explore the relationship between a firm's existing service productivity and the firm's success in introducing new services to the market. The results unveil a fundamental service productivity-service innovation dilemma: Being productive in existing services increases a firm's willingness to innovate new services proactively but decreases the firm's capabilities of bringing these services to the market successfully. We provide specific insights into the mechanism underlying the complex relationship between a firm's productivity in existing services, its innovation proactivity, and its service innovation success. For managers, we not only unpack and elucidate this dilemma but also demonstrate that a focused customer scope and growth market conditions may enable firms to mitigate the dilemma and successfully pursue service productivity and service innovation simultaneously.
Data analysis using a combination of independent component analysis and empirical mode decomposition
NASA Astrophysics Data System (ADS)
Lin, Shih-Lin; Tung, Pi-Cheng; Huang, Norden E.
2009-06-01
A combination of independent component analysis and empirical mode decomposition (ICA-EMD) is proposed in this paper to analyze low signal-to-noise ratio data. The advantages of ICA-EMD combination are these: ICA needs few sensory clues to separate the original source from unwanted noise and EMD can effectively separate the data into its constituting parts. The case studies reported here involve original sources contaminated by white Gaussian noise. The simulation results show that the ICA-EMD combination is an effective data analysis tool.
Twenty years and going strong: A dynamic systems revolution in motor and cognitive development
Spencer, John P.; Perone, Sammy; Buss, Aaron T.
2011-01-01
This article reviews the major contributions of dynamic systems theory in advancing thinking about development, the empirical insights the theory has generated, and the key challenges for the theory on the horizon. The first section discusses the emergence of dynamic systems theory in developmental science, the core concepts of the theory, and the resonance it has with other approaches that adopt a systems metatheory. The second section reviews the work of Esther Thelen and colleagues, who revolutionized how researchers think about the field of motor development. It also reviews recent extensions of this work to the domain of cognitive development. Here, the focus is on dynamic field theory, a formal, neurally grounded approach that has yielded novel insights into the embodied nature of cognition. The final section proposes that the key challenge on the horizon is to formally specify how interactions among multiple levels of analysis interact across multiple time scales to create developmental change. PMID:22125575
Orbital flight test shuttle external tank aerothermal flight evaluation, volume 1
NASA Technical Reports Server (NTRS)
Praharaj, Sarat C.; Engel, Carl D.; Warmbrod, John D.
1986-01-01
This 3-volume report discusses the evaluation of aerothermal flight measurements made on the orbital flight test Space Shuttle External Tanks (ETs). Six ETs were instrumented to measure various quantities during flight; including heat transfer, pressure, and structural temperature. The flight data was reduced and analyzed against math models established from an extensive wind tunnel data base and empirical heat-transfer relationships. This analysis has supported the validity of the current aeroheating methodology and existing data base; and, has also identified some problem areas which require methodology modifications. This is Volume 1, an Executive Summary. Volume 2 contains Appendices A (Aerothermal Comparisons) and B (Flight Derived h sub 1/h sub u vs. M sub inf. Plots), and Volume 3 contains Appendix C (Comparison of Interference Factors among OFT Flight, Prediction and 1H-97A Data), Appendix D (Freestream Stanton Number and Reynolds Number Correlation for Flight and Tunnel Data), and Appendix E (Flight-Derived h sub i/h sub u Tables).
Roughness influence on human blood drop spreading and splashing
NASA Astrophysics Data System (ADS)
Smith, Fiona; Buntsma, Naomi; Brutin, David
2017-11-01
The impact behaviour of complex fluid droplets is a topic that has been extensively studied but with much debate. The Bloodstain Pattern Analysis (BPA) community is encountering this scientific problem with daily practical cases since they use bloodstains as evidence in crime scene reconstruction. We aim to provide fundamental explanations in the study of blood drip stains by investigating the influence of surface roughness and wettability on the splashing limit of droplets of blood, a non-Newtonian colloidal fluid. Droplets of blood impacting perpendicularly different surfaces at different velocities were recorded. The recordings were analysed as well as the surfaces characteristics in order to find an empirical solution since we found that roughness plays a major role in the threshold of the splashing/non-splashing behaviour of blood compared to the wettability. Moreover it appears that roughness alters the deformation of the drip stains. These observations are key in characterising features of drip stains with the impacting conditions, which would answer some forensic issues.
Robust w-Estimators for Cryo-EM Class Means
Huang, Chenxi; Tagare, Hemant D.
2016-01-01
A critical step in cryogenic electron microscopy (cryo-EM) image analysis is to calculate the average of all images aligned to a projection direction. This average, called the “class mean”, improves the signal-to-noise ratio in single particle reconstruction (SPR). The averaging step is often compromised because of outlier images of ice, contaminants, and particle fragments. Outlier detection and rejection in the majority of current cryo-EM methods is done using cross-correlation with a manually determined threshold. Empirical assessment shows that the performance of these methods is very sensitive to the threshold. This paper proposes an alternative: a “w-estimator” of the average image, which is robust to outliers and which does not use a threshold. Various properties of the estimator, such as consistency and influence function are investigated. An extension of the estimator to images with different contrast transfer functions (CTFs) is also provided. Experiments with simulated and real cryo-EM images show that the proposed estimator performs quite well in the presence of outliers. PMID:26841397
Robust w-Estimators for Cryo-EM Class Means.
Huang, Chenxi; Tagare, Hemant D
2016-02-01
A critical step in cryogenic electron microscopy (cryo-EM) image analysis is to calculate the average of all images aligned to a projection direction. This average, called the class mean, improves the signal-to-noise ratio in single-particle reconstruction. The averaging step is often compromised because of the outlier images of ice, contaminants, and particle fragments. Outlier detection and rejection in the majority of current cryo-EM methods are done using cross-correlation with a manually determined threshold. Empirical assessment shows that the performance of these methods is very sensitive to the threshold. This paper proposes an alternative: a w-estimator of the average image, which is robust to outliers and which does not use a threshold. Various properties of the estimator, such as consistency and influence function are investigated. An extension of the estimator to images with different contrast transfer functions is also provided. Experiments with simulated and real cryo-EM images show that the proposed estimator performs quite well in the presence of outliers.
Factors influencing physicians' knowledge sharing on web medical forums.
Lin, Tung Cheng; Lai, Ming Cheng; Yang, Shu Wen
2016-09-01
Web medical forums are relatively unique as knowledge-sharing platforms because physicians participate exclusively as knowledge contributors and not as knowledge recipients. Using the perspective of social exchange theory and considering both extrinsic and intrinsic motivations, this study aims to elicit the factors that significantly influence the willingness of physicians to share professional knowledge on web medical forums and develops a research model to explore the motivations that underlie physicians' knowledge-sharing attitudes. This model hypothesizes that constructs, including shared vision, reputation, altruism, and self-efficacy, positively influence these attitudes and, by extension, positively impact knowledge-sharing intention. A conventional sampling method and the direct recruitment of physicians at their outpatient clinic gathered valid data from a total of 164 physicians for analysis in the model. The empirical results support the validity of the proposed model and identified shared vision as the most significant factor of influence on knowledge-sharing attitudes, followed in descending order by knowledge-sharing self-efficacy, reputation, and altruism. © The Author(s) 2015.
Garner, Bryan R.; Funk, Rodney R.; Hunter, Brooke D.
2012-01-01
The turnover of substance use disorder (SUD) treatment staff has been assumed to adversely impact treatment effectiveness, yet only limited research has empirically examined this assumption. Representing an extension of prior organizational-level analyses of the impact of staff turnover on client outcomes, this study examined the impact of SUD clinician turnover on adolescent treatment outcomes using a client perspective. Multilevel regression analysis did reveal that relative to those adolescents who did not experience clinician turnover, adolescents who experienced both direct and indirect clinician turnover reported a significantly higher percentage of days using alcohol or drugs at 6-month follow-up. However, clinician turnover was not found to have significant associations (negative or positive) with the other five treatment outcomes examined (e.g., substance-related problems, involvement in illegal activity). Thus, consistent with our prior findings, the current study provides additional evidence that turnover of SUD clinicians is not necessarily associated with adverse treatment outcomes. PMID:23083980
NASA Astrophysics Data System (ADS)
Fosas de Pando, Miguel; Schmid, Peter J.; Sipp, Denis
2016-11-01
Nonlinear model reduction for large-scale flows is an essential component in many fluid applications such as flow control, optimization, parameter space exploration and statistical analysis. In this article, we generalize the POD-DEIM method, introduced by Chaturantabut & Sorensen [1], to address nonlocal nonlinearities in the equations without loss of performance or efficiency. The nonlinear terms are represented by nested DEIM-approximations using multiple expansion bases based on the Proper Orthogonal Decomposition. These extensions are imperative, for example, for applications of the POD-DEIM method to large-scale compressible flows. The efficient implementation of the presented model-reduction technique follows our earlier work [2] on linearized and adjoint analyses and takes advantage of the modular structure of our compressible flow solver. The efficacy of the nonlinear model-reduction technique is demonstrated to the flow around an airfoil and its acoustic footprint. We could obtain an accurate and robust low-dimensional model that captures the main features of the full flow.
A transmission-virulence evolutionary trade-off explains attenuation of HIV-1 in Uganda
Blanquart, François; Grabowski, Mary Kate; Herbeck, Joshua; Nalugoda, Fred; Serwadda, David; Eller, Michael A; Robb, Merlin L; Gray, Ronald; Kigozi, Godfrey; Laeyendecker, Oliver; Lythgoe, Katrina A; Nakigozi, Gertrude; Quinn, Thomas C; Reynolds, Steven J; Wawer, Maria J; Fraser, Christophe
2016-01-01
Evolutionary theory hypothesizes that intermediate virulence maximizes pathogen fitness as a result of a trade-off between virulence and transmission, but empirical evidence remains scarce. We bridge this gap using data from a large and long-standing HIV-1 prospective cohort, in Uganda. We use an epidemiological-evolutionary model parameterised with this data to derive evolutionary predictions based on analysis and detailed individual-based simulations. We robustly predict stabilising selection towards a low level of virulence, and rapid attenuation of the virus. Accordingly, set-point viral load, the most common measure of virulence, has declined in the last 20 years. Our model also predicts that subtype A is slowly outcompeting subtype D, with both subtypes becoming less virulent, as observed in the data. Reduction of set-point viral loads should have resulted in a 20% reduction in incidence, and a three years extension of untreated asymptomatic infection, increasing opportunities for timely treatment of infected individuals. DOI: http://dx.doi.org/10.7554/eLife.20492.001 PMID:27815945
Confined turbulent swirling recirculating flow predictions. Ph.D. Thesis. Final Report
NASA Technical Reports Server (NTRS)
Abujelala, M. T.; Lilley, D. G.
1985-01-01
The capability and the accuracy of the STARPIC computer code in predicting confined turbulent swirling recirculating flows is presented. Inlet flow boundary conditions were demonstrated to be extremely important in simulating a flowfield via numerical calculations. The degree of swirl strength and expansion ratio have strong effects on the characteristics of swirling flow. In a nonswirling flow, a large corner recirculation zone exists in the flowfield with an expansion ratio greater than one. However, as the degree of inlet swirl increases, the size of this zone decreases and a central recirculation zone appears near the inlet. Generally, the size of the central zone increased with swirl strength and expansion ratio. Neither the standard k-epsilon turbulence mode nor its previous extensions show effective capability for predicting confined turbulent swirling recirculating flows. However, either reduced optimum values of three parameters in the mode or the empirical C sub mu formulation obtained via careful analysis of available turbulence measurements, can provide more acceptable accuracy in the prediction of these swirling flows.
Orbital flight test shuttle external tank aerothermal flight evaluation, volume 3
NASA Technical Reports Server (NTRS)
Praharaj, Sarat C.; Engel, Carl D.; Warmbrod, John D.
1986-01-01
This 3-volume report discusses the evaluation of aerothermal flight measurements made on the orbital flight test Space Shuttle External Tanks (ETs). Six ETs were instrumented to measure various quantities during flight; including heat transfer, pressure, and structural temperature. The flight data was reduced and analyzed against math models established from an extensive wind tunnel data base and empirical heat-transfer relationships. This analysis has supported the validity of the current aeroheating methodology and existing data base; and, has also identified some problem areas which require methodology modifications. Volume 1 is the Executive Summary. Volume 2 contains Appendix A (Aerothermal Comparisons), and Appendix B (Flight-Derived h sub 1/h sub u vs. M sub inf. Plots). This is Volume 3, containing Appendix C (Comparison of Interference Factors between OFT Flight, Prediction and 1H-97A Data), Appendix D (Freestream Stanton Number and Reynolds Number Correlation for Flight and Tunnel Data), and Appendix E (Flight-Derived h sub i/h sub u Tables).
Orbital flight test shuttle external tank aerothermal flight evaluation, volume 2
NASA Technical Reports Server (NTRS)
Praharaj, Sarat C.; Engel, Carl D.; Warmbrod, John D.
1986-01-01
This 3-volume report discusses the evaluation of aerothermal flight measurements made on the orbital flight test Space Shuttle External Tanks (ETs). Six ETs were instrumented to measure various quantities during flight; including heat transfer, pressure, and structural temperature. The flight data was reduced and analyzed against math models established from an extensive wind tunnel data base and empirical heat-transfer relationships. This analysis has supported the validity of the current aeroheating methodology and existing data base; and, has also identified some problem areas which require methodology modifications. Volume 1 is the Executive Summary. This is volume 2, containing Appendix A (Aerothermal Comparisons), and Appendix B (Flight-Derived h sub i/h sub u vs. M sub inf. Plots). Volume 3 contains Appendix C (Comparison of Interference Factors between OFT Flight, Prediction and 1H-97A Data), Appendix D (Freestream Stanton Number and Reynolds Number Correlation for Flight and Tunnel Data), and Appendix E (Flight-Derived h sub i/h sub u Tables).
Seven-panel solar wing deployment and on-orbit maneuvering analyses
NASA Astrophysics Data System (ADS)
Hwang, Earl
2005-05-01
BSS developed a new generation high power (~20kW) solar array to meet the customer demands. The high power solar array had the north and south solar wings of which designs were identical. Each side of the solar wing consists of three main conventional solar panels and the four-side panel swing-out new design. The fully deployed solar array surface area is 966 ft2. It was a quite challenging task to define the solar array's optimum design parameters and deployment scheme for such a huge solar array's successful deployment and on-orbit maneuvering. Hence, a deployable seven-flex-panel solar wing nonlinear math model and a fully deployed solar array/bus-payload math model were developed with the Dynamic Analysis and Design System (DADS) program codes utilizing the inherited and empirical data. Performing extensive parametric analyses with the math model, the optimum design parameters and the orbit maneuvering /deployment schemes were determined to meet all the design requirements, and for the successful solar wing deployment on-orbit.
A Laboratory Study of Slope Flows Dynamics
NASA Astrophysics Data System (ADS)
Capriati, Andrea; Cenedese, Antonio; Monti, Paolo
2003-11-01
Slope flows currents can contribute significantly in the diurnal circulation and air quality of complex terrain regions (mountains, valleys, etc.). During the daytime, solar heating warms the valley sides, causing up-slope (or anabatic) winds. In contrast, radiative cooling of the valley sides results in cold down-slope (drainage or katabatic) flows, characterized by small vertical extensions (usually 10-200 m) and with the typical features of dense gravity currents. In this paper, some preliminary results on slope flows obtained by means of a series of experiments conducted in the laboratory using a temperature controlled water tank are shown. Rakes of thermocouples are used to determine the temperature structure and particle tracking velocimetry is used for the velocity measurements. A simple slope consisting of a plate in which the temperature is forced via a set of Peltier Cells is used. The analysis is performed considering different slope angles, background thermal stratifications and surface heat fluxes as well. Comparisons with theoretical and empirical laws found in literature are reported.
Fear of darkness, the full moon and the nocturnal ecology of African lions.
Packer, Craig; Swanson, Alexandra; Ikanda, Dennis; Kushnir, Hadas
2011-01-01
Nocturnal carnivores are widely believed to have played an important role in human evolution, driving the need for night-time shelter, the control of fire and our innate fear of darkness. However, no empirical data are available on the effects of darkness on the risks of predation in humans. We performed an extensive analysis of predatory behavior across the lunar cycle on the largest dataset of lion attacks ever assembled and found that African lions are as sensitive to moonlight when hunting humans as when hunting herbivores and that lions are most dangerous to humans when the moon is faint or below the horizon. At night, people are most active between dusk and 10:00 pm, thus most lion attacks occur in the first weeks following the full moon (when the moon rises at least an hour after sunset). Consequently, the full moon is a reliable indicator of impending danger, perhaps helping to explain why the full moon has been the subject of so many myths and misconceptions.
Ogle, Christin M; Rubin, David C; Siegler, Ilene C
2016-03-01
Using data from a longitudinal study of community-dwelling older adults, we analyzed the most extensive set of known correlates of PTSD symptoms obtained from a single sample to examine the measures' independent and combined utility in accounting for PTSD symptom severity. Fifteen measures identified as PTSD risk factors in published meta-analyses and 12 theoretically and empirically supported individual difference and health-related measures were included. Individual difference measures assessed after the trauma, including insecure attachment and factors related to the current trauma memory, such as self-rated severity, event centrality, frequency of involuntary recall, and physical reactions to the memory, accounted for symptom severity better than measures of pre-trauma factors. In an analysis restricted to prospective measures assessed before the trauma, the total variance explained decreased from 56% to 16%. Results support a model of PTSD in which characteristics of the current trauma memory promote the development and maintenance of PTSD symptoms.
Using Predictability for Lexical Segmentation.
Çöltekin, Çağrı
2017-09-01
This study investigates a strategy based on predictability of consecutive sub-lexical units in learning to segment a continuous speech stream into lexical units using computational modeling and simulations. Lexical segmentation is one of the early challenges during language acquisition, and it has been studied extensively through psycholinguistic experiments as well as computational methods. However, despite strong empirical evidence, the explicit use of predictability of basic sub-lexical units in models of segmentation is underexplored. This paper presents an incremental computational model of lexical segmentation for exploring the usefulness of predictability for lexical segmentation. We show that the predictability cue is a strong cue for segmentation. Contrary to earlier reports in the literature, the strategy yields state-of-the-art segmentation performance with an incremental computational model that uses only this particular cue in a cognitively plausible setting. The paper also reports an in-depth analysis of the model, investigating the conditions affecting the usefulness of the strategy. Copyright © 2016 Cognitive Science Society, Inc.
Applying Mathematical Optimization Methods to an ACT-R Instance-Based Learning Model.
Said, Nadia; Engelhart, Michael; Kirches, Christian; Körkel, Stefan; Holt, Daniel V
2016-01-01
Computational models of cognition provide an interface to connect advanced mathematical tools and methods to empirically supported theories of behavior in psychology, cognitive science, and neuroscience. In this article, we consider a computational model of instance-based learning, implemented in the ACT-R cognitive architecture. We propose an approach for obtaining mathematical reformulations of such cognitive models that improve their computational tractability. For the well-established Sugar Factory dynamic decision making task, we conduct a simulation study to analyze central model parameters. We show how mathematical optimization techniques can be applied to efficiently identify optimal parameter values with respect to different optimization goals. Beyond these methodological contributions, our analysis reveals the sensitivity of this particular task with respect to initial settings and yields new insights into how average human performance deviates from potential optimal performance. We conclude by discussing possible extensions of our approach as well as future steps towards applying more powerful derivative-based optimization methods.
A reevaluation of spectral ratios for lunar mare TiO2 mapping
NASA Technical Reports Server (NTRS)
Johnson, Jeffrey R.; Larson, Stephen M.; Singer, Robert B.
1991-01-01
The empirical relation established by Charette et al. (1974) between the 400/560-nm spectral ratio of mature mare soils and weight percent TiO2 has been used extensively to map titanium content in the lunar maria. Relative reflectance spectra of mare regions show that a reference wavelength further into the near-IR, e.g., above 700 nm, could be used in place of the 560-nm band to provide greater contrast (a greater range of ratio values) and hence a more sensitive indicator of titanium content. An analysis of 400/730-nm ratio values derived from both laboratory and telescopic relative reflectance spectra suggests that this ratio provides greater sensitivity to TiO2 content than the 400/560-nm ratio. The increased range of ratio values is manifested in higher contrast 400/730-nm ratio images compared to 400/560-nm ratio images. This potential improvement in sensivity encourages a reevaluation of the original Charette et al. (1974) relation using the 400/730-nm ratio.
Functional Parallel Factor Analysis for Functions of One- and Two-dimensional Arguments.
Choi, Ji Yeh; Hwang, Heungsun; Timmerman, Marieke E
2018-03-01
Parallel factor analysis (PARAFAC) is a useful multivariate method for decomposing three-way data that consist of three different types of entities simultaneously. This method estimates trilinear components, each of which is a low-dimensional representation of a set of entities, often called a mode, to explain the maximum variance of the data. Functional PARAFAC permits the entities in different modes to be smooth functions or curves, varying over a continuum, rather than a collection of unconnected responses. The existing functional PARAFAC methods handle functions of a one-dimensional argument (e.g., time) only. In this paper, we propose a new extension of functional PARAFAC for handling three-way data whose responses are sequenced along both a two-dimensional domain (e.g., a plane with x- and y-axis coordinates) and a one-dimensional argument. Technically, the proposed method combines PARAFAC with basis function expansion approximations, using a set of piecewise quadratic finite element basis functions for estimating two-dimensional smooth functions and a set of one-dimensional basis functions for estimating one-dimensional smooth functions. In a simulation study, the proposed method appeared to outperform the conventional PARAFAC. We apply the method to EEG data to demonstrate its empirical usefulness.
Supporting Regularized Logistic Regression Privately and Efficiently.
Li, Wenfa; Liu, Hongzhe; Yang, Peng; Xie, Wei
2016-01-01
As one of the most popular statistical and machine learning models, logistic regression with regularization has found wide adoption in biomedicine, social sciences, information technology, and so on. These domains often involve data of human subjects that are contingent upon strict privacy regulations. Concerns over data privacy make it increasingly difficult to coordinate and conduct large-scale collaborative studies, which typically rely on cross-institution data sharing and joint analysis. Our work here focuses on safeguarding regularized logistic regression, a widely-used statistical model while at the same time has not been investigated from a data security and privacy perspective. We consider a common use scenario of multi-institution collaborative studies, such as in the form of research consortia or networks as widely seen in genetics, epidemiology, social sciences, etc. To make our privacy-enhancing solution practical, we demonstrate a non-conventional and computationally efficient method leveraging distributing computing and strong cryptography to provide comprehensive protection over individual-level and summary data. Extensive empirical evaluations on several studies validate the privacy guarantee, efficiency and scalability of our proposal. We also discuss the practical implications of our solution for large-scale studies and applications from various disciplines, including genetic and biomedical studies, smart grid, network analysis, etc.
NASA Astrophysics Data System (ADS)
Mondal, Puskar; Korenaga, Jun
2018-03-01
The dispersion relation of the Rayleigh-Taylor instability, a gravitational instability associated with unstable density stratification, is of profound importance in various geophysical contexts. When more than two layers are involved, a semi-analytical technique based on the biharmonic formulation of Stokes flow has been extensively used to obtain such dispersion relation. However, this technique may become cumbersome when applied to lithospheric dynamics, where a number of layers are necessary to represent the continuous variation of viscosity over many orders of magnitude. Here, we present an alternative and more efficient method based on the propagator matrix formulation of Stokes flow. With this approach, the original instability problem is reduced to a compact eigenvalue equation whose size is solely determined by the number of primary density contrasts. We apply this new technique to the stability of the early crust, and combined with the Monte Carlo sensitivity analysis, we derive an empirical formula to compute the growth rate of the Rayleigh-Taylor instability for this particular geophysical setting. Our analysis indicates that the likelihood of crustal delamination hinges critically on the effective viscosity of eclogite.
Compositional and strain analysis of In(Ga)N/GaN short period superlattices
NASA Astrophysics Data System (ADS)
Dimitrakopulos, G. P.; Vasileiadis, I. G.; Bazioti, C.; Smalc-Koziorowska, J.; Kret, S.; Dimakis, E.; Florini, N.; Kehagias, Th.; Suski, T.; Karakostas, Th.; Moustakas, T. D.; Komninou, Ph.
2018-01-01
Extensive high resolution transmission and scanning transmission electron microscopy observations were performed in In(Ga)N/GaN multi-quantum well short period superlattices comprising two-dimensional quantum wells (QWs) of nominal thicknesses 1, 2, and 4 monolayers (MLs) in order to obtain a correlation between their average composition, geometry, and strain. The high angle annular dark field Z-contrast observations were quantified for such layers, regarding the indium content of the QWs, and were correlated to their strain state using peak finding and geometrical phase analysis. Image simulations taking into thorough account the experimental imaging conditions were employed in order to associate the observed Z-contrast to the indium content. Energetically relaxed supercells calculated with a Tersoff empirical interatomic potential were used as the input for such simulations. We found a deviation from the tetragonal distortion prescribed by continuum elasticity for thin films, i.e., the strain in the relaxed cells was lower than expected for the case of 1 ML QWs. In all samples, the QW thickness and strain were confined in up to 2 ML with possible indium enrichment of the immediately abutting MLs. The average composition of the QWs was quantified in the form of alloy content.
Supporting Regularized Logistic Regression Privately and Efficiently
Li, Wenfa; Liu, Hongzhe; Yang, Peng; Xie, Wei
2016-01-01
As one of the most popular statistical and machine learning models, logistic regression with regularization has found wide adoption in biomedicine, social sciences, information technology, and so on. These domains often involve data of human subjects that are contingent upon strict privacy regulations. Concerns over data privacy make it increasingly difficult to coordinate and conduct large-scale collaborative studies, which typically rely on cross-institution data sharing and joint analysis. Our work here focuses on safeguarding regularized logistic regression, a widely-used statistical model while at the same time has not been investigated from a data security and privacy perspective. We consider a common use scenario of multi-institution collaborative studies, such as in the form of research consortia or networks as widely seen in genetics, epidemiology, social sciences, etc. To make our privacy-enhancing solution practical, we demonstrate a non-conventional and computationally efficient method leveraging distributing computing and strong cryptography to provide comprehensive protection over individual-level and summary data. Extensive empirical evaluations on several studies validate the privacy guarantee, efficiency and scalability of our proposal. We also discuss the practical implications of our solution for large-scale studies and applications from various disciplines, including genetic and biomedical studies, smart grid, network analysis, etc. PMID:27271738
Rapid decay in the relative efficiency of quarantine to halt epidemics in networks
NASA Astrophysics Data System (ADS)
Strona, Giovanni; Castellano, Claudio
2018-02-01
Several recent studies have tackled the issue of optimal network immunization by providing efficient criteria to identify key nodes to be removed in order to break apart a network, thus preventing the occurrence of extensive epidemic outbreaks. Yet, although the efficiency of those criteria has been demonstrated also in empirical networks, preventive immunization is rarely applied to real-world scenarios, where the usual approach is the a posteriori attempt to contain epidemic outbreaks using quarantine measures. Here we compare the efficiency of prevention with that of quarantine in terms of the tradeoff between the number of removed and saved nodes on both synthetic and empirical topologies. We show how, consistent with common sense, but contrary to common practice, in many cases preventing is better than curing: depending on network structure, rescuing an infected network by quarantine could become inefficient soon after the first infection.
Computational Models of Anterior Cingulate Cortex: At the Crossroads between Prediction and Effort.
Vassena, Eliana; Holroyd, Clay B; Alexander, William H
2017-01-01
In the last two decades the anterior cingulate cortex (ACC) has become one of the most investigated areas of the brain. Extensive neuroimaging evidence suggests countless functions for this region, ranging from conflict and error coding, to social cognition, pain and effortful control. In response to this burgeoning amount of data, a proliferation of computational models has tried to characterize the neurocognitive architecture of ACC. Early seminal models provided a computational explanation for a relatively circumscribed set of empirical findings, mainly accounting for EEG and fMRI evidence. More recent models have focused on ACC's contribution to effortful control. In parallel to these developments, several proposals attempted to explain within a single computational framework a wider variety of empirical findings that span different cognitive processes and experimental modalities. Here we critically evaluate these modeling attempts, highlighting the continued need to reconcile the array of disparate ACC observations within a coherent, unifying framework.
Review essay: empires, ancient and modern.
Hall, John A
2011-09-01
This essay drews attention to two books on empires by historians which deserve the attention of sociologists. Bang's model of the workings of the Roman economy powerfully demonstrates the tributary nature of per-industrial tributary empires. Darwin's analysis concentrates on modern overseas empires, wholly different in character as they involved the transportation of consumption items for the many rather than luxury goods for the few. Darwin is especially good at describing the conditions of existence of late nineteenth century empires, noting that their demise was caused most of all by the failure of balance of power politics in Europe. Concluding thoughts are offered about the USA. © London School of Economics and Political Science 2011.
Dealing with noise and physiological artifacts in human EEG recordings: empirical mode methods
NASA Astrophysics Data System (ADS)
Runnova, Anastasiya E.; Grubov, Vadim V.; Khramova, Marina V.; Hramov, Alexander E.
2017-04-01
In the paper we propose the new method for removing noise and physiological artifacts in human EEG recordings based on empirical mode decomposition (Hilbert-Huang transform). As physiological artifacts we consider specific oscillatory patterns that cause problems during EEG analysis and can be detected with additional signals recorded simultaneously with EEG (ECG, EMG, EOG, etc.) We introduce the algorithm of the proposed method with steps including empirical mode decomposition of EEG signal, choosing of empirical modes with artifacts, removing these empirical modes and reconstructing of initial EEG signal. We show the efficiency of the method on the example of filtration of human EEG signal from eye-moving artifacts.
fMRI Analysis-by-Synthesis Reveals a Dorsal Hierarchy That Extracts Surface Slant.
Ban, Hiroshi; Welchman, Andrew E
2015-07-08
The brain's skill in estimating the 3-D orientation of viewed surfaces supports a range of behaviors, from placing an object on a nearby table, to planning the best route when hill walking. This ability relies on integrating depth signals across extensive regions of space that exceed the receptive fields of early sensory neurons. Although hierarchical selection and pooling is central to understanding of the ventral visual pathway, the successive operations in the dorsal stream are poorly understood. Here we use computational modeling of human fMRI signals to probe the computations that extract 3-D surface orientation from binocular disparity. To understand how representations evolve across the hierarchy, we developed an inference approach using a series of generative models to explain the empirical fMRI data in different cortical areas. Specifically, we simulated the responses of candidate visual processing algorithms and tested how well they explained fMRI responses. Thereby we demonstrate a hierarchical refinement of visual representations moving from the representation of edges and figure-ground segmentation (V1, V2) to spatially extensive disparity gradients in V3A. We show that responses in V3A are little affected by low-level image covariates, and have a partial tolerance to the overall depth position. Finally, we show that responses in V3A parallel perceptual judgments of slant. This reveals a relatively short computational hierarchy that captures key information about the 3-D structure of nearby surfaces, and more generally demonstrates an analysis approach that may be of merit in a diverse range of brain imaging domains. Copyright © 2015 Ban and Welchman.
The Ideologies of American Social Critics: An Empirical Test of Kadushin's Theory
ERIC Educational Resources Information Center
Simon, David R.
1977-01-01
Examines Kadushin's earlier empirical efforts to determine the leading social critics and organizations of social criticism in America and investigates his theory through content analysis of leading journals of social criticism. (MH)
Kagel, John H.; Winkler, Robin C.
1972-01-01
The current research methods of behavioral economics are characterized by inadequate empirical foundations. Psychologists involved in the experimental analysis of behavior with their research strategies and their experimental technology, particularly that of the Token Economy, can assist in providing empirical foundations for behavioral economics. Cooperative research between economists and psychologists to this end should be immediately fruitful and mutually beneficial. PMID:16795356
ERIC Educational Resources Information Center
Adeyemo, Emily Oluseyi
2012-01-01
This study examined the impact of publication bias on a meta-analysis of empirical studies on validity of University Matriculation Examinations in Nigeria with a view to determine the level of difference between published and unpublished articles. Specifically, the design was an ex-post facto, a causal comparative design. The sample size consisted…
ERIC Educational Resources Information Center
Angrist, Joshua; Pischke, Jorn-Steffen
2010-01-01
This essay reviews progress in empirical economics since Leamer'rs (1983) critique. Leamer highlighted the benefits of sensitivity analysis, a procedure in which researchers show how their results change with changes in specification or functional form. Sensitivity analysis has had a salutary but not a revolutionary effect on econometric practice.…
No complexity–stability relationship in empirical ecosystems
Jacquet, Claire; Moritz, Charlotte; Morissette, Lyne; Legagneux, Pierre; Massol, François; Archambault, Philippe; Gravel, Dominique
2016-01-01
Understanding the mechanisms responsible for stability and persistence of ecosystems is one of the greatest challenges in ecology. Robert May showed that, contrary to intuition, complex randomly built ecosystems are less likely to be stable than simpler ones. Few attempts have been tried to test May's prediction empirically, and we still ignore what is the actual complexity–stability relationship in natural ecosystems. Here we perform a stability analysis of 116 quantitative food webs sampled worldwide. We find that classic descriptors of complexity (species richness, connectance and interaction strength) are not associated with stability in empirical food webs. Further analysis reveals that a correlation between the effects of predators on prey and those of prey on predators, combined with a high frequency of weak interactions, stabilize food web dynamics relative to the random expectation. We conclude that empirical food webs have several non-random properties contributing to the absence of a complexity–stability relationship. PMID:27553393
An empirically based model for knowledge management in health care organizations.
Sibbald, Shannon L; Wathen, C Nadine; Kothari, Anita
2016-01-01
Knowledge management (KM) encompasses strategies, processes, and practices that allow an organization to capture, share, store, access, and use knowledge. Ideal KM combines different sources of knowledge to support innovation and improve performance. Despite the importance of KM in health care organizations (HCOs), there has been very little empirical research to describe KM in this context. This study explores KM in HCOs, focusing on the status of current intraorganizational KM. The intention is to provide insight for future studies and model development for effective KM implementation in HCOs. A qualitative methods approach was used to create an empirically based model of KM in HCOs. Methods included (a) qualitative interviews (n = 24) with senior leadership to identify types of knowledge important in these roles plus current information-seeking behaviors/needs and (b) in-depth case study with leaders in new executive positions (n = 2). The data were collected from 10 HCOs. Our empirically based model for KM was assessed for face and content validity. The findings highlight the paucity of formal KM in our sample HCOs. Organizational culture, leadership, and resources are instrumental in supporting KM processes. An executive's knowledge needs are extensive, but knowledge assets are often limited or difficult to acquire as much of the available information is not in a usable format. We propose an empirically based model for KM to highlight the importance of context (internal and external), and knowledge seeking, synthesis, sharing, and organization. Participants who reviewed the model supported its basic components and processes, and potential for incorporating KM into organizational processes. Our results articulate ways to improve KM, increase organizational learning, and support evidence-informed decision-making. This research has implications for how to better integrate evidence and knowledge into organizations while considering context and the role of organizational processes.
Alpers, Charles N.; Myers, Perry A; Millsap, Daniel; Regnier, Tamsen B; Bowell, Robert J.; Alpers, Charles N.; Jamieson, Heather E.; Nordstrom, D. Kirk; Majzlan, Juraj
2014-01-01
The Empire Mine, together with other mines in the Grass Valley mining district, produced at least 21.3 million troy ounces (663 tonnes) of gold (Au) during the 1850s through the 1950s, making it the most productive hardrock Au mining district in California history (Clark 1970). The Empire Mine State Historic Park (Empire Mine SHP or EMSHP), established in 1975, provides the public with an opportunity to see many well-preserved features of the historic mining and mineral processing operations (CDPR 2014a).A legacy of Au mining at Empire Mine and elsewhere is contamination of mine wastes and associated soils, surface waters, and groundwaters with arsenic (As), mercury (Hg), lead (Pb), and other metals. At EMSHP, As has been the principal contaminant of concern and the focus of extensive remediation efforts over the past several years by the State of California, Department of Parks and Recreation (DPR) and Newmont USA, Ltd. In addition, the site is the main focus of a multidisciplinary research project on As bioavailability and bioaccessibility led by the California Department of Toxic Substances Control (DTSC) and funded by the U.S. Environmental Protection Agency’s (USEPA’s) Brownfields Program.This chapter was prepared as a guide for a field trip to EMSHP held on June 14, 2014, in conjunction with a short course on “Environmental Geochemistry, Mineralogy, and Microbiology of Arsenic” held in Nevada City, California on June 15–16, 2014. This guide contains background information on geological setting, mining history, and environmental history at EMSHP and other historical Au mining districts in the Sierra Nevada, followed by descriptions of the field trip stops.
Discovery of Empirical Components by Information Theory
2016-08-10
AFRL-AFOSR-VA-TR-2016-0289 Discovery of Empirical Components by Information Theory Amit Singer TRUSTEES OF PRINCETON UNIVERSITY 1 NASSAU HALL...3. DATES COVERED (From - To) 15 Feb 2013 to 14 Feb 2016 5a. CONTRACT NUMBER Discovery of Empirical Components by Information Theory 5b. GRANT...they draw not only from traditional linear algebra based numerical analysis or approximation theory , but also from information theory , graph theory
ERIC Educational Resources Information Center
Diesel, Vivien; Miná Dias, Marcelo
2016-01-01
Purpose: To analyze the Brazilian experience in designing and implementing a recent extension policy reform based on agroecology, and reflect on its wider theoretical implications for extension reform literature. Design/methodology/approach: Using a critical public analysis we characterize the evolution of Brazilian federal extension policy…
Clinical characteristics of ceftriaxone plus metronidazole in complicated intra-abdominal infection
2015-01-01
Purpose Empirical antibiotics in complicated intra-abdominal infection (c-IAI), such as secondary peritonitis are a first step of treatment. Empirical antibiotic regimen is very diverse. Ceftriaxone plus metronidazole regimen (CMR) is one of the empirical antibiotic regimens used in treatment of c-IAI. However, although CMR is a widely used empirical antibiotic regimen, study regarding success, failure or efficacy of CMR has been poorly understood. This retrospective study is conducted to compare the clinical efficacy of this regimen in c-IAI according to clinical characteristics. Methods The subjects were patients in this hospital who were diagnosed as secondary peritonitis between 2009 and 2013. Retrospective analysis was performed based on the records made after surgery regarding clinical characteristics including albumin level, blood pressure, pulse rate, respiration rate, smoking, age, sex, body mass index, hemoglobin, coexisting disease, leukocytosis, and APACHE (acute physiology and chronic health evaluation) II score. Results A total of 114 patients were enrolled. In univariated analysis, the success and failure of CMR showed significant association with preoperative low albumin, old age, and preoperative tachycardia. In multivariated analysis, low albumin and preoperative tachycardia were significant. Conclusion It is thought that an additional antibiotic treatment plan is necessary in patients with low albumin and tachycardia when the empirical antibiotic regimen is CMR in c-IAI. Conduct of research through well-designed prospective randomized clinical study is also necessary in order to evaluate the appropriateness of CMR and decide on a proper empirical antibiotic regimen between many regimens in c-IAI based on our country. PMID:26131444
Nakamura, Akiko; Sugimoto, Yuka; Ohishi, Kohshi; Sugawara, Yumiko; Fujieda, Atsushi; Monma, Fumihiko; Suzuki, Kei; Masuya, Masahiro; Nakase, Kazunori; Matsushima, Yoshiko; Wada, Hideo; Katayama, Naoyuki; Nobori, Tsutomu
2010-01-01
This study aimed to assess the clinical utility of PCR for the analysis of bacteria and fungi from blood for the management of febrile neutropenic patients with hematologic malignancies. Using a PCR system able to detect a broad range of bacteria and fungi, we conducted a prospective pilot study of periodic analyses of blood from patients following intensive chemotherapy. When fever occurred, it was treated with empirical antibiotic therapy, basically without knowledge of the PCR results. In 23 febrile episodes during the neutropenic period, bacteria were detected by PCR in 11 cases, while the same species were identified by blood culture in 3 cases. In 10 out of 11 PCR-positive cases, fever could be managed by empirical therapy. In the empirical-therapy-resistant case, the identification of Stenotrophomonas maltophilia by PCR led to improvement of fever. No fungi were detected by PCR in febrile cases, while Aspergillus fumigatus was detected in one afebrile patient, several days before a clinical diagnosis was made. In subsequent sporadic PCR analyses in 15 cases of febrile neutropenia, bacteria were detected by both PCR and blood culture in 7 cases and by PCR alone in 6. Fungi were not detected. While fever was improved by empirical therapy in 12 out of the 13 PCR-positive cases, the identification of Pseudomonas aeruginosa by PCR in one therapy-resistant case contributed to the successful treatment of persistent fever. Our results indicate that PCR analysis of bacteria from blood provides essential information for managing empirical-therapy-resistant febrile neutropenia. PMID:20392911
NASA Astrophysics Data System (ADS)
Mullen, Katharine M.
Human-technology integration is the replacement of human parts and extension of human capabilities with engineered devices and substrates. Its result is hybrid biological-artificial systems. We discuss here four categories of products furthering human-technology integration: wearable computers, pervasive computing environments, engineered tissues and organs, and prosthetics, and introduce examples of currently realized systems in each category. We then note that realization of a completely artificial sytem via the path of human-technology integration presents the prospect of empirical confirmation of an aware artificially embodied system.
NASA Astrophysics Data System (ADS)
Mosha, Herme Joseph
1988-03-01
This article seeks to identify factors affecting the quality of primary education in five regions of Tanzania by extensively reviewing relevant literature and empirical data. Some of the shortcomings emphasised by the author are: frequent staff turnover, declining financial support for primary education, ineffective curricula, shortage of teachers' guides and textbooks, and unfavourable working conditions for teachers in rural areas. Beyond this, the need for freely available material, efficient school management and regular inspections is stressed by the author.
Compressive Properties of Extruded Polytetrafluoroethylene
2007-07-01
against equivalent temperature ( Tmap ) at a single strain rate (3map). This is a pragmatic, empirically based line- arization and extension to large strains...one of the strain rates that was used in the experimental program, and in this case two rates were used: 0.1 s1 and 3200 s1. The value Tmap , is...defined as Tmap ¼ Texp þA log _3map log _3exp ð11Þ where the subscript exp indicates the experimental values of strain rate and temperature. A
Dilution jet mixing program, phase 3
NASA Technical Reports Server (NTRS)
Srinivasan, R.; Coleman, E.; Myers, G.; White, C.
1985-01-01
The main objectives for the NASA Jet Mixing Phase 3 program were: extension of the data base on the mixing of single sided rows of jets in a confined cross flow to discrete slots, including streamlined, bluff, and angled injections; quantification of the effects of geometrical and flow parameters on penetration and mixing of multiple rows of jets into a confined flow; investigation of in-line, staggered, and dissimilar hole configurations; and development of empirical correlations for predicting temperature distributions for discrete slots and multiple rows of dilution holes.
Raible, C; Leidl, R
2004-11-01
The German hospital market faces an extensive process of consolidation. In this change hospitals consider cooperation as one possibility to improve competitiveness. To investigate explanations of changes in the German hospital market by theoretical approaches of cooperation research. The aims and mechanism of the theories, their relevance in terms of contents and their potential for empirical tests were used as criteria to assess the approaches, with current and future trends in the German hospital market providing the framework. Based on literature review, six theoretical approaches were investigated: industrial organization, transaction cost theory, game theory, resource dependency, institutional theory, and co-operative investment and finance theory. In addition, the data needed to empirically test the theories were specified. As a general problem, some of the theoretical approaches set a perfect market as a precondition. This precondition is not met by the heavily regulated German hospital market. Given the current regulations and the assessment criteria, industrial organization as well as resource-dependency and institutional theory approaches showed the highest potential to explain various aspects of the changes in the hospital market. So far, none of the approaches investigated provides a comprehensive and empirically tested explanation of the changes in the German hospital market. However, some of the approaches provide a theoretical background for part of the changes. As this dynamic market is economically of high significance, there is a need for further development and empirical testing of relevant theoretical approaches.
Naikar, Neelam; Elix, Ben
2016-01-01
This paper proposes an approach for integrated system design, which has the intent of facilitating high levels of effectiveness in sociotechnical systems by promoting their capacity for adaptation. Building on earlier ideas and empirical observations, this approach recognizes that to create adaptive systems it is necessary to integrate the design of all of the system elements, including the interfaces, teams, training, and automation, such that workers are supported in adapting their behavior as well as their structure, or organization, in a coherent manner. Current approaches for work analysis and design are limited in regard to this fundamental objective, especially in cases when workers are confronted with unforeseen events. A suitable starting point is offered by cognitive work analysis (CWA), but while this framework can support actors in adapting their behavior, it does not necessarily accommodate adaptations in their structure. Moreover, associated design approaches generally focus on individual system elements, and those that consider multiple elements appear limited in their ability to facilitate integration, especially in the manner intended here. The proposed approach puts forward the set of possibilities for work organization in a system as the central mechanism for binding the design of its various elements, so that actors can adapt their structure as well as their behavior—in a unified fashion—to handle both familiar and novel conditions. Accordingly, this paper demonstrates how the set of possibilities for work organization in a system may be demarcated independently of the situation, through extensions of CWA, and how it may be utilized in design. This lynchpin, conceptualized in the form of a diagram of work organization possibilities (WOP), is important for preserving a system's inherent capacity for adaptation. Future research should focus on validating these concepts and establishing the feasibility of implementing them in industrial contexts. PMID:27445924
Naikar, Neelam; Elix, Ben
2016-01-01
This paper proposes an approach for integrated system design, which has the intent of facilitating high levels of effectiveness in sociotechnical systems by promoting their capacity for adaptation. Building on earlier ideas and empirical observations, this approach recognizes that to create adaptive systems it is necessary to integrate the design of all of the system elements, including the interfaces, teams, training, and automation, such that workers are supported in adapting their behavior as well as their structure, or organization, in a coherent manner. Current approaches for work analysis and design are limited in regard to this fundamental objective, especially in cases when workers are confronted with unforeseen events. A suitable starting point is offered by cognitive work analysis (CWA), but while this framework can support actors in adapting their behavior, it does not necessarily accommodate adaptations in their structure. Moreover, associated design approaches generally focus on individual system elements, and those that consider multiple elements appear limited in their ability to facilitate integration, especially in the manner intended here. The proposed approach puts forward the set of possibilities for work organization in a system as the central mechanism for binding the design of its various elements, so that actors can adapt their structure as well as their behavior-in a unified fashion-to handle both familiar and novel conditions. Accordingly, this paper demonstrates how the set of possibilities for work organization in a system may be demarcated independently of the situation, through extensions of CWA, and how it may be utilized in design. This lynchpin, conceptualized in the form of a diagram of work organization possibilities (WOP), is important for preserving a system's inherent capacity for adaptation. Future research should focus on validating these concepts and establishing the feasibility of implementing them in industrial contexts.
Austin, Peter C; Schuster, Tibor; Platt, Robert W
2015-10-15
Estimating statistical power is an important component of the design of both randomized controlled trials (RCTs) and observational studies. Methods for estimating statistical power in RCTs have been well described and can be implemented simply. In observational studies, statistical methods must be used to remove the effects of confounding that can occur due to non-random treatment assignment. Inverse probability of treatment weighting (IPTW) using the propensity score is an attractive method for estimating the effects of treatment using observational data. However, sample size and power calculations have not been adequately described for these methods. We used an extensive series of Monte Carlo simulations to compare the statistical power of an IPTW analysis of an observational study with time-to-event outcomes with that of an analysis of a similarly-structured RCT. We examined the impact of four factors on the statistical power function: number of observed events, prevalence of treatment, the marginal hazard ratio, and the strength of the treatment-selection process. We found that, on average, an IPTW analysis had lower statistical power compared to an analysis of a similarly-structured RCT. The difference in statistical power increased as the magnitude of the treatment-selection model increased. The statistical power of an IPTW analysis tended to be lower than the statistical power of a similarly-structured RCT.
Multiset canonical correlations analysis and multispectral, truly multitemporal remote sensing data.
Nielsen, Allan Aasbjerg
2002-01-01
This paper describes two- and multiset canonical correlations analysis (CCA) for data fusion, multisource, multiset, or multitemporal exploratory data analysis. These techniques transform multivariate multiset data into new orthogonal variables called canonical variates (CVs) which, when applied in remote sensing, exhibit ever-decreasing similarity (as expressed by correlation measures) over sets consisting of 1) spectral variables at fixed points in time (R-mode analysis), or 2) temporal variables with fixed wavelengths (T-mode analysis). The CVs are invariant to linear and affine transformations of the original variables within sets which means, for example, that the R-mode CVs are insensitive to changes over time in offset and gain in a measuring device. In a case study, CVs are calculated from Landsat Thematic Mapper (TM) data with six spectral bands over six consecutive years. Both Rand T-mode CVs clearly exhibit the desired characteristic: they show maximum similarity for the low-order canonical variates and minimum similarity for the high-order canonical variates. These characteristics are seen both visually and in objective measures. The results from the multiset CCA R- and T-mode analyses are very different. This difference is ascribed to the noise structure in the data. The CCA methods are related to partial least squares (PLS) methods. This paper very briefly describes multiset CCA-based multiset PLS. Also, the CCA methods can be applied as multivariate extensions to empirical orthogonal functions (EOF) techniques. Multiset CCA is well-suited for inclusion in geographical information systems (GIS).
Guo, J L; Wang, T F; Liao, J Y; Huang, C M
2016-02-01
This study assessed the applicability and efficacy of the theory of planned behavior (TPB) in predicting breastfeeding. The TPB assumes a rational approach for engaging in various behaviors, and has been used extensively for explaining health behavior. However, most studies have tested the effectiveness of TPB constructs in predicting how people perform actions for their own benefit rather than performing behaviors that are beneficial to others, such as breastfeeding infants. A meta-analysis approach could help clarify the breastfeeding practice to promote breastfeeding. This study used meta-analytic procedures. We searched for studies to include in our analysis, examining those published between January 1, 1990 and December 31, 2013 in PubMed, Medline, CINAHL, ProQuest, and Mosby's Index. We also reviewed journals with a history of publishing breastfeeding studies and searched reference lists for potential articles to include. Ten studies comprising a total of 2694 participants were selected for analysis. These studies yielded 10 effect sizes from the TPB, which ranged from 0.20 to 0.59. Structural equation model analysis using the pooled correlation matrix enabled us to determine the relative coefficients among TPB constructs. Attitude, subjective norms, and perceived behavioral control were all significant predictors of breastfeeding intention, whereas intention was a strong predictor of breastfeeding behavior. Perceived behavioral control reached a borderline level of significance to breastfeeding behavior. Theoretical and empirical implications are discussed from the perspective of evidence-based practice. Copyright © 2015 Elsevier Inc. All rights reserved.
Tsang, Hector W H; Ching, S C; Tang, K H; Lam, H T; Law, Peggy Y Y; Wan, C N
2016-05-01
Internalized stigma can lead to pervasive negative effects among people with severe mental illness (SMI). Although prevalence of internalized stigma is high, there is a dearth of interventions and meanwhile a lack of evidence as to their effectiveness. This study aims at unraveling the existence of different therapeutic interventions and the effectiveness internalized stigma reduction in people with SMI via a systematic review and meta-analysis. Five electronic databases were searched. Studies were included if they (1) involved community or hospital based interventions on internalized stigma, (2) included participants who were given a diagnosis of SMI>50%, and (3) were empirical and quantitative in nature. Fourteen articles were selected for extensive review and five for meta-analysis. Nine studies showed significant decrease in internalized stigma and two showed sustainable effects. Meta-analysis showed that there was a small to moderate significant effect in therapeutic interventions (SMD=-0.43; p=0.003). Among the intervention elements, four studies suggested a favorable effect of psychoeducation. Meta-analysis showed that there was small to moderate significant effect (SMD=-0.40; p=0.001). Most internalized stigma reduction programs appear to be effective. This systematic review cannot make any recommendation on which intervention is more effective although psychoeducation seems most promising. More Randomized Controlled Trials (RCT) on particular intervention components using standard outcome measures are recommended in future studies. Copyright © 2016. Published by Elsevier B.V.
SMART (Sports Medicine and Rehabilitation Team) Centers: An Empirical Analysis
2007-04-01
completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection ofinformation ...of finite health care resources, increased military operational tempo, and smaller expeditionary fighting forces, the US Navy has developed SMART...through their sacrifices due I owe my success. SMART Centers: An Empirical Analysis 5 Abstract In an era of finite health care resources, increased military