Conducting a Multivocal Thematic Synthesis on an Extensive Body of Literature
ERIC Educational Resources Information Center
Befus, Madelaine
2016-01-01
This paper will provide a methodology and progress report from a multivocal thematic synthesis being conducted on an extensive, diverse body of empirical studies. The study data includes a corpus of peer-reviewed empirical literature sharing a common reference published in English between 2000 and 2014. In this study, data to be synthesized share…
[Mobbing: a meta-analysis and integrative model of its antecedents and consequences].
Topa Cantisano, Gabriela; Depolo, Marco; Morales Domínguez, J Francisco
2007-02-01
Although mobbing has been extensively studied, empirical research has not led to firm conclusions regarding its antecedents and consequences, both at personal and organizational levels. An extensive literature search yielded 86 empirical studies with 93 samples. The matrix correlation obtained through meta-analytic techniques was used to test a structural equation model. Results supported hypotheses regarding organizational environmental factors as main predictors of mobbing.
The Effects of Extensive Reading on Reading Comprehension, Reading Rate, and Vocabulary Acquisition
ERIC Educational Resources Information Center
Suk, Namhee
2017-01-01
Several empirical studies and syntheses of extensive reading have concluded that extensive reading has positive impacts on language learning in second- and foreign-language settings. However, many of the studies contained methodological or curricular limitations, raising questions about the asserted positive effects of extensive reading. The…
Topa Cantisano, Gabriela; Morales Domínguez, J F; Depolo, Marco
2008-05-01
Although sexual harassment has been extensively studied, empirical research has not led to firm conclusions about its antecedents and consequences, both at the personal and organizational level. An extensive literature search yielded 42 empirical studies with 60 samples. The matrix correlation obtained through meta-analytic techniques was used to test a structural equation model. Results supported the hypotheses regarding organizational environmental factors as main predictors of harassment.
ERIC Educational Resources Information Center
Liu, Xun
2010-01-01
This study extended the technology acceptance model and empirically tested the new model with wikis, a new type of educational technology. Based on social cognitive theory and the theory of planned behavior, three new variables, wiki self-efficacy, online posting anxiety, and perceived behavioral control, were added to the original technology…
DOT National Transportation Integrated Search
2017-02-08
The study re-evaluates distress prediction models using the Mechanistic-Empirical Pavement Design Guide (MEPDG) and expands the sensitivity analysis to a wide range of pavement structures and soils. In addition, an extensive validation analysis of th...
ERIC Educational Resources Information Center
Forster, Greg
2008-01-01
The impact of Florida's "A+" accountability program, which until 2006 included a voucher program for chronically failing schools, on public school performance has been extensively studied. The results have consistently shown a positive effect on academic outcomes in Florida public schools. However, no empirical research has been done on…
Quantitative genetic versions of Hamilton's rule with empirical applications
McGlothlin, Joel W.; Wolf, Jason B.; Brodie, Edmund D.; Moore, Allen J.
2014-01-01
Hamilton's theory of inclusive fitness revolutionized our understanding of the evolution of social interactions. Surprisingly, an incorporation of Hamilton's perspective into the quantitative genetic theory of phenotypic evolution has been slow, despite the popularity of quantitative genetics in evolutionary studies. Here, we discuss several versions of Hamilton's rule for social evolution from a quantitative genetic perspective, emphasizing its utility in empirical applications. Although evolutionary quantitative genetics offers methods to measure each of the critical parameters of Hamilton's rule, empirical work has lagged behind theory. In particular, we lack studies of selection on altruistic traits in the wild. Fitness costs and benefits of altruism can be estimated using a simple extension of phenotypic selection analysis that incorporates the traits of social interactants. We also discuss the importance of considering the genetic influence of the social environment, or indirect genetic effects (IGEs), in the context of Hamilton's rule. Research in social evolution has generated an extensive body of empirical work focusing—with good reason—almost solely on relatedness. We argue that quantifying the roles of social and non-social components of selection and IGEs, in addition to relatedness, is now timely and should provide unique additional insights into social evolution. PMID:24686930
Transition mixing study empirical model report
NASA Technical Reports Server (NTRS)
Srinivasan, R.; White, C.
1988-01-01
The empirical model developed in the NASA Dilution Jet Mixing Program has been extended to include the curvature effects of transition liners. This extension is based on the results of a 3-D numerical model generated under this contract. The empirical model results agree well with the numerical model results for all tests cases evaluated. The empirical model shows faster mixing rates compared to the numerical model. Both models show drift of jets toward the inner wall of a turning duct. The structure of the jets from the inner wall does not exhibit the familiar kidney-shaped structures observed for the outer wall jets or for jets injected in rectangular ducts.
An Extension of the Partial Credit Model with an Application to the Measurement of Change.
ERIC Educational Resources Information Center
Fischer, Gerhard H.; Ponocny, Ivo
1994-01-01
An extension to the partial credit model, the linear partial credit model, is considered under the assumption of a certain linear decomposition of the item x category parameters into basic parameters. A conditional maximum likelihood algorithm for estimating basic parameters is presented and illustrated with simulation and an empirical study. (SLD)
Martin, Guillaume; Chapuis, Elodie; Goudet, Jérôme
2008-01-01
Neutrality tests in quantitative genetics provide a statistical framework for the detection of selection on polygenic traits in wild populations. However, the existing method based on comparisons of divergence at neutral markers and quantitative traits (Qst–Fst) suffers from several limitations that hinder a clear interpretation of the results with typical empirical designs. In this article, we propose a multivariate extension of this neutrality test based on empirical estimates of the among-populations (D) and within-populations (G) covariance matrices by MANOVA. A simple pattern is expected under neutrality: D = 2Fst/(1 − Fst)G, so that neutrality implies both proportionality of the two matrices and a specific value of the proportionality coefficient. This pattern is tested using Flury's framework for matrix comparison [common principal-component (CPC) analysis], a well-known tool in G matrix evolution studies. We show the importance of using a Bartlett adjustment of the test for the small sample sizes typically found in empirical studies. We propose a dual test: (i) that the proportionality coefficient is not different from its neutral expectation [2Fst/(1 − Fst)] and (ii) that the MANOVA estimates of mean square matrices between and among populations are proportional. These two tests combined provide a more stringent test for neutrality than the classic Qst–Fst comparison and avoid several statistical problems. Extensive simulations of realistic empirical designs suggest that these tests correctly detect the expected pattern under neutrality and have enough power to efficiently detect mild to strong selection (homogeneous, heterogeneous, or mixed) when it is occurring on a set of traits. This method also provides a rigorous and quantitative framework for disentangling the effects of different selection regimes and of drift on the evolution of the G matrix. We discuss practical requirements for the proper application of our test in empirical studies and potential extensions. PMID:18245845
ERIC Educational Resources Information Center
Oladele, O. I.; Adekoya, A. E.
2006-01-01
This paper examines the implications of farmers' propensity to discontinue the adoption of agricultural technologies in southwestern Nigeria. This is predicated on the fact that extension education process should be proactive in addressing farmers in order to sustain the adoption process. Empirical studies looking at diffusion processes from an…
ERIC Educational Resources Information Center
Sanga, Camilius; Mlozi, Malongo; Haug, Ruth; Tumbo, Siza
2016-01-01
The ubiquitous nature of mobile phones offers a noble environment where farmers can learn informally anywhere, anytime and at any location. This is an innovative way to address some of the weakness of conventional agricultural extension service. Few empirical studies have reported on the development of mobile phone application to support blended…
Study Guide in Health Economics.
ERIC Educational Resources Information Center
Dawson, George; Jablon, Bert
Prepared to assist students at Empire State College in developing learning contracts for the study of the economics of health care delivery, this study guide discusses various aspects of the topic, suggests student projects, and provides an extensive bibliography. First, introductory material discusses the relationship of economics to health care…
Research on Liquidity Risk Evaluation of Chinese A-Shares Market Based on Extension Theory
NASA Astrophysics Data System (ADS)
Bai-Qing, Sun; Peng-Xiang, Liu; Lin, Zhang; Yan-Ge, Li
This research defines the liquidity risk of stock market in matter-element theory and affair-element theory, establishes the indicator system of the forewarning for liquidity risks,designs the model and the process of early warning using the extension set method, extension dependent function and the comprehensive evaluation model. And the paper studies empirically A-shares market through the data of 1A0001, which prove that the model can better describe liquidity risk of China’s A-share market. At last, it gives the corresponding policy recommendations.
NASA Astrophysics Data System (ADS)
Sword-Daniels, V. L.; Rossetto, T.; Wilson, T. M.; Sargeant, S.
2015-05-01
The essential services that support urban living are complex and interdependent, and their disruption in disasters directly affects society. Yet there are few empirical studies to inform our understanding of the vulnerabilities and resilience of complex infrastructure systems in disasters. This research takes a systems thinking approach to explore the dynamic behaviour of a network of essential services, in the presence and absence of volcanic ashfall hazards in Montserrat, West Indies. Adopting a case study methodology and qualitative methods to gather empirical data, we centre the study on the healthcare system and its interconnected network of essential services. We identify different types of relationship between sectors and develop a new interdependence classification system for analysis. Relationships are further categorised by hazard conditions, for use in extensive risk contexts. During heightened volcanic activity, relationships between systems transform in both number and type: connections increase across the network by 41%, and adapt to increase cooperation and information sharing. Interconnections add capacities to the network, increasing the resilience of prioritised sectors. This in-depth and context-specific approach provides a new methodology for studying the dynamics of infrastructure interdependence in an extensive risk context, and can be adapted for use in other hazard contexts.
NASA Astrophysics Data System (ADS)
Sword-Daniels, V. L.; Rossetto, T.; Wilson, T. M.; Sargeant, S.
2015-02-01
The essential services that support urban living are complex and interdependent, and their disruption in disasters directly affects society. Yet there are few empirical studies to inform our understanding of the vulnerabilities and resilience of complex infrastructure systems in disasters. This research takes a systems thinking approach to explore the dynamic behaviour of a network of essential services, in the presence and absence of volcanic ashfall hazards in Montserrat, West Indies. Adopting a case study methodology and qualitative methods to gather empirical data we centre the study on the healthcare system and its interconnected network of essential services. We identify different types of relationship between sectors and develop a new interdependence classification system for analysis. Relationships are further categorised by hazard condition, for use in extensive risk contexts. During heightened volcanic activity, relationships between systems transform in both number and type: connections increase across the network by 41%, and adapt to increase cooperation and information sharing. Interconnections add capacities to the network, increasing the resilience of prioritised sectors. This in-depth and context-specific approach provides a new methodology for studying the dynamics of infrastructure interdependence in an extensive risk context, and can be adapted for use in other hazard contexts.
Symbiotic empirical ethics: a practical methodology.
Frith, Lucy
2012-05-01
Like any discipline, bioethics is a developing field of academic inquiry; and recent trends in scholarship have been towards more engagement with empirical research. This 'empirical turn' has provoked extensive debate over how such 'descriptive' research carried out in the social sciences contributes to the distinctively normative aspect of bioethics. This paper will address this issue by developing a practical research methodology for the inclusion of data from social science studies into ethical deliberation. This methodology will be based on a naturalistic conception of ethical theory that sees practice as informing theory just as theory informs practice - the two are symbiotically related. From this engagement with practice, the ways that such theories need to be extended and developed can be determined. This is a practical methodology for integrating theory and practice that can be used in empirical studies, one that uses ethical theory both to explore the data and to draw normative conclusions. © 2010 Blackwell Publishing Ltd.
Rehabilitation Counselor Work Environment: Examining Congruence with Prototypic Work Personality
ERIC Educational Resources Information Center
Zanskas, Stephen; Strohmer, Douglas C.
2010-01-01
The profession of rehabilitation counseling has undergone extensive empirical study. Absent from this research has been a theoretical basis for describing and understanding the profession and its associated work environment. The focus of this study was to further our understanding of the nature of the rehabilitation counselor's work environment…
Characteristics of Academically Excellent Business Studies Students in a Post-1992 University
ERIC Educational Resources Information Center
Bennett, Roger; Barkensjo, Anna
2005-01-01
In contrast to the extensive investigation of the characteristics of students who fail or perform badly in "new" universities, research into the factors associated with academic excellence within post-1992 institutions has been sparse. This empirical study examined the profile of a sample of 81 high-flying business studies undergraduates…
Empirical Questionnaire Methods for Fund-Raising Campaign Preparedness in Extension
ERIC Educational Resources Information Center
Comley Adams, Catherine; Butler, Douglass A.
2017-01-01
Amid waning public financial support for Extension program offerings, highly strategic and professional fund-raising practices are necessary for gaining momentum among private philanthropists and closing the fiscal gap. University of Missouri Extension conducted a precampaign survey that invited feedback from stakeholders to inform Extension…
Empirical factors and structure transference: Returning to the London account
NASA Astrophysics Data System (ADS)
Bueno, Otávio; French, Steven; Ladyman, James
2012-05-01
We offer a framework to represent the roles of empirical and theoretical factors in theory construction, and examine a case study to illustrate how the framework can be used to illuminate central features of scientific reasoning. The case study provides an extension of French and Ladyman's (1997) analysis of Fritz and Heinz London's model of superconductivity to accommodate the role of the analogy between superconductivity and diamagnetic phenomena in the development of the model between 1935 and 1937. We focus on this case since it allows us to separate the roles of empirical and theoretical factors, and so provides an example of the utility of the approach that we have adopted. We conclude the paper by drawing on the particular framework here developed to address a range of concerns.
Technology in Gifted Education: A Review of Best Practices and Empirical Research
ERIC Educational Resources Information Center
Periathiruvadi, Sita; Rinn, Anne N.
2013-01-01
The article aims to explore the progress of technology use in gifted education and highlight the best practices and empirical research in this area. The literature on the use of technology with gifted students and their teachers has been extensive, with articles on best practices, but the empirical research in this area is still emerging. With the…
Gender Differences in Access to Extension Services and Agricultural Productivity
ERIC Educational Resources Information Center
Ragasa, Catherine; Berhane, Guush; Tadesse, Fanaye; Taffesse, Alemayehu Seyoum
2013-01-01
Purpose: This article contributes new empirical evidence and nuanced analysis on the gender difference in access to extension services and how this translates to observed differences in technology adoption and agricultural productivity. Approach: It looks at the case of Ethiopia, where substantial investments in the extension system have been…
Mathematical Modelling as a Professional Task
ERIC Educational Resources Information Center
Frejd, Peter; Bergsten, Christer
2016-01-01
Educational research literature on mathematical modelling is extensive. However, not much attention has been paid to empirical investigations of its scholarly knowledge from the perspective of didactic transposition processes. This paper reports from an interview study of mathematical modelling activities involving nine professional model…
ERIC Educational Resources Information Center
Halloran, Roberta Kathryn
2011-01-01
Self-regulation, executive function and working memory are areas of cognitive processing that have been studied extensively. Although many studies have examined the constructs, there is limited empirical support suggesting a formal link between the three cognitive processes and their prediction of academic achievement. Thus, the present study…
Group Performance in Information Systems Project Groups: An Empirical Study
ERIC Educational Resources Information Center
Bahli, Bouchaib; Buyukkurt, Meral Demirbag
2005-01-01
The importance of teamwork in Information Systems Development (ISD) practice and education has been acknowledged but not studied extensively to date. This paper tests a model of how groups participating in ISD projects perform and examines the relationships between some antecedents of this performance based on group research theory well…
ERIC Educational Resources Information Center
Dee, Thomas; Penner, Emily
2016-01-01
An extensive theoretical and qualitative literature stresses the promise of instructional practices and content aligned with the cultural experiences of minority students. Ethnic studies courses provide a growing but controversial example of such "culturally relevant pedagogy." However, the empirical evidence on the effectiveness of…
Dimensions of Early Speech Sound Disorders: A Factor Analytic Study
ERIC Educational Resources Information Center
Lewis, Barbara A.; Freebairn, Lisa A.; Hansen, Amy J.; Stein, Catherine M.; Shriberg, Lawrence D.; Iyengar, Sudha K.; Taylor, H. Gerry
2006-01-01
The goal of this study was to classify children with speech sound disorders (SSD) empirically, using factor analytic techniques. Participants were 3-7-year olds enrolled in speech/language therapy (N=185). Factor analysis of an extensive battery of speech and language measures provided support for two distinct factors, representing the skill…
Modeling the effects of study abroad programs on college students
Alvin H. Yu; Garry E. Chick; Duarte B. Morais; Chung-Hsien Lin
2009-01-01
This study explored the possibility of modeling the effects of a study abroad program on students from a university in the northeastern United States. A program effect model was proposed after conducting an extensive literature review and empirically examining a sample of 265 participants in 2005. Exploratory factor analysis (EFA), confirmatory factor analysis (CFA),...
The Role of Key Qualifications in the Transition from Vocational Education to Work
ERIC Educational Resources Information Center
van Zolingen, S. J.
2002-01-01
This study presents a new definition of key qualifications related to occupations based on an extensive literature search. The empirical aspect of this study describes a Delphi study focused on policy where a number of key qualifications were operationalized for three selected jobs: commercial employee at a bank, claims assessor or acceptor at an…
A glacier runoff extension to the Precipitation Runoff Modeling System
A. E. Van Beusekom; R. J. Viger
2016-01-01
A module to simulate glacier runoff, PRMSglacier, was added to PRMS (Precipitation Runoff Modeling System), a distributed-parameter, physical-process hydrological simulation code. The extension does not require extensive on-glacier measurements or computational expense but still relies on physical principles over empirical relations as much as is feasible while...
Welch, Vivian; Jull, J; Petkovic, J; Armstrong, R; Boyer, Y; Cuervo, L G; Edwards, Sjl; Lydiatt, A; Gough, D; Grimshaw, J; Kristjansson, E; Mbuagbaw, L; McGowan, J; Moher, D; Pantoja, T; Petticrew, M; Pottie, K; Rader, T; Shea, B; Taljaard, M; Waters, E; Weijer, C; Wells, G A; White, H; Whitehead, M; Tugwell, P
2015-10-21
Health equity concerns the absence of avoidable and unfair differences in health. Randomized controlled trials (RCTs) can provide evidence about the impact of an intervention on health equity for specific disadvantaged populations or in general populations; this is important for equity-focused decision-making. Previous work has identified a lack of adequate reporting guidelines for assessing health equity in RCTs. The objective of this study is to develop guidelines to improve the reporting of health equity considerations in RCTs, as an extension of the Consolidated Standards of Reporting Trials (CONSORT). A six-phase study using integrated knowledge translation governed by a study executive and advisory board will assemble empirical evidence to inform the CONSORT-equity extension. To create the guideline, the following steps are proposed: (1) develop a conceptual framework for identifying "equity-relevant trials," (2) assess empirical evidence regarding reporting of equity-relevant trials, (3) consult with global methods and content experts on how to improve reporting of health equity in RCTs, (4) collect broad feedback and prioritize items needed to improve reporting of health equity in RCTs, (5) establish consensus on the CONSORT-equity extension: the guideline for equity-relevant trials, and (6) broadly disseminate and implement the CONSORT-equity extension. This work will be relevant to a broad range of RCTs addressing questions of effectiveness for strategies to improve practice and policy in the areas of social determinants of health, clinical care, health systems, public health, and international development, where health and/or access to health care is a primary outcome. The outcomes include a reporting guideline (CONSORT-equity extension) for equity-relevant RCTs and a knowledge translation strategy to broadly encourage its uptake and use by journal editors, authors, and funding agencies.
NASA Astrophysics Data System (ADS)
Elliott, R. M.; Gibson, R. A.; Carson, T. B.; Marasco, D. E.; Culligan, P. J.; McGillis, W. R.
2016-07-01
Green roofs have been utilized for urban stormwater management due to their ability to capture rainwater locally. Studies of the most common type, extensive green roofs, have demonstrated that green roofs can retain significant amounts of stormwater, but have also shown variation in seasonal performance. The purpose of this study is to determine how time of year impacts the hydrologic performance of extensive green roofs considering the covariates of antecedent dry weather period (ADWP), potential evapotranspiration (ET0) and storm event size. To do this, nearly four years of monitoring data from two full-scale extensive green roofs (with differing substrate depths of 100 mm and 31 mm) are analyzed. The annual performance is then modeled using a common empirical relationship between rainfall and green roof runoff, with the addition of Julian day in one approach, ET0 in another, and both ADWP and ET0 in a third approach. Together the monitoring and modeling results confirm that stormwater retention is highest in warmer months, the green roofs retain more rainfall with longer ADWPs, and the seasonal variations in behavior are more pronounced for the roof with the thinner media than the roof with the deeper media. Overall, the ability of seasonal accounting to improve stormwater retention modeling is demonstrated; modification of the empirical model to include ADWP, and ET0 improves the model R 2 from 0.944 to 0.975 for the thinner roof, and from 0.866 to 0.870 for the deeper roof. Furthermore, estimating the runoff with the empirical approach was shown to be more accurate then using a water balance model, with model R 2 of 0.944 and 0.866 compared to 0.975 and 0.866 for the thinner and deeper roof, respectively. This finding is attributed to the difficulty of accurately parameterizing the water balance model.
Fostering Effective Leadership in Foreign Contexts through Study of Cultural Values
ERIC Educational Resources Information Center
Schenck, Andrew D.
2016-01-01
While leadership styles have been extensively examined, cultural biases implicit within research methodologies often preclude application of results in foreign contexts. To more holistically comprehend the impact of culture on leadership, belief systems were empirically correlated to both transactional and transformational tendencies in public…
ERIC Educational Resources Information Center
Lloyd, Eva; Edmonds, Casey; Downs, Celony; Crutchley, Rebecca; Paffard, Fran
2017-01-01
The acquisition of everyday scientific concepts by 3-6-year-old children attending early childhood institutions has been widely studied. In contrast, research on science learning processes among younger children is less extensive. This paper reports on findings from an exploratory empirical study undertaken in a "stay and play" service…
Delay and Probability Discounting in Humans: An Overview
ERIC Educational Resources Information Center
McKerchar, Todd L.; Renda, C. Renee
2012-01-01
The purpose of this review is to introduce the reader to the concepts of delay and probability discounting as well as the major empirical findings to emerge from research with humans on these concepts. First, we review a seminal discounting study by Rachlin, Raineri, and Cross (1991) as well as an influential extension of this study by Madden,…
Future Orientation, School Contexts, and Problem Behaviors: A Multilevel Study
ERIC Educational Resources Information Center
Chen, Pan; Vazsonyi, Alexander T.
2013-01-01
The association between future orientation and problem behaviors has received extensive empirical attention; however, previous work has not considered school contextual influences on this link. Using a sample of N = 9,163 9th to 12th graders (51.0% females) from N = 85 high schools of the National Longitudinal Study of Adolescent Health, the…
An Empirical Study on Behavioural Intention to Reuse E-Learning Systems in Rural China
ERIC Educational Resources Information Center
Li, Yan; Duan, Yanqing; Fu, Zetian; Alford, Philip
2012-01-01
The learner's acceptance of e-learning systems has received extensive attention in prior studies, but how their experience of using e-learning systems impacts on their behavioural intention to reuse those systems has attracted limited research. As the applications of e-learning are still gaining momentum in developing countries, such as China,…
Strategic Planning, Recasts, Noticing, and L2 Development
ERIC Educational Resources Information Center
Hama, Mika
2012-01-01
Since the mid-1990s, the link between recasts and L2 development has been extensively tested, and the results from those studies have largely demonstrated that recasts have a positive effect on L2 learning. With this firm support from previous empirical evidence, studies have begun to focus on how recasts assist learning and under what conditions…
ERIC Educational Resources Information Center
Rimpiläinen, Sanna
2015-01-01
What do different research methods and approaches "do" in practice? The article seeks to discuss this point by drawing upon socio-material research approaches and empirical examples taken from the early stages of an extensive case study on an interdisciplinary project between two multidisciplinary fields of study, education and computer…
ERIC Educational Resources Information Center
Schmid, Richard F.; Bernard, Robert M.; Borokhovski, Eugene; Tamim, Rana; Abrami, Philip C.; Wade, C. Anne; Surkes, Michael A.; Lowerison, Gretchen
2009-01-01
This paper reports the findings of a Stage I meta-analysis exploring the achievement effects of computer-based technology use in higher education classrooms (non-distance education). An extensive literature search revealed more than 6,000 potentially relevant primary empirical studies. Analysis of a representative sample of 231 studies (k = 310)…
Literacy, Competence and Meaning-Making: A Human Sciences Approach
ERIC Educational Resources Information Center
Nikolajeva, Maria
2010-01-01
This semiotically informed article problematizes the concept of literacy as an aesthetic activity rather than reading skills and offers strategies for assessing young readers' understanding of fictional texts. Although not based on empirical research, the essay refers to and theorizes from extensive field studies of children's responses to…
Empirical Evaluation of Directional-Dependence Tests
ERIC Educational Resources Information Center
Thoemmes, Felix
2015-01-01
Testing of directional dependence is a method to infer causal direction that recently has attracted some attention. Previous examples by e.g. von Eye and DeShon (2012a) and extensive simulation studies by Pornprasertmanit and Little (2012) have demonstrated that under specific assumptions, directional-dependence tests can recover the true causal…
Child Psychotherapy Dropout: An Empirical Research Review
ERIC Educational Resources Information Center
Deakin, Elisabeth; Gastaud, Marina; Nunes, Maria Lucia Tiellet
2012-01-01
This study aims to discuss the most recent data about child psychotherapy dropout, especially child psychoanalytical psychotherapy. The authors also try to offer some possible alternatives to prevent such a phenomenon. The definition of "child psychotherapy dropout" is extensively discussed. The goal has been to attempt to create a standardised…
Information Sharing in the Field of Design Research
ERIC Educational Resources Information Center
Pilerot, Ola
2015-01-01
Introduction: This paper reports on an extensive research project which aimed at exploring information sharing activities in a scholarly context. The paper presents and synthesises findings from a literature review and three qualitative case studies. The empirical setting is a geographically distributed Nordic network of design scholars. Method:…
Relationship between Defenses, Personality, and Affect during a Stress Task in Normal Adolescents
ERIC Educational Resources Information Center
Steiner, Hans; Erickson, Sarah J.; MacLean, Peggy; Medic, Sanja; Plattner, Belinda; Koopman, Cheryl
2007-01-01
Objective: Although there are extensive data on the relationship between personality and stress reactivity in adults, there is little comparable empirical research with adolescents. This study examines the simultaneous relationships between long term functioning (personality, defenses) and observed stress reactivity (affect) in adolescents.…
Creativity and Flow in Musical Composition: An Empirical Investigation
ERIC Educational Resources Information Center
MacDonald, Raymond; Byrne, Charles; Carlton, Lana
2006-01-01
Although an extensive literature exists on creativity and music, there is a lack of published research investigating possible links between musical creativity and Csikszentmihalyi's concept of flow or optimal experience. This article examines a group composition task to study the relationships between creativity, flow and the quality of the…
A study of multiplex data bus techniques for the space shuttle
NASA Technical Reports Server (NTRS)
Kearney, R. J.; Kalange, M. A.
1972-01-01
A comprehensive technology base for the design of a multiplexed data bus subsystem is provided. Extensive analyses, both analytical and empirical, were performed. Subjects covered are classified under the following headings: requirements identification and analysis; transmission media studies; signal design and detection studies; synchronization, timing, and control studies; user-subsystem interface studies; operational reliability analyses; design of candidate data bus configurations; and evaluation of candidate data bus designs.
ERIC Educational Resources Information Center
Matson, Johnny L.; Kozlowski, Alison M.; Worley, Julie A.; Shoemaker, Mary E.; Sipes, Megan; Horovitz, Max
2011-01-01
An extensive literature on the causes of challenging behaviors has been developed, primarily in the applied behavior analysis literature. One hundred and seventy-three empirical studies were reviewed where functional assessment serves as the primary method of identifying these causes. Most of the studies were able to identify a clear function or…
Does Grade Inflation Affect the Credibility of Grades? Evidence from US Law School Admissions
ERIC Educational Resources Information Center
Wongsurawat, Winai
2009-01-01
While the nature and causes of university grade inflation have been extensively studied, little empirical research on the consequence of this phenomenon is currently available. The present study uses data for 48 US law schools to analyze admission decisions in 1995, 2000, and 2007, a period during which university grade inflation appears to have…
ERIC Educational Resources Information Center
Gilmore, Linda; Cuskelly, Monica; Browning, Melissa
2015-01-01
The main purpose of the current study was to provide empirical evidence to support or refute assumptions of phenotypic deficits in motivation for children with Down syndrome (DS). Children with moderate intellectual disability (MID) associated with etiologies other than DS were recruited in an extension of a previous study that involved children…
ERIC Educational Resources Information Center
Mak, Jennifer Y.; Cheung, Siu-Yin; King, Carina C.; Lam, Eddie T. C.
2016-01-01
There have been extensive studies of local residents' perception and reaction to the impacts of mega events. However, there is limited empirical research on the social impacts that shape foreign attitudes toward the host country. The purpose of this study was to develop and validate the Olympic Games Attitude Scale (OGAS) to examine viewers'…
Training Older Adults about Alzheimer's Disease--Impact on Knowledge and Fear
ERIC Educational Resources Information Center
Scerri, Anthony; Scerri, Charles
2017-01-01
Although the impact of Alzheimer's disease training programs directed to informal and formal caregivers has been extensively studied, programs for older adults who do not have the disease are relatively few. Moreover, increased knowledge increases fear of the disease, even though there is little empirical evidence to support this. This study…
ERIC Educational Resources Information Center
St. Pierre, Tena L.; Kaltreider, D. Lynne
2004-01-01
Despite availability of empirically supported school-based substance abuse prevention programs, adoption and implementation fidelity of such programs appear to be low. A replicated case study was conducted to investigate school adoption and implementation processes of the EXSELS model (Project ALERT delivered by program leaders through Cooperative…
Building More Solid Bridges between Buddhism and Western Psychology
ERIC Educational Resources Information Center
Sugamura, Genji; Haruki, Yutaka; Koshikawa, Fusako
2007-01-01
Introducing the ways of cultivating mental balance, B. A. Wallace and S. L. Shapiro attempted to build bridges between Buddhism and psychology. Their systematic categorization of Buddhist teachings and extensive review of empirical support from Western psychology are valuable for future study. However, it remains a matter of concern that some more…
A Social Psychological Exploration of Power Motivation Among Disadvantaged Workers.
ERIC Educational Resources Information Center
Levitin, Teresa Ellen
An extensive review of the literature on the social psychology of social power led to the conclusion that the area contains many unrelated, noncumulative theoretical and empirical works. Three conceptual distinctions were introduced to facilitate the systematic study of social power. Effectance motivation was used to describe the joint, often…
Can Multifactor Models of Teaching Improve Teacher Effectiveness Measures?
ERIC Educational Resources Information Center
Lazarev, Valeriy; Newman, Denis
2014-01-01
NCLB waiver requirements have led to development of teacher evaluation systems, in which student growth is a significant component. Recent empirical research has been focusing on metrics of student growth--value-added scores in particular--and their relationship to other metrics. An extensive set of recent teacher-evaluation studies conducted by…
Parenting Behaviour among Parents of Children with Autism Spectrum Disorder
ERIC Educational Resources Information Center
Lambrechts, Greet; Van Leeuwen, Karla; Boonen, Hannah; Maes, Bea; Noens, Ilse
2011-01-01
Contrary to the extensive amount of empirical findings about parental perceptions, parenting cognitions, and coping in families with a child with autism spectrum disorder (ASD), research about parenting itself is very scarce. A first goal of this study was to examine the factor structure and internal consistency of two scales to measure parenting…
An operational GLS model for hydrologic regression
Tasker, Gary D.; Stedinger, J.R.
1989-01-01
Recent Monte Carlo studies have documented the value of generalized least squares (GLS) procedures to estimate empirical relationships between streamflow statistics and physiographic basin characteristics. This paper presents a number of extensions of the GLS method that deal with realities and complexities of regional hydrologic data sets that were not addressed in the simulation studies. These extensions include: (1) a more realistic model of the underlying model errors; (2) smoothed estimates of cross correlation of flows; (3) procedures for including historical flow data; (4) diagnostic statistics describing leverage and influence for GLS regression; and (5) the formulation of a mathematical program for evaluating future gaging activities. ?? 1989.
A License to Produce? Farmer Interpretations of the New Food Security Agenda
ERIC Educational Resources Information Center
Fish, Rob; Lobley, Matt; Winter, Michael
2013-01-01
Drawing on the findings of empirical research conducted in the South West of England, this paper explores how farmers make sense of re-emerging imperatives for "food security" in UK policy and political discourse. The analysis presented is based on two types of empirical inquiry. First, an extensive survey of 1543 farmers, exploring the…
ERIC Educational Resources Information Center
Spennemann, Dirk H. R.; Atkinson, John; Cornforth, David
2007-01-01
Most universities have invested in extensive infrastructure in the form of computer laboratories and computer kiosks. However, is this investment justified when it is suggested that students work predominantly from home using their own computers? This paper provides an empirical study investigating how students at a regional multi-campus…
ERIC Educational Resources Information Center
Nesset, Valerie
2015-01-01
Introduction: As part of a larger study in 2006 of the information-seeking behaviour of third-grade students in Montreal, Quebec, Canada, a model of their information-seeking behaviour was developed. To further improve the model, an extensive examination of the literature into information-seeking behaviour and information literacy was conducted…
ERIC Educational Resources Information Center
Cheung, Ronnie; Vogel, Doug
2013-01-01
Collaborative technologies support group work in project-based environments. In this study, we enhance the technology acceptance model to explain the factors that influence the acceptance of Google Applications for collaborative learning. The enhanced model was empirically evaluated using survey data collected from 136 students enrolled in a…
The Effect of Automobile Safety on Vehicle Type Choice: An Empirical Study.
ERIC Educational Resources Information Center
McCarthy, Patrick S.
An analysis was made of the extent to which the safety characteristics of new vehicles affect consumer purchase decisions. Using an extensive data set that combines vehicle data collected by the Automobile Club of Southern California Target Car Program with the responses from a national household survey of new car buyers, a statistical model of…
An Automated Individual Feedback and Marking System: An Empirical Study
ERIC Educational Resources Information Center
Barker, Trevor
2011-01-01
The recent National Students Survey showed that feedback to students was an ongoing problem in Higher Education. This paper reports on the extension of our past research into the provision of automated feedback for objective testing. In the research presented here, the system has been further developed for marking practical and essay questions and…
ERIC Educational Resources Information Center
Inoue, Chihiro
2016-01-01
The constructs of complexity, accuracy and fluency (CAF) have been used extensively to investigate learner performance on second language tasks. However, a serious concern is that the variables used to measure these constructs are sometimes used conventionally without any empirical justification. It is crucial for researchers to understand how…
The Past, Present and Future of Geodemographic Research in the United States and United Kingdom
Singleton, Alexander D.; Spielman, Seth E.
2014-01-01
This article presents an extensive comparative review of the emergence and application of geodemographics in both the United States and United Kingdom, situating them as an extension of earlier empirically driven models of urban socio-spatial structure. The empirical and theoretical basis for this generalization technique is also considered. Findings demonstrate critical differences in both the application and development of geodemographics between the United States and United Kingdom resulting from their diverging histories, variable data economies, and availability of academic or free classifications. Finally, current methodological research is reviewed, linking this discussion prospectively to the changing spatial data economy in both the United States and United Kingdom. PMID:25484455
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-15
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PF10-5-000] Empire Pipeline, Inc.; Notice of Intent To Prepare an Environmental Assessment for the Planned Tioga County Extension Project, Request for Comments on Environmental Issues, and Notice of Public Scoping Meeting April 7, 2010. The staff of the Federal Energy...
ERIC Educational Resources Information Center
De Rosa, Marcello; Bartoli, Luca
2017-01-01
Purpose: The aim of the paper is to evaluate how advisory services stimulate the adoption of rural development policies (RDP) aiming at value creation. Design/methodology/approach: By linking the use of agricultural extension services (AES) to policies for value creation, we will put forward an empirical analysis in Italy, with the aim of…
NASA Astrophysics Data System (ADS)
Levy, M. C.; Thompson, S. E.; Cohn, A.
2014-12-01
Land use/cover change (LUCC) has occurred extensively in the Brazilian Amazon rainforest-savanna transition. Agricultural development-driven LUCC at regional scales can alter surface energy budgets, evapotranspiration (ET) and rainfall; these hydroclimatic changes impact streamflows, and thus hydropower. To date, there is only limited empirical understanding of these complex land-water-energy nexus dynamics, yet understanding is important to developing countries where both agriculture and hydropower are expanding and intensifying. To observe these changes and their interconnections, we synthesize a novel combination of ground network, remotely sensed, and empirically modeled data for LUCC, rainfall, flows, and hydropower potential. We connect the extensive temporal and spatial trends in LUCC occurring from 2000-2012 (and thus observable in the satellite record) to long-term historical flow records and run-of-river hydropower generation potential estimates. Changes in hydrologic condition are observed in terms of dry and wet season moments, extremes, and flow duration curves. Run-of-river hydropower generation potential is modeled at basin gauge points using equation models parameterized with literature-based low-head turbine efficiencies, and simple algorithms establishing optimal head and capacity from elevation and flows, respectively. Regression analyses are used to demonstrate a preliminary causal analysis of LUCC impacts to flow and energy, and discuss extension of the analysis to ungauged basins. The results are transferable to tropical and transitional forest regions worldwide where simultaneous agricultural and hydropower development potentially compete for coupled components of regional water cycles, and where policy makers and planners require an understanding of LUCC impacts to hydroclimate-dependent industries and ecosystems.
Modern Experience in City Combat
1987-03-01
Extensive media coverage of Beirut served to erode both domestic and international support. Surprise. As in military operations on other terrain... social reasons which constrain military actions to some degree. At the same time an empirical study has little difficulty in distinguishing in most...Review, 38-39. 39 Bureau of Applied Social Research, Columbia University. 0953). Korean urbanization: Past development and future potentials. Maxwell
The Role of Status in Producing Depressed Entitlement in Women's and Men's Pay Allocations
ERIC Educational Resources Information Center
Hogue, Mary; Yoder, Janice D.
2003-01-01
Extensive empirical evidence confirms a depressed entitlement effect wherein women pay themselves less than men for comparable work and believe the allocation fair. The present study tests the hypothesis that status subordination linked to being female underlies at least some of this effect. A 2 x 3 design crossed 180 undergraduates' gender with a…
[The neurodynamic core of consciousness and neural Darwinism].
Ibáñez, A
In the last decades, the scientific study of consciousness in the scope of the cognitive neurosciences can be considered one of the greatest challenges of contemporary science. The Gerald Edelman theory of consciousness is one of the most promising and controversial perspectives. This theory stands out by its approach to topics usually rejected by other neurophysiologic theories of consciousness, as the case of the neurophysiologic explanation of qualia. The goal of this paper is to review the dynamic core theory of consciousness, presenting the main features of the theory, analyzing the explanation strategies, their empirical extensions, and elaborating some critical considerations about the possibility of the neuroscientific study of qualia. The central and additional theoretical components are analyzed, emphasizing its ontological, restrictive and explanatory assumptions. The properties of conscious phenomena and their cerebral correlates as advanced by the theory are described, and finally its experiments and empirical extensions are examined. The explanatory strategies of the theory are analyzed, based on conceptual isomorphism between the phenomenological properties and the neurophysiological and mathematical measures. Some criticisms could be raised about the limitations of the dynamic core theory, especially regarding its account of the so-called 'hard problem' of consciousness or qualia.
More than associations: an ideomotor perspective on mirror neurons.
Brass, Marcel; Muhle-Karbe, Paul S
2014-04-01
In this commentary, we propose an extension of the associative approach of mirror neurons, namely, ideomotor theory. Ideomotor theory assumes that actions are controlled by anticipatory representations of their sensory consequences. As we outline below, this extension is necessary to clarify a number of empirical observations that are difficult to explain from a purely associative perspective.
Causal Responsibility and Counterfactuals
Lagnado, David A; Gerstenberg, Tobias; Zultan, Ro'i
2013-01-01
How do people attribute responsibility in situations where the contributions of multiple agents combine to produce a joint outcome? The prevalence of over-determination in such cases makes this a difficult problem for counterfactual theories of causal responsibility. In this article, we explore a general framework for assigning responsibility in multiple agent contexts. We draw on the structural model account of actual causation (e.g., Halpern & Pearl, 2005) and its extension to responsibility judgments (Chockler & Halpern, 2004). We review the main theoretical and empirical issues that arise from this literature and propose a novel model of intuitive judgments of responsibility. This model is a function of both pivotality (whether an agent made a difference to the outcome) and criticality (how important the agent is perceived to be for the outcome, before any actions are taken). The model explains empirical results from previous studies and is supported by a new experiment that manipulates both pivotality and criticality. We also discuss possible extensions of this model to deal with a broader range of causal situations. Overall, our approach emphasizes the close interrelations between causality, counterfactuals, and responsibility attributions. PMID:23855451
An Initial Model of Requirements Traceability an Empirical Study
1992-09-22
procedures have been used extensively in the study of human problem-solving, including such areas as general problem-solving behavior, physics problem...heen doing unless you have traceability." " Humans don’t go back to the requirements enough." "Traceabi!ity should be extremely helpful with...by constraints on its usage: ("Traceability needs to be something that humans can work with, not just a whip held over people." "Traceability should
ERIC Educational Resources Information Center
Mukala, Patrick; Cerone, Antonio; Turini, Franco
2017-01-01
Free\\Libre Open Source Software (FLOSS) environments are increasingly dubbed as learning environments where practical software engineering skills can be acquired. Numerous studies have extensively investigated how knowledge is acquired in these environments through a collaborative learning model that define a learning process. Such a learning…
ERIC Educational Resources Information Center
De Grip, Andries; Sauermann, Jan
2013-01-01
Although the transfer of on-the-job training to the workplace belongs to the realm of educational research, it is also highly related to labour economics. In the economic literature, the transfer of training is based on the theoretical framework of human capital theory and has been extensively analysed empirically in econometric studies that take…
ERIC Educational Resources Information Center
Tilak, Jandhyala B. G.
An extensive survey of empirical research on education as related to poverty, growth, and income distribution is presented, with the focus on 21 developing nations. The study uses the latest available data on alternative measures of income distribution, income shares of various population groups by income classes, and poverty ratios. The analysis…
Tsallis’ non-extensive free energy as a subjective value of an uncertain reward
NASA Astrophysics Data System (ADS)
Takahashi, Taiki
2009-03-01
Recent studies in neuroeconomics and econophysics revealed the importance of reward expectation in decision under uncertainty. Behavioral neuroeconomic studies have proposed that the unpredictability and the probability of an uncertain reward are distinctly encoded as entropy and a distorted probability weight, respectively, in the separate neural systems. However, previous behavioral economic and decision-theoretic models could not quantify reward-seeking and uncertainty aversion in a theoretically consistent manner. In this paper, we have: (i) proposed that generalized Helmholtz free energy in Tsallis’ non-extensive thermostatistics can be utilized to quantify a perceived value of an uncertain reward, and (ii) empirically examined the explanatory powers of the models. Future study directions in neuroeconomics and econophysics by utilizing the Tsallis’ free energy model are discussed.
Terror management theory applied clinically: implications for existential-integrative psychotherapy.
Lewis, Adam M
2014-01-01
Existential psychotherapy and Terror Management Theory (TMT) offer explanations for the potential psychological effects of death awareness, although their respective literatures bases differ in clarity, research, and implications for treating psychopathology. Existential therapy is often opaque to many therapists, in part due to the lack of consensus on what constitutes its practice, limited published practical examples, and few empirical studies examining its efficacy. By contrast, TMT has an extensive empirical literature base, both within social psychology and spanning multiple disciplines, although previously unexplored within clinical and counseling psychology. This article explores the implications of a proposed TMT integrated existential therapy (TIE), bridging the gap between disciplines in order to meet the needs of the aging population and current challenges facing existential therapists.
Terrorism as a process: a critical review of Moghaddam's "Staircase to Terrorism".
Lygre, Ragnhild B; Eid, Jarle; Larsson, Gerry; Ranstorp, Magnus
2011-12-01
This study reviews empirical evidence for Moghaddam's model "Staircase to Terrorism," which portrays terrorism as a process of six consecutive steps culminating in terrorism. An extensive literature search, where 2,564 publications on terrorism were screened, resulted in 38 articles which were subject to further analysis. The results showed that while most of the theories and processes linked to Moghaddam's model are supported by empirical evidence, the proposed transitions between the different steps are not. These results may question the validity of a linear stepwise model and may suggest that a combination of mechanisms/factors could combine in different ways to produce terrorism. © 2011 The Authors. Scandinavian Journal of Psychology © 2011 The Scandinavian Psychological Associations.
Nonparametric spirometry reference values for Hispanic Americans.
Glenn, Nancy L; Brown, Vanessa M
2011-02-01
Recent literature sites ethnic origin as a major factor in developing pulmonary function reference values. Extensive studies established reference values for European and African Americans, but not for Hispanic Americans. The Third National Health and Nutrition Examination Survey defines Hispanic as individuals of Spanish speaking cultures. While no group was excluded from the target population, sample size requirements only allowed inclusion of individuals who identified themselves as Mexican Americans. This research constructs nonparametric reference value confidence intervals for Hispanic American pulmonary function. The method is applicable to all ethnicities. We use empirical likelihood confidence intervals to establish normal ranges for reference values. Its major advantage: it is model free, but shares asymptotic properties of model based methods. Statistical comparisons indicate that empirical likelihood interval lengths are comparable to normal theory intervals. Power and efficiency studies agree with previously published theoretical results.
NASA Astrophysics Data System (ADS)
Xu, M., III; Liu, X.
2017-12-01
In the past 60 years, both the runoff and sediment load in the Yellow River Basin showed significant decreasing trends owing to the influences of human activities and climate change. Quantifying the impact of each factor (e.g. precipitation, sediment trapping dams, pasture, terrace, etc.) on the runoff and sediment load is among the key issues to guide the implement of water and soil conservation measures, and to predict the variation trends in the future. Hundreds of methods have been developed for studying the runoff and sediment load in the Yellow River Basin. Generally, these methods can be classified into empirical methods and physical-based models. The empirical methods, including hydrological method, soil and water conservation method, etc., are widely used in the Yellow River management engineering. These methods generally apply the statistical analyses like the regression analysis to build the empirical relationships between the main characteristic variables in a river basin. The elasticity method extensively used in the hydrological research can be classified into empirical method as it is mathematically deduced to be equivalent with the hydrological method. Physical-based models mainly include conceptual models and distributed models. The conceptual models are usually lumped models (e.g. SYMHD model, etc.) and can be regarded as transition of empirical models and distributed models. Seen from the publications that less studies have been conducted applying distributed models than empirical models as the simulation results of runoff and sediment load based on distributed models (e.g. the Digital Yellow Integrated Model, the Geomorphology-Based Hydrological Model, etc.) were usually not so satisfied owing to the intensive human activities in the Yellow River Basin. Therefore, this study primarily summarizes the empirical models applied in the Yellow River Basin and theoretically analyzes the main causes for the significantly different results using different empirical researching methods. Besides, we put forward an assessment frame for the researching methods of the runoff and sediment load variations in the Yellow River Basin from the point of view of inputting data, model structure and result output. And the assessment frame was then applied in the Huangfuchuan River.
Zhang, Kui; Wiener, Howard; Beasley, Mark; George, Varghese; Amos, Christopher I; Allison, David B
2006-08-01
Individual genome scans for quantitative trait loci (QTL) mapping often suffer from low statistical power and imprecise estimates of QTL location and effect. This lack of precision yields large confidence intervals for QTL location, which are problematic for subsequent fine mapping and positional cloning. In prioritizing areas for follow-up after an initial genome scan and in evaluating the credibility of apparent linkage signals, investigators typically examine the results of other genome scans of the same phenotype and informally update their beliefs about which linkage signals in their scan most merit confidence and follow-up via a subjective-intuitive integration approach. A method that acknowledges the wisdom of this general paradigm but formally borrows information from other scans to increase confidence in objectivity would be a benefit. We developed an empirical Bayes analytic method to integrate information from multiple genome scans. The linkage statistic obtained from a single genome scan study is updated by incorporating statistics from other genome scans as prior information. This technique does not require that all studies have an identical marker map or a common estimated QTL effect. The updated linkage statistic can then be used for the estimation of QTL location and effect. We evaluate the performance of our method by using extensive simulations based on actual marker spacing and allele frequencies from available data. Results indicate that the empirical Bayes method can account for between-study heterogeneity, estimate the QTL location and effect more precisely, and provide narrower confidence intervals than results from any single individual study. We also compared the empirical Bayes method with a method originally developed for meta-analysis (a closely related but distinct purpose). In the face of marked heterogeneity among studies, the empirical Bayes method outperforms the comparator.
ERIC Educational Resources Information Center
Desjardins, Richard
2013-01-01
This study considers the extensive critique of the impact of the "market" or "neoliberal" model on learning and its outcomes in the light of alternative models. The purpose is to consider the potential impacts of the market on learning and its outcomes and to contextualise critique by considering alternative coordination…
Spencer, James Herbert
2013-04-01
The literature on development has focused on the concept of transition in understanding the emergent challenges facing poor but rapidly developing countries. Scholars have focused extensively on the health and urban transitions associated with this change and, in particular, its use for understanding emerging infectious diseases. However, few have developed explicit empirical measures to quantify the extent to which a transitions focus is useful for theory, policy, and practice. Using open source data on avian influenza in 2004 and 2005 and the Vietnam Census of Population and Housing, this paper introduces the Kuznets curve as a tool for empirically estimating transition and disease. Findings suggest that the Kuznets curve is a viable tool for empirically assessing the role of transitional dynamics in the emergence of new infectious diseases.
Understanding medication compliance and persistence from an economics perspective.
Elliott, Rachel A; Shinogle, Judith A; Peele, Pamela; Bhosle, Monali; Hughes, Dyfrig A
2008-01-01
An increased understanding of the reasons for noncompliance and lack of persistence with prescribed medication is an important step to improve treatment effectiveness, and thus patient health. Explanations have been attempted from epidemiological, sociological, and psychological perspectives. Economic models (utility maximization, time preferences, health capital, bilateral bargaining, stated preference, and prospect theory) may contribute to the understanding of medication-taking behavior. Economic models are applied to medication noncompliance. Traditional consumer choice models under a budget constraint do apply to medication-taking behavior in that increased prices cause decreased utilization. Nevertheless, empiric evidence suggests that budget constraints are not the only factor affecting consumer choice around medicines. Examination of time preference models suggests that the intuitive association between time preference and medication compliance has not been investigated extensively, and has not been proven empirically. The health capital model has theoretical relevance, but has not been applied to compliance. Bilateral bargaining may present an alternative model to concordance of the patient-prescriber relationship, taking account of game-playing by either party. Nevertheless, there is limited empiric evidence to test its usefulness. Stated preference methods have been applied most extensively to medicines use. Evidence suggests that patients' preferences are consistently affected by side effects, and that preferences change over time, with age and experience. Prospect theory attempts to explain how new information changes risk perceptions and associated behavior but has not been applied empirically to medication use. Economic models of behavior may contribute to the understanding of medication use, but more empiric work is needed to assess their applicability.
Ternès, Nils; Rotolo, Federico; Michiels, Stefan
2016-07-10
Correct selection of prognostic biomarkers among multiple candidates is becoming increasingly challenging as the dimensionality of biological data becomes higher. Therefore, minimizing the false discovery rate (FDR) is of primary importance, while a low false negative rate (FNR) is a complementary measure. The lasso is a popular selection method in Cox regression, but its results depend heavily on the penalty parameter λ. Usually, λ is chosen using maximum cross-validated log-likelihood (max-cvl). However, this method has often a very high FDR. We review methods for a more conservative choice of λ. We propose an empirical extension of the cvl by adding a penalization term, which trades off between the goodness-of-fit and the parsimony of the model, leading to the selection of fewer biomarkers and, as we show, to the reduction of the FDR without large increase in FNR. We conducted a simulation study considering null and moderately sparse alternative scenarios and compared our approach with the standard lasso and 10 other competitors: Akaike information criterion (AIC), corrected AIC, Bayesian information criterion (BIC), extended BIC, Hannan and Quinn information criterion (HQIC), risk information criterion (RIC), one-standard-error rule, adaptive lasso, stability selection, and percentile lasso. Our extension achieved the best compromise across all the scenarios between a reduction of the FDR and a limited raise of the FNR, followed by the AIC, the RIC, and the adaptive lasso, which performed well in some settings. We illustrate the methods using gene expression data of 523 breast cancer patients. In conclusion, we propose to apply our extension to the lasso whenever a stringent FDR with a limited FNR is targeted. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
An empirical potential for simulating vacancy clusters in tungsten.
Mason, D R; Nguyen-Manh, D; Becquart, C S
2017-12-20
We present an empirical interatomic potential for tungsten, particularly well suited for simulations of vacancy-type defects. We compare energies and structures of vacancy clusters generated with the empirical potential with an extensive new database of values computed using density functional theory, and show that the new potential predicts low-energy defect structures and formation energies with high accuracy. A significant difference to other popular embedded-atom empirical potentials for tungsten is the correct prediction of surface energies. Interstitial properties and short-range pairwise behaviour remain similar to the Ackford-Thetford potential on which it is based, making this potential well-suited to simulations of microstructural evolution following irradiation damage cascades. Using atomistic kinetic Monte Carlo simulations, we predict vacancy cluster dissociation in the range 1100-1300 K, the temperature range generally associated with stage IV recovery.
Network accessibility & the evoluation of urban employment.
DOT National Transportation Integrated Search
2011-06-01
This research examines the impact of accessibility on the growth of employment centers in the : Los Angeles Region between 1980 and 2000. There is extensive empirical documentation of : polycentricity the presence of multiple concentrations of em...
Darwin and Evolutionary Psychology
ERIC Educational Resources Information Center
Ghiselin, Michael T.
1973-01-01
Darwin's views on various psychological behaviors were significant. Basing his conclusions on empirical research, he wrote extensively on the phylogeny of behavior, emotional expression, sexual selection, instincts, evolution of morals, ontogeny of behavior, and genetics of behavior. (PS)
Adler, Jonathan M; Chin, Erica D; Kolisetty, Aiswarya P; Oltmanns, Thomas F
2012-08-01
While identity disturbance has long been considered one of the defining features of Borderline Personality Disorder (BPD), the present study marks only the third empirical investigation to assess it and the first to do so from the perspective of research on narrative identity. Drawing on the rich tradition of studying narrative identity, the present study examined identity disturbance in a group of 40 mid-life adults, 20 with features of BPD and a matched sample of 20 without BPD. Extensive life story interviews were analyzed for a variety of narrative elements and the themes of agency, communion fulfillment (but not communion), and narrative coherence significantly distinguished the stories of those people with features of BPD from those without the disorder. In addition, associations between the theme of agency and psychopathology were evident six and twelve months following the life story interview. This study seeks to bridge the mutually-informative fields of research on personality disorders and normal identity processes.
Adler, Jonathan M.; Chin, Erica D.; Kolisetty, Aiswarya P.; Oltmanns, Thomas F.
2011-01-01
While identity disturbance has long been considered one of the defining features of Borderline Personality Disorder (BPD), the present study marks only the third empirical investigation to assess it and the first to do so from the perspective of research on narrative identity. Drawing on the rich tradition of studying narrative identity, the present study examined identity disturbance in a group of 40 mid-life adults, 20 with features of BPD and a matched sample of 20 without BPD. Extensive life story interviews were analyzed for a variety of narrative elements and the themes of agency, communion fulfillment (but not communion), and narrative coherence significantly distinguished the stories of those people with features of BPD from those without the disorder. In addition, associations between the theme of agency and psychopathology were evident six and twelve months following the life story interview. This study seeks to bridge the mutually-informative fields of research on personality disorders and normal identity processes. PMID:22867502
Disorders without borders: current and future directions in the meta-structure of mental disorders.
Carragher, Natacha; Krueger, Robert F; Eaton, Nicholas R; Slade, Tim
2015-03-01
Classification is the cornerstone of clinical diagnostic practice and research. However, the extant psychiatric classification systems are not well supported by research evidence. In particular, extensive comorbidity among putatively distinct disorders flags an urgent need for fundamental changes in how we conceptualize psychopathology. Over the past decade, research has coalesced on an empirically based model that suggests many common mental disorders are structured according to two correlated latent dimensions: internalizing and externalizing. We review and discuss the development of a dimensional-spectrum model which organizes mental disorders in an empirically based manner. We also touch upon changes in the DSM-5 and put forward recommendations for future research endeavors. Our review highlights substantial empirical support for the empirically based internalizing-externalizing model of psychopathology, which provides a parsimonious means of addressing comorbidity. As future research goals, we suggest that the field would benefit from: expanding the meta-structure of psychopathology to include additional disorders, development of empirically based thresholds, inclusion of a developmental perspective, and intertwining genomic and neuroscience dimensions with the empirical structure of psychopathology.
Modeling thermal sensation in a Mediterranean climate—a comparison of linear and ordinal models
NASA Astrophysics Data System (ADS)
Pantavou, Katerina; Lykoudis, Spyridon
2014-08-01
A simple thermo-physiological model of outdoor thermal sensation adjusted with psychological factors is developed aiming to predict thermal sensation in Mediterranean climates. Microclimatic measurements simultaneously with interviews on personal and psychological conditions were carried out in a square, a street canyon and a coastal location of the greater urban area of Athens, Greece. Multiple linear and ordinal regression were applied in order to estimate thermal sensation making allowance for all the recorded parameters or specific, empirically selected, subsets producing so-called extensive and empirical models, respectively. Meteorological, thermo-physiological and overall models - considering psychological factors as well - were developed. Predictions were improved when personal and psychological factors were taken into account as compared to meteorological models. The model based on ordinal regression reproduced extreme values of thermal sensation vote more adequately than the linear regression one, while the empirical model produced satisfactory results in relation to the extensive model. The effects of adaptation and expectation on thermal sensation vote were introduced in the models by means of the exposure time, season and preference related to air temperature and irradiation. The assessment of thermal sensation could be a useful criterion in decision making regarding public health, outdoor spaces planning and tourism.
An empirically based model for knowledge management in health care organizations.
Sibbald, Shannon L; Wathen, C Nadine; Kothari, Anita
2016-01-01
Knowledge management (KM) encompasses strategies, processes, and practices that allow an organization to capture, share, store, access, and use knowledge. Ideal KM combines different sources of knowledge to support innovation and improve performance. Despite the importance of KM in health care organizations (HCOs), there has been very little empirical research to describe KM in this context. This study explores KM in HCOs, focusing on the status of current intraorganizational KM. The intention is to provide insight for future studies and model development for effective KM implementation in HCOs. A qualitative methods approach was used to create an empirically based model of KM in HCOs. Methods included (a) qualitative interviews (n = 24) with senior leadership to identify types of knowledge important in these roles plus current information-seeking behaviors/needs and (b) in-depth case study with leaders in new executive positions (n = 2). The data were collected from 10 HCOs. Our empirically based model for KM was assessed for face and content validity. The findings highlight the paucity of formal KM in our sample HCOs. Organizational culture, leadership, and resources are instrumental in supporting KM processes. An executive's knowledge needs are extensive, but knowledge assets are often limited or difficult to acquire as much of the available information is not in a usable format. We propose an empirically based model for KM to highlight the importance of context (internal and external), and knowledge seeking, synthesis, sharing, and organization. Participants who reviewed the model supported its basic components and processes, and potential for incorporating KM into organizational processes. Our results articulate ways to improve KM, increase organizational learning, and support evidence-informed decision-making. This research has implications for how to better integrate evidence and knowledge into organizations while considering context and the role of organizational processes.
Conceptual analyses of extensible booms to support a solar sail
NASA Technical Reports Server (NTRS)
Crawford, R. F.; Benton, M. D.
1977-01-01
Extensible booms which could function as the diagonal spars and central mast of an 800 meter square, non-rotating Solar Sailing Vehicle were conceptually designed and analyzed. The boom design concept that was investigated is an extensible lattice boom which is stowed and deployed by elastically coiling and uncoiling its continuous longerons. The seven different free-span lengths in each spar which would minimize the total weights of the spars and mast were determined. Boom weights were calculated by using a semi-empirical formulation which related the overall weight of a boom to the weight of its longerons.
Effect of treatment in a constructed wetland on toxicity of textile wastewater
Baughman, G.L.; Perkins, W.S.; Lasier, P.J.; Winger, P.V.
2003-01-01
Constructed wetlands for treating wastewater have proliferated in recent years and their characteristics have been studied extensively. In most cases, constructed wetlands have been used primarily for removal of nutrients and heavy metals. Extensive literature is available concerning construction and use of wetlands for treatment of wastewater. Even so, quantitative descriptions of wetland function and processes are highly empirical and difficult to extrapolate. The processes involved in removal of pollutants by wetlands are poorly understood, especially for waste streams as complex as textile effluents. The few studies conducted on treatment of textile wastewater in constructed wetlands were cited in earlier publications. Results of a two-year study of a full-scale wetland treating textile effluent are presented here. The paper describes the effects of the wetland on aquatic toxicity of the wastewater and draws conclusions about the utility and limitations of constructed wetlands for treatment of textile effluents.
Some empirical evidence for ecological dissonance theory.
Miller, D I; Verhoek-Miller, N; Giesen, J M; Wells-Parker, E
2000-04-01
Using Festinger's cognitive dissonance theory as a model, the extension to Barker's ecological theory, referred to as ecological dissonance theory, was developed. Designed to examine the motivational dynamics involved when environmental systems are in conflict with each other or with cognitive systems, ecological dissonance theory yielded five propositions which were tested in 10 studies. This summary of the studies suggests operationally defined measures of ecological dissonance may correlate with workers' satisfaction with their jobs, involvement with their jobs, alienation from their work, and to a lesser extent, workers' conflict resolution behavior and communication style.
Earth orbital teleoperator systems evaluation
NASA Technical Reports Server (NTRS)
Shields, N. L., Jr.; Slaughter, P. H.; Brye, R. G.; Henderson, D. E.
1979-01-01
The mechanical extension of the human operator to remote and specialized environments poses a series of complex operational questions. A technical and scientific team was organized to investigate these questions through conducting specific laboratory and analytical studies. The intent of the studies was to determine the human operator requirements for remotely manned systems and to determine the particular effects that various system parameters have on human operator performance. In so doing, certain design criteria based on empirically derived data concerning the ultimate control system, the human operator, were added to the Teleoperator Development Program.
Taking Innovation To Scale In Primary Care Practices: The Functions Of Health Care Extension.
Ono, Sarah S; Crabtree, Benjamin F; Hemler, Jennifer R; Balasubramanian, Bijal A; Edwards, Samuel T; Green, Larry A; Kaufman, Arthur; Solberg, Leif I; Miller, William L; Woodson, Tanisha Tate; Sweeney, Shannon M; Cohen, Deborah J
2018-02-01
Health care extension is an approach to providing external support to primary care practices with the aim of diffusing innovation. EvidenceNOW was launched to rapidly disseminate and implement evidence-based guidelines for cardiovascular preventive care in the primary care setting. Seven regional grantee cooperatives provided the foundational elements of health care extension-technological and quality improvement support, practice capacity building, and linking with community resources-to more than two hundred primary care practices in each region. This article describes how the cooperatives varied in their approaches to extension and provides early empirical evidence that health care extension is a feasible and potentially useful approach for providing quality improvement support to primary care practices. With investment, health care extension may be an effective platform for federal and state quality improvement efforts to create economies of scale and provide practices with more robust and coordinated support services.
Empirical evaluation of neutral interactions in host-parasite networks.
Canard, E F; Mouquet, N; Mouillot, D; Stanko, M; Miklisova, D; Gravel, D
2014-04-01
While niche-based processes have been invoked extensively to explain the structure of interaction networks, recent studies propose that neutrality could also be of great importance. Under the neutral hypothesis, network structure would simply emerge from random encounters between individuals and thus would be directly linked to species abundance. We investigated the impact of species abundance distributions on qualitative and quantitative metrics of 113 host-parasite networks. We analyzed the concordance between neutral expectations and empirical observations at interaction, species, and network levels. We found that species abundance accurately predicts network metrics at all levels. Despite host-parasite systems being constrained by physiology and immunology, our results suggest that neutrality could also explain, at least partially, their structure. We hypothesize that trait matching would determine potential interactions between species, while abundance would determine their realization.
NASA Astrophysics Data System (ADS)
Nightingale, James; Wang, Qi; Grecos, Christos; Goma, Sergio
2014-02-01
High Efficiency Video Coding (HEVC), the latest video compression standard (also known as H.265), can deliver video streams of comparable quality to the current H.264 Advanced Video Coding (H.264/AVC) standard with a 50% reduction in bandwidth. Research into SHVC, the scalable extension to the HEVC standard, is still in its infancy. One important area for investigation is whether, given the greater compression ratio of HEVC (and SHVC), the loss of packets containing video content will have a greater impact on the quality of delivered video than is the case with H.264/AVC or its scalable extension H.264/SVC. In this work we empirically evaluate the layer-based, in-network adaptation of video streams encoded using SHVC in situations where dynamically changing bandwidths and datagram loss ratios require the real-time adaptation of video streams. Through the use of extensive experimentation, we establish a comprehensive set of benchmarks for SHVC-based highdefinition video streaming in loss prone network environments such as those commonly found in mobile networks. Among other results, we highlight that packet losses of only 1% can lead to a substantial reduction in PSNR of over 3dB and error propagation in over 130 pictures following the one in which the loss occurred. This work would be one of the earliest studies in this cutting-edge area that reports benchmark evaluation results for the effects of datagram loss on SHVC picture quality and offers empirical and analytical insights into SHVC adaptation to lossy, mobile networking conditions.
Theory of earthquakes interevent times applied to financial markets
NASA Astrophysics Data System (ADS)
Jagielski, Maciej; Kutner, Ryszard; Sornette, Didier
2017-10-01
We analyze the probability density function (PDF) of waiting times between financial loss exceedances. The empirical PDFs are fitted with the self-excited Hawkes conditional Poisson process with a long power law memory kernel. The Hawkes process is the simplest extension of the Poisson process that takes into account how past events influence the occurrence of future events. By analyzing the empirical data for 15 different financial assets, we show that the formalism of the Hawkes process used for earthquakes can successfully model the PDF of interevent times between successive market losses.
Shear in high strength concrete bridge girders : technical report.
DOT National Transportation Integrated Search
2013-04-01
Prestressed Concrete (PC) I-girders are used extensively as the primary superstructure components in Texas highway bridges. : A simple semi-empirical equation was developed at the University of Houston (UH) to predict the shear strength of PC I-girde...
Quantifying patterns of research interest evolution
NASA Astrophysics Data System (ADS)
Jia, Tao; Wang, Dashun; Szymanski, Boleslaw
Changing and shifting research interest is an integral part of a scientific career. Despite extensive investigations of various factors that influence a scientist's choice of research topics, quantitative assessments of mechanisms that give rise to macroscopic patterns characterizing research interest evolution of individual scientists remain limited. Here we perform a large-scale analysis of extensive publication records, finding that research interest change follows a reproducible pattern characterized by an exponential distribution. We identify three fundamental features responsible for the observed exponential distribution, which arise from a subtle interplay between exploitation and exploration in research interest evolution. We develop a random walk based model, which adequately reproduces our empirical observations. Our study presents one of the first quantitative analyses of macroscopic patterns governing research interest change, documenting a high degree of regularity underlying scientific research and individual careers.
Schohl, Kirsten A; Van Hecke, Amy V; Carson, Audrey Meyer; Dolan, Bridget; Karst, Jeffrey; Stevens, Sheryl
2014-03-01
This study aimed to evaluate the Program for the Education and Enrichment of Relational Skills (PEERS: Laugeson et al. in J Autism Dev Disord 39(4):596-606, 2009). PEERS focuses on improving friendship quality and social skills among adolescents with higher-functioning ASD. 58 participants aged 11-16 years-old were randomly assigned to either an immediate treatment or waitlist comparison group. Results revealed, in comparison to the waitlist group, that the experimental treatment group significantly improved their knowledge of PEERS concepts and friendship skills, increased in their amount of get-togethers, and decreased in their levels of social anxiety, core autistic symptoms, and problem behaviors from pre-to post-PEERS. This study provides the first independent replication and extension of the empirically-supported PEERS social skills intervention for adolescents with ASD.
Sinabro: A Smartphone-Integrated Opportunistic Electrocardiogram Monitoring System
Kwon, Sungjun; Lee, Dongseok; Kim, Jeehoon; Lee, Youngki; Kang, Seungwoo; Seo, Sangwon; Park, Kwangsuk
2016-01-01
In our preliminary study, we proposed a smartphone-integrated, unobtrusive electrocardiogram (ECG) monitoring system, Sinabro, which monitors a user’s ECG opportunistically during daily smartphone use without explicit user intervention. The proposed system also monitors ECG-derived features, such as heart rate (HR) and heart rate variability (HRV), to support the pervasive healthcare apps for smartphones based on the user’s high-level contexts, such as stress and affective state levels. In this study, we have extended the Sinabro system by: (1) upgrading the sensor device; (2) improving the feature extraction process; and (3) evaluating extensions of the system. We evaluated these extensions with a good set of algorithm parameters that were suggested based on empirical analyses. The results showed that the system could capture ECG reliably and extract highly accurate ECG-derived features with a reasonable rate of data drop during the user’s daily smartphone use. PMID:26978364
Sinabro: A Smartphone-Integrated Opportunistic Electrocardiogram Monitoring System.
Kwon, Sungjun; Lee, Dongseok; Kim, Jeehoon; Lee, Youngki; Kang, Seungwoo; Seo, Sangwon; Park, Kwangsuk
2016-03-11
In our preliminary study, we proposed a smartphone-integrated, unobtrusive electrocardiogram (ECG) monitoring system, Sinabro, which monitors a user's ECG opportunistically during daily smartphone use without explicit user intervention. The proposed system also monitors ECG-derived features, such as heart rate (HR) and heart rate variability (HRV), to support the pervasive healthcare apps for smartphones based on the user's high-level contexts, such as stress and affective state levels. In this study, we have extended the Sinabro system by: (1) upgrading the sensor device; (2) improving the feature extraction process; and (3) evaluating extensions of the system. We evaluated these extensions with a good set of algorithm parameters that were suggested based on empirical analyses. The results showed that the system could capture ECG reliably and extract highly accurate ECG-derived features with a reasonable rate of data drop during the user's daily smartphone use.
Empirical Tryout of a New Statistic for Detecting Temporally Inconsistent Responders.
Kerry, Matthew J
2018-01-01
Statistical screening of self-report data is often advised to support the quality of analyzed responses - For example, reduction of insufficient effort responding (IER). One recently introduced index based on Mahalanobis's D for detecting outliers in cross-sectional designs replaces centered scores with difference scores between repeated-measure items: Termed person temporal consistency ( D 2 ptc ). Although the adapted D 2 ptc index demonstrated usefulness in simulation datasets, it has not been applied to empirical data. The current study addresses D 2 ptc 's low uptake by critically appraising its performance across three empirical applications. Independent samples were selected to represent a range of scenarios commonly encountered by organizational researchers. First, in Sample 1, a repeat-measure of future time perspective (FTP) inexperienced working adults (age >40-years; n = 620) indicated that temporal inconsistency was significantly related to respondent age and item reverse-scoring. Second, in repeat-measure of team efficacy aggregations, D 2 ptc successfully detected team-level inconsistency across repeat-performance cycles. Thirdly, the usefulness of the D 2 ptc was examined in an experimental study dataset of subjective life expectancy indicated significantly more stable responding in experimental conditions compared to controls. The empirical findings support D 2 ptc 's flexible and useful application to distinct study designs. Discussion centers on current limitations and further extensions that may be of value to psychologists screening self-report data for strengthening response quality and meaningfulness of inferences from repeated-measures self-reports. Taken together, the findings support the usefulness of the newly devised statistic for detecting IER and other extreme response patterns.
Raeven, Vivian M; Spoorenberg, Simone M C; Boersma, Wim G; van de Garde, Ewoudt M W; Cannegieter, Suzanne C; Voorn, G P Paul; Bos, Willem Jan W; van Steenbergen, Jim E
2016-06-17
Microorganisms causing community-acquired pneumonia (CAP) can be categorised into viral, typical and atypical (Legionella species, Coxiella burnetii, Mycoplasma pneumoniae, and Chlamydia species). Extensive microbiological testing to identify the causative microorganism is not standardly recommended, and empiric treatment does not always cover atypical pathogens. In order to optimize epidemiologic knowledge of CAP and to improve empiric antibiotic choice, we investigated whether atypical microorganisms are associated with a particular season or with the patient characteristics age, gender, or chronic obstructive pulmonary disease (COPD). A data-analysis was performed on databases from four prospective studies, which all included adult patients hospitalised with CAP in the Netherlands (N = 980). All studies performed extensive microbiological testing. A main causative agent was identified in 565/980 (57.7 %) patients. Of these, 117 (20.7 %) were atypical microorganisms. This percentage was 40.4 % (57/141) during the non-respiratory season (week 20 to week 39, early May to early October), and 67.2 % (41/61) for patients under the age of 60 during this season. Factors that were associated with atypical causative agents were: CAP acquired in the non-respiratory season (odds ratio (OR) 4.3, 95 % CI 2.68-6.84), age <60 year (OR 2.9, 95 % CI 1.83-4.66), male gender (OR 1.7, 95 % CI 1.06-2.71) and absence of COPD (OR 0.2, 95 % CI 0.12-0.52). Atypical causative agents in CAP are associated with respectively non-respiratory season, age <60 years, male gender and absence of COPD. Therefore, to maximise its yield, extensive microbiological testing should be considered in patients <60 years old who are admitted with CAP from early May to early October. NCT00471640 , NCT00170196 (numbers of original studies).
Equal Work, Unequal Pay: Gender Discrimination within Work-Similar Occupations.
ERIC Educational Resources Information Center
Kemp, Alice Abel; Beck, E. M.
1986-01-01
Describes an empirical method to identify work-similar occupations using selected measures from the Dictionary of Occupational Titles. Examines male-female earnings differences within a group of work-similar occupations and finds that discrimination against females is extensive. (Author/CH)
The influence of tie strength on evolutionary games on networks: An empirical investigation
NASA Astrophysics Data System (ADS)
Buesser, Pierre; Peña, Jorge; Pestelacci, Enea; Tomassini, Marco
2011-11-01
Extending previous work on unweighted networks, we present here a systematic numerical investigation of standard evolutionary games on weighted networks. In the absence of any reliable model for generating weighted social networks, we attribute weights to links in a few ways supported by empirical data ranging from totally uncorrelated to weighted bipartite networks. The results of the extensive simulation work on standard complex network models show that, except in a case that does not seem to be common in social networks, taking the tie strength into account does not change in a radical manner the long-run steady-state behavior of the studied games. Besides model networks, we also included a real-life case drawn from a coauthorship network. In this case also, taking the weights into account only changes the results slightly with respect to the raw unweighted graph, although to draw more reliable conclusions on real social networks many more cases should be studied as these weighted networks become available.
Goldstein, Naomi E. S.; Kemp, Kathleen A.; Leff, Stephen S.; Lochman, John E.
2014-01-01
The use of manual-based interventions tends to improve client outcomes and promote replicability. With an increasingly strong link between funding and the use of empirically supported prevention and intervention programs, manual development and adaptation have become research priorities. As a result, researchers and scholars have generated guidelines for developing manuals from scratch, but there are no extant guidelines for adapting empirically supported, manualized prevention and intervention programs for use with new populations. Thus, this article proposes step-by-step guidelines for the manual adaptation process. It also describes two adaptations of an extensively researched anger management intervention to exemplify how an empirically supported program was systematically and efficiently adapted to achieve similar outcomes with vastly different populations in unique settings. PMID:25110403
Gene-Environment Interplay in Twin Models
Hatemi, Peter K.
2013-01-01
In this article, we respond to Shultziner’s critique that argues that identical twins are more alike not because of genetic similarity, but because they select into more similar environments and respond to stimuli in comparable ways, and that these effects bias twin model estimates to such an extent that they are invalid. The essay further argues that the theory and methods that undergird twin models, as well as the empirical studies which rely upon them, are unaware of these potential biases. We correct this and other misunderstandings in the essay and find that gene-environment (GE) interplay is a well-articulated concept in behavior genetics and political science, operationalized as gene-environment correlation and gene-environment interaction. Both are incorporated into interpretations of the classical twin design (CTD) and estimated in numerous empirical studies through extensions of the CTD. We then conduct simulations to quantify the influence of GE interplay on estimates from the CTD. Due to the criticism’s mischaracterization of the CTD and GE interplay, combined with the absence of any empirical evidence to counter what is presented in the extant literature and this article, we conclude that the critique does not enhance our understanding of the processes that drive political traits, genetic or otherwise. PMID:24808718
The Ambiguity of Artworks –A Guideline for Empirical Aesthetics Research with Artworks as Stimuli
Hayn-Leichsenring, Gregor U.
2017-01-01
The aim of this work is to provide researchers from the field of aesthetics with a guideline on working with artworks as stimuli. Empirical aesthetics research is complicated by the uncertainty of the object of research. There is no way to unquestionably tell whether an object is an artwork or not. However, although the extension of the term artwork (i.e., the range of objects to which this concept applies) remains vague, the different intensions of the term artwork (i.e., the internal concept that constitutes a formal definition) are well defined. Here, I review the various concepts of artworks (i.e., intensions) that scientists from different fields use in current research in empirical aesthetics. The selection of stimuli is often not explained and/or does not match the focus of the study. An application of two or more intensions within one study leads to an indeterminacy of the stimuli and, thus, to systematic problems concerning the interpretation and comparability of the experimental results. Based on these intensions and the Pleasure-Interest Model of Aesthetic Liking (Graf and Landwehr, 2015), I compiled a decision tree in order to provide researchers with an instrument that allows a better control over their stimuli. PMID:29123494
Wu, Hung-Yi; Lin, Yi-Kuei; Chang, Chi-Hsiang
2011-02-01
This study aims at developing a set of appropriate performance evaluation indices mainly based on balanced scorecard (BSC) for extension education centers in universities by utilizing multiple criteria decision making (MCDM). Through literature reviews and experts who have real practical experiences in extension education, adequate performance evaluation indices have been selected and then utilizing the decision making trial and evaluation laboratory (DEMATEL) and analytic network process (ANP), respectively, further establishes the causality between the four BSC perspectives as well as the relative weights between evaluation indices. According to this previous result, an empirical analysis of the performance evaluation of extension education centers of three universities at Taoyuan County in Taiwan is illustrated by applying VlseKriterijumska Optimizacija I Kompromisno Resenje (VIKOR). From the analysis results, it indicates that "Learning and growth" is the significant influential factor and it would affect the other three perspectives. In addition, it is discovered that "Internal process" perspective as well as "Financial" perspective play important roles in the performance evaluation of extension education centers. The top three key performance indices are "After-sales service", "Turnover volume", and "Net income". The proposed evaluation model could be considered as a reference for extension education centers in universities to prioritize their improvements on the key performance indices after performing VIKOR analyses. 2010 Elsevier Ltd. All rights reserved.
Taking Innovation To Scale In Primary Care Practices: The Functions Of Health Care Extension
Ono, Sarah S.; Crabtree, Benjamin F.; Hemler, Jennifer R.; Balasubramanian, Bijal A.; Edwards, Samuel T.; Green, Larry A.; Kaufman, Arthur; Solberg, Leif I.; Miller, William L.; Woodson, Tanisha Tate; Sweeney, Shannon M.; Cohen, Deborah J.
2018-01-01
Health care extension is an approach to providing external support to primary care practices with the aim of diffusing innovation. EvidenceNOW was launched to rapidly disseminate and implement evidence-based guidelines for cardiovascular preventive care in the primary care setting. Seven regional grantee cooperatives provided the foundational elements of health care extension—technological and quality improvement support, practice capacity building, and linking with community resources—to more than two hundred primary care practices in each region. This article describes how the cooperatives varied in their approaches to extension and provides early empirical evidence that health care extension is a feasible and potentially useful approach for providing quality improvement support to primary care practices. With investment, health care extension may be an effective platform for federal and state quality improvement efforts to create economies of scale and provide practices with more robust and coordinated support services. PMID:29401016
Rapid decay in the relative efficiency of quarantine to halt epidemics in networks
NASA Astrophysics Data System (ADS)
Strona, Giovanni; Castellano, Claudio
2018-02-01
Several recent studies have tackled the issue of optimal network immunization by providing efficient criteria to identify key nodes to be removed in order to break apart a network, thus preventing the occurrence of extensive epidemic outbreaks. Yet, although the efficiency of those criteria has been demonstrated also in empirical networks, preventive immunization is rarely applied to real-world scenarios, where the usual approach is the a posteriori attempt to contain epidemic outbreaks using quarantine measures. Here we compare the efficiency of prevention with that of quarantine in terms of the tradeoff between the number of removed and saved nodes on both synthetic and empirical topologies. We show how, consistent with common sense, but contrary to common practice, in many cases preventing is better than curing: depending on network structure, rescuing an infected network by quarantine could become inefficient soon after the first infection.
Rosen, Baruch; Tepper, Yotam; Bar-Oz, Guy
2018-01-01
Metric data of 6th century CE pigeons from the Negev Desert, Israel, are employed to test competing hypotheses on flock management strategies: that directed selection for size or shape took place under intensive management; or, alternatively, that stabilizing selection was a stronger determinant of size and shape under extensive management conditions. The results of the analysis support the second hypothesis by demonstrating that the Byzantine Negev pigeons were like wild pigeon (Columba livia) in shape, albeit small-sized. The inferred extensive management system is then discussed in the context of pigeon domestication and human micro-ecologies in marginal regions. PMID:29561880
A comparison of four streamflow record extension techniques
Hirsch, Robert M.
1982-01-01
One approach to developing time series of streamflow, which may be used for simulation and optimization studies of water resources development activities, is to extend an existing gage record in time by exploiting the interstation correlation between the station of interest and some nearby (long-term) base station. Four methods of extension are described, and their properties are explored. The methods are regression (REG), regression plus noise (RPN), and two new methods, maintenance of variance extension types 1 and 2 (MOVE.l, MOVE.2). MOVE.l is equivalent to a method which is widely used in psychology, biometrics, and geomorphology and which has been called by various names, e.g., ‘line of organic correlation,’ ‘reduced major axis,’ ‘unique solution,’ and ‘equivalence line.’ The methods are examined for bias and standard error of estimate of moments and order statistics, and an empirical examination is made of the preservation of historic low-flow characteristics using 50-year-long monthly records from seven streams. The REG and RPN methods are shown to have serious deficiencies as record extension techniques. MOVE.2 is shown to be marginally better than MOVE.l, according to the various comparisons of bias and accuracy.
A Comparison of Four Streamflow Record Extension Techniques
NASA Astrophysics Data System (ADS)
Hirsch, Robert M.
1982-08-01
One approach to developing time series of streamflow, which may be used for simulation and optimization studies of water resources development activities, is to extend an existing gage record in time by exploiting the interstation correlation between the station of interest and some nearby (long-term) base station. Four methods of extension are described, and their properties are explored. The methods are regression (REG), regression plus noise (RPN), and two new methods, maintenance of variance extension types 1 and 2 (MOVE.l, MOVE.2). MOVE.l is equivalent to a method which is widely used in psychology, biometrics, and geomorphology and which has been called by various names, e.g., `line of organic correlation,' `reduced major axis,' `unique solution,' and `equivalence line.' The methods are examined for bias and standard error of estimate of moments and order statistics, and an empirical examination is made of the preservation of historic low-flow characteristics using 50-year-long monthly records from seven streams. The REG and RPN methods are shown to have serious deficiencies as record extension techniques. MOVE.2 is shown to be marginally better than MOVE.l, according to the various comparisons of bias and accuracy.
[The Antonine plague: A global pestilence in the II century d.C].
Sáez, Andrés
2016-04-01
The Antonine plague was the first plague affecting globally the Western world. It affected all aspects of life of mankind in the Roman Empire: economics, politics, religion and the culture. The especialists set the mortality rate in the 10% of the population. On the other hand the existence of unified Roman Empire from culturally and territorially helped to spreading the plague as it could similarly occur in our society in a similar pandemic. In conclusion, it is argued that the epidemic was global in a sense of the geographical extension and the effects this had on the population.
Rolls, David A.; Wang, Peng; McBryde, Emma; Pattison, Philippa; Robins, Garry
2015-01-01
We compare two broad types of empirically grounded random network models in terms of their abilities to capture both network features and simulated Susceptible-Infected-Recovered (SIR) epidemic dynamics. The types of network models are exponential random graph models (ERGMs) and extensions of the configuration model. We use three kinds of empirical contact networks, chosen to provide both variety and realistic patterns of human contact: a highly clustered network, a bipartite network and a snowball sampled network of a “hidden population”. In the case of the snowball sampled network we present a novel method for fitting an edge-triangle model. In our results, ERGMs consistently capture clustering as well or better than configuration-type models, but the latter models better capture the node degree distribution. Despite the additional computational requirements to fit ERGMs to empirical networks, the use of ERGMs provides only a slight improvement in the ability of the models to recreate epidemic features of the empirical network in simulated SIR epidemics. Generally, SIR epidemic results from using configuration-type models fall between those from a random network model (i.e., an Erdős-Rényi model) and an ERGM. The addition of subgraphs of size four to edge-triangle type models does improve agreement with the empirical network for smaller densities in clustered networks. Additional subgraphs do not make a noticeable difference in our example, although we would expect the ability to model cliques to be helpful for contact networks exhibiting household structure. PMID:26555701
Ivory, James D; Williams, Dmitri; Martins, Nicole; Consalvo, Mia
2009-08-01
Although violent video game content and its effects have been examined extensively by empirical research, verbal aggression in the form of profanity has received less attention. Building on preliminary findings from previous studies, an extensive content analysis of profanity in video games was conducted using a sample of the 150 top-selling video games across all popular game platforms (including home consoles, portable consoles, and personal computers). The frequency of profanity, both in general and across three profanity categories, was measured and compared to games' ratings, sales, and platforms. Generally, profanity was found in about one in five games and appeared primarily in games rated for teenagers or above. Games containing profanity, however, tended to contain it frequently. Profanity was not found to be related to games' sales or platforms.
Aspara, Jaakko; Klein, Jan F; Luo, Xueming; Tikkanen, Henrikki
2018-05-01
We conduct a systematic exploratory investigation of the effects of firms' existing service productivity on the success of their new service innovations. Although previous research extensively addresses service productivity and service innovation, this is the first empirical study that bridges the gap between these two research streams and examines the links between the two concepts. Based on a comprehensive data set of new service introductions in a financial services market over a 14-year period, we empirically explore the relationship between a firm's existing service productivity and the firm's success in introducing new services to the market. The results unveil a fundamental service productivity-service innovation dilemma: Being productive in existing services increases a firm's willingness to innovate new services proactively but decreases the firm's capabilities of bringing these services to the market successfully. We provide specific insights into the mechanism underlying the complex relationship between a firm's productivity in existing services, its innovation proactivity, and its service innovation success. For managers, we not only unpack and elucidate this dilemma but also demonstrate that a focused customer scope and growth market conditions may enable firms to mitigate the dilemma and successfully pursue service productivity and service innovation simultaneously.
Theorizing and Researching Levels of Processing in Self-Regulated Learning
ERIC Educational Resources Information Center
Winne, Philip H.
2018-01-01
Background: Deep versus surface knowledge is widely discussed by educational practitioners. A corresponding construct, levels of processing, has received extensive theoretical and empirical attention in learning science and psychology. In both arenas, lower levels of information and shallower levels of processing are predicted and generally…
Identity Texts and Academic Achievement: Connecting the Dots in Multilingual School Contexts
ERIC Educational Resources Information Center
Cummins, Jim; Hu, Shirley; Markus, Paula; Kristiina Montero, M.
2015-01-01
The construct of "identity text" conjoins notions of identity affirmation and literacy engagement as equally relevant to addressing causes of underachievement among low socioeconomic status, multilingual, and marginalized group students. Despite extensive empirical evidence supporting the impact on academic achievement of both identity…
The areal reduction factor: A new analytical expression for the Lazio Region in central Italy
NASA Astrophysics Data System (ADS)
Mineo, C.; Ridolfi, E.; Napolitano, F.; Russo, F.
2018-05-01
For the study and modeling of hydrological phenomena, both in urban and rural areas, a proper estimation of the areal reduction factor (ARF) is crucial. In this paper, we estimated the ARF from observed rainfall data as the ratio between the average rainfall occurring in a specific area and the point rainfall. Then, we compared the obtained ARF values with some of the most widespread empirical approaches in literature which are used when rainfall observations are not available. Results highlight that the literature formulations can lead to a substantial over- or underestimation of the ARF estimated from observed data. These findings can have severe consequences, especially in the design of hydraulic structures where empirical formulations are extensively applied. The aim of this paper is to present a new analytical relationship with an explicit dependence on the rainfall duration and area that can better represent the ARF-area trend over the area case of study. The analytical curve presented here can find an important application to estimate the ARF values for design purposes. The test study area is the Lazio Region (central Italy).
Multilevel corporate environmental responsibility.
Karassin, Orr; Bar-Haim, Aviad
2016-12-01
The multilevel empirical study of the antecedents of corporate social responsibility (CSR) has been identified as "the first knowledge gap" in CSR research. Based on an extensive literature review, the present study outlines a conceptual multilevel model of CSR, then designs and empirically validates an operational multilevel model of the principal driving factors affecting corporate environmental responsibility (CER), as a measure of CSR. Both conceptual and operational models incorporate three levels of analysis: institutional, organizational, and individual. The multilevel nature of the design allows for the assessment of the relative importance of the levels and of their components in the achievement of CER. Unweighted least squares (ULS) regression analysis reveals that the institutional-level variables have medium relationships with CER, some variables having a negative effect. The organizational level is revealed as having strong and positive significant relationships with CER, with organizational culture and managers' attitudes and behaviors as significant driving forces. The study demonstrates the importance of multilevel analysis in improving the understanding of CSR drivers, relative to single level models, even if the significance of specific drivers and levels may vary by context. Copyright © 2016 Elsevier Ltd. All rights reserved.
Rossi, Fabrizio; Barth, James R; Cebula, Richard J
2018-06-01
The data presented in this article are related to the research article entitled "Do shareholder coalitions affect agency costs? Evidence from Italian-listed companies", Research in International Business and Finance , Forthcoming (Rossi et al., 2018) [1]. The study shows an empirical analysis using an extensive balanced panel dataset of 163 Italian listed companies for the period 2002-2013, which is a sample yielding 1956 firm-year observations. The sample consists primarily of manufacturing firms, but also includes some service enterprises. However all financial firms and regulated utilities are excluded. We collected data on ownership structure for the entire study period. Information was acquired from the Consob website and the individual company reports on corporate governance. Data on firm-level indicators (debt-to-capital ratio, firm size, and age of the firm) for all companies in the sample were collected from Datastream, Bloomberg , and Calepino dell'Azionista , as well as obtained manually from the financial statements of the individual companies being studied. Our dataset contains several measures of ownership structure for Italian listed companies.
High throughput film dosimetry in homogeneous and heterogeneous media for a small animal irradiator
Wack, L.; Ngwa, W.; Tryggestad, E.; Tsiamas, P.; Berbeco, R.; Ng, S.K.; Hesser, J.
2013-01-01
Purpose We have established a high-throughput Gafchromic film dosimetry protocol for narrow kilo-voltage beams in homogeneous and heterogeneous media for small-animal radiotherapy applications. The kV beam characterization is based on extensive Gafchromic film dosimetry data acquired in homogeneous and heterogeneous media. An empirical model is used for parameterization of depth and off-axis dependence of measured data. Methods We have modified previously published methods of film dosimetry to suit the specific tasks of the study. Unlike film protocols used in previous studies, our protocol employs simultaneous multichannel scanning and analysis of up to nine Gafchromic films per scan. A scanner and background correction were implemented to improve accuracy of the measurements. Measurements were taken in homogeneous and inhomogeneous phantoms at 220 kVp and a field size of 5 × 5 mm2. The results were compared against Monte Carlo simulations. Results Dose differences caused by variations in background signal were effectively removed by the corrections applied. Measurements in homogeneous phantoms were used to empirically characterize beam data in homogeneous and heterogeneous media. Film measurements in inhomogeneous phantoms and their empirical parameterization differed by about 2%–3%. The model differed from MC by about 1% (water, lung) to 7% (bone). Good agreement was found for measured and modelled off-axis ratios. Conclusions EBT2 films are a valuable tool for characterization of narrow kV beams, though care must be taken to eliminate disturbances caused by varying background signals. The usefulness of the empirical beam model in interpretation and parameterization of film data was demonstrated. PMID:23510532
AN EMPIRICAL FORMULA FOR THE DISTRIBUTION FUNCTION OF A THIN EXPONENTIAL DISC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharma, Sanjib; Bland-Hawthorn, Joss
2013-08-20
An empirical formula for a Shu distribution function that reproduces a thin disc with exponential surface density to good accuracy is presented. The formula has two free parameters that specify the functional form of the velocity dispersion. Conventionally, this requires the use of an iterative algorithm to produce the correct solution, which is computationally taxing for applications like Markov Chain Monte Carlo model fitting. The formula has been shown to work for flat, rising, and falling rotation curves. Application of this methodology to one of the Dehnen distribution functions is also shown. Finally, an extension of this formula to reproducemore » velocity dispersion profiles that are an exponential function of radius is also presented. Our empirical formula should greatly aid the efficient comparison of disc models with large stellar surveys or N-body simulations.« less
Fertility transitions and schooling: from micro- to macro-level associations.
Eloundou-Enyegue, Parfait M; Giroux, Sarah C
2012-11-01
Research on the schooling implications of fertility transitions often faces an aggregation problem: despite policy interest in macro-level outcomes, empirical studies usually focus on the micro-level effects of sibsize on schooling. This article proposes an aggregation framework for moving from micro- to macro-level associations between fertility and schooling. The proposed framework is an improvement over previous aggregation methods in that it considers concurrent changes in the effects of sibsize, socioeconomic context, and family structure. The framework is illustrated with data from six sub-Saharan countries. Possible extensions are discussed.
Wodarski, John S; Feit, Marvin D
2011-01-01
The problematic behaviors of teenagers and the subsequent negative consequences are extensive and well documented: unwanted pregnancy, substance abuse, violent behavior, depression, and social and psychological consequences of unemployment. In this article, the authors review an approach that uses a cooperative learning, empirically based intervention that employs peers as teachers. This intervention of choice is Teams-Games-Tournaments (TGT), a paradigm backed by five decades of empirical support. The application of TGT in preventive health programs incorporates elements in common with other prevention programs that are based on a public health orientation and constitute the essential components of health education, that is, skills training and practice in applying skills. The TGT intervention supports the idea that children and adolescents from various socioeconomic classes, between the ages of 8 and 18 and in classrooms or groups ranging in size from 4 to 17 members, can work together for one another. TGT has been applied successfully in such diverse areas as adolescent development, sexuality education, psychoactive substance abuse education, anger control, coping with depression and suicide, nutrition, comprehensive employment preparation, and family intervention. This article reviews the extensive research on TGT using examples of successful projects in substance abuse, violence, and nutrition. Issues are raised that relate to the implementation of preventive health strategies for adolescents, including cognitive aspects, social and family networks, and intervention components.
Inducing Fuzzy Models for Student Classification
ERIC Educational Resources Information Center
Nykanen, Ossi
2006-01-01
We report an approach for implementing predictive fuzzy systems that manage capturing both the imprecision of the empirically induced classifications and the imprecision of the intuitive linguistic expressions via the extensive use of fuzzy sets. From end-users' point of view, the approach enables encapsulating the technical details of the…
Initial Development and Validation of the Global Citizenship Scale
ERIC Educational Resources Information Center
Morais, Duarte B.; Ogden, Anthony C.
2011-01-01
The purpose of this article is to report on the initial development of a theoretically grounded and empirically validated scale to measure global citizenship. The methodology employed is multi-faceted, including two expert face validity trials, extensive exploratory and confirmatory factor analyses with multiple datasets, and a series of three…
ERIC Educational Resources Information Center
Huberman, Bernardo A.; Loch, Christoph H.; Onculer, Ayse
2004-01-01
The striving for status has long been recognized in sociology and economics. Extensive theoretical arguments and empirical evidence propose that people view status as a sign of competence and pursue it as a means to achieve power and resources. A small literature, however, based on arguments from biology and evolutionary psychology, proposes that…
Visual resources and the public: an empirical approach
Rachel Kaplan
1979-01-01
Visual resource management systems incorporate many assumptions about how people see the landscape. While these assumptions are not articulated, they nonetheless affect the decision process. Problems inherent in some of these assumptions are examined. Extensive research based on people's preference ratings of different settings provides insight into people's...
High-resolution, spatially extensive climate grids can be useful in regional hydrologic applications. However, in regions where precipitation is dominated by snow, snowmelt models are often used to account for timing and magnitude of water delivery. We developed an empirical, non...
Olson's "Cognitive Development": A Commentary.
ERIC Educational Resources Information Center
Follettie, Joseph F.
This report is a review of Olson's "Cognitive Development." Unlike a typical book review it does not compare and contrast the author's theoretical framework and methodological practices with those of others in the field, but rather it extensively describes and critiques the reported empirical work. The reasons given for this approach are that…
Assessing the Value of E-Learning Systems
ERIC Educational Resources Information Center
Levy, Yair
2006-01-01
"Assessing the Value of E-Learning Systems" provides an extensive literature review pulling theories from the field of information systems, psychology and cognitive sciences, distance and online learning, as well as marketing and decision sciences. This book provides empirical evidence for the power of measuring value in the context of e-learning…
The Status of Projective Techniques: Or, "Wishing Won't Make It Go Away."
ERIC Educational Resources Information Center
Piotrowski, Chris
The predicted decline in usefulness and emphasis of projective techniques was analyzed from several different perspectives including the academic community, members of the American Psychological Association (APA) Division 12, internship centers, the applied clinical setting, and private practitioners. In addition, an extensive review of empirical,…
NASA Astrophysics Data System (ADS)
Fioretti, Guido
2007-02-01
The productions function maps the inputs of a firm or a productive system onto its outputs. This article expounds generalizations of the production function that include state variables, organizational structures and increasing returns to scale. These extensions are needed in order to explain the regularities of the empirical distributions of certain economic variables.
ERIC Educational Resources Information Center
Barrow, Robin
2004-01-01
Recent empirical research into the brain, while reinforcing the view that we are extensively "programmed", does not refute the idea of a distinctive human mind. The human mind is primarily a product of the human capacity for a distinctive kind of language. Human language is thus what gives us our consciousness, reasoning capacity and autonomy. To…
Empirical Validation and Application of the Computing Attitudes Survey
ERIC Educational Resources Information Center
Dorn, Brian; Elliott Tew, Allison
2015-01-01
Student attitudes play an important role in shaping learning experiences. However, few validated instruments exist for measuring student attitude development in a discipline-specific way. In this paper, we present the design, development, and validation of the computing attitudes survey (CAS). The CAS is an extension of the Colorado Learning…
Linking knowledge and action through mental models of sustainable agriculture.
Hoffman, Matthew; Lubell, Mark; Hillis, Vicken
2014-09-09
Linking knowledge to action requires understanding how decision-makers conceptualize sustainability. This paper empirically analyzes farmer "mental models" of sustainability from three winegrape-growing regions of California where local extension programs have focused on sustainable agriculture. The mental models are represented as networks where sustainability concepts are nodes, and links are established when a farmer mentions two concepts in their stated definition of sustainability. The results suggest that winegrape grower mental models of sustainability are hierarchically structured, relatively similar across regions, and strongly linked to participation in extension programs and adoption of sustainable farm practices. We discuss the implications of our findings for the debate over the meaning of sustainability, and the role of local extension programs in managing knowledge systems.
Linking knowledge and action through mental models of sustainable agriculture
Hoffman, Matthew; Lubell, Mark; Hillis, Vicken
2014-01-01
Linking knowledge to action requires understanding how decision-makers conceptualize sustainability. This paper empirically analyzes farmer “mental models” of sustainability from three winegrape-growing regions of California where local extension programs have focused on sustainable agriculture. The mental models are represented as networks where sustainability concepts are nodes, and links are established when a farmer mentions two concepts in their stated definition of sustainability. The results suggest that winegrape grower mental models of sustainability are hierarchically structured, relatively similar across regions, and strongly linked to participation in extension programs and adoption of sustainable farm practices. We discuss the implications of our findings for the debate over the meaning of sustainability, and the role of local extension programs in managing knowledge systems. PMID:25157158
From conscious thought to automatic action: A simulation account of action planning.
Martiny-Huenger, Torsten; Martiny, Sarah E; Parks-Stamm, Elizabeth J; Pfeiffer, Elisa; Gollwitzer, Peter M
2017-10-01
We provide a theoretical framework and empirical evidence for how verbally planning an action creates direct perception-action links and behavioral automaticity. We argue that planning actions in an if (situation)-then (action) format induces sensorimotor simulations (i.e., activity patterns reenacting the event in the sensory and motor brain areas) of the anticipated situation and the intended action. Due to their temporal overlap, these activity patterns become linked. Whenever the previously simulated situation is encountered, the previously simulated action is partially reactivated through spreading activation and thus more likely to be executed. In 4 experiments (N = 363), we investigated the relation between specific if-then action plans worded to activate simulations of elbow flexion versus extension movements and actual elbow flexion versus extension movements in a subsequent, ostensibly unrelated categorization task. As expected, linking a critical stimulus to intended actions that implied elbow flexion movements (e.g., grabbing it for consumption) subsequently facilitated elbow flexion movements upon encountering the critical stimulus. However, linking a critical stimulus to actions that implied elbow extension movements (e.g., pointing at it) subsequently facilitated elbow extension movements upon encountering the critical stimulus. Thus, minor differences (i.e., exchanging the words "point at" with "grab") in verbally formulated action plans (i.e., conscious thought) had systematic consequences on subsequent actions. The question of how conscious thought can induce stimulus-triggered action is illuminated by the provided theoretical framework and the respective empirical evidence, facilitating the understanding of behavioral automaticity and human agency. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Villar, S E J; Edwards, H G M
2005-05-01
Seventy-five specimens from thirty fragments of Roman villa wall-paintings from sites in Burgos Castilla y Leon, Spain, have been analysed by Raman spectroscopy. This is the first time that a Raman spectrocopic study of Roman wall-paintings from Spain has been reported. The extensive range of tonalities and colour compositions contrasts with the results found in other provinces of the Roman Empire, for example Romano-British villas. Calcite, aragonite, haematite, caput mortuum, cinnabar, limonite, goethite, cuprorivaite, lazurite, green earth, carbon and verdigris have been found as pigments. Some mineral mixtures with different tonalities have been made using different strategies to those more usually found. Of particular interest is the assignation of the Tarna mine for the origin of the cinnabar used for obtaining the red colour in some specimens analysed here. The wide range of colours, tonalities and minerals found in some of the sites studied in this work is suggestive of a high social status for the community.
Sexual selection and mate choice.
Andersson, Malte; Simmons, Leigh W
2006-06-01
The past two decades have seen extensive growth of sexual selection research. Theoretical and empirical work has clarified many components of pre- and postcopulatory sexual selection, such as aggressive competition, mate choice, sperm utilization and sexual conflict. Genetic mechanisms of mate choice evolution have been less amenable to empirical testing, but molecular genetic analyses can now be used for incisive experimentation. Here, we highlight some of the currently debated areas in pre- and postcopulatory sexual selection. We identify where new techniques can help estimate the relative roles of the various selection mechanisms that might work together in the evolution of mating preferences and attractive traits, and in sperm-egg interactions.
Wavenumber selection in Benard convection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Catton, I.
1988-11-01
The results of three related studies dealing with wavenumber selection in Rayleigh--Benard convection are reported. The first, an extension of the power integral method, is used to argue for the existence of multi-wavenumbers at all supercritical wavenumbers. Most existing closure schemes are shown to be inadequate. A thermodynamic stability criterion is shown to give reasonable results but requires empirical measurement of one parameter for closure. The third study uses an asymptotic approach based in part on geometric considerations and requires no empiricism to obtain good predictions of the wavenumber. These predictions, however, can only be used for certain planforms ofmore » convection.« less
Measuring and modeling salience with the theory of visual attention.
Krüger, Alexander; Tünnermann, Jan; Scharlau, Ingrid
2017-08-01
For almost three decades, the theory of visual attention (TVA) has been successful in mathematically describing and explaining a wide variety of phenomena in visual selection and recognition with high quantitative precision. Interestingly, the influence of feature contrast on attention has been included in TVA only recently, although it has been extensively studied outside the TVA framework. The present approach further develops this extension of TVA's scope by measuring and modeling salience. An empirical measure of salience is achieved by linking different (orientation and luminance) contrasts to a TVA parameter. In the modeling part, the function relating feature contrasts to salience is described mathematically and tested against alternatives by Bayesian model comparison. This model comparison reveals that the power function is an appropriate model of salience growth in the dimensions of orientation and luminance contrast. Furthermore, if contrasts from the two dimensions are combined, salience adds up additively.
Semi-Empirical Prediction of Aircraft Low-Speed Aerodynamic Characteristics
NASA Technical Reports Server (NTRS)
Olson, Erik D.
2015-01-01
This paper lays out a comprehensive methodology for computing a low-speed, high-lift polar, without requiring additional details about the aircraft design beyond what is typically available at the conceptual design stage. Introducing low-order, physics-based aerodynamic analyses allows the methodology to be more applicable to unconventional aircraft concepts than traditional, fully-empirical methods. The methodology uses empirical relationships for flap lift effectiveness, chord extension, drag-coefficient increment and maximum lift coefficient of various types of flap systems as a function of flap deflection, and combines these increments with the characteristics of the unflapped airfoils. Once the aerodynamic characteristics of the flapped sections are known, a vortex-lattice analysis calculates the three-dimensional lift, drag and moment coefficients of the whole aircraft configuration. This paper details the results of two validation cases: a supercritical airfoil model with several types of flaps; and a 12-foot, full-span aircraft model with slats and double-slotted flaps.
[Legionnaire's disease with predominant liver involvement].
Magro Molina, A; Plaza Poquet, V; Giner Galvañ, V
2002-04-01
Like other pneumonias due to atypical agents, pneumonia due to Legionela Pneumophila has no characteristic clinical facts, although fever and non-productive cough are almost constant and diarrhea with changes in mental status are common. Hyponatremia and moderate transient hypertransaminasemia are common too. Severe systemic affectation after hematogenous dissemination similar to those described with typical bacterial pneumonias is a prominent difference with other atypical agents, with high mortality rates in the absence of appropriate treatment. Etiological diagnosis is very difficult and it is normally achieved late in the course of the infection. Because of diagnostic difficulties and potential mortality in predisposed patients, empirical antibiotherapy has been extensively recommended. We present a patient affected by critical community-acquired pneumonia due to Legionela Pneumophila serogroup 1 with liver alteration as the main manifestation and good response to empirical antibiotherapy with claritromycine and rifampin. We recommended the empirical use of such therapy in those pneumonias without microbiological diagnosis and torpid evolution.
Alpers, Charles N.; Myers, Perry A; Millsap, Daniel; Regnier, Tamsen B; Bowell, Robert J.; Alpers, Charles N.; Jamieson, Heather E.; Nordstrom, D. Kirk; Majzlan, Juraj
2014-01-01
The Empire Mine, together with other mines in the Grass Valley mining district, produced at least 21.3 million troy ounces (663 tonnes) of gold (Au) during the 1850s through the 1950s, making it the most productive hardrock Au mining district in California history (Clark 1970). The Empire Mine State Historic Park (Empire Mine SHP or EMSHP), established in 1975, provides the public with an opportunity to see many well-preserved features of the historic mining and mineral processing operations (CDPR 2014a).A legacy of Au mining at Empire Mine and elsewhere is contamination of mine wastes and associated soils, surface waters, and groundwaters with arsenic (As), mercury (Hg), lead (Pb), and other metals. At EMSHP, As has been the principal contaminant of concern and the focus of extensive remediation efforts over the past several years by the State of California, Department of Parks and Recreation (DPR) and Newmont USA, Ltd. In addition, the site is the main focus of a multidisciplinary research project on As bioavailability and bioaccessibility led by the California Department of Toxic Substances Control (DTSC) and funded by the U.S. Environmental Protection Agency’s (USEPA’s) Brownfields Program.This chapter was prepared as a guide for a field trip to EMSHP held on June 14, 2014, in conjunction with a short course on “Environmental Geochemistry, Mineralogy, and Microbiology of Arsenic” held in Nevada City, California on June 15–16, 2014. This guide contains background information on geological setting, mining history, and environmental history at EMSHP and other historical Au mining districts in the Sierra Nevada, followed by descriptions of the field trip stops.
Sample Size Estimation in Cluster Randomized Educational Trials: An Empirical Bayes Approach
ERIC Educational Resources Information Center
Rotondi, Michael A.; Donner, Allan
2009-01-01
The educational field has now accumulated an extensive literature reporting on values of the intraclass correlation coefficient, a parameter essential to determining the required size of a planned cluster randomized trial. We propose here a simple simulation-based approach including all relevant information that can facilitate this task. An…
English as a Lingua Franca in Europe: An Empirical Perspective
ERIC Educational Resources Information Center
Breiteneder, Angelika
2009-01-01
In 2008, the need for intra-European communication has long exceeded the limits set by language barriers. As a result, English acts extensively as a lingua franca among Europeans with different mother tongues, particularly so in the professional domains of education, business, international relations and scientific research. Yet, despite its…
Lessons Learned from Instructional Design Theory: An Application in Management Education
ERIC Educational Resources Information Center
Burke, Lisa A.
2007-01-01
Given that many doctoral programs do not provide extensive training on how to present course information in the classroom, the current paper looks to educational psychology theory and research for guidance. Richard Mayer and others' copious empirical work on effective and ineffective instructional design, along with relevant research findings in…
Sex Ratios, Economic Power, and Women's Roles: A Theoretical Extension and Empirical Test.
ERIC Educational Resources Information Center
South, Scott J.
1988-01-01
Tested hypotheses concerning sex ratios, women's roles, and economic power with data from 111 countries. Found undersupply of women positively associated with proportion of women who marry and fertility rate; inversely associated with women's average age at marriage, literacy rate, and divorce rate. Suggests women's economic power may counteract…
ERIC Educational Resources Information Center
Cousans, Fran; Patterson, Fiona; Edwards, Helena; Walker, Kim; McLachlan, John C.; Good, David
2017-01-01
Although there is extensive evidence confirming the predictive validity of situational judgement tests (SJTs) in medical education, there remains a shortage of evidence for their predictive validity for performance of postgraduate trainees in their first role in clinical practice. Moreover, to date few researchers have empirically examined the…
Learning in the Liminal Space: A Semiotic Approach to Threshold Concepts
ERIC Educational Resources Information Center
Land, Ray; Rattray, Julie; Vivian, Peter
2014-01-01
The threshold concepts approach to student learning and curriculum design now informs an empirical research base comprising over 170 disciplinary and professional contexts. It draws extensively on the notion of troublesomeness in a "liminal" space of learning. The latter is a transformative state in the process of learning in which there…
The net economic value of wilderness
J. Michael Bowker; J.E. Harvard; John C. Bergstrom; H. Ken Cordell; Donald B.K. English; John B. Loomis
2005-01-01
The purpose of this chapter is to inventory and assess what is currently known about the economic or "dollar" values accruing to Americans from the National Wilderness Preservation System. This chapter identifies the benefits of Wilderness and the economic value of these benefits through an extensive review of published conceptual and empirical literature. It...
A Unified Approach to Measurement Error and Missing Data: Details and Extensions
ERIC Educational Resources Information Center
Blackwell, Matthew; Honaker, James; King, Gary
2017-01-01
We extend a unified and easy-to-use approach to measurement error and missing data. In our companion article, Blackwell, Honaker, and King give an intuitive overview of the new technique, along with practical suggestions and empirical applications. Here, we offer more precise technical details, more sophisticated measurement error model…
Task Oriented Tools for Information Retrieval
ERIC Educational Resources Information Center
Yang, Peilin
2017-01-01
Information Retrieval (IR) is one of the most evolving research fields and has drawn extensive attention in recent years. Because of its empirical nature, the advance of the IR field is closely related to the development of various toolkits. While the traditional IR toolkit mainly provides a platform to evaluate the effectiveness of retrieval…
Implicit Theories of Ability in Physical Education: Current Issues and Future Directions
ERIC Educational Resources Information Center
Warburton, Victoria Emily; Spray, Christopher Mark
2017-01-01
Purpose: In light of the extensive empirical evidence that implicit theories have important motivational consequences for young people across a range of educational settings we seek to provide a summary of, and personal reflection on, implicit theory research and practice in physical education (PE). Overview: We first provide an introduction to…
Expert Panel Reviews of Research Centers: The Site Visit Process
ERIC Educational Resources Information Center
Lawrenz, Frances; Thao, Mao; Johnson, Kelli
2012-01-01
Site visits are used extensively in a variety of settings within the evaluation community. They are especially common in making summative value decisions about the quality and worth of research programs/centers. However, there has been little empirical research and guidance about how to appropriately conduct evaluative site visits of research…
ERIC Educational Resources Information Center
Brasiel, Sarah; Martin, Taylor; Jeong, Soojeong; Yuan, Min
2016-01-01
An extensive body of research has demonstrated that the use in a K-12 classroom of technology, such as the Internet, computers, and software programs, enhances the learning of mathematics (Cheung & Slavin, 2013; Cohen & Hollebrands, 2011). In particular, growing empirical evidence supports that certain types of technology, such as…
Usage Intention Framework Model: A Fuzzy Logic Interpretation of the Classical Utaut Model
ERIC Educational Resources Information Center
Sandaire, Johnny
2009-01-01
A fuzzy conjoint analysis (FCA: Turksen, 1992) model for enhancing management decision in the technology adoption domain was implemented as an extension to the UTAUT model (Venkatesh, Morris, Davis, & Davis, 2003). Additionally, a UTAUT-based Usage Intention Framework Model (UIFM) introduced a closed-loop feedback system. The empirical evidence…
The Construction of the Self: A Developmental Perspective.
ERIC Educational Resources Information Center
Harter, Susan
Drawing upon extensive theoretical knowledge and decades of empirical research, this book traces changes in the structure and content of self-representations from early childhood through late adolescence. Chapter 1 includes a discussion of the self as subject (I-self) and object (Me-self) and describes the historical roots of contemporary issues…
School Reforms, Principal Leadership, and Teacher Resistance: Evidence from Korea
ERIC Educational Resources Information Center
Park, Joo-Ho; Jeong, Dong Wook
2013-01-01
Many countries design and implement school change with a focus on the fundamental reconfiguration in the structures of schooling. In this article, we examined the relationship between principal leadership and teacher resistance to school reforms driven by external interveners. For an empirical analysis, we took advantage of extensive data derived…
Systems for Instructional Improvement: Creating Coherence from the Classroom to the District Office
ERIC Educational Resources Information Center
Cobb, Paul; Jackson, Kara; Henrick, Erin; Smith, Thomas M.
2018-01-01
In "Systems for Instructional Improvement," Paul Cobb and his colleagues draw on their extensive research to propose a series of specific, empirically grounded recommendations that together constitute a theory of action for advancing instruction at scale. The authors outline the elements of a coherent instructional system; describe…
A Comparison of Flexible Prompt Fading and Constant Time Delay for Five Children with Autism
ERIC Educational Resources Information Center
Soluaga, Doris; Leaf, Justin B.; Taubman, Mitchell; McEachin, John; Leaf, Ron
2008-01-01
Given the increasing rates of autism, identifying prompting procedures that can assist in the development of more optimal learning opportunities for this population is critical. Extensive empirical research exists supporting the effectiveness of various prompting strategies. Constant time delay (CTD) is a highly implemented prompting procedure…
Mixture Distribution Latent State-Trait Analysis: Basic Ideas and Applications
ERIC Educational Resources Information Center
Courvoisier, Delphine S.; Eid, Michael; Nussbeck, Fridtjof W.
2007-01-01
Extensions of latent state-trait models for continuous observed variables to mixture latent state-trait models with and without covariates of change are presented that can separate individuals differing in their occasion-specific variability. An empirical application to the repeated measurement of mood states (N = 501) revealed that a model with 2…
Application of LSP Texts in Translator Training
ERIC Educational Resources Information Center
Ilynska, Larisa; Smirnova, Tatjana; Platonova, Marina
2017-01-01
The paper presents discussion of the results of extensive empirical research into efficient methods of educating and training translators of LSP (language for special purposes) texts. The methodology is based on using popular LSP texts in the respective fields as one of the main media for translator training. The aim of the paper is to investigate…
Willis, Gordon; Lawrence, Deirdre; Hartman, Anne; Kudela, Martha Stapleton; Levin, Kerry; Forsyth, Barbara
2013-01-01
Because of the vital need to attain cross-cultural comparability of estimates of tobacco use across subgroups of the U.S. population that differ in primary language use, the NCI Tobacco Use Special Cessation Supplement to the Current Population Survey (TUSCS-CPS) was translated into Spanish, Chinese (Mandarin and Cantonese), Korean, Vietnamese, and Khmer (Cambodian). The questionnaire translations were extensively tested using an eight-step process that focused on both translation procedures and empirical pretesting. The resulting translations are available on the Internet (at http://riskfactor.cancer.gov/studies/tus-cps/translation/questionnaires.html) for tobacco researchers to use in their own surveys, either in full, or as material to be selected as appropriate. This manuscript provides information to guide researchers in accessing and using the translations, and describes the empirical procedures used to develop and pretest them (cognitive interviewing and behavior coding). We also provide recommendations concerning the further development of questionnaire translations. PMID:18584471
Willis, Gordon; Lawrence, Deirdre; Hartman, Anne; Stapleton Kudela, Martha; Levin, Kerry; Forsyth, Barbara
2008-06-01
Because of the vital need to attain cross-cultural comparability of estimates of tobacco use across subgroups of the U.S. population that differ in primary language use, the National Cancer Institute (NCI) Tobacco Use Special Cessation Supplement to the Current Population Survey (TUSCS-CPS) was translated into Spanish, Chinese (Mandarin and Cantonese), Korean, Vietnamese, and Khmer (Cambodian). The questionnaire translations were extensively tested using an eight-step process that focused on both translation procedures and empirical pretesting. The resulting translations are available on the Internet at http://riskfactor.cancer.gov/studies/tus-cps/translation/questionnaires.html for tobacco researchers to use in their own surveys, either in full, or as material to be selected as appropriate. This manuscript provides information to guide researchers in accessing and using the translations, and describes the empirical procedures used to develop and pretest them (cognitive interviewing and behavior coding). We also provide recommendations concerning the further development of questionnaire translations.
Ben-Shachar, Rotem; Koelle, Katia
2018-06-15
An extensive body of theory addresses the topic of pathogen virulence evolution, yet few studies have empirically demonstrated the presence of fitness trade-offs that would select for intermediate virulence. Here we show the presence of transmission-clearance trade-offs in dengue virus using viremia measurements. By fitting a within-host model to these data, we further find that the interaction between dengue and the host immune response can account for the observed trade-offs. Finally, we consider dengue virulence evolution when selection acts on the virus's production rate. By combining within-host model simulations with empirical findings on how host viral load affects human-to-mosquito transmission success, we show that the virus's transmission potential is maximized at production rates associated with intermediate virulence and that the optimal production rate critically depends on dengue's epidemiological context. These results indicate that long-term changes in dengue's global distribution impact the invasion and spread of virulent dengue virus genotypes.
Harmonic analysis of electrified railway based on improved HHT
NASA Astrophysics Data System (ADS)
Wang, Feng
2018-04-01
In this paper, the causes and harms of the current electric locomotive electrical system harmonics are firstly studied and analyzed. Based on the characteristics of the harmonics in the electrical system, the Hilbert-Huang transform method is introduced. Based on the in-depth analysis of the empirical mode decomposition method and the Hilbert transform method, the reasons and solutions to the endpoint effect and modal aliasing problem in the HHT method are explored. For the endpoint effect of HHT, this paper uses point-symmetric extension method to extend the collected data; In allusion to the modal aliasing problem, this paper uses the high frequency harmonic assistant method to preprocess the signal and gives the empirical formula of high frequency auxiliary harmonic. Finally, combining the suppression of HHT endpoint effect and modal aliasing problem, an improved HHT method is proposed and simulated by matlab. The simulation results show that the improved HHT is effective for the electric locomotive power supply system.
Experiments in dilution jet mixing effects of multiple rows and non-circular orifices
NASA Technical Reports Server (NTRS)
Holdeman, J. D.; Srinivasan, R.; Coleman, E. B.; Meyers, G. D.; White, C. D.
1985-01-01
Experimental and empirical model results are presented that extend previous studies of the mixing of single-sided and opposed rows of jets in a confined duct flow to include effects of non-circular orifices and double rows of jets. Analysis of the mean temperature data obtained in this investigation showed that the effects of orifice shape and double rows are significant only in the region close to the injection plane, provided that the orifices are symmetric with respect to the main flow direction. The penetration and mixing of jets from 45-degree slanted slots is slightly less than that from equivalent-area symmetric orifices. The penetration from 2-dimensional slots is similar to that from equivalent-area closely-spaced rows of holes, but the mixing is slower for the 2-D slots. Calculated mean temperature profiles downstream of jets from non-circular and double rows of orifices, made using an extension developed for a previous empirical model, are shown to be in good agreement with the measured distributions.
Experiments in dilution jet mixing - Effects of multiple rows and non-circular orifices
NASA Technical Reports Server (NTRS)
Holdeman, J. D.; Srinivasan, R.; Coleman, E. B.; Meyers, G. D.; White, C. D.
1985-01-01
Experimental and empirical model results are presented that extend previous studies of the mixing of single-sided and opposed rows of jets in a confined duct flow to include effects of non-circular orifices and double rows of jets. Analysis of the mean temperature data obtained in this investigation showed that the effects of orifice shape and double rows are significant only in the region close to the injection plane, provided that the orifices are symmetric with respect to the main flow direction. The penetration and mixing of jets from 45-degree slanted slots is slightly less than that from equivalent-area symmetric orifices. The penetration from two-dimensional slots is similar to that from equivalent-area closely-spaced rows of holes, but the mixing is slower for the 2-D slots. Calculated mean temperature profiles downstream of jets from non-circular and double rows of orifices, made using an extension developed for a previous empirical model, are shown to be in good agreement with the measured distributions.
Rector, Neil A; Man, Vincent; Lerman, Bethany
2014-06-01
Cognitive-behavioural therapy (CBT) is an empirically supported treatment for anxiety disorders. CBT treatments are based on disorder-specific protocols that have been developed to target individual anxiety disorders, despite that anxiety disorders frequently co-occur and are comorbid with depression. Given the high rates of diagnostic comorbidity, substantial overlap in dimensional symptom ratings, and extensive evidence that the mood and anxiety disorders share a common set of psychological and biological vulnerabilities, transdiagnostic CBT protocols have recently been developed to treat the commonalities among the mood and anxiety disorders. We conducted a selective review of empirical developments in the transdiagnostic CBT treatment of anxiety and depression (2008-2013). Preliminary evidence suggests that theoretically based transdiagnostic CBT approaches lead to large treatment effects on the primary anxiety disorder, considerable reduction of diagnostic comorbidity, and some preliminary effects regarding the impact on the putative, shared psychological mechanisms. However, the empirical literature remains tentative owing to relatively small samples, limited direct comparisons with disorder-specific CBT protocols, and the relative absence of the study of disorder-specific compared with shared mechanisms of action in treatment. We conclude with a treatment conceptualization of the new transdiagnostic interventions as complementary, rather than contradictory, to disorder-specific CBT.
Waadeland, Carl Haakon
2017-01-01
Results from different empirical investigations on gestural aspects of timed rhythmic movements indicate that the production of asymmetric movement trajectories is a feature that seems to be a common characteristic of various performances of repetitive rhythmic patterns. The behavioural or neural origin of these asymmetrical trajectories is, however, not identified. In the present study we outline a theoretical model that is capable of producing syntheses of asymmetric movement trajectories documented in empirical investigations by Balasubramaniam et al. (2004). Characteristic qualities of the extension/flexion profiles in the observed asymmetric trajectories are reproduced, and we conduct an experiment similar to Balasubramaniam et al. (2004) to show that the empirically documented movement trajectories and our modelled approximations share the same spectral components. The model is based on an application of frequency modulated movements, and a theoretical interpretation offered by the model is to view paced rhythmic movements as a result of an unpaced movement being "stretched" and "compressed", caused by the presence of a metronome. We discuss our model construction within the framework of event-based and emergent timing, and argue that a change between these timing modes might be reflected by the strength of the modulation in our model. Copyright © 2016 Elsevier B.V. All rights reserved.
Temperament, Emotion and Childhood Stuttering
Jones, Robin; Choi, Dahye; Conture, Edward; Walden, Tedra
2015-01-01
The purpose of this article is to provide a brief description of temperament and emotion, review empirical evidence pertaining to their possible association with childhood stuttering, and discuss possible clinical implications. In general, temperament is typically thought of as an individual's constitutionally (biologically) based behavioral proclivities. These proclivities often include emotional reactivity and self-regulation. Reactivity refers to arousal of emotions, motor activity, and attention, and self-regulation refers to the ability to moderate those tendencies. The trait-like nature of temperament makes it potentially salient to our understanding of the onset and development of stuttering because temperamental tendencies may result in greater reactivity or difficulty in coping. Emotions, which are more state-like and variable, may influence the variation of stuttering commonly observed both within and between speaking situations. Temperament and emotion may serve as a causal contributor to developmental stuttering, with empirical findings indicating that preschool-aged children who stutter (CWS) exhibit differences in temperament and emotion when compared with children who do not stutter (CWNS). Given that empirical study of temperament in preschool-aged CWS is nascent, extensive discussion of clinical implications is challenging. With that caution, we present some early possibilities, including matching treatment approaches with the child's temperamental profile and using temperament as a predictor of treatment outcome. PMID:24782274
Urbanowicz, Richard A; McClure, C Patrick; King, Barnabas; Mason, Christopher P; Ball, Jonathan K; Tarr, Alexander W
2016-09-01
Retrovirus pseudotypes are a highly tractable model used to study the entry pathways of enveloped viruses. This model has been extensively applied to the study of the hepatitis C virus (HCV) entry pathway, preclinical screening of antiviral antibodies and for assessing the phenotype of patient-derived viruses using HCV pseudoparticles (HCVpp) possessing the HCV E1 and E2 glycoproteins. However, not all patient-isolated clones produce particles that are infectious in this model. This study investigated factors that might limit phenotyping of patient-isolated HCV glycoproteins. Genetically related HCV glycoproteins from quasispecies in individual patients were discovered to behave very differently in this entry model. Empirical optimization of the ratio of packaging construct and glycoprotein-encoding plasmid was required for successful HCVpp genesis for different clones. The selection of retroviral packaging construct also influenced the function of HCV pseudoparticles. Some glycoprotein constructs tolerated a wide range of assay parameters, while others were much more sensitive to alterations. Furthermore, glycoproteins previously characterized as unable to mediate entry were found to be functional. These findings were validated using chimeric cell-cultured HCV bearing these glycoproteins. Using the same empirical approach we demonstrated that generation of infectious ebolavirus pseudoviruses (EBOVpv) was also sensitive to the amount and ratio of plasmids used, and that protocols for optimal production of these pseudoviruses are dependent on the exact virus glycoprotein construct. These findings demonstrate that it is crucial for studies utilizing pseudoviruses to conduct empirical optimization of pseudotype production for each specific glycoprotein sequence to achieve optimal titres and facilitate accurate phenotyping.
Are artworks more like people than artifacts? Individual concepts and their extensions.
Newman, George E; Bartels, Daniel M; Smith, Rosanna K
2014-10-01
This paper examines people's reasoning about identity continuity (i.e., how people decide that a particular object is the same object over time) and its relation to previous research on how people value one-of-a-kind artifacts, such as artwork. We propose that judgments about the continuity of artworks are related to judgments about the continuity of individual persons because art objects are seen as physical extensions of their creators. We report a reanalysis of previous data and the results of two new empirical studies that test this hypothesis. The first study demonstrates that the mere categorization of an object as "art" versus "a tool" changes people's intuitions about the persistence of those objects over time. In a second study, we examine some conditions that may lead artworks to be thought of as different from other artifacts. These observations inform both current understanding of what makes some objects one-of-a-kind as well as broader questions regarding how people intuitively think about the persistence of human agents. Copyright © 2014 Cognitive Science Society, Inc.
NASA Astrophysics Data System (ADS)
Carozza, D. A.; Bianchi, D.; Galbraith, E. D.
2015-12-01
Environmental change and the exploitation of marine resources have had profound impacts on marine communities, with potential implications for ocean biogeochemistry and food security. In order to study such global-scale problems, it is helpful to have computationally efficient numerical models that predict the first-order features of fish biomass production as a function of the environment, based on empirical and mechanistic understandings of marine ecosystems. Here we describe the ecological module of the BiOeconomic mArine Trophic Size-spectrum (BOATS) model, which takes an Earth-system approach to modeling fish biomass at the global scale. The ecological model is designed to be used on an Earth System model grid, and determines size spectra of fish biomass by explicitly resolving life history as a function of local temperature and net primary production. Biomass production is limited by the availability of photosynthetic energy to upper trophic levels, following empirical trophic efficiency scalings, and by well-established empirical temperature-dependent growth rates. Natural mortality is calculated using an empirical size-based relationship, while reproduction and recruitment depend on both the food availability to larvae from net primary production and the production of eggs by mature adult fish. We describe predicted biomass spectra and compare them to observations, and conduct a sensitivity study to determine how the change as a function of net primary production and temperature. The model relies on a limited number of parameters compared to similar modeling efforts, while retaining realistic representations of biological and ecological processes, and is computationally efficient, allowing extensive parameter-space analyses even when implemented globally. As such, it enables the exploration of the linkages between ocean biogeochemistry, climate, and upper trophic levels at the global scale, as well as a representation of fish biomass for idealized studies of fisheries.
NASA Astrophysics Data System (ADS)
Carozza, David Anthony; Bianchi, Daniele; Galbraith, Eric Douglas
2016-04-01
Environmental change and the exploitation of marine resources have had profound impacts on marine communities, with potential implications for ocean biogeochemistry and food security. In order to study such global-scale problems, it is helpful to have computationally efficient numerical models that predict the first-order features of fish biomass production as a function of the environment, based on empirical and mechanistic understandings of marine ecosystems. Here we describe the ecological module of the BiOeconomic mArine Trophic Size-spectrum (BOATS) model, which takes an Earth-system approach to modelling fish biomass at the global scale. The ecological model is designed to be used on an Earth-system model grid, and determines size spectra of fish biomass by explicitly resolving life history as a function of local temperature and net primary production. Biomass production is limited by the availability of photosynthetic energy to upper trophic levels, following empirical trophic efficiency scalings, and by well-established empirical temperature-dependent growth rates. Natural mortality is calculated using an empirical size-based relationship, while reproduction and recruitment depend on both the food availability to larvae from net primary production and the production of eggs by mature adult fish. We describe predicted biomass spectra and compare them to observations, and conduct a sensitivity study to determine how they change as a function of net primary production and temperature. The model relies on a limited number of parameters compared to similar modelling efforts, while retaining reasonably realistic representations of biological and ecological processes, and is computationally efficient, allowing extensive parameter-space analyses even when implemented globally. As such, it enables the exploration of the linkages between ocean biogeochemistry, climate, and upper trophic levels at the global scale, as well as a representation of fish biomass for idealized studies of fisheries.
Judge, W Q; Zeithaml, C P
1992-01-01
As the health care environment becomes more competitive, nonprofit hospitals are under pressure to adopt for-profit business practices. Based on an extensive field study, this research examines the central issue of organizational governance by comparing the strategic roles of nonprofit hospital boards with for-profit industrial boards. The results show that nonprofit hospital boards are generally more involved in the strategic decision process than their for-profit counterparts. If this governance activity is seen as desirable, hospital boards should exercise caution in emulating for-profit board practices. PMID:1563953
Maximum Likelihood Estimations and EM Algorithms with Length-biased Data
Qin, Jing; Ning, Jing; Liu, Hao; Shen, Yu
2012-01-01
SUMMARY Length-biased sampling has been well recognized in economics, industrial reliability, etiology applications, epidemiological, genetic and cancer screening studies. Length-biased right-censored data have a unique data structure different from traditional survival data. The nonparametric and semiparametric estimations and inference methods for traditional survival data are not directly applicable for length-biased right-censored data. We propose new expectation-maximization algorithms for estimations based on full likelihoods involving infinite dimensional parameters under three settings for length-biased data: estimating nonparametric distribution function, estimating nonparametric hazard function under an increasing failure rate constraint, and jointly estimating baseline hazards function and the covariate coefficients under the Cox proportional hazards model. Extensive empirical simulation studies show that the maximum likelihood estimators perform well with moderate sample sizes and lead to more efficient estimators compared to the estimating equation approaches. The proposed estimates are also more robust to various right-censoring mechanisms. We prove the strong consistency properties of the estimators, and establish the asymptotic normality of the semi-parametric maximum likelihood estimators under the Cox model using modern empirical processes theory. We apply the proposed methods to a prevalent cohort medical study. Supplemental materials are available online. PMID:22323840
Intonation as an interface between language and affect.
Grandjean, Didier; Bänziger, Tanja; Scherer, Klaus R
2006-01-01
The vocal expression of human emotions is embedded within language and the study of intonation has to take into account two interacting levels of information--emotional and semantic meaning. In addition to the discussion of this dual coding system, an extension of Brunswik's lens model is proposed. This model includes the influences of conventions, norms, and display rules (pull effects) and psychobiological mechanisms (push effects) on emotional vocalizations produced by the speaker (encoding) and the reciprocal influences of these two aspects on attributions made by the listener (decoding), allowing the dissociation and systematic study of the production and perception of intonation. Three empirical studies are described as examples of possibilities of dissociating these different phenomena at the behavioral and neurological levels in the study of intonation.
A discrete element method-based approach to predict the breakage of coal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gupta, Varun; Sun, Xin; Xu, Wei
Pulverization is an essential pre-combustion technique employed for solid fuels, such as coal, to reduce particle sizes. Smaller particles ensure rapid and complete combustion, leading to low carbon emissions. Traditionally, the resulting particle size distributions from pulverizers have been determined by empirical or semi-empirical approaches that rely on extensive data gathered over several decades during operations or experiments, with limited predictive capabilities for new coals and processes. Our work presents a Discrete Element Method (DEM)-based computational approach to model coal particle breakage with experimentally characterized coal physical properties. We also examined the effect of select operating parameters on the breakagemore » behavior of coal particles.« less
A discrete element method-based approach to predict the breakage of coal
Gupta, Varun; Sun, Xin; Xu, Wei; ...
2017-08-05
Pulverization is an essential pre-combustion technique employed for solid fuels, such as coal, to reduce particle sizes. Smaller particles ensure rapid and complete combustion, leading to low carbon emissions. Traditionally, the resulting particle size distributions from pulverizers have been determined by empirical or semi-empirical approaches that rely on extensive data gathered over several decades during operations or experiments, with limited predictive capabilities for new coals and processes. Our work presents a Discrete Element Method (DEM)-based computational approach to model coal particle breakage with experimentally characterized coal physical properties. We also examined the effect of select operating parameters on the breakagemore » behavior of coal particles.« less
Scarbecz, Mark
2004-11-01
Despite some important differences, relationships among dental team members bear striking similarities to marital relationships. Empirical research on marital interaction can be useful in enhancing relationships among dental team members. As with marriage, it is unrealistic to expect that conflict and differences of opinion will never occur among dental team members. However, a set of principles derived from extensive, empirical, behavioral science research on marital interaction can provide dental teams with strategies for strengthening working relationships and managing conflict. Benefits of using these principles may include a reduction in employee turnover, improvements in efficiency and productivity, and the creation of an environment that helps attract and retain patients.
ERIC Educational Resources Information Center
Horner, Robert H.; Kincaid, Donald; Sugai, George; Lewis, Timothy; Eber, Lucille; Barrett, Susan; Dickey, Celeste Rossetto; Richter, Mary; Sullivan, Erin; Boezio, Cyndi; Algozzine, Bob; Reynolds, Heather; Johnson, Nanci
2014-01-01
Scaling of evidence-based practices in education has received extensive discussion but little empirical evaluation. We present here a descriptive summary of the experience from seven states with a history of implementing and scaling School-Wide Positive Behavioral Interventions and Supports (SWPBIS) over the past decade. Each state has been…
Leadership Matters: Teachers' Roles in School Decision Making and School Performance
ERIC Educational Resources Information Center
Ingersoll, Richard M.; Sirinides, Philip; Dougherty, Patrick
2018-01-01
Given the prominence of both instructional leadership and teacher leadership in the realms of school reform and policy, not surprisingly, both have also been the focus of extensive empirical research. But there have been limits to this research. It is, for example, unclear which of the many key elements of instructional leadership are more, or…
ERIC Educational Resources Information Center
Lin, Yu-Wei; Zini, Enrico
2008-01-01
This empirical paper shows how free/libre open source software (FLOSS) contributes to mutual and collaborative learning in an educational environment. Unlike proprietary software, FLOSS allows extensive customisation of software to support the needs of local users better. This also allows users to participate more proactively in the development…
Putting Children Front and Center: Building Coordinated Social Policy for America's Children
ERIC Educational Resources Information Center
Zaff, Jonathan F.; Smerdon, Becky
2009-01-01
In this article, we argue that policymakers in America should reference a coherent, comprehensive, and child-centered framework for children. That is, based on an extensive review of the empirical literature on the first two decades of life, we conclude that policies should address the needs of young people throughout the first two decades of…
ERIC Educational Resources Information Center
Lim, Lois; Oei, Adam C.
2015-01-01
Despite the widespread use of Orton-Gillingham (OG) based approaches to dyslexia remediation, empirical support documenting its effectiveness is lacking. Recently, Chia and Houghton demonstrated the effectiveness of the OG approach for remediation of dyslexia in Singapore. As a conceptual replication and extension of that research, we report…
What We Are Learning about How the Brain Learns-Implications for the Use of Video in the Classroom.
ERIC Educational Resources Information Center
Davidson, Tom; McKenzie, Barbara K.
2000-01-01
Describes empirical research in the fields of neurology and cognitive science that is being conducted to determine how and why the brain learns. Explains ways that video is compatible with how the brain learns and suggests it should be used more extensively by teachers and library media specialists. (LRW)
Worldwide Ocean Optics Database (WOOD)
2002-09-30
attenuation estimated from diffuse attenuation and backscatter data). Error estimates will also be provided for the computed results. Extensive algorithm...empirical algorithms (e.g., beam attenuation estimated from diffuse attenuation and backscatter data). Error estimates will also be provided for the...properties, including diffuse attenuation, beam attenuation, and scattering. Data from ONR-funded bio-optical cruises will be given priority for loading
ERIC Educational Resources Information Center
Hofman, Roelande H.; de Boom, Jan; Meeuwisse, Marieke; Hofman, W. H. Adriaan
2013-01-01
Despite the extensive literature on educational innovations, there is only limited empirical research available into the impact of innovations on student achievement. In this article, the following research questions will be answered: What form do innovations in secondary education take, are there types of innovative schools, and what effect do…
An Empirical Look at Recipient Benefits Associated with a University-Issued Student Leadership Award
ERIC Educational Resources Information Center
Adams, Robyn L.
2012-01-01
Within academia there is an elaborate and extensive system of awards for both students and faculty (Frey, 2006). Although the majority of student-based awards are for outstanding leadership and related accomplishments, there has been virtually no research on the impact of receiving such a leadership award (Frey, 2006). Due to the conspicuous…
The Educational Use of Social Annotation Tools in Higher Education: A Literature Review
ERIC Educational Resources Information Center
Novak, Elena; Razzouk, Rim; Johnson, Tristan E.
2012-01-01
This paper presents a literature review of empirical research related to the use and effect of online social annotation (SA) tools in higher education settings. SA technology is an emerging educational technology that has not yet been extensively used and examined in education. As such, the research focusing on this technology is still very…
ERIC Educational Resources Information Center
Desjardins, Richard; Ederer, Peer
2015-01-01
This article explores the relative importance of different socio-demographic and practice-oriented factors that are related to proficiency in problem solving in technology-rich environments (PSTREs) and by extension may be related to complex problem solving (CPS). The empirical analysis focuses on the proficiency measurements of PSTRE made…
Equality of Opportunity and Equality of Outcome
ERIC Educational Resources Information Center
Kodelja, Zdenko
2016-01-01
The report on the findings of extensive empirical research on equality of educational opportunities carried out in the United States on a very large sample of public schools by Coleman and his colleagues has had a major impact on education policy and has given rise to a large amount of research and various interpretations. However, as some…
An Econometric Examination of the Behavioral Perspective Model in the Context of Norwegian Retailing
ERIC Educational Resources Information Center
Sigurdsson, Valdimar; Kahamseh, Saeed; Gunnarsson, Didrik; Larsen, Nils Magne; Foxall, Gordon R.
2013-01-01
The behavioral perspective model's (BPM; Foxall, 1990) retailing literature is built on extensive empirical research and techniques that were originally refined in choice experiments in behavioral economics and behavior analysis, and then tested mostly on British consumer panel data. We test the BPM in the context of Norwegian retailing. This…
Modeling wildland fire propagation with level set methods
V. Mallet; D.E Keyes; F.E. Fendell
2009-01-01
Level set methods are versatile and extensible techniques for general front tracking problems, including the practically important problem of predicting the advance of a fire front across expanses of surface vegetation. Given a rule, empirical or otherwise, to specify the rate of advance of an infinitesimal segment of fire front arc normal to itself (i.e., given the...
Callinan, R B; Sammut, J; Fraser, G C
2005-02-28
Severe dermatitis and branchitis are described in a wild population of empire gudgeon Hypseleotris compressa, an Australian eleotrid, exposed naturally to runoff from acid sulfate soils (ASS) in a drained estuarine embayment in eastern Australia. After at least 2 d exposure to pH < 4, and up to 7 d exposure to pH < 6, approximately 50% of the fish sampled had moderate to severe diffuse epidermal hyperplasia, usually at scale margins, and scattered areas of moderate to severe, focal to locally extensive, subacute, necrotising dermatitis. Saprolegnia spp. had invaded epidermis in some inflamed areas. In gills, there was moderate to severe hyperplasia and necrosis of secondary lamellar epithelium, with fusion of adjacent secondary lamellae. Inorganic monomeric aluminium and calcium concentrations in water at the site during the event were 27.7 and 16.6 mg l(-1), respectively. Large numbers of empire gudgeons at the study site had died after at least 8 d exposure to pH < 4, and up to 13 d exposure to pH < 6. These findings provide clear evidence that acidification of estuarine systems by runoff from ASS has deleterious effects on aquatic biota. Furthermore, study findings suggest a mechanism whereby lesions of epizootic ulcerative syndrome (EUS) may be initiated in estuarine fishes by a combination of sublethal exposure to ASS runoff and Aphanomyces invadans infection, a suggestion consistent with the geographic and temporal distribution of EUS outbreaks in Australian estuaries.
Risky forward interest rates and swaptions: Quantum finance model and empirical results
NASA Astrophysics Data System (ADS)
Baaquie, Belal Ehsan; Yu, Miao; Bhanap, Jitendra
2018-02-01
Risk free forward interest rates (Diebold and Li, 2006 [1]; Jamshidian, 1991 [2 ]) - and their realization by US Treasury bonds as the leading exemplar - have been studied extensively. In Baaquie (2010), models of risk free bonds and their forward interest rates based on the quantum field theoretic formulation of the risk free forward interest rates have been discussed, including the empirical evidence supporting these models. The quantum finance formulation of risk free forward interest rates is extended to the case of risky forward interest rates. The examples of the Singapore and Malaysian forward interest rates are used as specific cases. The main feature of the quantum finance model is that the risky forward interest rates are modeled both a) as a stand-alone case as well as b) being driven by the US forward interest rates plus a spread - having its own term structure -above the US forward interest rates. Both the US forward interest rates and the term structure for the spread are modeled by a two dimensional Euclidean quantum field. As a precursor to the evaluation of put option of the Singapore coupon bond, the quantum finance model for swaptions is tested using empirical study of swaptions for the US Dollar -showing that the model is quite accurate. A prediction for the market price of the put option for the Singapore coupon bonds is obtained. The quantum finance model is generalized to study the Malaysian case and the Malaysian forward interest rates are shown to have anomalies absent for the US and Singapore case. The model's prediction for a Malaysian interest rate swap is obtained.
Niang, Oumar; Thioune, Abdoulaye; El Gueirea, Mouhamed Cheikh; Deléchelle, Eric; Lemoine, Jacques
2012-09-01
The major problem with the empirical mode decomposition (EMD) algorithm is its lack of a theoretical framework. So, it is difficult to characterize and evaluate this approach. In this paper, we propose, in the 2-D case, the use of an alternative implementation to the algorithmic definition of the so-called "sifting process" used in the original Huang's EMD method. This approach, especially based on partial differential equations (PDEs), was presented by Niang in previous works, in 2005 and 2007, and relies on a nonlinear diffusion-based filtering process to solve the mean envelope estimation problem. In the 1-D case, the efficiency of the PDE-based method, compared to the original EMD algorithmic version, was also illustrated in a recent paper. Recently, several 2-D extensions of the EMD method have been proposed. Despite some effort, 2-D versions for EMD appear poorly performing and are very time consuming. So in this paper, an extension to the 2-D space of the PDE-based approach is extensively described. This approach has been applied in cases of both signal and image decomposition. The obtained results confirm the usefulness of the new PDE-based sifting process for the decomposition of various kinds of data. Some results have been provided in the case of image decomposition. The effectiveness of the approach encourages its use in a number of signal and image applications such as denoising, detrending, or texture analysis.
A glacier runoff extension to the Precipitation Runoff Modeling System
Van Beusekom, Ashley E.; Viger, Roland
2016-01-01
A module to simulate glacier runoff, PRMSglacier, was added to PRMS (Precipitation Runoff Modeling System), a distributed-parameter, physical-process hydrological simulation code. The extension does not require extensive on-glacier measurements or computational expense but still relies on physical principles over empirical relations as much as is feasible while maintaining model usability. PRMSglacier is validated on two basins in Alaska, Wolverine, and Gulkana Glacier basin, which have been studied since 1966 and have a substantial amount of data with which to test model performance over a long period of time covering a wide range of climatic and hydrologic conditions. When error in field measurements is considered, the Nash-Sutcliffe efficiencies of streamflow are 0.87 and 0.86, the absolute bias fractions of the winter mass balance simulations are 0.10 and 0.08, and the absolute bias fractions of the summer mass balances are 0.01 and 0.03, all computed over 42 years for the Wolverine and Gulkana Glacier basins, respectively. Without taking into account measurement error, the values are still within the range achieved by the more computationally expensive codes tested over shorter time periods.
Out-of-Sample Extensions for Non-Parametric Kernel Methods.
Pan, Binbin; Chen, Wen-Sheng; Chen, Bo; Xu, Chen; Lai, Jianhuang
2017-02-01
Choosing suitable kernels plays an important role in the performance of kernel methods. Recently, a number of studies were devoted to developing nonparametric kernels. Without assuming any parametric form of the target kernel, nonparametric kernel learning offers a flexible scheme to utilize the information of the data, which may potentially characterize the data similarity better. The kernel methods using nonparametric kernels are referred to as nonparametric kernel methods. However, many nonparametric kernel methods are restricted to transductive learning, where the prediction function is defined only over the data points given beforehand. They have no straightforward extension for the out-of-sample data points, and thus cannot be applied to inductive learning. In this paper, we show how to make the nonparametric kernel methods applicable to inductive learning. The key problem of out-of-sample extension is how to extend the nonparametric kernel matrix to the corresponding kernel function. A regression approach in the hyper reproducing kernel Hilbert space is proposed to solve this problem. Empirical results indicate that the out-of-sample performance is comparable to the in-sample performance in most cases. Experiments on face recognition demonstrate the superiority of our nonparametric kernel method over the state-of-the-art parametric kernel methods.
NASA Technical Reports Server (NTRS)
Bergrun, N. R.
1951-01-01
An empirical method for the determination of the area, rate, and distribution of water-drop impingement on airfoils of arbitrary section is presented. The procedure represents an initial step toward the development of a method which is generally applicable in the design of thermal ice-prevention equipment for airplane wing and tail surfaces. Results given by the proposed empirical method are expected to be sufficiently accurate for the purpose of heated-wing design, and can be obtained from a few numerical computations once the velocity distribution over the airfoil has been determined. The empirical method presented for incompressible flow is based on results of extensive water-drop. trajectory computations for five airfoil cases which consisted of 15-percent-thick airfoils encompassing a moderate lift-coefficient range. The differential equations pertaining to the paths of the drops were solved by a differential analyzer. The method developed for incompressible flow is extended to the calculation of area and rate of impingement on straight wings in subsonic compressible flow to indicate the probable effects of compressibility for airfoils at low subsonic Mach numbers.
Hard-Rock Stability Analysis for Span Design in Entry-Type Excavations with Learning Classifiers
García-Gonzalo, Esperanza; Fernández-Muñiz, Zulima; García Nieto, Paulino José; Bernardo Sánchez, Antonio; Menéndez Fernández, Marta
2016-01-01
The mining industry relies heavily on empirical analysis for design and prediction. An empirical design method, called the critical span graph, was developed specifically for rock stability analysis in entry-type excavations, based on an extensive case-history database of cut and fill mining in Canada. This empirical span design chart plots the critical span against rock mass rating for the observed case histories and has been accepted by many mining operations for the initial span design of cut and fill stopes. Different types of analysis have been used to classify the observed cases into stable, potentially unstable and unstable groups. The main purpose of this paper is to present a new method for defining rock stability areas of the critical span graph, which applies machine learning classifiers (support vector machine and extreme learning machine). The results show a reasonable correlation with previous guidelines. These machine learning methods are good tools for developing empirical methods, since they make no assumptions about the regression function. With this software, it is easy to add new field observations to a previous database, improving prediction output with the addition of data that consider the local conditions for each mine. PMID:28773653
Hard-Rock Stability Analysis for Span Design in Entry-Type Excavations with Learning Classifiers.
García-Gonzalo, Esperanza; Fernández-Muñiz, Zulima; García Nieto, Paulino José; Bernardo Sánchez, Antonio; Menéndez Fernández, Marta
2016-06-29
The mining industry relies heavily on empirical analysis for design and prediction. An empirical design method, called the critical span graph, was developed specifically for rock stability analysis in entry-type excavations, based on an extensive case-history database of cut and fill mining in Canada. This empirical span design chart plots the critical span against rock mass rating for the observed case histories and has been accepted by many mining operations for the initial span design of cut and fill stopes. Different types of analysis have been used to classify the observed cases into stable, potentially unstable and unstable groups. The main purpose of this paper is to present a new method for defining rock stability areas of the critical span graph, which applies machine learning classifiers (support vector machine and extreme learning machine). The results show a reasonable correlation with previous guidelines. These machine learning methods are good tools for developing empirical methods, since they make no assumptions about the regression function. With this software, it is easy to add new field observations to a previous database, improving prediction output with the addition of data that consider the local conditions for each mine.
Theoretical and Empirical Analysis of a Spatial EA Parallel Boosting Algorithm.
Kamath, Uday; Domeniconi, Carlotta; De Jong, Kenneth
2018-01-01
Many real-world problems involve massive amounts of data. Under these circumstances learning algorithms often become prohibitively expensive, making scalability a pressing issue to be addressed. A common approach is to perform sampling to reduce the size of the dataset and enable efficient learning. Alternatively, one customizes learning algorithms to achieve scalability. In either case, the key challenge is to obtain algorithmic efficiency without compromising the quality of the results. In this article we discuss a meta-learning algorithm (PSBML) that combines concepts from spatially structured evolutionary algorithms (SSEAs) with concepts from ensemble and boosting methodologies to achieve the desired scalability property. We present both theoretical and empirical analyses which show that PSBML preserves a critical property of boosting, specifically, convergence to a distribution centered around the margin. We then present additional empirical analyses showing that this meta-level algorithm provides a general and effective framework that can be used in combination with a variety of learning classifiers. We perform extensive experiments to investigate the trade-off achieved between scalability and accuracy, and robustness to noise, on both synthetic and real-world data. These empirical results corroborate our theoretical analysis, and demonstrate the potential of PSBML in achieving scalability without sacrificing accuracy.
Morselli, Davide; Passini, Stefano
2015-11-01
In Crimes of obedience, Kelman and Hamilton argue that societies can be protected by the degeneration of authority only when citizenship is based on a strong values orientation. This reference to values may be the weakest point in their theory because they do not explicitly define these values. Nevertheless, their empirical findings suggest that the authors are referring to specific democratic principles and universal values (e.g., equality, fairness, harmlessness). In this article, a composite index known as the value-oriented citizenship (VOC) index is introduced and empirically analysed. The results confirm that the VOC index discriminates between people who relate to authority based on values rather than based on their role or on rules in general. The article discusses the utility of the VOC index to develop Kelman and Hamilton's framework further empirically as well as its implications for the analysis of the relationship between individuals and authority. Copyright © 2015 Elsevier Inc. All rights reserved.
The ancient city of Rome, its empire, and the spread of tuberculosis in Europe.
Eddy, Jared J
2015-06-01
The formation of the Roman Empire constituted an unprecedented joining of Mediterranean and European lands and peoples, centering on the capital of Rome. During the late Roman Republic and early Roman Empire (ca. 200B.C.-ca. 200 A.D.) urbanization and population growth led to conditions favorable to the spread of tuberculosis throughout Italy and especially within Rome itself. Trade and military expansion would have acted as vehicles for the further extension of tuberculosis to the provinces via direct transmission from Italian-born Romans to the native populations. However, an alternative explanation may better explain the increase in the number of archeological cases of tuberculosis with the start of the Roman era. A literature review of Roman-era cases and their locations suggests that the development of an urban, Roman way of life resulted in significant increases in prevalence in regions where tuberculosis had previously been endemic only at a low level. Copyright © 2015 Elsevier Ltd. All rights reserved.
Sorption and reemission of formaldehyde by gypsum wallboard. Report for June 1990-August 1992
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, J.C.S.
1993-01-01
The paper gives results of an analysis of the sorption and desorption of formaldehyde by unpainted wallboard, using a mass transfer model based on the Langmuir sorption isotherm. The sorption and desorption rate constants are determined by short-term experimental data. Long-term sorption and desorption curves are developed by the mass transfer model without any adjustable parameters. Compared with other empirically developed models, the mass transfer model has more extensive applicability and provides an elucidation of the sorption and desorption mechanism that empirical models cannot. The mass transfer model is also more feasible and accurate than empirical models for applications suchmore » as scale-up and exposure assessment. For a typical indoor environment, the model predicts that gypsum wallboard is a much stronger sink for formaldehyde than for other indoor air pollutants such as tetrachloroethylene and ethylbenzene. The strong sink effects are reflected by the high equilibrium capacity and slow decay of the desorption curve.« less
ERIC Educational Resources Information Center
Taylor, Lauren J.; Maybery, Murray T.; Wray, John; Ravine, David; Hunt, Anna; Whitehouse, Andrew J. O.
2013-01-01
Extensive empirical evidence indicates that the lesser variant of Autism Spectrum Disorders (ASD) involves a communication impairment that is similar to, but milder than, the deficit in clinical ASD. This research explored the relationship between the broader autism phenotype (BAP) among parents, an index of genetic liability for ASD, and proband…
Artistry and Analysis: Student Experiences of UK Practice-Based Doctorates in Art and Design
ERIC Educational Resources Information Center
Collinson, Jacquelyn Allen
2005-01-01
During the last decade, doctoral education has been the focus of much international academic attention. This period has also witnessed the rapid growth of practice-based research degrees in art and design in the UK. To date, however, there has been no extensive empirical research on the subjective experiences of students undertaking this form of…
An Empirical Analysis of the Navy Junior Reserve Officer Training Corps (NJROTC)
1987-12-01
o_ _ion For NTIS GRA&I DTIC TAB 0 Ula•ncnmwood [] , ,, Juutiriou~iozl Dis !tribution/ Availability (,’ odk 3 Alra .taI / Diat ~~za V. TABLE OF CONTENTS...followingq: I. The overall total 2. The row totals 3. The column totals "Supertables" were also extensively used. A supertable is essentially a collection
ERIC Educational Resources Information Center
Polit, Denise; And Others
To expand the use of women in nontraditional industrial careers, the U.S. Air Force examined the questions of recruiting, selecting, and training women for traditionally male blue collar work. An extensive review of the literature revealed that little empirical data on the effectiveness of various administrative policies had been collected. The…
Anthony H. Conner; Melissa S. Reeves
2001-01-01
Computational chemistry methods can be used to explore the theoretical chemistry behind reactive systems, to compare the relative chemical reactivity of different systems, and, by extension, to predict the reactivity of new systems. Ongoing research has focused on the reactivity of a wide variety of phenolic compounds with formaldehyde using semi-empirical and ab...
ERIC Educational Resources Information Center
Shupe, Ellen I.; Pung, Stephanie K.
2011-01-01
Although issues related to the role of librarians have long been discussed in the literature on academic librarianship, there has been little attempt to incorporate the extensive psychological theory and research on role-related issues. In the current article we review the empirical literature on the role of librarians, with a particular focus on…
ERIC Educational Resources Information Center
Dymond, Simon; Alonso-Alvarez, Benigno
2010-01-01
In a recent article, Schlinger (2008) marked the 50th anniversary of the publication of Skinner's "Verbal Behavior" (1957) by considering its impact on the field of behaviorism and research on verbal behavior. In the present article, we comment on Schlinger's conclusions regarding the impact of the book and highlight the extensions and…
NASA Technical Reports Server (NTRS)
Morris, Carl N.
1987-01-01
Motivated by the LANDSAT problem of estimating the probability of crop or geological types based on multi-channel satellite imagery data, Morris and Kostal (1983), Hill, Hinkley, Kostal, and Morris (1984), and Morris, Hinkley, and Johnston (1985) developed an empirical Bayes approach to this problem. Here, researchers return to those developments, making certain improvements and extensions, but restricting attention to the binary case of only two attributes.
ERIC Educational Resources Information Center
Virues-Ortega, Javier; Hurtado-Parrado, Camilo; Martin, Toby L.; Julio, Flavia
2012-01-01
Mario Bunge is one of the most prolific philosophers of our time. Over the past sixty years he has written extensively about semantics, ontology, epistemology, philosophy of science and ethics. Bunge has been interested in the philosophical and methodological implications of modern psychology and more specifically in the philosophies of the…
Good Practices for Learning to Recognize Actions Using FV and VLAD.
Wu, Jianxin; Zhang, Yu; Lin, Weiyao
2016-12-01
High dimensional representations such as Fisher vectors (FV) and vectors of locally aggregated descriptors (VLAD) have shown state-of-the-art accuracy for action recognition in videos. The high dimensionality, on the other hand, also causes computational difficulties when scaling up to large-scale video data. This paper makes three lines of contributions to learning to recognize actions using high dimensional representations. First, we reviewed several existing techniques that improve upon FV or VLAD in image classification, and performed extensive empirical evaluations to assess their applicability for action recognition. Our analyses of these empirical results show that normality and bimodality are essential to achieve high accuracy. Second, we proposed a new pooling strategy for VLAD and three simple, efficient, and effective transformations for both FV and VLAD. Both proposed methods have shown higher accuracy than the original FV/VLAD method in extensive evaluations. Third, we proposed and evaluated new feature selection and compression methods for the FV and VLAD representations. This strategy uses only 4% of the storage of the original representation, but achieves comparable or even higher accuracy. Based on these contributions, we recommend a set of good practices for action recognition in videos for practitioners in this field.
The Song Remains the Same: A Replication and Extension of the MUSIC Model.
Rentfrow, Peter J; Goldberg, Lewis R; Stillwell, David J; Kosinski, Michal; Gosling, Samuel D; Levitin, Daniel J
2012-12-01
There is overwhelming anecdotal and empirical evidence for individual differences in musical preferences. However, little is known about what drives those preferences. Are people drawn to particular musical genres (e.g., rap, jazz) or to certain musical properties (e.g., lively, loud)? Recent findings suggest that musical preferences can be conceptualized in terms of five orthogonal dimensions: Mellow, Unpretentious, Sophisticated, Intense, and Contemporary (conveniently, MUSIC). The aim of the present research is to replicate and extend that work by empirically examining the hypothesis that musical preferences are based on preferences for particular musical properties and psychological attributes as opposed to musical genres. Findings from Study 1 replicated the five-factor MUSIC structure using musical excerpts from a variety of genres and subgenres and revealed musical attributes that differentiate each factor. Results from Studies 2 and 3 show that the MUSIC structure is recoverable using musical pieces from only the jazz and rock genres, respectively. Taken together, the current work provides strong evidence that preferences for music are determined by specific musical attributes and that the MUSIC model is a robust framework for conceptualizing and measuring such preferences.
The Song Remains the Same: A Replication and Extension of the MUSIC Model
Rentfrow, Peter J.; Goldberg, Lewis R.; Stillwell, David J.; Kosinski, Michal; Gosling, Samuel D.; Levitin, Daniel J.
2012-01-01
There is overwhelming anecdotal and empirical evidence for individual differences in musical preferences. However, little is known about what drives those preferences. Are people drawn to particular musical genres (e.g., rap, jazz) or to certain musical properties (e.g., lively, loud)? Recent findings suggest that musical preferences can be conceptualized in terms of five orthogonal dimensions: Mellow, Unpretentious, Sophisticated, Intense, and Contemporary (conveniently, MUSIC). The aim of the present research is to replicate and extend that work by empirically examining the hypothesis that musical preferences are based on preferences for particular musical properties and psychological attributes as opposed to musical genres. Findings from Study 1 replicated the five-factor MUSIC structure using musical excerpts from a variety of genres and subgenres and revealed musical attributes that differentiate each factor. Results from Studies 2 and 3 show that the MUSIC structure is recoverable using musical pieces from only the jazz and rock genres, respectively. Taken together, the current work provides strong evidence that preferences for music are determined by specific musical attributes and that the MUSIC model is a robust framework for conceptualizing and measuring such preferences. PMID:24825945
Light during darkness and cancer: relationships in circadian photoreception and tumor biology.
Jasser, Samar A; Blask, David E; Brainard, George C
2006-05-01
The relationship between circadian phototransduction and circadian-regulated processes is poorly understood. Melatonin, commonly a circadian phase marker, may play a direct role in a myriad of physiologic processes. The circadian rhythm for pineal melatonin secretion is regulated by the hypothalamic suprachiasmatic nucleus (SCN). Its neural source of light input is a unique subset of intrinsically photosensitive retinal ganglion cells expressing melanopsin, the primary circadian photopigment in rodents and primates. Action spectra of melatonin suppression by light have shown that light in the 446-477 nm range, distinct from the visual system's peak sensitivity, is optimal for stimulating the human circadian system. Breast cancer is the oncological disease entity whose relationship to circadian rhythm fluctuations has perhaps been most extensively studied. Empirical data has increasingly supported the hypothesis that higher risk of breast cancer in industrialized countries is partly due to increased exposure to light at night. Studies of tumor biology implicate melatonin as a potential mediator of this effect. Yet, causality between lifestyle factors and circadian tumor biology remains elusive and likely reflects significant variability with physiologic context. Continued rigorous empirical inquiry into the physiology and clinical implications of these habitual, integrated aspects of life is highly warranted at this time.
A discrete element method-based approach to predict the breakage of coal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gupta, Varun; Sun, Xin; Xu, Wei
Pulverization is an essential pre-combustion technique employed for solid fuels, such as coal, to reduce particle sizes. Smaller particles ensure rapid and complete combustion, leading to low carbon emissions. Traditionally, the resulting particle size distributions from pulverizers have been informed by empirical or semi-empirical approaches that rely on extensive data gathered over several decades during operations or experiments. However, the predictive capabilities for new coals and processes are limited. This work presents a Discrete Element Method based computational framework to predict particle size distribution resulting from the breakage of coal particles characterized by the coal’s physical properties. The effect ofmore » certain operating parameters on the breakage behavior of coal particles also is examined.« less
Empirical mode decomposition-based facial pose estimation inside video sequences
NASA Astrophysics Data System (ADS)
Qing, Chunmei; Jiang, Jianmin; Yang, Zhijing
2010-03-01
We describe a new pose-estimation algorithm via integration of the strength in both empirical mode decomposition (EMD) and mutual information. While mutual information is exploited to measure the similarity between facial images to estimate poses, EMD is exploited to decompose input facial images into a number of intrinsic mode function (IMF) components, which redistribute the effect of noise, expression changes, and illumination variations as such that, when the input facial image is described by the selected IMF components, all the negative effects can be minimized. Extensive experiments were carried out in comparisons to existing representative techniques, and the results show that the proposed algorithm achieves better pose-estimation performances with robustness to noise corruption, illumination variation, and facial expressions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chremos, Alexandros, E-mail: achremos@imperial.ac.uk; Nikoubashman, Arash, E-mail: arashn@princeton.edu; Panagiotopoulos, Athanassios Z.
In this contribution, we develop a coarse-graining methodology for mapping specific block copolymer systems to bead-spring particle-based models. We map the constituent Kuhn segments to Lennard-Jones particles, and establish a semi-empirical correlation between the experimentally determined Flory-Huggins parameter χ and the interaction of the model potential. For these purposes, we have performed an extensive set of isobaric–isothermal Monte Carlo simulations of binary mixtures of Lennard-Jones particles with the same size but with asymmetric energetic parameters. The phase behavior of these monomeric mixtures is then extended to chains with finite sizes through theoretical considerations. Such a top-down coarse-graining approach is importantmore » from a computational point of view, since many characteristic features of block copolymer systems are on time and length scales which are still inaccessible through fully atomistic simulations. We demonstrate the applicability of our method for generating parameters by reproducing the morphology diagram of a specific diblock copolymer, namely, poly(styrene-b-methyl methacrylate), which has been extensively studied in experiments.« less
Feldstein Ewing, Sarah W; Chung, Tammy
2013-06-01
Research on mechanisms of behavior change provides an innovative method to improve treatment for addictive behaviors. An important extension of mechanisms of change research involves the use of translational approaches, which examine how basic biological (i.e., brain-based mechanisms) and behavioral factors interact in initiating and sustaining positive behavior change as a result of psychotherapy. Articles in this special issue include integrative conceptual reviews and innovative empirical research on brain-based mechanisms that may underlie risk for addictive behaviors and response to psychotherapy from adolescence through adulthood. Review articles discuss hypothesized mechanisms of change for cognitive and behavioral therapies, mindfulness-based interventions, and neuroeconomic approaches. Empirical articles cover a range of addictive behaviors, including use of alcohol, cigarettes, marijuana, cocaine, and pathological gambling and represent a variety of imaging approaches including fMRI, magneto-encephalography, real-time fMRI, and diffusion tensor imaging. Additionally, a few empirical studies directly examine brain-based mechanisms of change, whereas others examine brain-based indicators as predictors of treatment outcome. Finally, two commentaries discuss craving as a core feature of addiction, and the importance of a developmental approach to examining mechanisms of change. Ultimately, translational research on mechanisms of behavior change holds promise for increasing understanding of how psychotherapy may modify brain structure and functioning and facilitate the initiation and maintenance of positive treatment outcomes for addictive behaviors. 2013 APA, all rights reserved
Feldstein Ewing, Sarah W.; Chung, Tammy
2013-01-01
Research on mechanisms of behavior change provides an innovative method to improve treatment for addictive behaviors. An important extension of mechanisms of change research involves the use of translational approaches, which examine how basic biological (i.e., brain-based mechanisms) and behavioral factors interact in initiating and sustaining positive behavior change as a result of psychotherapy. Articles in this special issue include integrative conceptual reviews and innovative empirical research on brain-based mechanisms that may underlie risk for addictive behaviors and response to psychotherapy from adolescence through adulthood. Review articles discuss hypothesized mechanisms of change for cognitive and behavioral therapies, mindfulness-based interventions, and neuroeconomic approaches. Empirical articles cover a range of addictive behaviors, including use of alcohol, cigarettes, marijuana, cocaine, and pathological gambling and represent a variety of imaging approaches including fMRI, magneto-encephalography, real time fMRI, and diffusion tensor imaging. Additionally, a few empirical studies directly examined brain-based mechanisms of change, whereas others examined brain-based indicators as predictors of treatment outcome. Finally, two commentaries discuss craving as a core feature of addiction, and the importance of a developmental approach to examining mechanisms of change. Ultimately, translational research on mechanisms of behavior change holds promise for increasing understanding of how psychotherapy may modify brain structure and functioning and facilitate the initiation and maintenance of positive treatment outcomes for addictive behaviors. PMID:23815447
Aimé, Carla; André, Jean-Baptiste; Raymond, Michel
2017-07-01
Menopause, the permanent cessation of ovulation, occurs in humans well before the end of the expected lifespan, leading to an extensive post-reproductive period which remains a puzzle for evolutionary biologists. All human populations display this particularity; thus, it is difficult to empirically evaluate the conditions for its emergence. In this study, we used artificial neural networks to model the emergence and evolution of allocation decisions related to reproduction in simulated populations. When allocation decisions were allowed to freely evolve, both menopause and extensive post-reproductive life-span emerged under some ecological conditions. This result allowed us to test various hypotheses about the required conditions for the emergence of menopause and extensive post-reproductive life-span. Our findings did not support the Maternal Hypothesis (menopause has evolved to avoid the risk of dying in childbirth, which is higher in older women). In contrast, results supported a shared prediction from the Grandmother Hypothesis and the Embodied Capital Model. Indeed, we found that extensive post-reproductive lifespan allows resource reallocation to increase fertility of the children and survival of the grandchildren. Furthermore, neural capital development and the skill intensiveness of the foraging niche, rather than strength, played a major role in shaping the age profile of somatic and cognitive senescence in our simulated populations. This result supports the Embodied Capital Model rather than the Grand-Mother Hypothesis. Finally, in simulated populations where menopause had already evolved, we found that reduced post-reproductive lifespan lead to reduced children's fertility and grandchildren's survival. The results are discussed in the context of the evolutionary emergence of menopause and extensive post-reproductive life-span.
Observations of increased tropical rainfall preceded by air passage over forests.
Spracklen, D V; Arnold, S R; Taylor, C M
2012-09-13
Vegetation affects precipitation patterns by mediating moisture, energy and trace-gas fluxes between the surface and atmosphere. When forests are replaced by pasture or crops, evapotranspiration of moisture from soil and vegetation is often diminished, leading to reduced atmospheric humidity and potentially suppressing precipitation. Climate models predict that large-scale tropical deforestation causes reduced regional precipitation, although the magnitude of the effect is model and resolution dependent. In contrast, observational studies have linked deforestation to increased precipitation locally but have been unable to explore the impact of large-scale deforestation. Here we use satellite remote-sensing data of tropical precipitation and vegetation, combined with simulated atmospheric transport patterns, to assess the pan-tropical effect of forests on tropical rainfall. We find that for more than 60 per cent of the tropical land surface (latitudes 30 degrees south to 30 degrees north), air that has passed over extensive vegetation in the preceding few days produces at least twice as much rain as air that has passed over little vegetation. We demonstrate that this empirical correlation is consistent with evapotranspiration maintaining atmospheric moisture in air that passes over extensive vegetation. We combine these empirical relationships with current trends of Amazonian deforestation to estimate reductions of 12 and 21 per cent in wet-season and dry-season precipitation respectively across the Amazon basin by 2050, due to less-efficient moisture recycling. Our observation-based results complement similar estimates from climate models, in which the physical mechanisms and feedbacks at work could be explored in more detail.
2017-01-01
Although in recent years the study of gene expression variation in the absence of genetic or environmental cues or gene expression heterogeneity has intensified considerably, many basic and applied biological fields still remain unaware of how useful the study of gene expression heterogeneity patterns might be for the characterization of biological systems and/or processes. Largely based on the modulator effect chromatin compaction has for gene expression heterogeneity and the extensive changes in chromatin compaction known to occur for specialized cells that are naturally or artificially induced to revert to less specialized states or dedifferentiate, I recently hypothesized that processes that concur with cell dedifferentiation would show an extensive reduction in gene expression heterogeneity. The confirmation of the existence of such trend could be of wide interest because of the biomedical and biotechnological relevance of cell dedifferentiation-based processes, i.e., regenerative development, cancer, human induced pluripotent stem cells, or plant somatic embryogenesis. Here, I report the first empirical evidence consistent with the existence of an extensive reduction in gene expression heterogeneity for processes that concur with cell dedifferentiation by analyzing transcriptome dynamics along forearm regenerative development in Ambystoma mexicanum or axolotl. Also, I briefly discuss on the utility of the study of gene expression heterogeneity dynamics might have for the characterization of cell dedifferentiation-based processes, and the engineering of tools that afforded better monitoring and modulating such processes. Finally, I reflect on how a transitional reduction in gene expression heterogeneity for dedifferentiated cells can promote a long-term increase in phenotypic heterogeneity following cell dedifferentiation with potential adverse effects for biomedical and biotechnological applications. PMID:29134148
Dignity in the care of older people – a review of the theoretical and empirical literature
Gallagher, Ann; Li, Sarah; Wainwright, Paul; Jones, Ian Rees; Lee, Diana
2008-01-01
Background Dignity has become a central concern in UK health policy in relation to older and vulnerable people. The empirical and theoretical literature relating to dignity is extensive and as likely to confound and confuse as to clarify the meaning of dignity for nurses in practice. The aim of this paper is critically to examine the literature and to address the following questions: What does dignity mean? What promotes and diminishes dignity? And how might dignity be operationalised in the care of older people? This paper critically reviews the theoretical and empirical literature relating to dignity and clarifies the meaning and implications of dignity in relation to the care of older people. If nurses are to provide dignified care clarification is an essential first step. Methods This is a review article, critically examining papers reporting theoretical perspectives and empirical studies relating to dignity. The following databases were searched: Assia, BHI, CINAHL, Social Services Abstracts, IBSS, Web of Knowledge Social Sciences Citation Index and Arts & Humanities Citation Index and location of books a chapters in philosophy literature. An analytical approach was adopted to the publications reviewed, focusing on the objectives of the review. Results and discussion We review a range of theoretical and empirical accounts of dignity and identify key dignity promoting factors evident in the literature, including staff attitudes and behaviour; environment; culture of care; and the performance of specific care activities. Although there is scope to learn more about cultural aspects of dignity we know a good deal about dignity in care in general terms. Conclusion We argue that what is required is to provide sufficient support and education to help nurses understand dignity and adequate resources to operationalise dignity in their everyday practice. Using the themes identified from our review we offer proposals for the direction of future research. PMID:18620561
Reconstructing the world trade multiplex: The role of intensive and extensive biases
NASA Astrophysics Data System (ADS)
Mastrandrea, Rossana; Squartini, Tiziano; Fagiolo, Giorgio; Garlaschelli, Diego
2014-12-01
In economic and financial networks, the strength of each node has always an important economic meaning, such as the size of supply and demand, import and export, or financial exposure. Constructing null models of networks matching the observed strengths of all nodes is crucial in order to either detect interesting deviations of an empirical network from economically meaningful benchmarks or reconstruct the most likely structure of an economic network when the latter is unknown. However, several studies have proved that real economic networks and multiplexes topologically differ from configurations inferred only from node strengths. Here we provide a detailed analysis of the world trade multiplex by comparing it to an enhanced null model that simultaneously reproduces the strength and the degree of each node. We study several temporal snapshots and almost 100 layers (commodity classes) of the multiplex and find that the observed properties are systematically well reproduced by our model. Our formalism allows us to introduce the (static) concept of extensive and intensive bias, defined as a measurable tendency of the network to prefer either the formation of extra links or the reinforcement of link weights, with respect to a reference case where only strengths are enforced. Our findings complement the existing economic literature on (dynamic) intensive and extensive trade margins. More generally, they show that real-world multiplexes can be strongly shaped by layer-specific local constraints.
NASA Astrophysics Data System (ADS)
Hoose, C.; Hande, L. B.; Mohler, O.; Niemand, M.; Paukert, M.; Reichardt, I.; Ullrich, R.
2016-12-01
Between 0 and -37°C, ice formation in clouds is triggered by aerosol particles acting as heterogeneous ice nuclei. At lower temperatures, heterogeneous ice nucleation on aerosols can occur at lower supersaturations than homogeneous freezing of solutes. In laboratory experiments, the ability of different aerosol species (e.g. desert dusts, soot, biological particles) has been studied in detail and quantified via various theoretical or empirical parameterization approaches. For experiments in the AIDA cloud chamber, we have quantified the ice nucleation efficiency via a temperature- and supersaturation dependent ice nucleation active site density. Here we present a new empirical parameterization scheme for immersion and deposition ice nucleation on desert dust and soot based on these experimental data. The application of this parameterization to the simulation of cirrus clouds, deep convective clouds and orographic clouds will be shown, including the extension of the scheme to the treatment of freezing of rain drops. The results are compared to other heterogeneous ice nucleation schemes. Furthermore, an aerosol-dependent parameterization of contact ice nucleation is presented.
Day, Troy
2016-04-01
Epigenetic inheritance is the transmission of nongenetic material such as gene expression levels, RNA and other biomolecules from parents to offspring. There is a growing realization that such forms of inheritance can play an important role in evolution. Bacteria represent a prime example of epigenetic inheritance because a large array of cellular components is transmitted to offspring, in addition to genetic material. Interestingly, there is an extensive and growing empirical literature showing that many bacteria can form 'persister' cells that are phenotypically resistant or tolerant to antibiotics, but most of these results are not interpreted within the context of epigenetic inheritance. Instead, persister cells are usually viewed as a genetically encoded bet-hedging strategy that has evolved in response to a fluctuating environment. Here I show, using a relatively simple model, that many of these empirical findings can be more simply understood as arising from a combination of epigenetic inheritance and cellular noise. I therefore suggest that phenotypic drug tolerance in bacteria might represent one of the best-studied examples of evolution under epigenetic inheritance. © 2016 John Wiley & Sons Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foight, Dillon R.; Slane, Patrick O.; Güver, Tolga
We present a comprehensive study of interstellar X-ray extinction using the extensive Chandra supernova remnant (SNR) archive and use our results to refine the empirical relation between the hydrogen column density and optical extinction. In our analysis, we make use of the large, uniform data sample to assess various systematic uncertainties in the measurement of the interstellar X-ray absorption. Specifically, we address systematic uncertainties that originate from (i) the emission models used to fit SNR spectra; (ii) the spatial variations within individual remnants; (iii) the physical conditions of the remnant such as composition, temperature, and non-equilibrium regions; and (iv) themore » model used for the absorption of X-rays in the interstellar medium. Using a Bayesian framework to quantify these systematic uncertainties, and combining the resulting hydrogen column density measurements with the measurements of optical extinction toward the same remnants, we find the empirical relation N {sub H} = (2.87 ± 0.12) × 10{sup 21} A {sub V} cm{sup 2}, which is significantly higher than the previous measurements.« less
The Regulation of Task Performance: A Trans-Disciplinary Review
Clark, Ian; Dumas, Guillaume
2016-01-01
Definitions of meta-cognition typically have two components: (1) knowledge about one's own cognitive functioning; and, (2) control over one's own cognitive activities. Since Flavell and his colleagues provided the empirical foundation on which to build studies of meta-cognition and the autonoetic (self) knowledge required for effective learning, the intervening years have seen the extensive dissemination of theoretical and empirical research on meta-cognition, which now encompasses a variety of issues and domains including educational psychology and neuroscience. Nevertheless, the psychological and neural underpinnings of meta-cognitive predictions and reflections that determine subsequent regulation of task performance remain ill understood. This article provides an outline of meta-cognition in the science of education with evidence drawn from neuroimaging, psycho-physiological, and psychological literature. We will rigorously explore research that addresses the pivotal role of the prefrontal cortex (PFC) in controlling the meta-cognitive processes that underpin the self-regulated learning (SRL) strategies learners employ to regulate task performance. The article delineates what those strategies are, and how the learning environment can facilitate or frustrate strategy use by influencing learners' self-efficacy. PMID:26779050
Evaluating temperature as a driver of changing coastal biodiversity
NASA Astrophysics Data System (ADS)
Batt, R. D.; Morley, J. W.; Selden, R. L.; Tingley, M. W.; Pinsky, M. L.
2016-02-01
Coastal waters are warming for many regions of the world, but the impacts on biodiversity are unclear. Theoretical mechanisms for temperature-driven changes in diversity include the expansion of highly diverse warm-water communities, and a tendency for colonizations to occur more rapidly than extinctions. However, these hypotheses remain untested. In fact, some surveys of biodiversity indicate no systematic change in local species richness for most regions of the world. We evaluated the empirical evidence for these proposed mechanisms using long-term and spatially extensive surveys in conjunction with statistical methods robust to observational biases. In contrast to other empirical studies, we identified consistent increases in the richness of North American demersal communities in recent decades. The changes in these communities are associated with changing water temperatures, but are not well-predicted by proposed mechanisms. Most theoretical expectations for how temperature may change biodiversity involve biogeographic dynamics. By determining the timing and locations of colonization and extinction events of 2000 species, we provide a rare assessment of the merits and shortcomings of these hypotheses as they pertain to observed changes coastal biodiversity.
Bogodistov, Yevgen; Dost, Florian
2017-01-01
This study reveals that Duchenne (genuine) and non-Duchenne (non-genuine, polite) smiles are implicitly associated with psychological proximity and distance, respectively. These findings link two extensive research streams from human communication and psychology. Interestingly, extant construal-level theory research suggests the link may work as smiles signaling either a benign situation or politeness, resulting in conflicting predictions for the association between smile type and psychological distance. The current study uses implicit association tests to reveal theoretically and empirically consistent non-Duchenne-smile–distance and Duchenne-smile–proximity associations for all four types of psychological distance: temporal, spatial, social, and hypothetical. Practically, the results suggest several useful applications of non-Duchenne smiles in human communication contexts. PMID:28848483
Birth order has no effect on intelligence: a reply and extension of previous findings.
Wichman, Aaron L; Rodgers, Joseph Lee; Maccallum, Robert C
2007-09-01
We address points raised by Zajonc and Sulloway, who reject findings showing that birth order has no effect on intelligence. Many objections to findings of null birth-order results seem to stem from a misunderstanding of the difference between study designs where birth order is confounded with true causal influences on intelligence across families and designs that control for some of these influences. We discuss some of the consequences of not appreciating the nature of this difference. When between-family confounds are controlled using appropriate study designs and techniques such as multilevel modeling, birth order is shown not to influence intelligence. We conclude with an empirical investigation of the replicability and generalizability of this approach.
The Ethics of Human Life Extension: The Second Argument from Evolution.
Gyngell, Chris
2015-12-01
One argument that is sometimes made against pursuing radical forms of human life extension is that such interventions will make the species less evolvable, which would be morally undesirable. In this article, I discuss the empirical and evaluative claims of this argument. I argue that radical increases in life expectancy could, in principle, reduce the evolutionary potential of human populations through both biological and cultural mechanisms. I further argue that if life extension did reduce the evolvability of the species, this will be undesirable for three reasons: (1) it may increase the species' susceptibility to extinction risks, (2) it may adversely affect institutions and practices that promote well-being, and (3) it may impede moral progress. © The Author 2015. Published by Oxford University Press, on behalf of the Journal of Medicine and Philosophy Inc. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
ERIC Educational Resources Information Center
Sanchez-Franco, Manuel J.; Martinez-Lopez, Francisco J.; Martin-Velicia, Felix A.
2009-01-01
Our research specifically focuses on the effects of the national cultural background of educators on the acceptance and usage of ICT, particularly the Web as an extensive and expanding information base that provides the ultimate in resource-rich learning. Most research has been used North Americans as subjects. For this reason, we interviewed…
ERIC Educational Resources Information Center
Downey, James P.; Kher, Hemant V.
2015-01-01
Technology training in the classroom is critical in preparing students for upper level classes as well as professional careers, especially in fields such as technology. One of the key enablers to this process is computer self-efficacy (CSE), which has an extensive stream of empirical research. Despite this, one of the missing pieces is how CSE…
Military Suicide Research Consortium: Extension to New Opportunities and Challenges
2017-04-01
Abnormal Psychology . 56. Tucker, R., Michaels, M., Rogers, M., Wingate, L., & Joiner, T. (2016). Construct validity of a proposed new diagnostic entity...analysis with implications for understanding suicidal behavior. Journal of Abnormal Psychology , 123, 835-840. 2. Anestis, M., Soberay, K., Gutierrez, P...predictions of the interpersonal- psychological theory of suicidal behavior: Empirical tests in two samples of young adults. Journal of Abnormal
ERIC Educational Resources Information Center
Crisp, Nicola Elinor
2013-01-01
While some African American students perform as well as or better than their White peers on standardized tests, African Americans as a group attain lower scores on standardized tests than their White peers. This phenomenon has been addressed extensively in educational research. However, not much empirical research has been conducted to investigate…
ERIC Educational Resources Information Center
Kampourakis, Kostas
2016-01-01
Teaching about nature of science (NOS) is considered as an important goal of science education in various countries. Extensive empirical research about how some aspects of NOS can be effectively taught is also available. The most widely adopted conceptualization of NOS is based on a small number of general aspects of NOS, which fall into two…
ERIC Educational Resources Information Center
Guth, Jessica
2008-01-01
This paper, based on extensive empirical work with Polish and Bulgarian scientists in Germany and the UK, examines the impact of the EU enlargement including the free movement of persons provisions on the mobility of scientists from Eastern to Western Europe. It focuses on early career researchers and particularly PhD candidates and begins by…
ERIC Educational Resources Information Center
Ghaffarzadegan, Navid; Stewart, Thomas R.
2011-01-01
Elwin, Juslin, Olsson, and Enkvist (2007) and Henriksson, Elwin, and Juslin (2010) offered the constructivist coding hypothesis to describe how people code the outcomes of their decisions when availability of feedback is conditional on the decision. They provided empirical evidence only for the 0.5 base rate condition. This commentary argues that…
A General Model for Estimating Macroevolutionary Landscapes.
Boucher, Florian C; Démery, Vincent; Conti, Elena; Harmon, Luke J; Uyeda, Josef
2018-03-01
The evolution of quantitative characters over long timescales is often studied using stochastic diffusion models. The current toolbox available to students of macroevolution is however limited to two main models: Brownian motion and the Ornstein-Uhlenbeck process, plus some of their extensions. Here, we present a very general model for inferring the dynamics of quantitative characters evolving under both random diffusion and deterministic forces of any possible shape and strength, which can accommodate interesting evolutionary scenarios like directional trends, disruptive selection, or macroevolutionary landscapes with multiple peaks. This model is based on a general partial differential equation widely used in statistical mechanics: the Fokker-Planck equation, also known in population genetics as the Kolmogorov forward equation. We thus call the model FPK, for Fokker-Planck-Kolmogorov. We first explain how this model can be used to describe macroevolutionary landscapes over which quantitative traits evolve and, more importantly, we detail how it can be fitted to empirical data. Using simulations, we show that the model has good behavior both in terms of discrimination from alternative models and in terms of parameter inference. We provide R code to fit the model to empirical data using either maximum-likelihood or Bayesian estimation, and illustrate the use of this code with two empirical examples of body mass evolution in mammals. FPK should greatly expand the set of macroevolutionary scenarios that can be studied since it opens the way to estimating macroevolutionary landscapes of any conceivable shape. [Adaptation; bounds; diffusion; FPK model; macroevolution; maximum-likelihood estimation; MCMC methods; phylogenetic comparative data; selection.].
Jenkins, Emily K; Kothari, Anita; Bungay, Vicky; Johnson, Joy L; Oliffe, John L
2016-08-30
Much of the research and theorising in the knowledge translation (KT) field has focused on clinical settings, providing little guidance to those working in community settings. In this study, we build on previous research in community-based KT by detailing the theory driven and empirically-informed CollaboraKTion framework. A case study design and ethnographic methods were utilised to gain an in-depth understanding of the processes for conducting a community-based KT study as a means to distilling the CollaboraKTion framework. Drawing on extensive field notes describing fieldwork observations and interactions as well as evidence from the participatory research and KT literature, we detail the processes and steps undertaken in this community-based KT study as well as their rationale and the challenges encountered. In an effort to build upon existing knowledge, Kitson and colleagues' co-KT framework, which provides guidance for conducting KT aimed at addressing population-level health, was applied as a coding structure to inform the current analysis. This approach was selected because it (1) supported the application of an existing community-based KT framework to empirical data and (2) provided an opportunity to contribute to the theory and practice gaps in the community-based KT literature through an inductively derived empirical example. Analysis revealed that community-based KT is an iterative process that can be viewed as comprising five overarching processes: (1) contacting and connecting; (2) deepening understandings; (3) adapting and applying the knowledge base; (4) supporting and evaluating continued action; and (5) transitioning and embedding as well as several key elements within each of these processes (e.g. building on existing knowledge, establishing partnerships). These empirically informed theory advancements in KT and participatory research traditions are summarised in the CollaboraKTion framework. We suggest that community-based KT researchers place less emphasis on enhancing uptake of specific interventions and focus on collaboratively identifying and creating changes to the contextual factors that influence health outcomes. The CollaboraKTion framework can be used to guide the development, implementation and evaluation of contextually relevant, evidence-informed initiatives aimed at improving population health, amid providing a foundation to leverage future research and practice in this emergent KT area.
Liao, J. G.; Mcmurry, Timothy; Berg, Arthur
2014-01-01
Empirical Bayes methods have been extensively used for microarray data analysis by modeling the large number of unknown parameters as random effects. Empirical Bayes allows borrowing information across genes and can automatically adjust for multiple testing and selection bias. However, the standard empirical Bayes model can perform poorly if the assumed working prior deviates from the true prior. This paper proposes a new rank-conditioned inference in which the shrinkage and confidence intervals are based on the distribution of the error conditioned on rank of the data. Our approach is in contrast to a Bayesian posterior, which conditions on the data themselves. The new method is almost as efficient as standard Bayesian methods when the working prior is close to the true prior, and it is much more robust when the working prior is not close. In addition, it allows a more accurate (but also more complex) non-parametric estimate of the prior to be easily incorporated, resulting in improved inference. The new method’s prior robustness is demonstrated via simulation experiments. Application to a breast cancer gene expression microarray dataset is presented. Our R package rank.Shrinkage provides a ready-to-use implementation of the proposed methodology. PMID:23934072
Spectrum of bacteremia in posthematopoietic stem cell transplant patients from an Indian center.
Ghafur, A; Devarajan, V; Raj, R; Easow, J; Raja, T
2016-01-01
Despite the relatively low prevalence of Gram-positive bacteremic infections in Indian oncology patients, glycopeptides are extensively used for empirical management of febrile neutropenia. Our aim was to analyze the spectrum of bacteremia in posthematopoietic stem cell transplant (HSCT) recipients in our center and make a recommendation on glycopeptide use in this patient population. Retrospective analysis of bacteremic data from HSCT recipients in a tertiary care oncology and transplant center from South India, between 2011 and 2013. In 217 patients, 52 bacteremic episodes were identified. The majority of the isolates were Gram-negatives (88.4%) with very few Gram-positives (7.69%). Glycopeptides need not be included in the empirical antibiotic regimen in post-HSCT settings with very low Gram-positive infection rates.
ERIC Educational Resources Information Center
Gansemer, Lawrence P.; Bealer, Robert C.
Using data generated from the records of 460 rural-reared Pennsylvania males contacted initially as sophomores in 1947 and again in 1957 and 1971, an effort was made to replicate the tradition of path analytic, causal modeling of status attainment in American society and to assess the empirical efficacy of certain family input variables not…
Acoustic Scattering by Near-Surface Inhomogeneities in Porous Media
1990-02-21
surfaces [8]. Recently, this empirical model has been replaced by a more rigorous mi- crostructural model [9]. Here, the acoustical characteristics of...boundaries. A discussion of how ground acoustic characteristics are modelled then follows, with the chapter being concluded by a brief summary. 3.1...of ground acoustic char- acteristics, with particular emphasis on the Four parameter model of Atten- borough, that will be used extensively later. 48
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sundararaman, Ravishankar; Gunceler, Deniz; Arias, T. A.
2014-10-07
Continuum solvation models enable efficient first principles calculations of chemical reactions in solution, but require extensive parametrization and fitting for each solvent and class of solute systems. Here, we examine the assumptions of continuum solvation models in detail and replace empirical terms with physical models in order to construct a minimally-empirical solvation model. Specifically, we derive solvent radii from the nonlocal dielectric response of the solvent from ab initio calculations, construct a closed-form and parameter-free weighted-density approximation for the free energy of the cavity formation, and employ a pair-potential approximation for the dispersion energy. We show that the resulting modelmore » with a single solvent-independent parameter: the electron density threshold (n c), and a single solvent-dependent parameter: the dispersion scale factor (s 6), reproduces solvation energies of organic molecules in water, chloroform, and carbon tetrachloride with RMS errors of 1.1, 0.6 and 0.5 kcal/mol, respectively. We additionally show that fitting the solvent-dependent s 6 parameter to the solvation energy of a single non-polar molecule does not substantially increase these errors. Parametrization of this model for other solvents, therefore, requires minimal effort and is possible without extensive databases of experimental solvation free energies.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sundararaman, Ravishankar; Gunceler, Deniz; Arias, T. A.
2014-10-07
Continuum solvation models enable efficient first principles calculations of chemical reactions in solution, but require extensive parametrization and fitting for each solvent and class of solute systems. Here, we examine the assumptions of continuum solvation models in detail and replace empirical terms with physical models in order to construct a minimally-empirical solvation model. Specifically, we derive solvent radii from the nonlocal dielectric response of the solvent from ab initio calculations, construct a closed-form and parameter-free weighted-density approximation for the free energy of the cavity formation, and employ a pair-potential approximation for the dispersion energy. We show that the resulting modelmore » with a single solvent-independent parameter: the electron density threshold (n{sub c}), and a single solvent-dependent parameter: the dispersion scale factor (s{sub 6}), reproduces solvation energies of organic molecules in water, chloroform, and carbon tetrachloride with RMS errors of 1.1, 0.6 and 0.5 kcal/mol, respectively. We additionally show that fitting the solvent-dependent s{sub 6} parameter to the solvation energy of a single non-polar molecule does not substantially increase these errors. Parametrization of this model for other solvents, therefore, requires minimal effort and is possible without extensive databases of experimental solvation free energies.« less
Outcome-Dependent Sampling with Interval-Censored Failure Time Data
Zhou, Qingning; Cai, Jianwen; Zhou, Haibo
2017-01-01
Summary Epidemiologic studies and disease prevention trials often seek to relate an exposure variable to a failure time that suffers from interval-censoring. When the failure rate is low and the time intervals are wide, a large cohort is often required so as to yield reliable precision on the exposure-failure-time relationship. However, large cohort studies with simple random sampling could be prohibitive for investigators with a limited budget, especially when the exposure variables are expensive to obtain. Alternative cost-effective sampling designs and inference procedures are therefore desirable. We propose an outcome-dependent sampling (ODS) design with interval-censored failure time data, where we enrich the observed sample by selectively including certain more informative failure subjects. We develop a novel sieve semiparametric maximum empirical likelihood approach for fitting the proportional hazards model to data from the proposed interval-censoring ODS design. This approach employs the empirical likelihood and sieve methods to deal with the infinite-dimensional nuisance parameters, which greatly reduces the dimensionality of the estimation problem and eases the computation difficulty. The consistency and asymptotic normality of the resulting regression parameter estimator are established. The results from our extensive simulation study show that the proposed design and method works well for practical situations and is more efficient than the alternative designs and competing approaches. An example from the Atherosclerosis Risk in Communities (ARIC) study is provided for illustration. PMID:28771664
Gui, Jiang; Moore, Jason H.; Williams, Scott M.; Andrews, Peter; Hillege, Hans L.; van der Harst, Pim; Navis, Gerjan; Van Gilst, Wiek H.; Asselbergs, Folkert W.; Gilbert-Diamond, Diane
2013-01-01
We present an extension of the two-class multifactor dimensionality reduction (MDR) algorithm that enables detection and characterization of epistatic SNP-SNP interactions in the context of a quantitative trait. The proposed Quantitative MDR (QMDR) method handles continuous data by modifying MDR’s constructive induction algorithm to use a T-test. QMDR replaces the balanced accuracy metric with a T-test statistic as the score to determine the best interaction model. We used a simulation to identify the empirical distribution of QMDR’s testing score. We then applied QMDR to genetic data from the ongoing prospective Prevention of Renal and Vascular End-Stage Disease (PREVEND) study. PMID:23805232
Length of hospitalization and outcome of commitment and recommitment hearings.
Parry, C D; Turkheimer, E
1992-01-01
Despite extensive legislative reformulation of civil commitment procedures, empirical studies have shown that civil commitment hearings continue to be largely nonadversarial. The authors observed all civil commitment hearings during a three-month period at a large state hospital in Virginia and examined the characteristics of patients and the actions of attorneys, clinical examiners, and judges as a function of the length of time the patient had been in the hospital. The analysis revealed that as the length of a patient's hospitalization increased, the hearings became shorter and less adversarial; patients tended to show fewer signs of acute psychiatric illness and more signs of chronic schizophrenia. The implications of these findings for civil commitment policy are discussed.
Does reflection lead to wise choices?
Bortolotti, Lisa
2011-01-01
Does conscious reflection lead to good decision-making? Whereas engaging in reflection is traditionally thought to be the best way to make wise choices, recent psychological evidence undermines the role of reflection in lay and expert judgement. The literature suggests that thinking about reasons does not improve the choices people make, and that experts do not engage in reflection, but base their judgements on intuition, often shaped by extensive previous experience. Can we square the traditional accounts of wisdom with the results of these empirical studies? Should we even attempt to? I shall defend the view that philosophy and cognitive sciences genuinely interact in tackling questions such as whether reflection leads to making wise choices. PMID:22408385
"Fuzziness" in the celular interactome: a historical perspective.
Welch, G Rickey
2012-01-01
Some historical background is given for appreciating the impact of the empirical construct known as the cellular protein-protein interactome, which is a seemingly de novo entity that has arisen of late within the context of postgenomic systems biology. The approach here builds on a generalized principle of "fuzziness" in protein behavior, proposed by Tompa and Fuxreiter.(1) Recent controversies in the analysis and interpretation of the interactome studies are rationalized historically under the auspices of this concept. There is an extensive literature on protein-protein interactions, dating to the mid-1900s, which may help clarify the "fuzziness" in the interactome picture and, also, provide a basis for understanding the physiological importance of protein-protein interactions in vivo.
An optimal control strategy for two-dimensional motion camouflage with non-holonimic constraints.
Rañó, Iñaki
2012-07-01
Motion camouflage is a stealth behaviour observed both in hover-flies and in dragonflies. Existing controllers for mimicking motion camouflage generate this behaviour on an empirical basis or without considering the kinematic motion restrictions present in animal trajectories. This study summarises our formal contributions to solve the generation of motion camouflage as a non-linear optimal control problem. The dynamics of the system capture the kinematic restrictions to motion of the agents, while the performance index ensures camouflage trajectories. An extensive set of simulations support the technique, and a novel analysis of the obtained trajectories contributes to our understanding of possible mechanisms to obtain sensor based motion camouflage, for instance, in mobile robots.
McArtor, Daniel B.; Lubke, Gitta H.; Bergeman, C. S.
2017-01-01
Person-centered methods are useful for studying individual differences in terms of (dis)similarities between response profiles on multivariate outcomes. Multivariate distance matrix regression (MDMR) tests the significance of associations of response profile (dis)similarities and a set of predictors using permutation tests. This paper extends MDMR by deriving and empirically validating the asymptotic null distribution of its test statistic, and by proposing an effect size for individual outcome variables, which is shown to recover true associations. These extensions alleviate the computational burden of permutation tests currently used in MDMR and render more informative results, thus making MDMR accessible to new research domains. PMID:27738957
McArtor, Daniel B; Lubke, Gitta H; Bergeman, C S
2017-12-01
Person-centered methods are useful for studying individual differences in terms of (dis)similarities between response profiles on multivariate outcomes. Multivariate distance matrix regression (MDMR) tests the significance of associations of response profile (dis)similarities and a set of predictors using permutation tests. This paper extends MDMR by deriving and empirically validating the asymptotic null distribution of its test statistic, and by proposing an effect size for individual outcome variables, which is shown to recover true associations. These extensions alleviate the computational burden of permutation tests currently used in MDMR and render more informative results, thus making MDMR accessible to new research domains.
Bartz, Daniel; Hatrick, Kerr; Hesse, Christian W; Müller, Klaus-Robert; Lemm, Steven
2013-01-01
Robust and reliable covariance estimates play a decisive role in financial and many other applications. An important class of estimators is based on factor models. Here, we show by extensive Monte Carlo simulations that covariance matrices derived from the statistical Factor Analysis model exhibit a systematic error, which is similar to the well-known systematic error of the spectrum of the sample covariance matrix. Moreover, we introduce the Directional Variance Adjustment (DVA) algorithm, which diminishes the systematic error. In a thorough empirical study for the US, European, and Hong Kong stock market we show that our proposed method leads to improved portfolio allocation.
Bartz, Daniel; Hatrick, Kerr; Hesse, Christian W.; Müller, Klaus-Robert; Lemm, Steven
2013-01-01
Robust and reliable covariance estimates play a decisive role in financial and many other applications. An important class of estimators is based on factor models. Here, we show by extensive Monte Carlo simulations that covariance matrices derived from the statistical Factor Analysis model exhibit a systematic error, which is similar to the well-known systematic error of the spectrum of the sample covariance matrix. Moreover, we introduce the Directional Variance Adjustment (DVA) algorithm, which diminishes the systematic error. In a thorough empirical study for the US, European, and Hong Kong stock market we show that our proposed method leads to improved portfolio allocation. PMID:23844016
Methodological convergence of program evaluation designs.
Chacón-Moscoso, Salvador; Anguera, M Teresa; Sanduvete-Chaves, Susana; Sánchez-Martín, Milagrosa
2014-01-01
Nowadays, the confronting dichotomous view between experimental/quasi-experimental and non-experimental/ethnographic studies still exists but, despite the extensive use of non-experimental/ethnographic studies, the most systematic work on methodological quality has been developed based on experimental and quasi-experimental studies. This hinders evaluators and planners' practice of empirical program evaluation, a sphere in which the distinction between types of study is changing continually and is less clear. Based on the classical validity framework of experimental/quasi-experimental studies, we carry out a review of the literature in order to analyze the convergence of design elements in methodological quality in primary studies in systematic reviews and ethnographic research. We specify the relevant design elements that should be taken into account in order to improve validity and generalization in program evaluation practice in different methodologies from a practical methodological and complementary view. We recommend ways to improve design elements so as to enhance validity and generalization in program evaluation practice.
Effects of shape and stroke parameters on the propulsion performance of an axisymmetric swimmer.
Peng, Jifeng; Alben, Silas
2012-03-01
In nature, there exists a special group of aquatic animals which have an axisymmetric body and whose primary swimming mechanism is to use periodic body contractions to generate vortex rings in the surrounding fluid. Using jellyfish medusae as an example, this study develops a mathematical model of body kinematics of an axisymmetric swimmer and uses a computational approach to investigate the induced vortex wakes. Wake characteristics are identified for swimmers using jet propulsion and rowing, two mechanisms identified in previous studies of medusan propulsion. The parameter space of body kinematics is explored through four quantities: a measure of body shape, stroke amplitude, the ratio between body contraction duration and extension duration, and the pulsing frequency. The effects of these parameters on thrust, input power requirement and circulation production are quantified. Two metrics, cruising speed and energy cost of locomotion, are used to evaluate the propulsion performance. The study finds that a more prolate-shaped swimmer with larger stroke amplitudes is able to swim faster, but its cost of locomotion is also higher. In contrast, a more oblate-shaped swimmer with smaller stroke amplitudes uses less energy for its locomotion, but swims more slowly. Compared with symmetric strokes with equal durations of contraction and extension, faster bell contractions increase the swimming speed whereas faster bell extensions decrease it, but both require a larger energy input. This study shows that besides the well-studied correlations between medusan body shape and locomotion, stroke variables also affect the propulsion performance. It provides a framework for comparing the propulsion performance of axisymmetric swimmers based on their body kinematics when it is difficult to measure and analyze their wakes empirically. The knowledge from this study is also useful for the design of robotic swimmers that use axisymmetric body contractions for propulsion.
Scribner, Kim T.; Pearce, John M.; Baker, Allan J.
2000-01-01
The recent proliferation and greater accessibility of molecular genetic markers has led to a growing appreciation of the ecological and evolutionary inferences that can be drawn from molecular characterizations of individuals and populations (Burke et al. 1992, Avise 1994). Different techniques have the ability to target DNA sequences which have different patterns of inheritance, different modes and rates of evolution and, concomitantly, different levels of variation. In the quest for 'the right marker for the right job', microsatellites have been widely embraced as the marker of choice for many empirical genetic studies. The proliferation of microsatellite loci for various species and the voluminous literature compiled in very few years associated with their evolution and use in various research applications, exemplifies their growing importance as a research tool in the biological sciences.The ability to define allelic states based on variation at the nucleotide level has afforded unparalleled opportunities to document the actual mutational process and rates of evolution at individual microsatellite loci. The scrutiny to which these loci have been subjected has resulted in data that raise issues pertaining to assumptions formerly stated, but largely untestable for other marker classes. Indeed this is an active arena for theoretical and empirical work. Given the extensive and ever-increasing literature on various statistical methodologies and cautionary notes regarding the uses of microsatellites, some consideration should be given to the unique characteristics of these loci when determining how and under what conditions they can be employed.
Power independent EMG based gesture recognition for robotics.
Li, Ling; Looney, David; Park, Cheolsoo; Rehman, Naveed U; Mandic, Danilo P
2011-01-01
A novel method for detecting muscle contraction is presented. This method is further developed for identifying four different gestures to facilitate a hand gesture controlled robot system. It is achieved based on surface Electromyograph (EMG) measurements of groups of arm muscles. The cross-information is preserved through a simultaneous processing of EMG channels using a recent multivariate extension of Empirical Mode Decomposition (EMD). Next, phase synchrony measures are employed to make the system robust to different power levels due to electrode placements and impedances. The multiple pairwise muscle synchronies are used as features of a discrete gesture space comprising four gestures (flexion, extension, pronation, supination). Simulations on real-time robot control illustrate the enhanced accuracy and robustness of the proposed methodology.
Social learning and the development of individual and group behaviour in mammal societies
Thornton, Alex; Clutton-Brock, Tim
2011-01-01
As in human societies, social learning may play an important role in shaping individual and group characteristics in other mammals. Here, we review research on non-primate mammals, concentrating on work at our long-term meerkat study site, where longitudinal data and field experiments have generated important insights into the role of social learning under natural conditions. Meerkats live under high predation pressure and occupy a difficult foraging niche. Accordingly, pups make extensive use of social information in learning to avoid predation and obtain food. Where individual learning is costly or opportunities are lacking, as in the acquisition of prey-handling skills, adults play an active role in promoting learning through teaching. Social learning can also cause information to spread through groups, but our data suggest that this does not necessarily result in homogeneous, group-wide traditions. Moreover, traditions are commonly eroded by individual learning. We suggest that traditions will only persist where there are high costs of deviating from the group norm or where skill development requires extensive time and effort. Persistent traditions could, theoretically, modify selection pressures and influence genetic evolution. Further empirical studies of social learning in natural populations are now urgently needed to substantiate theoretical claims. PMID:21357220
Social learning and the development of individual and group behaviour in mammal societies.
Thornton, Alex; Clutton-Brock, Tim
2011-04-12
As in human societies, social learning may play an important role in shaping individual and group characteristics in other mammals. Here, we review research on non-primate mammals, concentrating on work at our long-term meerkat study site, where longitudinal data and field experiments have generated important insights into the role of social learning under natural conditions. Meerkats live under high predation pressure and occupy a difficult foraging niche. Accordingly, pups make extensive use of social information in learning to avoid predation and obtain food. Where individual learning is costly or opportunities are lacking, as in the acquisition of prey-handling skills, adults play an active role in promoting learning through teaching. Social learning can also cause information to spread through groups, but our data suggest that this does not necessarily result in homogeneous, group-wide traditions. Moreover, traditions are commonly eroded by individual learning. We suggest that traditions will only persist where there are high costs of deviating from the group norm or where skill development requires extensive time and effort. Persistent traditions could, theoretically, modify selection pressures and influence genetic evolution. Further empirical studies of social learning in natural populations are now urgently needed to substantiate theoretical claims.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Miao; Du, Yonghui; Gao, Lili
A recent experimental study reported the successful synthesis of an orthorhombic FeB{sub 4} with a high hardness of 62(5) GPa [H. Gou et al., Phys. Rev. Lett. 111, 157002 (2013)], which has reignited extensive interests on whether transition-metal borides compounds will become superhard materials. However, it is contradicted with some theoretical studies suggesting transition-metal boron compounds are unlikely to become superhard materials. Here, we examined structural and electronic properties of FeB{sub 4} using density functional theory. The electronic calculations show the good metallicity and covalent Fe–B bonding. Meanwhile, we extensively investigated stress-strain relations of FeB{sub 4} under various tensile andmore » shear loading directions. The calculated weakest tensile and shear stresses are 40 GPa and 25 GPa, respectively. Further simulations (e.g., electron localization function and bond length along the weakest loading direction) on FeB{sub 4} show the weak Fe–B bonding is responsible for this low hardness. Moreover, these results are consistent with the value of Vickers hardness (11.7–32.3 GPa) by employing different empirical hardness models and below the superhardness threshold of 40 GPa. Our current results suggest FeB{sub 4} is a hard material and unlikely to become superhard (>40 GPa)« less
Marino, Nicholas Dos Anjos Cristiano; Romero, Gustavo Quevedo; Farjalla, Vinicius Fortes
2018-03-01
Ecologists have extensively investigated the effect of warming on consumer-resource interactions, with experiments revealing that warming can strengthen, weaken or have no net effect on top-down control of resources. These experiments have inspired a body of theoretical work to explain the variation in the effect of warming on top-down control. However, there has been no quantitative attempt to reconcile theory with outcomes from empirical studies. To address the gap between theory and experiment, we performed a meta-analysis to examine the combined effect of experimental warming and top-down control on resource biomass and determined potential sources of variation across experiments. We show that differences in experimental outcomes are related to systematic variation in the geographical distribution of studies. Specifically, warming strengthened top-down control when experiments were conducted in colder regions, but had the opposite effect in warmer regions. Furthermore, we found that differences in the thermoregulation strategy of the consumer and openness of experimental arenas to dispersal can contribute to some deviation from the overall geographical pattern. These results reconcile empirical findings and support the expectation of geographical variation in the response of consumer-resource interactions to warming. © 2018 John Wiley & Sons Ltd/CNRS.
Correcting for population structure and kinship using the linear mixed model: theory and extensions.
Hoffman, Gabriel E
2013-01-01
Population structure and kinship are widespread confounding factors in genome-wide association studies (GWAS). It has been standard practice to include principal components of the genotypes in a regression model in order to account for population structure. More recently, the linear mixed model (LMM) has emerged as a powerful method for simultaneously accounting for population structure and kinship. The statistical theory underlying the differences in empirical performance between modeling principal components as fixed versus random effects has not been thoroughly examined. We undertake an analysis to formalize the relationship between these widely used methods and elucidate the statistical properties of each. Moreover, we introduce a new statistic, effective degrees of freedom, that serves as a metric of model complexity and a novel low rank linear mixed model (LRLMM) to learn the dimensionality of the correction for population structure and kinship, and we assess its performance through simulations. A comparison of the results of LRLMM and a standard LMM analysis applied to GWAS data from the Multi-Ethnic Study of Atherosclerosis (MESA) illustrates how our theoretical results translate into empirical properties of the mixed model. Finally, the analysis demonstrates the ability of the LRLMM to substantially boost the strength of an association for HDL cholesterol in Europeans.
Pocock, Nicola S; Phua, Kai Hong
2011-05-04
Medical tourism is a growing phenomenon with policy implications for health systems, particularly of destination countries. Private actors and governments in Southeast Asia are promoting the medical tourist industry, but the potential impact on health systems, particularly in terms of equity in access and availability for local consumers, is unclear. This article presents a conceptual framework that outlines the policy implications of medical tourism's growth for health systems, drawing on the cases of Thailand, Singapore and Malaysia, three regional hubs for medical tourism, via an extensive review of academic and grey literature. Variables for further analysis of the potential impact of medical tourism on health systems are also identified. The framework can provide a basis for empirical, in country studies weighing the benefits and disadvantages of medical tourism for health systems. The policy implications described are of particular relevance for policymakers and industry practitioners in other Southeast Asian countries with similar health systems where governments have expressed interest in facilitating the growth of the medical tourist industry. This article calls for a universal definition of medical tourism and medical tourists to be enunciated, as well as concerted data collection efforts, to be undertaken prior to any meaningful empirical analysis of medical tourism's impact on health systems.
2011-01-01
Medical tourism is a growing phenomenon with policy implications for health systems, particularly of destination countries. Private actors and governments in Southeast Asia are promoting the medical tourist industry, but the potential impact on health systems, particularly in terms of equity in access and availability for local consumers, is unclear. This article presents a conceptual framework that outlines the policy implications of medical tourism's growth for health systems, drawing on the cases of Thailand, Singapore and Malaysia, three regional hubs for medical tourism, via an extensive review of academic and grey literature. Variables for further analysis of the potential impact of medical tourism on health systems are also identified. The framework can provide a basis for empirical, in country studies weighing the benefits and disadvantages of medical tourism for health systems. The policy implications described are of particular relevance for policymakers and industry practitioners in other Southeast Asian countries with similar health systems where governments have expressed interest in facilitating the growth of the medical tourist industry. This article calls for a universal definition of medical tourism and medical tourists to be enunciated, as well as concerted data collection efforts, to be undertaken prior to any meaningful empirical analysis of medical tourism's impact on health systems. PMID:21539751
The psychobiological theory of temperament and character: comment on Farmer and Goldberg (2008).
Cloninger, C Robert
2008-09-01
The revised Temperament and Character Inventory (TCI-R) is the third stage of development of a widely used multiscale personality inventory that began with the Tridimensional Personality Questionnaire (TPQ) and then the Temperament and Character Inventory (TCI). The author describes the third stage of the psychobiological theory of temperament and character; empirical tests of its predictions from genetics, neurobiology, psychosocial development, and clinical studies; and empirical findings that stimulated incremental changes in theory and test construction. Linear factor analysis is an inadequate method for evaluating the nonlinear and dynamical nature of the intrapsychic processes that influence human personality. Traits derived by factor analysis under the doubtful assumption of linearity are actually heterogeneous composites of rational and emotional processes that differ fundamentally in their underlying brain processes. The predictions of the psychobiological theory are strongly validated by extensive data from genetics, neurobiology, longitudinal studies of development, and clinical assessment. The distinction between temperament and character allows the TCI and TCI-R to outperform other popular personality inventories in distinguishing individuals with personality disorders from others and in describing the developmental path to well-being in terms of dynamical processes within the individual that are useful for both research and clinical practice. (c) 2008 APA, all rights reserved.
An analysis of empirical estimates of sexual aggression victimization and perpetration.
Spitzberg, B H
1999-01-01
Estimates of prevalence for several categories of sexual coercion, including rape and attempted rape, were statistically aggregated across 120 studies, involving over 100,000 subjects. According to the data, almost 13% of women and over 3% of men have been raped, and almost 5% of men claim to have perpetrated rape. In contrast, about 25% of women and men claim to have been sexually coerced and to have perpetrated sexual coercion. In general, the mediating variables examined--population type, decade, date of publication, and type of operationalization--were not consistently related to rates of victimization or perpetration. Nevertheless, the extensive variation among study estimates strongly suggests the possibility of systematic sources of variation that have yet to be identified. Further analyses are called for to disentangle such sources.
Scoping and sensitivity analyses for the Demonstration Tokamak Hybrid Reactor (DTHR)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sink, D.A.; Gibson, G.
1979-03-01
The results of an extensive set of parametric studies are presented which provide analytical data of the effects of various tokamak parameters on the performance and cost of the DTHR (Demonstration Tokamak Hybrid Reactor). The studies were centered on a point design which is described in detail. Variations in the device size, neutron wall loading, and plasma aspect ratio are presented, and the effects on direct hardware costs, fissile fuel production (breeding), fusion power production, electrical power consumption, and thermal power production are shown graphically. The studies considered both ignition and beam-driven operations of DTHR and yielded results based onmore » two empirical scaling laws presently used in reactor studies. Sensitivity studies were also made for variations in the following key parameters: the plasma elongation, the minor radius, the TF coil peak field, the neutral beam injection power, and the Z/sub eff/ of the plasma.« less
Militarism and globalization: Is there an empirical link?
Irandoust, Manuchehr
2018-01-01
Despite the fact that previous studies have extensively investigated the causal nexus between military expenditure and economic growth in both developed and developing countries, those studies have not considered the role of globalization. The aim of this study is to examine the relationship between militarism and globalization for the top 15 military expenditure spenders over the period 1990-2012. The bootstrap panel Granger causality approach is utilized to detect the direction of causality. The results show that military expenditure and overall globalization are causally related in most of the countries under review. This implies that countries experiencing greater globalization have relatively large increases in militarization over the past 20 years. The policy implication of the findings is that greater military spending by a country increases the likelihood of military conflict in the future, the anticipation of which discourages globalization.
Nonlinear dynamical modes of climate variability: from curves to manifolds
NASA Astrophysics Data System (ADS)
Gavrilov, Andrey; Mukhin, Dmitry; Loskutov, Evgeny; Feigin, Alexander
2016-04-01
The necessity of efficient dimensionality reduction methods capturing dynamical properties of the system from observed data is evident. Recent study shows that nonlinear dynamical mode (NDM) expansion is able to solve this problem and provide adequate phase variables in climate data analysis [1]. A single NDM is logical extension of linear spatio-temporal structure (like empirical orthogonal function pattern): it is constructed as nonlinear transformation of hidden scalar time series to the space of observed variables, i. e. projection of observed dataset onto a nonlinear curve. Both the hidden time series and the parameters of the curve are learned simultaneously using Bayesian approach. The only prior information about the hidden signal is the assumption of its smoothness. The optimal nonlinearity degree and smoothness are found using Bayesian evidence technique. In this work we do further extension and look for vector hidden signals instead of scalar with the same smoothness restriction. As a result we resolve multidimensional manifolds instead of sum of curves. The dimension of the hidden manifold is optimized using also Bayesian evidence. The efficiency of the extension is demonstrated on model examples. Results of application to climate data are demonstrated and discussed. The study is supported by Government of Russian Federation (agreement #14.Z50.31.0033 with the Institute of Applied Physics of RAS). 1. Mukhin, D., Gavrilov, A., Feigin, A., Loskutov, E., & Kurths, J. (2015). Principal nonlinear dynamical modes of climate variability. Scientific Reports, 5, 15510. http://doi.org/10.1038/srep15510
Empirical relations for cavitation and liquid impingement erosion processes
NASA Technical Reports Server (NTRS)
Rao, P. V.; Buckley, D. H.
1984-01-01
A unified power-law relationship between average erosion rate and cumulative erosion is presented. Extensive data analyses from venturi, magnetostriction (stationary and oscillating specimens), liquid drop, and jet impact devices appear to conform to this relation. A normalization technique using cavitation and liquid impingement erosion data is also presented to facilitate prediction. Attempts are made to understand the relationship between the coefficients in the power-law relationships and the material properties.
An Empirical Approach to Analysis of Similarities between Software Failure Regions
1991-09-01
cycle costs after the soft- ware has been marketed (Alberts, 1976). 1 Unfortunately, extensive software testing is frequently necessary in spite of...incidence is primarily syntactic. This mixing of semantic and syntactic forms in the same analysis could lead to some distortion, especially since the...of formulae to improve readability or to indicate precedence of operations. * All defintions within ’Condition I’ of a failure region are assumed to
Simultaneous confidence bands for Cox regression from semiparametric random censorship.
Mondal, Shoubhik; Subramanian, Sundarraman
2016-01-01
Cox regression is combined with semiparametric random censorship models to construct simultaneous confidence bands (SCBs) for subject-specific survival curves. Simulation results are presented to compare the performance of the proposed SCBs with the SCBs that are based only on standard Cox. The new SCBs provide correct empirical coverage and are more informative. The proposed SCBs are illustrated with two real examples. An extension to handle missing censoring indicators is also outlined.
Does the U.S. exercise contagion on Italy? A theoretical model and empirical evidence
NASA Astrophysics Data System (ADS)
Cerqueti, Roy; Fenga, Livio; Ventura, Marco
2018-06-01
This paper deals with the theme of contagion in financial markets. At this aim, we develop a model based on Mixed Poisson Processes to describe the abnormal returns of financial markets of two considered countries. In so doing, the article defines the theoretical conditions to be satisfied in order to state that one of them - the so-called leader - exercises contagion on the others - the followers. Specifically, we employ an invariant probabilistic result stating that a suitable transformation of a Mixed Poisson Process is still a Mixed Poisson Process. The theoretical claim is validated by implementing an extensive simulation analysis grounded on empirical data. The countries considered are the U.S. (as the leader) and Italy (as the follower) and the period under scrutiny is very large, ranging from 1970 to 2014.
NASA Technical Reports Server (NTRS)
Bergrun, Norman R
1952-01-01
An empirically derived basis for predicting the area, rate, and distribution of water-drop impingement on airfoils of arbitrary section is presented. The concepts involved represent an initial step toward the development of a calculation technique which is generally applicable to the design of thermal ice-prevention equipment for airplane wing and tail surfaces. It is shown that sufficiently accurate estimates, for the purpose of heated-wing design, can be obtained by a few numerical computations once the velocity distribution over the airfoil has been determined. The calculation technique presented is based on results of extensive water-drop trajectory computations for five airfoil cases which consisted of 15-percent-thick airfoils encompassing a moderate lift-coefficient range. The differential equations pertaining to the paths of the drops were solved by a differential analyzer.
Empirical molecular-dynamics study of diffusion in liquid semiconductors
NASA Astrophysics Data System (ADS)
Yu, W.; Wang, Z. Q.; Stroud, D.
1996-11-01
We report the results of an extensive molecular-dynamics study of diffusion in liquid Si and Ge (l-Si and l-Ge) and of impurities in l-Ge, using empirical Stillinger-Weber (SW) potentials with several choices of parameters. We use a numerical algorithm in which the three-body part of the SW potential is decomposed into products of two-body potentials, thereby permitting the study of large systems. One choice of SW parameters agrees very well with the observed l-Ge structure factors. The diffusion coefficients D(T) at melting are found to be approximately 6.4×10-5 cm2/s for l-Si, in good agreement with previous calculations, and about 4.2×10-5 and 4.6×10-5 cm2/s for two models of l-Ge. In all cases, D(T) can be fitted to an activated temperature dependence, with activation energies Ed of about 0.42 eV for l-Si, and 0.32 or 0.26 eV for two models of l-Ge, as calculated from either the Einstein relation or from a Green-Kubo-type integration of the velocity autocorrelation function. D(T) for Si impurities in l-Ge is found to be very similar to the self-diffusion coefficient of l-Ge. We briefly discuss possible reasons why the SW potentials give D(T)'s substantially lower than ab initio predictions.
Chilenski, Sarah M; Olson, Jonathan R; Schulte, Jill A; Perkins, Daniel F; Spoth, Richard
2015-02-01
Prior theoretical and empirical research suggests that multiple aspects of an organization's context are likely related to a number of factors, from their interest and ability to adopt new programming, to client outcomes. A limited amount of the prior research has taken a more community-wide perspective by examining factors that associate with community readiness for change, leaving how these findings generalize to community organizations that conduct prevention or positive youth development programs unknown. Thus for the current study, we examined how the organizational context of the Cooperative Extension System (CES) associates with current attitudes and practices regarding prevention and evidence-based programming. Attitudes and practices have been found in the empirical literature to be key indicators of an organization's readiness to adopt prevention and evidence-based programming. Based on multi-level mixed models, results indicate that organizational management practices distinct from program delivery may affect an organization's readiness to adopt and implement new prevention and evidence-based youth programs, thereby limiting the potential public health impact of evidence-based programs. Openness to change, openness of leadership, and communication were the strongest predictors identified within this study. An organization's morale was also found to be a strong predictor of an organization's readiness. The findings of the current study are discussed in terms of implications for prevention and intervention.
Chilenski, Sarah M.; Olson, Jonathan R.; Schulte, Jill A.; Perkins, Daniel F.; Spoth, Richard
2015-01-01
Prior theoretical and empirical research suggests that multiple aspects of an organization’s context are likely related to a number of factors, from their interest and ability to adopt new programming, to client outcomes. A limited amount of the prior research has taken a more community-wide perspective by examining factors that associate with community readiness for change, leaving how these findings generalize to community organizations that conduct prevention or positive youth development programs unknown. Thus for the current study, we examined how the organizational context of the Cooperative Extension System (CES) associates with current attitudes and practices regarding prevention and evidence-based programming. Attitudes and practices have been found in the empirical literature to be key indicators of an organization’s readiness to adopt prevention and evidence-based programming. Based on multi-level mixed models, results indicate that organizational management practices distinct from program delivery may affect an organization’s readiness to adopt and implement new prevention and evidence-based youth programs, thereby limiting the potential public health impact of evidence-based programs. Openness to change, openness of leadership, and communication were the strongest predictors identified within this study. An organization’s morale was also found to be a strong predictor of an organization’s readiness. The findings of the current study are discussed in terms of implications for prevention and intervention. PMID:25463014
Jet Aeroacoustics: Noise Generation Mechanism and Prediction
NASA Technical Reports Server (NTRS)
Tam, Christopher
1998-01-01
This report covers the third year research effort of the project. The research work focussed on the fine scale mixing noise of both subsonic and supersonic jets and the effects of nozzle geometry and tabs on subsonic jet noise. In publication 1, a new semi-empirical theory of jet mixing noise from fine scale turbulence is developed. By an analogy to gas kinetic theory, it is shown that the source of noise is related to the time fluctuations of the turbulence kinetic theory. On starting with the Reynolds Averaged Navier-Stokes equations, a formula for the radiated noise is derived. An empirical model of the space-time correlation function of the turbulence kinetic energy is adopted. The form of the model is in good agreement with the space-time two-point velocity correlation function measured by Davies and coworkers. The parameters of the correlation are related to the parameters of the k-epsilon turbulence model. Thus the theory is self-contained. Extensive comparisons between the computed noise spectrum of the theory and experimental measured have been carried out. The parameters include jet Mach number from 0.3 to 2.0 and temperature ratio from 1.0 to 4.8. Excellent agreements are found in the spectrum shape, noise intensity and directivity. It is envisaged that the theory would supercede all semi-empirical and totally empirical jet noise prediction methods in current use.
NASA Astrophysics Data System (ADS)
Pries-Heje, Jan; Baskerville, Richard L.
This paper elaborates a design science approach for management planning anchored to the concept of a management design theory. Unlike the notions of design theories arising from information systems, management design theories can appear as a system of technological rules, much as a system of hypotheses or propositions can embody scientific theories. The paper illus trates this form of management design theories with three grounded cases. These grounded cases include a software process improvement study, a user involvement study, and an organizational change study. Collectively these studies demonstrate how design theories founded on technological rules can not only improve the design of information systems, but that these concepts have great practical value for improving the framing of strategic organi zational design decisions about such systems. Each case is either grounded in an empirical sense, that is to say, actual practice, or it is grounded to practices described extensively in the practical literature. Such design theories will help managers more easily approach complex, strategic decisions.
A quantitative test of population genetics using spatiogenetic patterns in bacterial colonies.
Korolev, Kirill S; Xavier, João B; Nelson, David R; Foster, Kevin R
2011-10-01
It is widely accepted that population-genetics theory is the cornerstone of evolutionary analyses. Empirical tests of the theory, however, are challenging because of the complex relationships between space, dispersal, and evolution. Critically, we lack quantitative validation of the spatial models of population genetics. Here we combine analytics, on- and off-lattice simulations, and experiments with bacteria to perform quantitative tests of the theory. We study two bacterial species, the gut microbe Escherichia coli and the opportunistic pathogen Pseudomonas aeruginosa, and show that spatiogenetic patterns in colony biofilms of both species are accurately described by an extension of the one-dimensional stepping-stone model. We use one empirical measure, genetic diversity at the colony periphery, to parameterize our models and show that we can then accurately predict another key variable: the degree of short-range cell migration along an edge. Moreover, the model allows us to estimate other key parameters, including effective population size (density) at the expansion frontier. While our experimental system is a simplification of natural microbial community, we argue that it constitutes proof of principle that the spatial models of population genetics can quantitatively capture organismal evolution.
The Dilemma of Service Productivity and Service Innovation
Aspara, Jaakko; Klein, Jan F.; Luo, Xueming; Tikkanen, Henrikki
2017-01-01
We conduct a systematic exploratory investigation of the effects of firms’ existing service productivity on the success of their new service innovations. Although previous research extensively addresses service productivity and service innovation, this is the first empirical study that bridges the gap between these two research streams and examines the links between the two concepts. Based on a comprehensive data set of new service introductions in a financial services market over a 14-year period, we empirically explore the relationship between a firm’s existing service productivity and the firm’s success in introducing new services to the market. The results unveil a fundamental service productivity-service innovation dilemma: Being productive in existing services increases a firm’s willingness to innovate new services proactively but decreases the firm’s capabilities of bringing these services to the market successfully. We provide specific insights into the mechanism underlying the complex relationship between a firm’s productivity in existing services, its innovation proactivity, and its service innovation success. For managers, we not only unpack and elucidate this dilemma but also demonstrate that a focused customer scope and growth market conditions may enable firms to mitigate the dilemma and successfully pursue service productivity and service innovation simultaneously. PMID:29706764
Unified Least Squares Methods for the Evaluation of Diagnostic Tests With the Gold Standard
Tang, Liansheng Larry; Yuan, Ao; Collins, John; Che, Xuan; Chan, Leighton
2017-01-01
The article proposes a unified least squares method to estimate the receiver operating characteristic (ROC) parameters for continuous and ordinal diagnostic tests, such as cancer biomarkers. The method is based on a linear model framework using the empirically estimated sensitivities and specificities as input “data.” It gives consistent estimates for regression and accuracy parameters when the underlying continuous test results are normally distributed after some monotonic transformation. The key difference between the proposed method and the method of Tang and Zhou lies in the response variable. The response variable in the latter is transformed empirical ROC curves at different thresholds. It takes on many values for continuous test results, but few values for ordinal test results. The limited number of values for the response variable makes it impractical for ordinal data. However, the response variable in the proposed method takes on many more distinct values so that the method yields valid estimates for ordinal data. Extensive simulation studies are conducted to investigate and compare the finite sample performance of the proposed method with an existing method, and the method is then used to analyze 2 real cancer diagnostic example as an illustration. PMID:28469385
ExGUtils: A Python Package for Statistical Analysis With the ex-Gaussian Probability Density.
Moret-Tatay, Carmen; Gamermann, Daniel; Navarro-Pardo, Esperanza; Fernández de Córdoba Castellá, Pedro
2018-01-01
The study of reaction times and their underlying cognitive processes is an important field in Psychology. Reaction times are often modeled through the ex-Gaussian distribution, because it provides a good fit to multiple empirical data. The complexity of this distribution makes the use of computational tools an essential element. Therefore, there is a strong need for efficient and versatile computational tools for the research in this area. In this manuscript we discuss some mathematical details of the ex-Gaussian distribution and apply the ExGUtils package, a set of functions and numerical tools, programmed for python, developed for numerical analysis of data involving the ex-Gaussian probability density. In order to validate the package, we present an extensive analysis of fits obtained with it, discuss advantages and differences between the least squares and maximum likelihood methods and quantitatively evaluate the goodness of the obtained fits (which is usually an overlooked point in most literature in the area). The analysis done allows one to identify outliers in the empirical datasets and criteriously determine if there is a need for data trimming and at which points it should be done.
ExGUtils: A Python Package for Statistical Analysis With the ex-Gaussian Probability Density
Moret-Tatay, Carmen; Gamermann, Daniel; Navarro-Pardo, Esperanza; Fernández de Córdoba Castellá, Pedro
2018-01-01
The study of reaction times and their underlying cognitive processes is an important field in Psychology. Reaction times are often modeled through the ex-Gaussian distribution, because it provides a good fit to multiple empirical data. The complexity of this distribution makes the use of computational tools an essential element. Therefore, there is a strong need for efficient and versatile computational tools for the research in this area. In this manuscript we discuss some mathematical details of the ex-Gaussian distribution and apply the ExGUtils package, a set of functions and numerical tools, programmed for python, developed for numerical analysis of data involving the ex-Gaussian probability density. In order to validate the package, we present an extensive analysis of fits obtained with it, discuss advantages and differences between the least squares and maximum likelihood methods and quantitatively evaluate the goodness of the obtained fits (which is usually an overlooked point in most literature in the area). The analysis done allows one to identify outliers in the empirical datasets and criteriously determine if there is a need for data trimming and at which points it should be done. PMID:29765345
True Randomness from Big Data.
Papakonstantinou, Periklis A; Woodruff, David P; Yang, Guang
2016-09-26
Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.
NASA Astrophysics Data System (ADS)
Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang
2016-09-01
Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.
Research-Based Implementation of Peer Instruction: A Literature Review
Vickrey, Trisha; Rosploch, Kaitlyn; Rahmanian, Reihaneh; Pilarz, Matthew; Stains, Marilyne
2015-01-01
Current instructional reforms in undergraduate science, technology, engineering, and mathematics (STEM) courses have focused on enhancing adoption of evidence-based instructional practices among STEM faculty members. These practices have been empirically demonstrated to enhance student learning and attitudes. However, research indicates that instructors often adapt rather than adopt practices, unknowingly compromising their effectiveness. Thus, there is a need to raise awareness of the research-based implementation of these practices, develop fidelity of implementation protocols to understand adaptations being made, and ultimately characterize the true impact of reform efforts based on these practices. Peer instruction (PI) is an example of an evidence-based instructional practice that consists of asking students conceptual questions during class time and collecting their answers via clickers or response cards. Extensive research has been conducted by physics and biology education researchers to evaluate the effectiveness of this practice and to better understand the intricacies of its implementation. PI has also been investigated in other disciplines, such as chemistry and computer science. This article reviews and summarizes these various bodies of research and provides instructors and researchers with a research-based model for the effective implementation of PI. Limitations of current studies and recommendations for future empirical inquiries are also provided. PMID:25713095
Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang
2016-01-01
Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests. PMID:27666514
Evaluating the utility of two gestural discomfort evaluation methods
Son, Minseok; Jung, Jaemoon; Park, Woojin
2017-01-01
Evaluating physical discomfort of designed gestures is important for creating safe and usable gesture-based interaction systems; yet, gestural discomfort evaluation has not been extensively studied in HCI, and few evaluation methods seem currently available whose utility has been experimentally confirmed. To address this, this study empirically demonstrated the utility of the subjective rating method after a small number of gesture repetitions (a maximum of four repetitions) in evaluating designed gestures in terms of physical discomfort resulting from prolonged, repetitive gesture use. The subjective rating method has been widely used in previous gesture studies but without empirical evidence on its utility. This study also proposed a gesture discomfort evaluation method based on an existing ergonomics posture evaluation tool (Rapid Upper Limb Assessment) and demonstrated its utility in evaluating designed gestures in terms of physical discomfort resulting from prolonged, repetitive gesture use. Rapid Upper Limb Assessment is an ergonomics postural analysis tool that quantifies the work-related musculoskeletal disorders risks for manual tasks, and has been hypothesized to be capable of correctly determining discomfort resulting from prolonged, repetitive gesture use. The two methods were evaluated through comparisons against a baseline method involving discomfort rating after actual prolonged, repetitive gesture use. Correlation analyses indicated that both methods were in good agreement with the baseline. The methods proposed in this study seem useful for predicting discomfort resulting from prolonged, repetitive gesture use, and are expected to help interaction designers create safe and usable gesture-based interaction systems. PMID:28423016
An assessment of laser velocimetry in hypersonic flow
NASA Technical Reports Server (NTRS)
1992-01-01
Although extensive progress has been made in computational fluid mechanics, reliable flight vehicle designs and modifications still cannot be made without recourse to extensive wind tunnel testing. Future progress in the computation of hypersonic flow fields is restricted by the need for a reliable mean flow and turbulence modeling data base which could be used to aid in the development of improved empirical models for use in numerical codes. Currently, there are few compressible flow measurements which could be used for this purpose. In this report, the results of experiments designed to assess the potential for laser velocimeter measurements of mean flow and turbulent fluctuations in hypersonic flow fields are presented. Details of a new laser velocimeter system which was designed and built for this test program are described.
Modelling erosion on a daily basis, an adaptation of the MMF approach
NASA Astrophysics Data System (ADS)
Shrestha, Dhruba Pikha; Jetten, Victor G.
2018-02-01
Effect of soil erosion causing negative impact on ecosystem services and food security is well known. On the other hand there can be yearly variation of total precipitation received in an area, with the presence of extreme rains. To assess annual erosion rates various empirical models have been extensively used in all the climatic regions. While these models are simple to operate and do not require lot of input data, the effect of extreme rain is not taken into account. Although physically based models are available to simulate erosion processes including particle detachment, transportation and deposition of sediments during a storm they are not applicable for assessing annual soil loss rates. Moreover storm event data may not be available everywhere prohibiting their extensive use.
On Modeling Eavesdropping Attacks in Underwater Acoustic Sensor Networks †
Wang, Qiu; Dai, Hong-Ning; Li, Xuran; Wang, Hao; Xiao, Hong
2016-01-01
The security and privacy of underwater acoustic sensor networks has received extensive attention recently due to the proliferation of underwater activities. This paper proposes an analytical model to investigate the eavesdropping attacks in underwater acoustic sensor networks. Our analytical framework considers the impacts of various underwater acoustic channel conditions (such as the acoustic signal frequency, spreading factor and wind speed) and different hydrophones (isotropic hydrophones and array hydrophones) in terms of network nodes and eavesdroppers. We also conduct extensive simulations to evaluate the effectiveness and the accuracy of our proposed model. Empirical results show that our proposed model is quite accurate. In addition, our results also imply that the eavesdropping probability heavily depends on both the underwater acoustic channel conditions and the features of hydrophones. PMID:27213379
Recent tests of the equilibrium-point hypothesis (lambda model).
Feldman, A G; Ostry, D J; Levin, M F; Gribble, P L; Mitnitski, A B
1998-07-01
The lambda model of the equilibrium-point hypothesis (Feldman & Levin, 1995) is an approach to motor control which, like physics, is based on a logical system coordinating empirical data. The model has gone through an interesting period. On one hand, several nontrivial predictions of the model have been successfully verified in recent studies. In addition, the explanatory and predictive capacity of the model has been enhanced by its extension to multimuscle and multijoint systems. On the other hand, claims have recently appeared suggesting that the model should be abandoned. The present paper focuses on these claims and concludes that they are unfounded. Much of the experimental data that have been used to reject the model are actually consistent with it.
Enabling communication concurrency through flexible MPI endpoints
Dinan, James; Grant, Ryan E.; Balaji, Pavan; ...
2014-09-23
MPI defines a one-to-one relationship between MPI processes and ranks. This model captures many use cases effectively; however, it also limits communication concurrency and interoperability between MPI and programming models that utilize threads. Our paper describes the MPI endpoints extension, which relaxes the longstanding one-to-one relationship between MPI processes and ranks. Using endpoints, an MPI implementation can map separate communication contexts to threads, allowing them to drive communication independently. Also, endpoints enable threads to be addressable in MPI operations, enhancing interoperability between MPI and other programming models. Furthermore, these characteristics are illustrated through several examples and an empirical study thatmore » contrasts current multithreaded communication performance with the need for high degrees of communication concurrency to achieve peak communication performance.« less
The humanistic psychology-positive psychology divide: contrasts in philosophical foundations.
Waterman, Alan S
2013-04-01
The relationship between the fields of humanistic and positive psychology has been marked by continued tension and ambivalence. This tension can be traced to extensive differences in the philosophical grounding characterizing the two perspectives within psychology. These differences exist with respect to (a) ontology, including the ways in which human nature is conceptualized regarding human potentials and well-being; (b) epistemology, specifically, the choice of research strategies for the empirical study of these concepts; and (c) practical philosophy, particularly the goals and strategies adopted when conducting therapy or undertaking counseling interventions. Because of this philosophical divide, adherents of the two perspectives may best be advised to pursue separately their shared desire to understand and promote human potentials and well-being.
Enabling communication concurrency through flexible MPI endpoints
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dinan, James; Grant, Ryan E.; Balaji, Pavan
MPI defines a one-to-one relationship between MPI processes and ranks. This model captures many use cases effectively; however, it also limits communication concurrency and interoperability between MPI and programming models that utilize threads. Our paper describes the MPI endpoints extension, which relaxes the longstanding one-to-one relationship between MPI processes and ranks. Using endpoints, an MPI implementation can map separate communication contexts to threads, allowing them to drive communication independently. Also, endpoints enable threads to be addressable in MPI operations, enhancing interoperability between MPI and other programming models. Furthermore, these characteristics are illustrated through several examples and an empirical study thatmore » contrasts current multithreaded communication performance with the need for high degrees of communication concurrency to achieve peak communication performance.« less
Enabling communication concurrency through flexible MPI endpoints
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dinan, James; Grant, Ryan E.; Balaji, Pavan
MPI defines a one-to-one relationship between MPI processes and ranks. This model captures many use cases effectively; however, it also limits communication concurrency and interoperability between MPI and programming models that utilize threads. This paper describes the MPI endpoints extension, which relaxes the longstanding one-to-one relationship between MPI processes and ranks. Using endpoints, an MPI implementation can map separate communication contexts to threads, allowing them to drive communication independently. Endpoints also enable threads to be addressable in MPI operations, enhancing interoperability between MPI and other programming models. These characteristics are illustrated through several examples and an empirical study that contrastsmore » current multithreaded communication performance with the need for high degrees of communication concurrency to achieve peak communication performance.« less
Framing the inborn aging process and longevity science.
Farrelly, Colin
2010-06-01
The medical sciences are currently dominated by the "disease-model" approach to health extension, an approach that prioritizes the study of pathological mechanisms with the goal of discovering treatment modalities for specific diseases. This approach has marginalized research on the aging process itself, research that could lead to an intervention that retards aging, thus conferring health dividends that would far exceed what could be expected by eliminating any specific disease of aging. This paper offers a diagnosis of how this sub-optimal approach to health extension arose and some general prescriptions concerning how progress could be made in terms of adopting a more rational approach to health extension. Drawing on empirical findings from psychology and economics, "prospect theory" is applied to the challenges of "framing" the inborn aging process given the cognitive capacities of real (rather than rational) decision-makers under conditions of risk and uncertainty. Prospect theory reveals that preferences are in fact dependent on whether particular outcomes of a choice are regarded as "a loss" or "a gain", relative to a reference point (or "aspiration level for survival"). And this has significant consequences for the way biogerontologists ought to characterise the central aspirations of the field (i.e. to prevent disease versus extend lifespan). Furthermore, it reveals the importance of shifting the existing reference point of the medical sciences to one that is shaped by the findings of evolutionary biology and biodemography.
NASA Technical Reports Server (NTRS)
Devenport, William J.; Glegg, Stewart A. L.
1995-01-01
This report summarizes accomplishments and progress for the period ending April 1995. Much of the work during this period has concentrated on preparation for an analysis of data produced by an extensive wind tunnel test. Time has also been spent further developing an empirical theory to account for the effects of blade-vortex interaction upon the circulation distribution of the vortex and on preliminary measurements aimed at controlling the vortex core size.
Mesoscale Particle-Based Model of Electrophoresis
Giera, Brian; Zepeda-Ruiz, Luis A.; Pascall, Andrew J.; ...
2015-07-31
Here, we develop and evaluate a semi-empirical particle-based model of electrophoresis using extensive mesoscale simulations. We parameterize the model using only measurable quantities from a broad set of colloidal suspensions with properties that span the experimentally relevant regime. With sufficient sampling, simulated diffusivities and electrophoretic velocities match predictions of the ubiquitous Stokes-Einstein and Henry equations, respectively. This agreement holds for non-polar and aqueous solvents or ionic liquid colloidal suspensions under a wide range of applied electric fields.
Heat transfer correlations for multilayer insulation systems
NASA Astrophysics Data System (ADS)
Krishnaprakas, C. K.; Badari Narayana, K.; Dutta, Pradip
2000-01-01
Multilayer insulation (MLI) blankets are extensively used in spacecrafts as lightweight thermal protection systems. Heat transfer analysis of MLI is sometimes too complex to use in practical design applications. Hence, for practical engineering design purposes, it is necessary to have simpler procedures to evaluate the heat transfer rate through MLI. In this paper, four different empirical models for heat transfer are evaluated by fitting against experimentally observed heat flux through MLI blankets of various configurations, and the results are discussed.
Mesoscale Particle-Based Model of Electrophoresis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giera, Brian; Zepeda-Ruiz, Luis A.; Pascall, Andrew J.
Here, we develop and evaluate a semi-empirical particle-based model of electrophoresis using extensive mesoscale simulations. We parameterize the model using only measurable quantities from a broad set of colloidal suspensions with properties that span the experimentally relevant regime. With sufficient sampling, simulated diffusivities and electrophoretic velocities match predictions of the ubiquitous Stokes-Einstein and Henry equations, respectively. This agreement holds for non-polar and aqueous solvents or ionic liquid colloidal suspensions under a wide range of applied electric fields.
Gender earnings and poverty reduction: post-Communist Uzbekistan.
Bhat, Bilal Ahmad Bhat
2011-01-01
Women get less of the material resources, social status, power and opportunities for self-actualization than men do who share their social location – be it a location based on class, race, occupation, ethnicity, religion, education, nationality, or any intersection of these factors. The process of feminization of poverty in Central Asia and Uzbekistan is intimately connected to the cultural and institutional limitations that put a ceiling on women’s involvement in economic activity. This article attempts to study and explore gender in the context of poverty reduction in Uzbekistan, the most populated state of Central Asia, to understand the ways and manner in which poverty and other forms of deprivation demand women’s participation in variety of contexts. The study is primarily an empirical one and is based on an extensive sociological investigation in the field.
NASA Astrophysics Data System (ADS)
Tiwari, Harinarayan; Sharma, Nayan
2017-05-01
This research paper focuses on the need of turbulence, instruments reliable to capture turbulence, different turbulence parameters and some advance methodology which can decompose various turbulence structures at different levels near hydraulic structures. Small-scale turbulence research has valid prospects in open channel flow. The relevance of the study is amplified as we introduce any hydraulic structure in the channel which disturbs the natural flow and creates discontinuity. To recover this discontinuity, the piano key weir (PKW) might be used with sloped keys. Constraints of empirical results in the vicinity of PKW necessitate extensive laboratory experiments with fair and reliable instrumentation techniques. Acoustic Doppler velocimeter was established to be best suited within range of some limitations using principal component analysis. Wavelet analysis is proposed to decompose the underlying turbulence structure in a better way.
Understanding Drug Use Over the Life Course: Past, Present, and Future
Hser, Yih-Ing; Hamilton, Alison; Niv, Noosha
2009-01-01
Over the past 20 years, much exciting addiction research has been conducted. Extensive knowledge has been gathered about comorbid issues, particularly mental health disorders, HIV, and criminal justice involvement. Health services addiction research has become increasingly sophisticated, shifting its focus from patients to consider also services, organizations, and financing structures. Furthermore, through several long-term follow-up studies, empirical evidence convincingly demonstrates that drug dependence is not an acute disorder, and is best understood through a life course perspective with an emphasis on chronicity This article highlights three major directions for future addiction research: developing strategies for chronic care (including longitudinal intervention studies), furthering cross-system linkage and coordination, and utilizing innovative methods (e.g., growth curve modeling, longitudinal mixed methods research) to strengthen the evidence base for the life course perspective on drug addiction. PMID:21234276
Ray tracing study of rising tone EMIC-triggered emissions
NASA Astrophysics Data System (ADS)
Hanzelka, Miroslav; Santolík, Ondřej; Grison, Benjamin; Cornilleau-Wehrlin, Nicole
2017-04-01
ElectroMagnetic Ion Cyclotron (EMIC) triggered emissions have been subject of extensive theoretical and experimental research in last years. These emissions are characterized by high coherence values and a frequency range of 0.5 - 2.0 Hz, close to local helium gyrofrequency. We perform ray tracing case studies of rising tone EMIC-triggered emissions observed by the Cluster spacecraft in both nightside and dayside regions off the equatorial plane. By comparison of simulated and measured wave properties, namely wave vector orientation, group velocity, dispersion and ellipticity of polarization, we determine possible source locations. Diffusive equilibrium density model and other, semi-empirical models are used with ion composition inferred from cross-over frequencies. Ray tracing simulations are done in cold plasma approximation with inclusion of Landau and cyclotron damping. Various widths, locations and profiles of plasmapause are tested.
NASA Astrophysics Data System (ADS)
Weißenborn, E.; Bossmeyer, T.; Bertram, T.
2011-08-01
Tighter emission regulations are driving the development of advanced engine control strategies relying on feedback information from the combustion chamber. In this context, it is especially seeked for alternatives to expensive in-cylinder pressure sensors. The present study addresses these issues by pursuing a simulation-based approach. It focuses on the extension of an empirical, zero-dimensional cylinder pressure model using the engine speed signal in order to detect cylinder-wise variations in combustion. As a special feature, only information available from the standard sensor configuration are utilized. Within the study, different methods for the model-based reconstruction of the combustion pressure including nonlinear Kalman filtering are compared. As a result, the accuracy of the cylinder pressure model can be enhanced. At the same time, the inevitable limitations of the proposed methods are outlined.
McVey, Alana J; Dolan, Bridget K; Willar, Kirsten S; Pleiss, Sheryl; Karst, Jeffrey S; Casnar, Christina L; Caiozzo, Christina; Vogt, Elisabeth M; Gordon, Nakia S; Van Hecke, Amy Vaughan
2016-12-01
Young adults with ASD experience difficulties with social skills, empathy, loneliness, and social anxiety. One intervention, PEERS® for Young Adults, shows promise in addressing these challenges. The present study replicated and extended the original study by recruiting a larger sample (N = 56), employing a gold standard ASD assessment tool, and examining changes in social anxiety utilizing a randomized controlled trial design. Results indicated improvements in social responsiveness (SSIS-RS SS, p = .006 and CPB, p = .005; SRS, p = .004), PEERS® knowledge (TYASSK, p = .001), empathy (EQ, p = .044), direct interactions (QSQ-YA, p = .059), and social anxiety (LSAS-SR, p = .019). Findings demonstrate further empirical support for the intervention for individuals with ASD.
Computational Models of Anterior Cingulate Cortex: At the Crossroads between Prediction and Effort.
Vassena, Eliana; Holroyd, Clay B; Alexander, William H
2017-01-01
In the last two decades the anterior cingulate cortex (ACC) has become one of the most investigated areas of the brain. Extensive neuroimaging evidence suggests countless functions for this region, ranging from conflict and error coding, to social cognition, pain and effortful control. In response to this burgeoning amount of data, a proliferation of computational models has tried to characterize the neurocognitive architecture of ACC. Early seminal models provided a computational explanation for a relatively circumscribed set of empirical findings, mainly accounting for EEG and fMRI evidence. More recent models have focused on ACC's contribution to effortful control. In parallel to these developments, several proposals attempted to explain within a single computational framework a wider variety of empirical findings that span different cognitive processes and experimental modalities. Here we critically evaluate these modeling attempts, highlighting the continued need to reconcile the array of disparate ACC observations within a coherent, unifying framework.
Hemakom, Apit; Goverdovsky, Valentin; Looney, David; Mandic, Danilo P
2016-04-13
An extension to multivariate empirical mode decomposition (MEMD), termed adaptive-projection intrinsically transformed MEMD (APIT-MEMD), is proposed to cater for power imbalances and inter-channel correlations in real-world multichannel data. It is shown that the APIT-MEMD exhibits similar or better performance than MEMD for a large number of projection vectors, whereas it outperforms MEMD for the critical case of a small number of projection vectors within the sifting algorithm. We also employ the noise-assisted APIT-MEMD within our proposed intrinsic multiscale analysis framework and illustrate the advantages of such an approach in notoriously noise-dominated cooperative brain-computer interface (BCI) based on the steady-state visual evoked potentials and the P300 responses. Finally, we show that for a joint cognitive BCI task, the proposed intrinsic multiscale analysis framework improves system performance in terms of the information transfer rate. © 2016 The Author(s).
NASA Astrophysics Data System (ADS)
Cañon-Tapia, Edgardo; Mendoza-Borunda, Ramón
2014-06-01
The distribution of volcanic features is ultimately controlled by processes taking place beneath the surface of a planet. For this reason, characterization of volcano distribution at a global scale can be used to obtain insights concerning dynamic aspects of planetary interiors. Until present, studies of this type have focused on volcanic features of a specific type, or have concentrated on relatively small regions. In this paper, (the first of a series of three papers) we describe the distribution of volcanic features observed over the entire surface of the Earth, combining an extensive database of submarine and subaerial volcanoes. The analysis is based on spatial density contours obtained with the Fisher kernel. Based on an empirical approach that makes no a priori assumptions concerning the number of modes that should characterize the density distribution of volcanism we identified the most significant modes. Using those modes as a base, the relevant distance for the formation of clusters of volcanoes is constrained to be on the order of 100 to 200 km. In addition, it is noted that the most significant modes lead to the identification of clusters that outline the most important tectonic margins on Earth without the need of making any ad hoc assumptions. Consequently, we suggest that this method has the potential of yielding insights about the probable occurrence of tectonic features within other planets.
Response of the Morus bombycis growing season to temperature and its latitudinal pattern in Japan.
Doi, Hideyuki
2012-09-01
Changes in leaf phenology lengthen the growing season length (GSL, the days between leaf budburst and leaf fall) under the global warming. GSL and the leaf phenology response to climate change is one of the most important predictors of climate change effect on plants. Empirical evidence of climatic effects on GSL remains scarce, especially at a regional scale and the latitudinal pattern. This study analyzed the datasets of leaf budburst and fall phenology in Morus bombycis (Urticales), which were observed by the agency of the Japan Meteorological Agency (JMA) from 1953 to 2005 over a wide range of latitudes in Japan (31 to 44° N). In the present study, single regression slopes of leaf phenological timing and air temperature across Japan were calculated and their spatial patterns using general linear models were tested. The results showed that the GSL extension was caused mainly by a delay in leaf fall phenology. Relationships between latitude and leaf phenological and GSL responses against air temperature were significantly negative. The response of leaf phenology and GSL to air temperature at lower latitudes was larger than that at higher latitudes. The findings indicate that GSL extension should be considered with regards to latitude and climate change.
Salloch, Sabine; Schildmann, Jan; Vollmann, Jochen
2012-04-13
The methodology of medical ethics during the last few decades has shifted from a predominant use of normative-philosophical analyses to an increasing involvement of empirical methods. The articles which have been published in the course of this so-called 'empirical turn' can be divided into conceptual accounts of empirical-normative collaboration and studies which use socio-empirical methods to investigate ethically relevant issues in concrete social contexts. A considered reference to normative research questions can be expected from good quality empirical research in medical ethics. However, a significant proportion of empirical studies currently published in medical ethics lacks such linkage between the empirical research and the normative analysis. In the first part of this paper, we will outline two typical shortcomings of empirical studies in medical ethics with regard to a link between normative questions and empirical data: (1) The complete lack of normative analysis, and (2) cryptonormativity and a missing account with regard to the relationship between 'is' and 'ought' statements. Subsequently, two selected concepts of empirical-normative collaboration will be presented and how these concepts may contribute to improve the linkage between normative and empirical aspects of empirical research in medical ethics will be demonstrated. Based on our analysis, as well as our own practical experience with empirical research in medical ethics, we conclude with a sketch of concrete suggestions for the conduct of empirical research in medical ethics. High quality empirical research in medical ethics is in need of a considered reference to normative analysis. In this paper, we demonstrate how conceptual approaches of empirical-normative collaboration can enhance empirical research in medical ethics with regard to the link between empirical research and normative analysis.
2012-01-01
Background The methodology of medical ethics during the last few decades has shifted from a predominant use of normative-philosophical analyses to an increasing involvement of empirical methods. The articles which have been published in the course of this so-called 'empirical turn' can be divided into conceptual accounts of empirical-normative collaboration and studies which use socio-empirical methods to investigate ethically relevant issues in concrete social contexts. Discussion A considered reference to normative research questions can be expected from good quality empirical research in medical ethics. However, a significant proportion of empirical studies currently published in medical ethics lacks such linkage between the empirical research and the normative analysis. In the first part of this paper, we will outline two typical shortcomings of empirical studies in medical ethics with regard to a link between normative questions and empirical data: (1) The complete lack of normative analysis, and (2) cryptonormativity and a missing account with regard to the relationship between 'is' and 'ought' statements. Subsequently, two selected concepts of empirical-normative collaboration will be presented and how these concepts may contribute to improve the linkage between normative and empirical aspects of empirical research in medical ethics will be demonstrated. Based on our analysis, as well as our own practical experience with empirical research in medical ethics, we conclude with a sketch of concrete suggestions for the conduct of empirical research in medical ethics. Summary High quality empirical research in medical ethics is in need of a considered reference to normative analysis. In this paper, we demonstrate how conceptual approaches of empirical-normative collaboration can enhance empirical research in medical ethics with regard to the link between empirical research and normative analysis. PMID:22500496
NASA Astrophysics Data System (ADS)
Mullen, Katharine M.
Human-technology integration is the replacement of human parts and extension of human capabilities with engineered devices and substrates. Its result is hybrid biological-artificial systems. We discuss here four categories of products furthering human-technology integration: wearable computers, pervasive computing environments, engineered tissues and organs, and prosthetics, and introduce examples of currently realized systems in each category. We then note that realization of a completely artificial sytem via the path of human-technology integration presents the prospect of empirical confirmation of an aware artificially embodied system.
NASA Astrophysics Data System (ADS)
Mosha, Herme Joseph
1988-03-01
This article seeks to identify factors affecting the quality of primary education in five regions of Tanzania by extensively reviewing relevant literature and empirical data. Some of the shortcomings emphasised by the author are: frequent staff turnover, declining financial support for primary education, ineffective curricula, shortage of teachers' guides and textbooks, and unfavourable working conditions for teachers in rural areas. Beyond this, the need for freely available material, efficient school management and regular inspections is stressed by the author.
Compressive Properties of Extruded Polytetrafluoroethylene
2007-07-01
against equivalent temperature ( Tmap ) at a single strain rate (3map). This is a pragmatic, empirically based line- arization and extension to large strains...one of the strain rates that was used in the experimental program, and in this case two rates were used: 0.1 s1 and 3200 s1. The value Tmap , is...defined as Tmap ¼ Texp þA log _3map log _3exp ð11Þ where the subscript exp indicates the experimental values of strain rate and temperature. A
Dilution jet mixing program, phase 3
NASA Technical Reports Server (NTRS)
Srinivasan, R.; Coleman, E.; Myers, G.; White, C.
1985-01-01
The main objectives for the NASA Jet Mixing Phase 3 program were: extension of the data base on the mixing of single sided rows of jets in a confined cross flow to discrete slots, including streamlined, bluff, and angled injections; quantification of the effects of geometrical and flow parameters on penetration and mixing of multiple rows of jets into a confined flow; investigation of in-line, staggered, and dissimilar hole configurations; and development of empirical correlations for predicting temperature distributions for discrete slots and multiple rows of dilution holes.
Raible, C; Leidl, R
2004-11-01
The German hospital market faces an extensive process of consolidation. In this change hospitals consider cooperation as one possibility to improve competitiveness. To investigate explanations of changes in the German hospital market by theoretical approaches of cooperation research. The aims and mechanism of the theories, their relevance in terms of contents and their potential for empirical tests were used as criteria to assess the approaches, with current and future trends in the German hospital market providing the framework. Based on literature review, six theoretical approaches were investigated: industrial organization, transaction cost theory, game theory, resource dependency, institutional theory, and co-operative investment and finance theory. In addition, the data needed to empirically test the theories were specified. As a general problem, some of the theoretical approaches set a perfect market as a precondition. This precondition is not met by the heavily regulated German hospital market. Given the current regulations and the assessment criteria, industrial organization as well as resource-dependency and institutional theory approaches showed the highest potential to explain various aspects of the changes in the hospital market. So far, none of the approaches investigated provides a comprehensive and empirically tested explanation of the changes in the German hospital market. However, some of the approaches provide a theoretical background for part of the changes. As this dynamic market is economically of high significance, there is a need for further development and empirical testing of relevant theoretical approaches.
NASA Astrophysics Data System (ADS)
Bijl, Piet; Reynolds, Joseph P.; Vos, Wouter K.; Hogervorst, Maarten A.; Fanning, Jonathan D.
2011-05-01
The TTP (Targeting Task Performance) metric, developed at NVESD, is the current standard US Army model to predict EO/IR Target Acquisition performance. This model however does not have a corresponding lab or field test to empirically assess the performance of a camera system. The TOD (Triangle Orientation Discrimination) method, developed at TNO in The Netherlands, provides such a measurement. In this study, we make a direct comparison between TOD performance for a range of sensors and the extensive historical US observer performance database built to develop and calibrate the TTP metric. The US perception data were collected doing an identification task by military personnel on a standard 12 target, 12 aspect tactical vehicle image set that was processed through simulated sensors for which the most fundamental sensor parameters such as blur, sampling, spatial and temporal noise were varied. In the present study, we measured TOD sensor performance using exactly the same sensors processing a set of TOD triangle test patterns. The study shows that good overall agreement is obtained when the ratio between target characteristic size and TOD test pattern size at threshold equals 6.3. Note that this number is purely based on empirical data without any intermediate modeling. The calibration of the TOD to the TTP is highly beneficial to the sensor modeling and testing community for a variety of reasons. These include: i) a connection between requirement specification and acceptance testing, and ii) a very efficient method to quickly validate or extend the TTP range prediction model to new systems and tasks.
A new radio propagation model at 2.4 GHz for wireless medical body sensors in outdoor environment.
Yang, Daniel S
2013-01-01
This study investigates the effect of antenna height, receive antenna placement on human body, and distance between transmitter and receiver on the loss of wireless signal power in order to develop a wireless propagation model for wireless body sensors. Although many studies looked at the effect of distance, few studies were found that investigated methodically the effect of antenna height and antenna placement on the human body. Transmit antenna heights of 1, 2, and 3 meters, receive antenna heights of 1 and 1.65 meters, "on-body" and "off-body" placements of receive antenna, and a total of 11 distances ranging from 1 to 45 meters are tested in relation to received power in dBm. Multiple regression is used to analyze the data. Significance of a variable is tested by comparing its p-value with alpha, and model fit is assessed using adjusted R(2) and s of residuals. It is found that an increase in antenna height would increase power--but only for transmit antenna. The receive antenna height has a surprising, opposite effect in the on-body case and an insignificant effect in the off-body case. To formalize the propagation model, coefficient values from multiple regression are incorporated in an extension of the log-distance model to produce a new empirical model for on-body and off-body cases, and the new empirical model could conceivably be utilized to design more reliable wireless links for medical body sensors.
NASA Astrophysics Data System (ADS)
Zhu, Xiaowei; Iungo, G. Valerio; Leonardi, Stefano; Anderson, William
2017-02-01
For a horizontally homogeneous, neutrally stratified atmospheric boundary layer (ABL), aerodynamic roughness length, z_0, is the effective elevation at which the streamwise component of mean velocity is zero. A priori prediction of z_0 based on topographic attributes remains an open line of inquiry in planetary boundary-layer research. Urban topographies - the topic of this study - exhibit spatial heterogeneities associated with variability of building height, width, and proximity with adjacent buildings; such variability renders a priori, prognostic z_0 models appealing. Here, large-eddy simulation (LES) has been used in an extensive parametric study to characterize the ABL response (and z_0) to a range of synthetic, urban-like topographies wherein statistical moments of the topography have been systematically varied. Using LES results, we determined the hierarchical influence of topographic moments relevant to setting z_0. We demonstrate that standard deviation and skewness are important, while kurtosis is negligible. This finding is reconciled with a model recently proposed by Flack and Schultz (J Fluids Eng 132:041203-1-041203-10, 2010), who demonstrate that z_0 can be modelled with standard deviation and skewness, and two empirical coefficients (one for each moment). We find that the empirical coefficient related to skewness is not constant, but exhibits a dependence on standard deviation over certain ranges. For idealized, quasi-uniform cubic topographies and for complex, fully random urban-like topographies, we demonstrate strong performance of the generalized Flack and Schultz model against contemporary roughness correlations.
Stochastic tools hidden behind the empirical dielectric relaxation laws
NASA Astrophysics Data System (ADS)
Stanislavsky, Aleksander; Weron, Karina
2017-03-01
The paper is devoted to recent advances in stochastic modeling of anomalous kinetic processes observed in dielectric materials which are prominent examples of disordered (complex) systems. Theoretical studies of dynamical properties of ‘structures with variations’ (Goldenfield and Kadanoff 1999 Science 284 87-9) require application of such mathematical tools—by means of which their random nature can be analyzed and, independently of the details distinguishing various systems (dipolar materials, glasses, semiconductors, liquid crystals, polymers, etc), the empirical universal kinetic patterns can be derived. We begin with a brief survey of the historical background of the dielectric relaxation study. After a short outline of the theoretical ideas providing the random tools applicable to modeling of relaxation phenomena, we present probabilistic implications for the study of the relaxation-rate distribution models. In the framework of the probability distribution of relaxation rates we consider description of complex systems, in which relaxing entities form random clusters interacting with each other and single entities. Then we focus on stochastic mechanisms of the relaxation phenomenon. We discuss the diffusion approach and its usefulness for understanding of anomalous dynamics of relaxing systems. We also discuss extensions of the diffusive approach to systems under tempered random processes. Useful relationships among different stochastic approaches to the anomalous dynamics of complex systems allow us to get a fresh look at this subject. The paper closes with a final discussion on achievements of stochastic tools describing the anomalous time evolution of complex systems.
Linear combination methods to improve diagnostic/prognostic accuracy on future observations
Kang, Le; Liu, Aiyi; Tian, Lili
2014-01-01
Multiple diagnostic tests or biomarkers can be combined to improve diagnostic accuracy. The problem of finding the optimal linear combinations of biomarkers to maximise the area under the receiver operating characteristic curve has been extensively addressed in the literature. The purpose of this article is threefold: (1) to provide an extensive review of the existing methods for biomarker combination; (2) to propose a new combination method, namely, the nonparametric stepwise approach; (3) to use leave-one-pair-out cross-validation method, instead of re-substitution method, which is overoptimistic and hence might lead to wrong conclusion, to empirically evaluate and compare the performance of different linear combination methods in yielding the largest area under receiver operating characteristic curve. A data set of Duchenne muscular dystrophy was analysed to illustrate the applications of the discussed combination methods. PMID:23592714
Wangmo, Tenzin; Hauri, Sirin; Gennet, Eloise; Anane-Sarpong, Evelyn; Provoost, Veerle; Elger, Bernice S
2018-02-07
A review of literature published a decade ago noted a significant increase in empirical papers across nine bioethics journals. This study provides an update on the presence of empirical papers in the same nine journals. It first evaluates whether the empirical trend is continuing as noted in the previous study, and second, how it is changing, that is, what are the characteristics of the empirical works published in these nine bioethics journals. A review of the same nine journals (Bioethics; Journal of Medical Ethics; Journal of Clinical Ethics; Nursing Ethics; Cambridge Quarterly of Healthcare Ethics; Hastings Center Report; Theoretical Medicine and Bioethics; Christian Bioethics; and Kennedy Institute of Ethics Journal) was conducted for a 12-year period from 2004 to 2015. Data obtained was analysed descriptively and using a non-parametric Chi-square test. Of the total number of original papers (N = 5567) published in the nine bioethics journals, 18.1% (n = 1007) collected and analysed empirical data. Journal of Medical Ethics and Nursing Ethics led the empirical publications, accounting for 89.4% of all empirical papers. The former published significantly more quantitative papers than qualitative, whereas the latter published more qualitative papers. Our analysis reveals no significant difference (χ2 = 2.857; p = 0.091) between the proportion of empirical papers published in 2004-2009 and 2010-2015. However, the increasing empirical trend has continued in these journals with the proportion of empirical papers increasing from 14.9% in 2004 to 17.8% in 2015. This study presents the current state of affairs regarding empirical research published nine bioethics journals. In the quarter century of data that is available about the nine bioethics journals studied in two reviews, the proportion of empirical publications continues to increase, signifying a trend towards empirical research in bioethics. The growing volume is mainly attributable to two journals: Journal of Medical Ethics and Nursing Ethics. This descriptive study further maps the still developing field of empirical research in bioethics. Additional studies are needed to completely map the nature and extent of empirical research in bioethics to inform the ongoing debate about the value of empirical research for bioethics.
Predicting Seagrass Occurrence in a Changing Climate Using Random Forests
NASA Astrophysics Data System (ADS)
Aydin, O.; Butler, K. A.
2017-12-01
Seagrasses are marine plants that can quickly sequester vast amounts of carbon (up to 100 times more and 12 times faster than tropical forests). In this work, we present an integrated GIS and machine learning approach to build a data-driven model of seagrass presence-absence. We outline a random forest approach that avoids the prevalence bias in many ecological presence-absence models. One of our goals is to predict global seagrass occurrence from a spatially limited training sample. In addition, we conduct a sensitivity study which investigates the vulnerability of seagrass to changing climate conditions. We integrate multiple data sources including fine-scale seagrass data from MarineCadastre.gov and the recently available globally extensive publicly available Ecological Marine Units (EMU) dataset. These data are used to train a model for seagrass occurrence along the U.S. coast. In situ oceans data are interpolated using Empirical Bayesian Kriging (EBK) to produce globally extensive prediction variables. A neural network is used to estimate probable future values of prediction variables such as ocean temperature to assess the impact of a warming climate on seagrass occurrence. The proposed workflow can be generalized to many presence-absence models.
Malik, Salim S; Lythgoe, Mark P; McPhail, Mark; Monahan, Kevin J
2017-11-30
Around 5% of colorectal cancers are due to mutations within DNA mismatch repair genes, resulting in Lynch syndrome (LS). These mutations have a high penetrance with early onset of colorectal cancer at a mean age of 45 years. The mainstay of surgical management is either a segmental or extensive colectomy. Currently there is no unified agreement as to which management strategy is superior due to limited conclusive empirical evidence available. A systematic review and meta- analysis to evaluate the risk of metachronous colorectal cancer (MCC) and mortality in LS following segmental and extensive colectomy. A systematic review of the PubMed database was conducted. Studies were included/ excluded based on pre-specified criteria. To assess the risk of MCC and mortality attributed to segmental or extensive colectomies, relative risks (RR) were calculated and corresponding 95% confidence intervals (CI). Publication bias was investigated using funnel plots. Data about mortality, as well as patient ascertainment [Amsterdam criteria (AC), germline mutation (GM)] were also extracted. Statistical analysis was conducted using the R program (version 3.2.3). The literature search identified 85 studies. After further analysis ten studies were eligible for inclusion in data synthesis. Pooled data identified 1389 patients followed up for a mean of 100.7 months with a mean age of onset of 45.5 years of age. A total 1119 patients underwent segmental colectomies with an absolute risk of MCC in this group of 22.4% at the end of follow-up. The 270 patients who had extensive colectomies had a MCC absolute risk of 4.7% (0% in those with a panproctocolecomy). Segmental colectomy was significantly associated with an increased relative risk of MCC (RR = 5.12; 95% CI 2.88-9.11; Fig. 1), although no significant association with mortality was identified (RR = 1.65; 95% CI 0.90-3.02). There was no statistically significant difference in the risk of MCC between AC and GM cohorts (p = 0.5, Chi-squared test). In LS, segmental colectomy results in a significant increased risk of developing MCC. Despite the choice of segmental or extensive colectomies having no statistically significant impact on mortality, the choice of initial surgical management can impact a patient's requirement for further surgery. An extensive colectomy can result in decreased need for further surgery; reduced hospital stays and associated costs. The significant difference in the risk of MCC, following segmental or extensive colectomies should be discussed with patients when deciding appropriate management. An individualised approach should be utilised, taking into account the patient's age, co-morbidities and genotype. In order to determine likely germline-specific effects, or a difference in survival, larger and more comprehensive studies are required.
Evaluating the febrile patient with a rash.
McKinnon, H D; Howard, T
2000-08-15
The differential diagnosis for febrile patients with a rash is extensive. Diseases that present with fever and rash are usually classified according to the morphology of the primary lesion. Rashes can be categorized as maculopapular (centrally and peripherally distributed), petechial, diffusely erythematous with desquamation, vesiculobullous-pustular and nodular. Potential causes include viruses, bacteria, spirochetes, rickettsiae, medications and rheumatologic diseases. A thorough history and a careful physical examination are essential to making a correct diagnosis. Although laboratory studies can be useful in confirming the diagnosis, test results often are not available immediately. Because the severity of these illnesses can vary from minor (roseola) to life-threatening (meningococcemia), the family physician must make prompt management decisions regarding empiric therapy. Hospitalization, isolation and antimicrobial therapy often must be considered when a patient presents with fever and a rash.
Semantic memory: a feature-based analysis and new norms for Italian.
Montefinese, Maria; Ambrosini, Ettore; Fairfield, Beth; Mammarella, Nicola
2013-06-01
Semantic norms for properties produced by native speakers are valuable tools for researchers interested in the structure of semantic memory and in category-specific semantic deficits in individuals following brain damage. The aims of this study were threefold. First, we sought to extend existing semantic norms by adopting an empirical approach to category (Exp. 1) and concept (Exp. 2) selection, in order to obtain a more representative set of semantic memory features. Second, we extensively outlined a new set of semantic production norms collected from Italian native speakers for 120 artifactual and natural basic-level concepts, using numerous measures and statistics following a feature-listing task (Exp. 3b). Finally, we aimed to create a new publicly accessible database, since only a few existing databases are publicly available online.
Dynamics of information diffusion and its applications on complex networks
NASA Astrophysics Data System (ADS)
Zhang, Zi-Ke; Liu, Chuang; Zhan, Xiu-Xiu; Lu, Xin; Zhang, Chu-Xu; Zhang, Yi-Cheng
2016-09-01
The ongoing rapid expansion of the Word Wide Web (WWW) greatly increases the information of effective transmission from heterogeneous individuals to various systems. Extensive research for information diffusion is introduced by a broad range of communities including social and computer scientists, physicists, and interdisciplinary researchers. Despite substantial theoretical and empirical studies, unification and comparison of different theories and approaches are lacking, which impedes further advances. In this article, we review recent developments in information diffusion and discuss the major challenges. We compare and evaluate available models and algorithms to respectively investigate their physical roles and optimization designs. Potential impacts and future directions are discussed. We emphasize that information diffusion has great scientific depth and combines diverse research fields which makes it interesting for physicists as well as interdisciplinary researchers.
Social Learning Strategies: Bridge-Building between Fields.
Kendal, Rachel L; Boogert, Neeltje J; Rendell, Luke; Laland, Kevin N; Webster, Mike; Jones, Patricia L
2018-07-01
While social learning is widespread, indiscriminate copying of others is rarely beneficial. Theory suggests that individuals should be selective in what, when, and whom they copy, by following 'social learning strategies' (SLSs). The SLS concept has stimulated extensive experimental work, integrated theory, and empirical findings, and created impetus to the social learning and cultural evolution fields. However, the SLS concept needs updating to accommodate recent findings that individuals switch between strategies flexibly, that multiple strategies are deployed simultaneously, and that there is no one-to-one correspondence between psychological heuristics deployed and resulting population-level patterns. The field would also benefit from the simultaneous study of mechanism and function. SLSs provide a useful vehicle for bridge-building between cognitive psychology, neuroscience, and evolutionary biology. Copyright © 2018. Published by Elsevier Ltd.
Mining Personal Data Using Smartphones and Wearable Devices: A Survey
Rehman, Muhammad Habib ur; Liew, Chee Sun; Wah, Teh Ying; Shuja, Junaid; Daghighi, Babak
2015-01-01
The staggering growth in smartphone and wearable device use has led to a massive scale generation of personal (user-specific) data. To explore, analyze, and extract useful information and knowledge from the deluge of personal data, one has to leverage these devices as the data-mining platforms in ubiquitous, pervasive, and big data environments. This study presents the personal ecosystem where all computational resources, communication facilities, storage and knowledge management systems are available in user proximity. An extensive review on recent literature has been conducted and a detailed taxonomy is presented. The performance evaluation metrics and their empirical evidences are sorted out in this paper. Finally, we have highlighted some future research directions and potentially emerging application areas for personal data mining using smartphones and wearable devices. PMID:25688592
Infliximab-Related Infusion Reactions: Systematic Review
Ron, Yulia; Kivity, Shmuel; Ben-Horin, Shomron; Israeli, Eran; Fraser, Gerald M.; Dotan, Iris; Chowers, Yehuda; Confino-Cohen, Ronit; Weiss, Batia
2015-01-01
Objective: Administration of infliximab is associated with a well-recognised risk of infusion reactions. Lack of a mechanism-based rationale for their prevention, and absence of adequate and well-controlled studies, has led to the use of diverse empirical administration protocols. The aim of this study is to perform a systematic review of the evidence behind the strategies for preventing infusion reactions to infliximab, and for controlling the reactions once they occur. Methods: We conducted extensive search of electronic databases of MEDLINE [PubMed] for reports that communicate various aspects of infusion reactions to infliximab in IBD patients. Results: We examined full texts of 105 potentially eligible articles. No randomised controlled trials that pre-defined infusion reaction as a primary outcome were found. Three RCTs evaluated infusion reactions as a secondary outcome; another four RCTs included infusion reactions in the safety evaluation analysis; and 62 additional studies focused on various aspects of mechanism/s, risk, primary and secondary preventive measures, and management algorithms. Seven studies were added by a manual search of reference lists of the relevant articles. A total of 76 original studies were included in quantitative analysis of the existing strategies. Conclusions: There is still paucity of systematic and controlled data on the risk, prevention, and management of infusion reactions to infliximab. We present working algorithms based on systematic and extensive review of the available data. More randomised controlled trials are needed in order to investigate the efficacy of the proposed preventive and management algorithms. PMID:26092578
Limits of the memory coefficient in measuring correlated bursts
NASA Astrophysics Data System (ADS)
Jo, Hang-Hyun; Hiraoka, Takayuki
2018-03-01
Temporal inhomogeneities in event sequences of natural and social phenomena have been characterized in terms of interevent times and correlations between interevent times. The inhomogeneities of interevent times have been extensively studied, while the correlations between interevent times, often called correlated bursts, are far from being fully understood. For measuring the correlated bursts, two relevant approaches were suggested, i.e., memory coefficient and burst size distribution. Here a burst size denotes the number of events in a bursty train detected for a given time window. Empirical analyses have revealed that the larger memory coefficient tends to be associated with the heavier tail of the burst size distribution. In particular, empirical findings in human activities appear inconsistent, such that the memory coefficient is close to 0, while burst size distributions follow a power law. In order to comprehend these observations, by assuming the conditional independence between consecutive interevent times, we derive the analytical form of the memory coefficient as a function of parameters describing interevent time and burst size distributions. Our analytical result can explain the general tendency of the larger memory coefficient being associated with the heavier tail of burst size distribution. We also find that the apparently inconsistent observations in human activities are compatible with each other, indicating that the memory coefficient has limits to measure the correlated bursts.
Environmental Kuznets Curve Hypothesis: A Perspective of Sustainable Development in Indonesia
NASA Astrophysics Data System (ADS)
Nuansa, Citrasmara Galuh; Widodo, Wahyu
2018-02-01
Sustainable development with three main pillars, namely environmental, economic, and social, is the concept of country's development to achieve inclusive economic growth, good environmental quality, and improvement of people's welfare. However, the dominance of economic factors cause various environmental problem. This phenomenon occurs in most of developing countries, including in Indonesia. The relationship between economic activity and environmental quality has been widely discussed and empirically tested by scholars. This descriptive research analysed the hypothesis called Environmental Kuznets Curve (EKC) from a perspective of sustainable development in Indonesia. EKC hypothesis illustrates the relationship between economic growth and environmental degradation forming an inverted U-curve, indicating that at the beginning of development, environmental quality will decrease along with increasing economic growth, and then reached a certain point the environmental quality will gradually improve. In this paper will be discussed how the relationship between environmental quality and economic growth in Indonesia was investigated. The preliminary results show that most of the empirical studies use the conventional approach, in which the CO2 emission used as the proxy of environmental degradation. The existence of inverted U-curve is also inconclusive. Therefore, the extension research on the relationship between economic growth and environmental quality in Indonesia using the EKC hypothesis is required.
Maraolo, Alberto Enrico; Cascella, Marco; Corcione, Silvia; Cuomo, Arturo; Nappa, Salvatore; Borgia, Guglielmo; De Rosa, Francesco Giuseppe; Gentile, Ivan
2017-09-01
Pseudomonas aeruginosa (PA) is one of the most important causes of healthcare-related infections among Gram-negative bacteria. The best therapeutic approach is controversial, especially for multidrug-resistant (MDR) and extensively drug-resistant (XDR) strains as well as in the setting of most severe patients, such as in the intensive care unit (ICU). Areas covered: This article addresses several points. First, the main microbiological aspects of PA, focusing on its wide array of resistance mechanisms. Second, risk factors and the worse outcome linked to MDR-PA infection. Third, the pharmacological peculiarity of ICU patients, that makes the choice of a proper antimicrobial therapy difficult. Eventually, the current therapeutic options against MDR-PA are reviewed, taking into account the main variables that drive antimicrobial optimization in critically ill patients. Literature search was carried out using Pubmed and Web of Science. Expert commentary: Methodologically rigorous studies are urgently needed to clarify crucial aspects of the treatment against MDR-PA, namely monotherapy versus combination therapy in empiric and targeted settings. In the meanwhile, useful options are represented by newly approved drugs, such as ceftolozane/tazobactam and ceftazidime/avibactam. In critically ill patients, at least as empirical approach, a combination therapy is a prudent choice when a MDR-PA strain is suspected.
Adopting adequate leaching requirement for practical response models of basil to salinity
NASA Astrophysics Data System (ADS)
Babazadeh, Hossein; Tabrizi, Mahdi Sarai; Darvishi, Hossein Hassanpour
2016-07-01
Several mathematical models are being used for assessing plant response to salinity of the root zone. Objectives of this study included quantifying the yield salinity threshold value of basil plants to irrigation water salinity and investigating the possibilities of using irrigation water salinity instead of saturated extract salinity in the available mathematical models for estimating yield. To achieve the above objectives, an extensive greenhouse experiment was conducted with 13 irrigation water salinity levels, namely 1.175 dS m-1 (control treatment) and 1.8 to 10 dS m-1. The result indicated that, among these models, the modified discount model (one of the most famous root water uptake model which is based on statistics) produced more accurate results in simulating the basil yield reduction function using irrigation water salinities. Overall the statistical model of Steppuhn et al. on the modified discount model and the math-empirical model of van Genuchten and Hoffman provided the best results. In general, all of the statistical models produced very similar results and their results were better than math-empirical models. It was also concluded that if enough leaching was present, there was no significant difference between the soil salinity saturated extract models and the models using irrigation water salinity.
NASA Astrophysics Data System (ADS)
Amri, N.; Hashim, M. I.; Ismail, N.; Rohman, F. S.; Bashah, N. A. A.
2017-09-01
Electrocoagulation (EC) is a promising technology that extensively used to remove fluoride ions efficiently from industrial wastewater. However, it has received very little consideration and understanding on mechanism and factors that affecting the fluoride removal process. In order to determine the efficiency of fluoride removal in EC process, the effect of operating parameters such as voltage and electrolysis time were investigated in this study. A batch experiment with monopolar aluminium electrodes was conducted to identify the model of fluoride removal using empirical model equation. The EC process was investigated using several parameters which include voltage (3 - 12 V) and electrolysis time (0 - 60 minutes) at a constant initial fluoride concentration of 25 mg/L. The result shows that the fluoride removal efficiency increased steadily with increasing voltage and electrolysis time. The best fluoride removal efficiency was obtained with 94.8 % removal at 25 mg/L initial fluoride concentration, voltage of 12 V and 60 minutes electrolysis time. The results indicated that the rate constant, k and number of order, n decreased as the voltage increased. The rate of fluoride removal model was developed based on the empirical model equation using the correlation of k and n. Overall, the result showed that EC process can be considered as a potential alternative technology for fluoride removal in wastewater.
Delineating genetic relationships among the Maya.
Ibarra-Rivera, Lisa; Mirabal, Sheyla; Regueiro, Manuela M; Herrera, Rene J
2008-03-01
By 250 AD, the Classic Maya had become the most advanced civilization within the New World, possessing the only well-developed hieroglyphic writing system of the time and an advanced knowledge of mathematics, astronomy and architecture. Though only ruins of the empire remain, 7.5 million Mayan descendants still occupy areas of Mexico, Guatemala, Belize, El Salvador, and Honduras. Although they inhabit distant and distinct territories, speak more than 28 languages, and have been historically divided by warfare and a city-state-like political system, and they share characteristics such as rituals, artistic, architectural motifs that distinguish them as unequivocally Maya. This study was undertaken to determine whether these similarities among Mayan communities mirror genetic affinities or are merely a reflection of their common culture. Four Mayan populations were investigated (i.e., the K'iche and Kakchikel from Guatemala and the Campeche and Yucatan from Mexico) and compared with previously published populations across 15 autosomal STR loci. As a whole, the Maya emerge as a distinct group within Mesoamerica, indicating that they are more similar to each other than to other Mesoamerican groups. The data suggest that although geographic and political boundaries existed among Mayan communities, genetic exchanges between the different Mayan groups have occurred, supporting theories of extensive trading throughout the empire. 2007 Wiley-Liss, Inc.
EMPIRE: Nuclear Reaction Model Code System for Data Evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herman, M.; Capote, R.; Carlson, B.V.
EMPIRE is a modular system of nuclear reaction codes, comprising various nuclear models, and designed for calculations over a broad range of energies and incident particles. A projectile can be a neutron, proton, any ion (including heavy-ions) or a photon. The energy range extends from the beginning of the unresolved resonance region for neutron-induced reactions ({approx} keV) and goes up to several hundred MeV for heavy-ion induced reactions. The code accounts for the major nuclear reaction mechanisms, including direct, pre-equilibrium and compound nucleus ones. Direct reactions are described by a generalized optical model (ECIS03) or by the simplified coupled-channels approachmore » (CCFUS). The pre-equilibrium mechanism can be treated by a deformation dependent multi-step direct (ORION + TRISTAN) model, by a NVWY multi-step compound one or by either a pre-equilibrium exciton model with cluster emission (PCROSS) or by another with full angular momentum coupling (DEGAS). Finally, the compound nucleus decay is described by the full featured Hauser-Feshbach model with {gamma}-cascade and width-fluctuations. Advanced treatment of the fission channel takes into account transmission through a multiple-humped fission barrier with absorption in the wells. The fission probability is derived in the WKB approximation within the optical model of fission. Several options for nuclear level densities include the EMPIRE-specific approach, which accounts for the effects of the dynamic deformation of a fast rotating nucleus, the classical Gilbert-Cameron approach and pre-calculated tables obtained with a microscopic model based on HFB single-particle level schemes with collective enhancement. A comprehensive library of input parameters covers nuclear masses, optical model parameters, ground state deformations, discrete levels and decay schemes, level densities, fission barriers, moments of inertia and {gamma}-ray strength functions. The results can be converted into ENDF-6 formatted files using the accompanying code EMPEND and completed with neutron resonances extracted from the existing evaluations. The package contains the full EXFOR (CSISRS) library of experimental reaction data that are automatically retrieved during the calculations. Publication quality graphs can be obtained using the powerful and flexible plotting package ZVView. The graphic user interface, written in Tcl/Tk, provides for easy operation of the system. This paper describes the capabilities of the code, outlines physical models and indicates parameter libraries used by EMPIRE to predict reaction cross sections and spectra, mainly for nucleon-induced reactions. Selected applications of EMPIRE are discussed, the most important being an extensive use of the code in evaluations of neutron reactions for the new US library ENDF/B-VII.0. Future extensions of the system are outlined, including neutron resonance module as well as capabilities of generating covariances, using both KALMAN and Monte-Carlo methods, that are still being advanced and refined.« less
Viallon, Vivian; Banerjee, Onureena; Jougla, Eric; Rey, Grégoire; Coste, Joel
2014-03-01
Looking for associations among multiple variables is a topical issue in statistics due to the increasing amount of data encountered in biology, medicine, and many other domains involving statistical applications. Graphical models have recently gained popularity for this purpose in the statistical literature. In the binary case, however, exact inference is generally very slow or even intractable because of the form of the so-called log-partition function. In this paper, we review various approximate methods for structure selection in binary graphical models that have recently been proposed in the literature and compare them through an extensive simulation study. We also propose a modification of one existing method, that is shown to achieve good performance and to be generally very fast. We conclude with an application in which we search for associations among causes of death recorded on French death certificates. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Local Linear Regression for Data with AR Errors.
Li, Runze; Li, Yan
2009-07-01
In many statistical applications, data are collected over time, and they are likely correlated. In this paper, we investigate how to incorporate the correlation information into the local linear regression. Under the assumption that the error process is an auto-regressive process, a new estimation procedure is proposed for the nonparametric regression by using local linear regression method and the profile least squares techniques. We further propose the SCAD penalized profile least squares method to determine the order of auto-regressive process. Extensive Monte Carlo simulation studies are conducted to examine the finite sample performance of the proposed procedure, and to compare the performance of the proposed procedures with the existing one. From our empirical studies, the newly proposed procedures can dramatically improve the accuracy of naive local linear regression with working-independent error structure. We illustrate the proposed methodology by an analysis of real data set.
2015-01-01
The recent availability of high frequency data has permitted more efficient ways of computing volatility. However, estimation of volatility from asset price observations is challenging because observed high frequency data are generally affected by noise-microstructure effects. We address this issue by using the Fourier estimator of instantaneous volatility introduced in Malliavin and Mancino 2002. We prove a central limit theorem for this estimator with optimal rate and asymptotic variance. An extensive simulation study shows the accuracy of the spot volatility estimates obtained using the Fourier estimator and its robustness even in the presence of different microstructure noise specifications. An empirical analysis on high frequency data (U.S. S&P500 and FIB 30 indices) illustrates how the Fourier spot volatility estimates can be successfully used to study intraday variations of volatility and to predict intraday Value at Risk. PMID:26421617
Joeng, Hee-Koung; Chen, Ming-Hui; Kang, Sangwook
2015-01-01
Discrete survival data are routinely encountered in many fields of study including behavior science, economics, epidemiology, medicine, and social science. In this paper, we develop a class of proportional exponentiated link transformed hazards (ELTH) models. We carry out a detailed examination of the role of links in fitting discrete survival data and estimating regression coefficients. Several interesting results are established regarding the choice of links and baseline hazards. We also characterize the conditions for improper survival functions and the conditions for existence of the maximum likelihood estimates under the proposed ELTH models. An extensive simulation study is conducted to examine the empirical performance of the parameter estimates under the Cox proportional hazards model by treating discrete survival times as continuous survival times, and the model comparison criteria, AIC and BIC, in determining links and baseline hazards. A SEER breast cancer dataset is analyzed in details to further demonstrate the proposed methodology. PMID:25772374
Chen, Gang; Xu, Zhengyuan; Ding, Haipeng; Sadler, Brian
2009-03-02
We consider outdoor non-line-of-sight deep ultraviolet (UV) solar blind communications at ranges up to 100 m, with different transmitter and receiver geometries. We propose an empirical channel path loss model, and fit the model based on extensive measurements. We observe range-dependent power decay with a power exponent that varies from 0.4 to 2.4 with varying geometry. We compare with the single scattering model, and show that the single scattering assumption leads to a model that is not accurate for small apex angles. Our model is then used to study fundamental communication system performance trade-offs among transmitted optical power, range, link geometry, data rate, and bit error rate. Both weak and strong solar background radiation scenarios are considered to bound detection performance. These results provide guidelines to system design.
Building more solid bridges between Buddhism and Western psychology.
Sugamura, Genji; Haruki, Yutaka; Koshikawa, Fusako
2007-12-01
Introducing the ways of cultivating mental balance, B. A. Wallace and S. L. Shapiro attempted to build bridges between Buddhism and psychology. Their systematic categorization of Buddhist teachings and extensive review of empirical support from Western psychology are valuable for future study. However, it remains a matter of concern that some more profound parts of Buddhist philosophy can be disregarded by focusing only on practical aspects of Buddhism within the context of mental health. In this comment, the authors briefly address four substantial themes to be considered: reality, identity, causality, and logicality. They suggest that the way to interpret Buddhism as techniques for well-being would certainly be viable in encouraging the study of Buddhist teachings in psychology. Yet, such attempts should not result in superficial imports and applications of Buddhist practices but give due weight to the deeper philosophical issues to build more solid bridges between Buddhism and psychology. (Copyright) 2007 APA.
López-Ariztegui, N; Lobato-Casado, P; Muñoz-Escudero, F; Polo-Martín, M; Montes-Gonzalo, M C; Alvarez-Tejerina, A
To report a case of sub-acute encephalopathy with all the extension study negative and with response to steroid therapy. The study involves a 22-year-old female with no relevant past history who presented symptoms of sub-acute encephalopathy consisting in behavioural disorders, generalised seizures and bradypsychia, which gradually progressed to a state of low-level consciousness. While she was in hospital all kinds of diagnostic tests were conducted, the results of which were either normal or negative; the electroencephalogram was repeatedly abnormal and detection of protein 14-3-3 in cerebrospinal fluid was positive. Empirical corticoid therapy was begun with clinical and electrophysiological improvements and the patient recovered completely without any sequelae. With no evidence of autoimmune thyroid disease, although non-specific autoimmunity was present, the patient was diagnosed as having non-vasculitic autoimmune meningoencephalitis.
A Multicenter Evaluation of Prolonged Empiric Antibiotic Therapy in Adult ICUs in the United States.
Thomas, Zachariah; Bandali, Farooq; Sankaranarayanan, Jayashri; Reardon, Tom; Olsen, Keith M
2015-12-01
The purpose of this study is to determine the rate of prolonged empiric antibiotic therapy in adult ICUs in the United States. Our secondary objective is to examine the relationship between the prolonged empiric antibiotic therapy rate and certain ICU characteristics. Multicenter, prospective, observational, 72-hour snapshot study. Sixty-seven ICUs from 32 hospitals in the United States. Nine hundred ninety-eight patients admitted to the ICU between midnight on June 20, 2011, and June 21, 2011, were included in the study. None. Antibiotic orders were categorized as prophylactic, definitive, empiric, or prolonged empiric antibiotic therapy. Prolonged empiric antibiotic therapy was defined as empiric antibiotics that continued for at least 72 hours in the absence of adjudicated infection. Standard definitions from the Centers for Disease Control and Prevention were used to determine infection. Prolonged empiric antibiotic therapy rate was determined as the ratio of the total number of empiric antibiotics continued for at least 72 hours divided by the total number of empiric antibiotics. Univariate analysis of factors associated with the ICU prolonged empiric antibiotic therapy rate was conducted using Student t test. A total of 660 unique antibiotics were prescribed as empiric therapy to 364 patients. Of the empiric antibiotics, 333 of 660 (50%) were continued for at least 72 hours in instances where Centers for Disease Control and Prevention infection criteria were not met. Suspected pneumonia accounted for approximately 60% of empiric antibiotic use. The most frequently prescribed empiric antibiotics were vancomycin and piperacillin/tazobactam. ICUs that utilized invasive techniques for the diagnosis of ventilator-associated pneumonia had lower rates of prolonged empiric antibiotic therapy than those that did not, 45.1% versus 59.5% (p = 0.03). No other institutional factor was significantly associated with prolonged empiric antibiotic therapy rate. Half of all empiric antibiotics ordered in critically ill patients are continued for at least 72 hours in absence of adjudicated infection. Additional studies are needed to confirm these findings and determine the risks and benefits of prolonged empiric therapy in the critically ill.
Absorption line indices in the UV. I. Empirical and theoretical stellar population models
NASA Astrophysics Data System (ADS)
Maraston, C.; Nieves Colmenárez, L.; Bender, R.; Thomas, D.
2009-01-01
Aims: Stellar absorption lines in the optical (e.g. the Lick system) have been extensively studied and constitute an important stellar population diagnostic for galaxies in the local universe and up to moderate redshifts. Proceeding towards higher look-back times, galaxies are younger and the ultraviolet becomes the relevant spectral region where the dominant stellar populations shine. A comprehensive study of ultraviolet absorption lines of stellar population models is however still lacking. With this in mind, we study absorption line indices in the far and mid-ultraviolet in order to determine age and metallicity indicators for UV-bright stellar populations in the local universe as well as at high redshift. Methods: We explore empirical and theoretical spectral libraries and use evolutionary population synthesis to compute synthetic line indices of stellar population models. From the empirical side, we exploit the IUE-low resolution library of stellar spectra and system of absorption lines, from which we derive analytical functions (fitting functions) describing the strength of stellar line indices as a function of gravity, temperature and metallicity. The fitting functions are entered into an evolutionary population synthesis code in order to compute the integrated line indices of stellar populations models. The same line indices are also directly evaluated on theoretical spectral energy distributions of stellar population models based on Kurucz high-resolution synthetic spectra, In order to select indices that can be used as age and/or metallicity indicators for distant galaxies and globular clusters, we compare the models to data of template globular clusters from the Magellanic Clouds with independently known ages and metallicities. Results: We provide synthetic line indices in the wavelength range ~1200 Å to ~3000 Å for stellar populations of various ages and metallicities.This adds several new indices to the already well-studied CIV and SiIV absorptions. Based on the comparison with globular cluster data, we select a set of 11 indices blueward of the 2000 Å rest-frame that allows us to recover well the ages and the metallicities of the clusters. These indices are ideal to study ages and metallicities of young galaxies at high redshift. We also provide the synthetic high-resolution stellar population SEDs.
Verstraelen, Toon; Van Speybroeck, Veronique; Waroquier, Michel
2009-07-28
An extensive benchmark of the electronegativity equalization method (EEM) and the split charge equilibration (SQE) model on a very diverse set of organic molecules is presented. These models efficiently compute atomic partial charges and are used in the development of polarizable force fields. The predicted partial charges that depend on empirical parameters are calibrated to reproduce results from quantum mechanical calculations. Recently, SQE is presented as an extension of the EEM to obtain the correct size dependence of the molecular polarizability. In this work, 12 parametrization protocols are applied to each model and the optimal parameters are benchmarked systematically. The training data for the empirical parameters comprise of MP2/Aug-CC-pVDZ calculations on 500 organic molecules containing the elements H, C, N, O, F, S, Cl, and Br. These molecules have been selected by an ingenious and autonomous protocol from an initial set of almost 500,000 small organic molecules. It is clear that the SQE model outperforms the EEM in all benchmark assessments. When using Hirshfeld-I charges for the calibration, the SQE model optimally reproduces the molecular electrostatic potential from the ab initio calculations. Applications on chain molecules, i.e., alkanes, alkenes, and alpha alanine helices, confirm that the EEM gives rise to a divergent behavior for the polarizability, while the SQE model shows the correct trends. We conclude that the SQE model is an essential component of a polarizable force field, showing several advantages over the original EEM.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lombardo, N.J.; Marseille, T.J.; White, M.D.
TRUMP-BD (Boil Down) is an extension of the TRUMP (Edwards 1972) computer program for the analysis of nuclear fuel assemblies under severe accident conditions. This extension allows prediction of the heat transfer rates, metal-water oxidation rates, fission product release rates, steam generation and consumption rates, and temperature distributions for nuclear fuel assemblies under core uncovery conditions. The heat transfer processes include conduction in solid structures, convection across fluid-solid boundaries, and radiation between interacting surfaces. Metal-water reaction kinetics are modeled with empirical relationships to predict the oxidation rates of steam-exposed Zircaloy and uranium metal. The metal-water oxidation models are parabolic inmore » form with an Arrhenius temperature dependence. Uranium oxidation begins when fuel cladding failure occurs; Zircaloy oxidation occurs continuously at temperatures above 13000{degree}F when metal and steam are available. From the metal-water reactions, the hydrogen generation rate, total hydrogen release, and temporal and spatial distribution of oxide formations are computed. Consumption of steam from the oxidation reactions and the effect of hydrogen on the coolant properties is modeled for independent coolant flow channels. Fission product release from exposed uranium metal Zircaloy-clad fuel is modeled using empirical time and temperature relationships that consider the release to be subject to oxidation and volitization/diffusion ( bake-out'') release mechanisms. Release of the volatile species of iodine (I), tellurium (Te), cesium (Ce), ruthenium (Ru), strontium (Sr), zirconium (Zr), cerium (Cr), and barium (Ba) from uranium metal fuel may be modeled.« less
Isaksen, Geir Villy; Hopmann, Kathrin Helen; Åqvist, Johan; Brandsdal, Bjørn Olav
2016-04-12
Purine nucleoside phosphorylase (PNP) catalyzes the reversible phosphorolysis of purine ribonucleosides and 2'-deoxyribonucleosides, yielding the purine base and (2'-deoxy)ribose 1-phosphate as products. While this enzyme has been extensively studied, several questions with respect to the catalytic mechanism have remained largely unanswered. The role of the phosphate and key amino acid residues in the catalytic reaction as well as the purine ring protonation state is elucidated using density functional theory calculations and extensive empirical valence bond (EVB) simulations. Free energy surfaces for adenosine, inosine, and guanosine are fitted to ab initio data and yield quantitative agreement with experimental data when the surfaces are used to model the corresponding enzymatic reactions. The cognate substrates 6-aminopurines (inosine and guanosine) interact with PNP through extensive hydrogen bonding, but the substrate specificity is found to be a direct result of the electrostatic preorganization energy along the reaction coordinate. Asn243 has previously been identified as a key residue providing substrate specificity. Mutation of Asn243 to Asp has dramatic effects on the substrate specificity, making 6-amino- and 6-oxopurines equally good as substrates. The principal effect of this particular mutation is the change in the electrostatic preorganization energy between the native enzyme and the Asn243Asp mutant, clearly favoring adenosine over inosine and guanosine. Thus, the EVB simulations show that this particular mutation affects the electrostatic preorganization of the active site, which in turn can explain the substrate specificity.
Toward a Principled Sampling Theory for Quasi-Orders
Ünlü, Ali; Schrepp, Martin
2016-01-01
Quasi-orders, that is, reflexive and transitive binary relations, have numerous applications. In educational theories, the dependencies of mastery among the problems of a test can be modeled by quasi-orders. Methods such as item tree or Boolean analysis that mine for quasi-orders in empirical data are sensitive to the underlying quasi-order structure. These data mining techniques have to be compared based on extensive simulation studies, with unbiased samples of randomly generated quasi-orders at their basis. In this paper, we develop techniques that can provide the required quasi-order samples. We introduce a discrete doubly inductive procedure for incrementally constructing the set of all quasi-orders on a finite item set. A randomization of this deterministic procedure allows us to generate representative samples of random quasi-orders. With an outer level inductive algorithm, we consider the uniform random extensions of the trace quasi-orders to higher dimension. This is combined with an inner level inductive algorithm to correct the extensions that violate the transitivity property. The inner level correction step entails sampling biases. We propose three algorithms for bias correction and investigate them in simulation. It is evident that, on even up to 50 items, the new algorithms create close to representative quasi-order samples within acceptable computing time. Hence, the principled approach is a significant improvement to existing methods that are used to draw quasi-orders uniformly at random but cannot cope with reasonably large item sets. PMID:27965601
Toward a Principled Sampling Theory for Quasi-Orders.
Ünlü, Ali; Schrepp, Martin
2016-01-01
Quasi-orders, that is, reflexive and transitive binary relations, have numerous applications. In educational theories, the dependencies of mastery among the problems of a test can be modeled by quasi-orders. Methods such as item tree or Boolean analysis that mine for quasi-orders in empirical data are sensitive to the underlying quasi-order structure. These data mining techniques have to be compared based on extensive simulation studies, with unbiased samples of randomly generated quasi-orders at their basis. In this paper, we develop techniques that can provide the required quasi-order samples. We introduce a discrete doubly inductive procedure for incrementally constructing the set of all quasi-orders on a finite item set. A randomization of this deterministic procedure allows us to generate representative samples of random quasi-orders. With an outer level inductive algorithm, we consider the uniform random extensions of the trace quasi-orders to higher dimension. This is combined with an inner level inductive algorithm to correct the extensions that violate the transitivity property. The inner level correction step entails sampling biases. We propose three algorithms for bias correction and investigate them in simulation. It is evident that, on even up to 50 items, the new algorithms create close to representative quasi-order samples within acceptable computing time. Hence, the principled approach is a significant improvement to existing methods that are used to draw quasi-orders uniformly at random but cannot cope with reasonably large item sets.
Agroskin, Dmitrij; Jonas, Eva; Klackl, Johannes; Prentice, Mike
2016-01-01
The hypothesis that people respond to reminders of mortality with closed-minded, ethnocentric attitudes has received extensive empirical support, largely from research in the Terror Management Theory (TMT) tradition. However, the basic motivational and neural processes that underlie this effect remain largely hypothetical. According to recent neuropsychological theorizing, mortality salience (MS) effects on cultural closed-mindedness may be mediated by activity in the behavioral inhibition system (BIS), which leads to passive avoidance and decreased approach motivation. This should be especially true for people motivated to avoid unfamiliar and potentially threatening stimuli as reflected in a high need for closure (NFC). In two studies involving moderated mediation analyses, people high on trait NFC responded to MS with increased BIS activity (as indicated by EEG and the line bisection task), which is characteristic of inhibited approach motivation. BIS activity, in turn, predicted a reluctance to explore foreign cultures (Study 1) and generalized ethnocentric attitudes (Study 2). In a third study, inhibition was induced directly and caused an increase in ethnocentrism for people high on NFC. Moreover, the effect of the inhibition manipulation × NFC interaction on ethnocentrism was explained by increases in BIS-related affect (i.e., anxious inhibition) at high NFC. To our knowledge, this research is the first to establish an empirical link between very basic, neurally-instantiated inhibitory processes and rather complex, higher-order manifestations of intergroup negativity in response to MS. Our findings contribute to a fuller understanding of the cultural worldview defense phenomenon by illuminating the motivational underpinnings of cultural closed-mindedness in the wake of existential threat.
Agroskin, Dmitrij; Jonas, Eva; Klackl, Johannes; Prentice, Mike
2016-01-01
The hypothesis that people respond to reminders of mortality with closed-minded, ethnocentric attitudes has received extensive empirical support, largely from research in the Terror Management Theory (TMT) tradition. However, the basic motivational and neural processes that underlie this effect remain largely hypothetical. According to recent neuropsychological theorizing, mortality salience (MS) effects on cultural closed-mindedness may be mediated by activity in the behavioral inhibition system (BIS), which leads to passive avoidance and decreased approach motivation. This should be especially true for people motivated to avoid unfamiliar and potentially threatening stimuli as reflected in a high need for closure (NFC). In two studies involving moderated mediation analyses, people high on trait NFC responded to MS with increased BIS activity (as indicated by EEG and the line bisection task), which is characteristic of inhibited approach motivation. BIS activity, in turn, predicted a reluctance to explore foreign cultures (Study 1) and generalized ethnocentric attitudes (Study 2). In a third study, inhibition was induced directly and caused an increase in ethnocentrism for people high on NFC. Moreover, the effect of the inhibition manipulation × NFC interaction on ethnocentrism was explained by increases in BIS-related affect (i.e., anxious inhibition) at high NFC. To our knowledge, this research is the first to establish an empirical link between very basic, neurally-instantiated inhibitory processes and rather complex, higher-order manifestations of intergroup negativity in response to MS. Our findings contribute to a fuller understanding of the cultural worldview defense phenomenon by illuminating the motivational underpinnings of cultural closed-mindedness in the wake of existential threat. PMID:27826261
Empirical resistive-force theory for slender biological filaments in shear-thinning fluids
NASA Astrophysics Data System (ADS)
Riley, Emily E.; Lauga, Eric
2017-06-01
Many cells exploit the bending or rotation of flagellar filaments in order to self-propel in viscous fluids. While appropriate theoretical modeling is available to capture flagella locomotion in simple, Newtonian fluids, formidable computations are required to address theoretically their locomotion in complex, nonlinear fluids, e.g., mucus. Based on experimental measurements for the motion of rigid rods in non-Newtonian fluids and on the classical Carreau fluid model, we propose empirical extensions of the classical Newtonian resistive-force theory to model the waving of slender filaments in non-Newtonian fluids. By assuming the flow near the flagellum to be locally Newtonian, we propose a self-consistent way to estimate the typical shear rate in the fluid, which we then use to construct correction factors to the Newtonian local drag coefficients. The resulting non-Newtonian resistive-force theory, while empirical, is consistent with the Newtonian limit, and with the experiments. We then use our models to address waving locomotion in non-Newtonian fluids and show that the resulting swimming speeds are systematically lowered, a result which we are able to capture asymptotically and to interpret physically. An application of the models to recent experimental results on the locomotion of Caenorhabditis elegans in polymeric solutions shows reasonable agreement and thus captures the main physics of swimming in shear-thinning fluids.
Selecting a restoration technique to minimize OCR error.
Cannon, M; Fugate, M; Hush, D R; Scovel, C
2003-01-01
This paper introduces a learning problem related to the task of converting printed documents to ASCII text files. The goal of the learning procedure is to produce a function that maps documents to restoration techniques in such a way that on average the restored documents have minimum optical character recognition error. We derive a general form for the optimal function and use it to motivate the development of a nonparametric method based on nearest neighbors. We also develop a direct method of solution based on empirical error minimization for which we prove a finite sample bound on estimation error that is independent of distribution. We show that this empirical error minimization problem is an extension of the empirical optimization problem for traditional M-class classification with general loss function and prove computational hardness for this problem. We then derive a simple iterative algorithm called generalized multiclass ratchet (GMR) and prove that it produces an optimal function asymptotically (with probability 1). To obtain the GMR algorithm we introduce a new data map that extends Kesler's construction for the multiclass problem and then apply an algorithm called Ratchet to this mapped data, where Ratchet is a modification of the Pocket algorithm . Finally, we apply these methods to a collection of documents and report on the experimental results.
Was Newton right? A search for non-Newtonian behavior of weak-field gravity
NASA Astrophysics Data System (ADS)
Boynton, Paul; Moore, Michael; Newman, Riley; Berg, Eric; Bonicalzi, Ricco; McKenney, Keven
2014-06-01
Empirical tests of Einstein's metric theory of gravitation, even in the non-relativistic, weak-field limit, could play an important role in judging theory-driven extensions of the current Standard Model of fundamental interactions. Guided by Galileo's work and his own experiments, Newton formulated a theory of gravity in which the force of attraction between two bodies is independent of composition and proportional to the inertia of each, thereby transparently satisfying Galileo's empirically informed conjecture regarding the Universality of Free Fall. Similarly, Einstein honored the manifest success of Newton's theory by assuring that the linearized equations of GTR matched the Newtonian formalism under "classical" conditions. Each of these steps, however, was explicitly an approximation raised to the status of principle. Perhaps, at some level, Newtonian gravity does not accurately describe the physical interaction between uncharged, unmagnetized, macroscopic bits of ordinary matter. What if Newton were wrong? Detecting any significant deviation from Newtonian behavior, no matter how small, could provide new insights and possibly reveal new physics. In the context of physics as an empirical science, for us this yet unanswered question constitutes sufficient motivation to attempt precision measurements of the kind described here. In this paper we report the current status of a project to search for violation of the Newtonian inverse square law of gravity.
Chirumbolo, Antonio; Urbini, Flavio; Callea, Antonino; Lo Presti, Alessandro; Talamo, Alessandra
2017-01-01
One of the more visible effects of the societal changes is the increased feelings of uncertainty in the workforce. In fact, job insecurity represents a crucial occupational risk factor and a major job stressor that has negative consequences on both organizational well-being and individual health. Many studies have focused on the consequences about the fear and the perception of losing the job as a whole (called quantitative job insecurity), while more recently research has begun to examine more extensively the worries and the perceptions of losing valued job features (called qualitative job insecurity). The vast majority of the studies, however, have investigated the effects of quantitative and qualitative job insecurity separately. In this paper, we proposed the Job Insecurity Integrated Model aimed to examine the effects of quantitative job insecurity and qualitative job insecurity on their short-term and long-term outcomes. This model was empirically tested in two independent studies, hypothesizing that qualitative job insecurity mediated the effects of quantitative job insecurity on different outcomes, such as work engagement and organizational identification (Study 1), and job satisfaction, commitment, psychological stress and turnover intention (Study 2). Study 1 was conducted on 329 employees in private firms, while Study 2 on 278 employees in both public sector and private firms. Results robustly showed that qualitative job insecurity totally mediated the effects of quantitative on all the considered outcomes. By showing that the effects of quantitative job insecurity on its outcomes passed through qualitative job insecurity, the Job Insecurity Integrated Model contributes to clarifying previous findings in job insecurity research and puts forward a framework that could profitably produce new investigations with important theoretical and practical implications. PMID:29250013
A Study Space Analysis and Narrative Review of Trauma-Informed Mediators of Dating Violence.
Cascardi, Michele; Jouriles, Ernest N
2018-07-01
Research linking child maltreatment and dating violence in adolescence and emerging adulthood has proliferated in the past two decades; however, the precise mechanisms by which these experiences are related remain elusive. A trauma-informed perspective suggests four particularly promising mediators: maladaptive attachment, emotion regulation difficulties, emotional distress, and hostility. The current article characterizes the status of the empirical literature examining these four mediators using a study space analysis and a narrative review of existing research. An extensive literature search identified 42 papers (44 studies) that met the following criteria: (1) at least one measure of child maltreatment (emotional, physical, sexual, neglect, or exposure to intimate partner violence); (2) a measure of one of the four mediator variables; (3) a measure of dating violence perpetration or victimization; and (4) a sample of adolescents or young adults. The study space analysis suggested several important observations about the research on this topic, including a dearth of studies examining hostility as a mediator and little research using prospective designs or clinical samples. There are also limitations with the conceptualization and measurement of dating violence, child maltreatment, and some of the mediator variables. In addition, few studies examined more than one mediator variable in the same study. The narrative review suggested that maladaptive attachment (specifically insecure attachment styles), emotion regulation difficulties (specifically regulation of the emotion of anger), and emotional distress construed broadly represent promising mediators of the association between child maltreatment and dating violence, but conclusions about mediation must remain tentative given the state of the literature. The discussion offers recommendations for improved theoretical and empirical rigor to advance future research on mechanisms linking child maltreatment and dating violence.
Chirumbolo, Antonio; Urbini, Flavio; Callea, Antonino; Lo Presti, Alessandro; Talamo, Alessandra
2017-01-01
One of the more visible effects of the societal changes is the increased feelings of uncertainty in the workforce. In fact, job insecurity represents a crucial occupational risk factor and a major job stressor that has negative consequences on both organizational well-being and individual health. Many studies have focused on the consequences about the fear and the perception of losing the job as a whole (called quantitative job insecurity), while more recently research has begun to examine more extensively the worries and the perceptions of losing valued job features (called qualitative job insecurity). The vast majority of the studies, however, have investigated the effects of quantitative and qualitative job insecurity separately. In this paper, we proposed the Job Insecurity Integrated Model aimed to examine the effects of quantitative job insecurity and qualitative job insecurity on their short-term and long-term outcomes. This model was empirically tested in two independent studies, hypothesizing that qualitative job insecurity mediated the effects of quantitative job insecurity on different outcomes, such as work engagement and organizational identification (Study 1), and job satisfaction, commitment, psychological stress and turnover intention (Study 2). Study 1 was conducted on 329 employees in private firms, while Study 2 on 278 employees in both public sector and private firms. Results robustly showed that qualitative job insecurity totally mediated the effects of quantitative on all the considered outcomes. By showing that the effects of quantitative job insecurity on its outcomes passed through qualitative job insecurity, the Job Insecurity Integrated Model contributes to clarifying previous findings in job insecurity research and puts forward a framework that could profitably produce new investigations with important theoretical and practical implications.
NASA Technical Reports Server (NTRS)
Persing, T. Ray; Bellish, Christine A.; Brandon, Jay; Kenney, P. Sean; Carzoo, Susan; Buttrill, Catherine; Guenther, Arlene
2005-01-01
Several aircraft airframe modeling approaches are currently being used in the DoD community for acquisition, threat evaluation, training, and other purposes. To date there has been no clear empirical study of the impact of airframe simulation fidelity on piloted real-time aircraft simulation study results, or when use of a particular level of fidelity is indicated. This paper documents a series of piloted simulation studies using three different levels of airframe model fidelity. This study was conducted using the NASA Langley Differential Maneuvering Simulator. Evaluations were conducted with three pilots for scenarios requiring extensive maneuvering of the airplanes during air combat. In many cases, a low-fidelity modified point-mass model may be sufficient to evaluate the combat effectiveness of the aircraft. However, in cases where high angle-of-attack flying qualities and aerodynamic performance are a factor or when precision tracking ability of the aircraft must be represented, use of high-fidelity models is indicated.
Agorastos, Agorastos; Metscher, Tanja; Huber, Christian G; Jelinek, Lena; Vitzthum, Francesca; Muhtz, Christoph; Kellner, Michael; Moritz, Steffen
2012-10-01
The relation between religiosity/spirituality (R/S), personal beliefs, and mental health has been extensively studied. However, concerning anxiety disorders (ADs), empirical evidence is scarce. This study investigated the differences in R/S and magical/paranormal ideation among obsessive-compulsive disorder patients (OCD; n = 49), patients with other ADs (n = 36), and healthy controls (HCs; n = 35). Our results suggest negative religious coping as being the only parameter showing significantly higher scores in OCD and AD participants in comparison with HCs. Negative religious coping reflects negative functional expressions of R/S in stressful situations. Logistic regression also suggested negative religious coping as the strongest predictor of group affiliation to the nonhealthy group. Further results show no significant differences between other R/S, magical, and paranormal ideation traits among groups. This study underlines an important role of negative religious coping in ADs yet does not clearly indicate a specific causality. Religious-sensitive treatment targeting cognitive aspects of negative religious coping are discussed.
Satagopan, Jaya M; Sen, Ananda; Zhou, Qin; Lan, Qing; Rothman, Nathaniel; Langseth, Hilde; Engel, Lawrence S
2016-06-01
Matched case-control studies are popular designs used in epidemiology for assessing the effects of exposures on binary traits. Modern studies increasingly enjoy the ability to examine a large number of exposures in a comprehensive manner. However, several risk factors often tend to be related in a nontrivial way, undermining efforts to identify the risk factors using standard analytic methods due to inflated type-I errors and possible masking of effects. Epidemiologists often use data reduction techniques by grouping the prognostic factors using a thematic approach, with themes deriving from biological considerations. We propose shrinkage-type estimators based on Bayesian penalization methods to estimate the effects of the risk factors using these themes. The properties of the estimators are examined using extensive simulations. The methodology is illustrated using data from a matched case-control study of polychlorinated biphenyls in relation to the etiology of non-Hodgkin's lymphoma. © 2015, The International Biometric Society.
Chaudhari, Mangesh I; Muralidharan, Ajay; Pratt, Lawrence R; Rempe, Susan B
2018-02-12
Progress in understanding liquid ethylene carbonate (EC) and propylene carbonate (PC) on the basis of molecular simulation, emphasizing simple models of interatomic forces, is reviewed. Results on the bulk liquids are examined from the perspective of anticipated applications to materials for electrical energy storage devices. Preliminary results on electrochemical double-layer capacitors based on carbon nanotube forests and on model solid-electrolyte interphase (SEI) layers of lithium ion batteries are considered as examples. The basic results discussed suggest that an empirically parameterized, non-polarizable force field can reproduce experimental structural, thermodynamic, and dielectric properties of EC and PC liquids with acceptable accuracy. More sophisticated force fields might include molecular polarizability and Buckingham-model description of inter-atomic overlap repulsions as extensions to Lennard-Jones models of van der Waals interactions. Simple approaches should be similarly successful also for applications to organic molecular ions in EC/PC solutions, but the important case of Li[Formula: see text] deserves special attention because of the particularly strong interactions of that small ion with neighboring solvent molecules. To treat the Li[Formula: see text] ions in liquid EC/PC solutions, we identify interaction models defined by empirically scaled partial charges for ion-solvent interactions. The empirical adjustments use more basic inputs, electronic structure calculations and ab initio molecular dynamics simulations, and also experimental results on Li[Formula: see text] thermodynamics and transport in EC/PC solutions. Application of such models to the mechanism of Li[Formula: see text] transport in glassy SEI models emphasizes the advantage of long time-scale molecular dynamics studies of these non-equilibrium materials.
Guideline recommendations and antimicrobial resistance: the need for a change.
Elias, Christelle; Moja, Lorenzo; Mertz, Dominik; Loeb, Mark; Forte, Gilles; Magrini, Nicola
2017-07-26
Antimicrobial resistance has become a global burden for which inappropriate antimicrobial use is an important contributing factor. Any decisions on the selection of antibiotics use should consider their effects on antimicrobial resistance. The objective of this study was to assess the extent to which antibiotic prescribing guidelines have considered resistance patterns when making recommendations for five highly prevalent infectious syndromes. We used Medline searches complemented with extensive use of Web engine to identify guidelines on empirical treatment of community-acquired pneumonia, urinary tract infections, acute otitis media, rhinosinusitis and pharyngitis. We collected data on microbiology and resistance patterns and identified discrete pattern categories. We assessed the extent to which recommendations considered resistance, in addition to efficacy and safety, when recommending antibiotics. We identified 135 guidelines, which reported a total of 251 recommendations. Most (103/135, 79%) were from developed countries. Community-acquired pneumonia was the syndrome mostly represented (51, 39%). In only 16 (6.4%) recommendations, selection of empirical antibiotic was discussed in relation to resistance and specific microbiological data. In a further 69 (27.5%) recommendations, references were made in relation to resistance, but the attempt was inconsistent. Across syndromes, 12 patterns of resistance with implications on recommendations were observed. 50% to 75% of recommendations did not attempt to set recommendation in the context of these patterns. There is consistent evidence that guidelines on empirical antibiotic use did not routinely consider resistance in their recommendations. Decision-makers should analyse and report the extent of local resistance patterns to allow better decision-making. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Moustafa, Ahmed A.; Wufong, Ella; Servatius, Richard J.; Pang, Kevin C. H.; Gluck, Mark A.; Myers, Catherine E.
2013-01-01
A recurrent-network model provides a unified account of the hippocampal region in mediating the representation of temporal information in classical eyeblink conditioning. Much empirical research is consistent with a general conclusion that delay conditioning (in which the conditioned stimulus CS and unconditioned stimulus US overlap and co-terminate) is independent of the hippocampal system, while trace conditioning (in which the CS terminates before US onset) depends on the hippocampus. However, recent studies show that, under some circumstances, delay conditioning can be hippocampal-dependent and trace conditioning can be spared following hippocampal lesion. Here, we present an extension of our prior trial-level models of hippocampal function and stimulus representation that can explain these findings within a unified framework. Specifically, the current model includes adaptive recurrent collateral connections that aid in the representation of intra-trial temporal information. With this model, as in our prior models, we argue that the hippocampus is not specialized for conditioned response timing, but rather is a general-purpose system that learns to predict the next state of all stimuli given the current state of variables encoded by activity in recurrent collaterals. As such, the model correctly predicts that hippocampal involvement in classical conditioning should be critical not only when there is an intervening trace interval, but also when there is a long delay between CS onset and US onset. Our model simulates empirical data from many variants of classical conditioning, including delay and trace paradigms in which the length of the CS, the inter-stimulus interval, or the trace interval is varied. Finally, we discuss model limitations, future directions, and several novel empirical predictions of this temporal processing model of hippocampal function and learning. PMID:23178699
Stochastic Geometric Network Models for Groups of Functional and Structural Connectomes
Friedman, Eric J.; Landsberg, Adam S.; Owen, Julia P.; Li, Yi-Ou; Mukherjee, Pratik
2014-01-01
Structural and functional connectomes are emerging as important instruments in the study of normal brain function and in the development of new biomarkers for a variety of brain disorders. In contrast to single-network studies that presently dominate the (non-connectome) network literature, connectome analyses typically examine groups of empirical networks and then compare these against standard (stochastic) network models. Current practice in connectome studies is to employ stochastic network models derived from social science and engineering contexts as the basis for the comparison. However, these are not necessarily best suited for the analysis of connectomes, which often contain groups of very closely related networks, such as occurs with a set of controls or a set of patients with a specific disorder. This paper studies important extensions of standard stochastic models that make them better adapted for analysis of connectomes, and develops new statistical fitting methodologies that account for inter-subject variations. The extensions explicitly incorporate geometric information about a network based on distances and inter/intra hemispherical asymmetries (to supplement ordinary degree-distribution information), and utilize a stochastic choice of networks' density levels (for fixed threshold networks) to better capture the variance in average connectivity among subjects. The new statistical tools introduced here allow one to compare groups of networks by matching both their average characteristics and the variations among them. A notable finding is that connectomes have high “smallworldness” beyond that arising from geometric and degree considerations alone. PMID:25067815
NASA Astrophysics Data System (ADS)
Karimzadeh, Shaghayegh; Askan, Aysegul
2018-04-01
Located within a basin structure, at the conjunction of North East Anatolian, North Anatolian and Ovacik Faults, Erzincan city center (Turkey) is one of the most hazardous regions in the world. Combination of the seismotectonic and geological settings of the region has resulted in series of significant seismic activities including the 1939 (Ms 7.8) as well as the 1992 (Mw = 6.6) earthquakes. The devastative 1939 earthquake occurred in the pre-instrumental era in the region with no available local seismograms. Thus, a limited number of studies exist on that earthquake. However, the 1992 event, despite the sparse local network at that time, has been studied extensively. This study aims to simulate the 1939 Erzincan earthquake using available regional seismic and geological parameters. Despite several uncertainties involved, such an effort to quantitatively model the 1939 earthquake is promising, given the historical reports of extensive damage and fatalities in the area. The results of this study are expressed in terms of anticipated acceleration time histories at certain locations, spatial distribution of selected ground motion parameters and felt intensity maps in the region. Simulated motions are first compared against empirical ground motion prediction equations derived with both local and global datasets. Next, anticipated intensity maps of the 1939 earthquake are obtained using local correlations between peak ground motion parameters and felt intensity values. Comparisons of the estimated intensity distributions with the corresponding observed intensities indicate a reasonable modeling of the 1939 earthquake.
Why is it hard to make progress in assessing children's decision-making competence?
Hein, Irma M; Troost, Pieter W; Broersma, Alice; de Vries, Martine C; Daams, Joost G; Lindauer, Ramón J L
2015-01-10
For decades, the discussion on children's competence to consent to medical issues has concentrated around normative concerns, with little progress in clinical practices. Decision-making competence is an important condition in the informed consent model. In pediatrics, clinicians need to strike a proper balance in order to both protect children's interests when they are not fully able to do so themselves and to respect their autonomy when they are. Children's competence to consent, however, is currently not assessed in a standardized way. Moreover, the correlation between competence to give informed consent and age in children has never been systematically investigated, nor do we know which factors exactly contribute to children's competence.This article aims at identifying these gaps in knowledge and suggests options for dealing with the obstacles in empirical research in order to advance policies and practices regarding children's medical decision-making competence. Understanding children's competency is hampered by the law. Legislative regulations concerning competency are established on a strong presumption that persons older than a certain age are competent, whereas younger persons are not. Furthermore, a number of contextual factors are believed to be of influence on a child's decision-making competence: the developmental stage of children, the influence of parents and peers, the quality of information provision, life experience, the type of medical decision, and so on. Ostensibly, these diverse and extensive barriers hinder any form of advancement in this conflicted area. Addressing these obstacles encourages the discussion on children's competency, in which the most prominent question concerns the lack of a clear operationalization of children's competence to consent. Empirical data are needed to substantiate the discussion. The empirical approach offers an opportunity to give direction to the debate. Recommendations for future research include: studying a standardized assessment instrument covering all four relevant dimensions of competence (understanding, reasoning, appreciation, expressing a choice), including a study population of children covering the full age range of 7 to 18 years, improving information provision, and assessing relevant contextual data.
A computational approach to compare regression modelling strategies in prediction research.
Pajouheshnia, Romin; Pestman, Wiebe R; Teerenstra, Steven; Groenwold, Rolf H H
2016-08-25
It is often unclear which approach to fit, assess and adjust a model will yield the most accurate prediction model. We present an extension of an approach for comparing modelling strategies in linear regression to the setting of logistic regression and demonstrate its application in clinical prediction research. A framework for comparing logistic regression modelling strategies by their likelihoods was formulated using a wrapper approach. Five different strategies for modelling, including simple shrinkage methods, were compared in four empirical data sets to illustrate the concept of a priori strategy comparison. Simulations were performed in both randomly generated data and empirical data to investigate the influence of data characteristics on strategy performance. We applied the comparison framework in a case study setting. Optimal strategies were selected based on the results of a priori comparisons in a clinical data set and the performance of models built according to each strategy was assessed using the Brier score and calibration plots. The performance of modelling strategies was highly dependent on the characteristics of the development data in both linear and logistic regression settings. A priori comparisons in four empirical data sets found that no strategy consistently outperformed the others. The percentage of times that a model adjustment strategy outperformed a logistic model ranged from 3.9 to 94.9 %, depending on the strategy and data set. However, in our case study setting the a priori selection of optimal methods did not result in detectable improvement in model performance when assessed in an external data set. The performance of prediction modelling strategies is a data-dependent process and can be highly variable between data sets within the same clinical domain. A priori strategy comparison can be used to determine an optimal logistic regression modelling strategy for a given data set before selecting a final modelling approach.
Group Theoretical Characterization of Wave Equations
NASA Astrophysics Data System (ADS)
Nisticò, Giuseppe
2017-12-01
Group theoretical methods, worked out in particular by Mackey and Wigner, allow to attain the explicit Quantum Theory of a free particle through a purely deductive development based on symmetry principles. The extension of these methods to the case of an interacting particle finds a serious obstacle in the loss of the symmetry condition for the transformations of Galilei's group. The known attempts towards such an extension introduce restrictions which lead to theories empirically too limited. In the present article we show how the difficulties raised by the loss of symmetry can be overcome without the restrictions that affect tha past attempts. According to our results, the different specific forms of the wave equation of an interacting particle are implied by particular first order invariance properties that characterize the interaction with respect to specific sub-groups of galileian transformations. Moreover, the possibility of yet unknown forms of the wave equation is left open.
Frictional velocity-weakening in landslides on Earth and on other planetary bodies.
Lucas, Antoine; Mangeney, Anne; Ampuero, Jean Paul
2014-03-04
One of the ultimate goals in landslide hazard assessment is to predict maximum landslide extension and velocity. Despite much work, the physical processes governing energy dissipation during these natural granular flows remain uncertain. Field observations show that large landslides travel over unexpectedly long distances, suggesting low dissipation. Numerical simulations of landslides require a small friction coefficient to reproduce the extension of their deposits. Here, based on analytical and numerical solutions for granular flows constrained by remote-sensing observations, we develop a consistent method to estimate the effective friction coefficient of landslides. This method uses a constant basal friction coefficient that reproduces the first-order landslide properties. We show that friction decreases with increasing volume or, more fundamentally, with increasing sliding velocity. Inspired by frictional weakening mechanisms thought to operate during earthquakes, we propose an empirical velocity-weakening friction law under a unifying phenomenological framework applicable to small and large landslides observed on Earth and beyond.
NASA Astrophysics Data System (ADS)
Steffen, Julien; Hartke, Bernd
2017-10-01
Building on the recently published quantum-mechanically derived force field (QMDFF) and its empirical valence bond extension, EVB-QMDFF, it is now possible to generate a reliable potential energy surface for any given elementary reaction step in an essentially black box manner. This requires a limited and pre-defined set of reference data near the reaction path and generates an accurate approximation of the reference potential energy surface, on and off the reaction path. This intermediate representation can be used to generate reaction rate data, with far better accuracy and reliability than with traditional approaches based on transition state theory (TST) or variational extensions thereof (VTST), even if those include sophisticated tunneling corrections. However, the additional expense at the reference level remains very modest. We demonstrate all this for three arbitrarily chosen example reactions.
Bayesian Group Bridge for Bi-level Variable Selection.
Mallick, Himel; Yi, Nengjun
2017-06-01
A Bayesian bi-level variable selection method (BAGB: Bayesian Analysis of Group Bridge) is developed for regularized regression and classification. This new development is motivated by grouped data, where generic variables can be divided into multiple groups, with variables in the same group being mechanistically related or statistically correlated. As an alternative to frequentist group variable selection methods, BAGB incorporates structural information among predictors through a group-wise shrinkage prior. Posterior computation proceeds via an efficient MCMC algorithm. In addition to the usual ease-of-interpretation of hierarchical linear models, the Bayesian formulation produces valid standard errors, a feature that is notably absent in the frequentist framework. Empirical evidence of the attractiveness of the method is illustrated by extensive Monte Carlo simulations and real data analysis. Finally, several extensions of this new approach are presented, providing a unified framework for bi-level variable selection in general models with flexible penalties.
Modeling species-abundance relationships in multi-species collections
Peng, S.; Yin, Z.; Ren, H.; Guo, Q.
2003-01-01
Species-abundance relationship is one of the most fundamental aspects of community ecology. Since Motomura first developed the geometric series model to describe the feature of community structure, ecologists have developed many other models to fit the species-abundance data in communities. These models can be classified into empirical and theoretical ones, including (1) statistical models, i.e., negative binomial distribution (and its extension), log-series distribution (and its extension), geometric distribution, lognormal distribution, Poisson-lognormal distribution, (2) niche models, i.e., geometric series, broken stick, overlapping niche, particulate niche, random assortment, dominance pre-emption, dominance decay, random fraction, weighted random fraction, composite niche, Zipf or Zipf-Mandelbrot model, and (3) dynamic models describing community dynamics and restrictive function of environment on community. These models have different characteristics and fit species-abundance data in various communities or collections. Among them, log-series distribution, lognormal distribution, geometric series, and broken stick model have been most widely used.
Genotype imputation in a coalescent model with infinitely-many-sites mutation
Huang, Lucy; Buzbas, Erkan O.; Rosenberg, Noah A.
2012-01-01
Empirical studies have identified population-genetic factors as important determinants of the properties of genotype-imputation accuracy in imputation-based disease association studies. Here, we develop a simple coalescent model of three sequences that we use to explore the theoretical basis for the influence of these factors on genotype-imputation accuracy, under the assumption of infinitely-many-sites mutation. Employing a demographic model in which two populations diverged at a given time in the past, we derive the approximate expectation and variance of imputation accuracy in a study sequence sampled from one of the two populations, choosing between two reference sequences, one sampled from the same population as the study sequence and the other sampled from the other population. We show that under this model, imputation accuracy—as measured by the proportion of polymorphic sites that are imputed correctly in the study sequence—increases in expectation with the mutation rate, the proportion of the markers in a chromosomal region that are genotyped, and the time to divergence between the study and reference populations. Each of these effects derives largely from an increase in information available for determining the reference sequence that is genetically most similar to the sequence targeted for imputation. We analyze as a function of divergence time the expected gain in imputation accuracy in the target using a reference sequence from the same population as the target rather than from the other population. Together with a growing body of empirical investigations of genotype imputation in diverse human populations, our modeling framework lays a foundation for extending imputation techniques to novel populations that have not yet been extensively examined. PMID:23079542
NASA Astrophysics Data System (ADS)
Rautenbach, Victoria; Coetzee, Serena; Çöltekin, Arzu
2017-05-01
Topographic maps are among the most commonly used map types, however, their complex and information-rich designs depicting natural, human-made and cultural features make them difficult to read. Regardless of their complexity, spatial planners make extensive use of topographic maps in their work. On the other hand, various studies suggest that map literacy among the development planning professionals in South Africa is not very high. The widespread use of topographic maps combined with the low levels of map literacy presents challenges for effective development planning. In this paper we address some of these challenges by developing a specialized task taxonomy based on systematically assessed map literacy levels; and conducting an empirical experiment with topographic maps to evaluate our task taxonomy. In such empirical studies if non-realistic tasks are used, the results of map literacy tests may be skewed. Furthermore, experience and familiarity with the studied map type play a role in map literacy. There is thus a need to develop map literacy tests aimed at planners specifically. We developed a taxonomy of realistic map reading tasks typically executed during the planning process. The taxonomy defines six levels tasks of increasing difficulty and complexity, ranging from recognising symbols to extracting knowledge. We hypothesized that competence in the first four levels indicates functional map literacy. In this paper, we present results from an empirical experiment with 49 map literate participants solving a subset of tasks from the first four levels of the taxonomy with a topographic map. Our findings suggest that the proposed taxonomy is a good reference for evaluating topographic map literacy. Participants solved the tasks on all four levels as expected and we therefore conclude that the experiment based on the first four levels of the taxonomy successfully determined the functional map literacy of the participants. We plan to continue the study for the remaining levels, repeat the experiments with a group of map illiterate participants to confirm that the taxonomy can also be used to determine map illiteracy.
Kaplan, Warren Allan; Ritz, Lindsay Sarah; Vitello, Marie
2011-01-01
Objectives: The objective of this study was to assess the existing theoretical and empirical literature examining the link between "local production" of pharmaceuticals and medical devices and increased local access to these products. Our preliminary hypothesis is that studies showing a robust relationship between local production and access to medical products are sparse, at best. Methods: An extensive literature search was conducted using a wide variety of databases and search terms intending to capture as many different aspects of this issue as possible. The results of the search were reviewed and categorized according to their relevance to the research question. The literature was also reviewed to determine the rigor used to examine the effects of local production and what implications these experiences hold for other developing countries. Results: Literature addressing the benefits of local production and the link between it and access to medical products is sparse, mainly descriptive and lacking empirical evidence. Of the literature we reviewed that addressed comparative economics and strategic planning of multinational and domestic firms, there are few dealing with emerging markets and lower-middle income countries and even fewer that compare local biomedical producers with multinational corporations in terms of a reasonable metric. What comparisons exist mainly relate to prices of local versus foreign/multinational produced medicines. Conclusions: An assessment of the existing theoretical and empirical literature examining the link between "local production" of pharmaceuticals and medical devices and increased local access to these products reveals a paucity of literature explicitly dealing with this issue. Of the literature that does exist, methods used to date are insufficient to prove a robust relationship between local production of medical products and access to these products. There are mixed messages from various studies, and although the studies may correctly depict specific situations in specific countries with reference to specific products, such evidence cannot be generalized. Our review strongly supports the need for further research in understanding the dynamic link between local production and access to medical products PMID:23093883
U.S. national report to the International Union of Geodesy and Geophysics
NASA Technical Reports Server (NTRS)
Gorney, D. J.
1987-01-01
This paper highlights progress by U.S. authors during 1983-1986 in the broad area of auroral research. Atmospheric emissions and their use as a tool for remote-sensing the dynamics, energetics, and effects of auroral activity is a subject which is emphasized here because of the vast progress made in this area on both observational and theoretical fronts. The evolution of primary auroral electrons, the acceleration of auroral ions, small-scale electric fields, auroral kilometric radiation, auroral empirical models and activity indices are also reviewed. An extensive bibliography is supplied.
Open source approaches to health information systems in Kenya.
Drury, Peter; Dahlman, Bruce
2005-01-01
This paper focuses on the experience to date of an installation of a Free Open Source Software (FOSS) product, Care2X, at a church hospital in Kenya. The FOSS movement has been maturing rapidly. In developed countries, its benefits relative to proprietary software have been extensively discussed and ways of quantifying the total costs of the development have been developed. Nevertheless, empirical data on the impact of FOSS, particularly in the developing world, concerning its use and development is still quite limited, although the possibilities of FOSS are becoming increasingly attractive.
US national report to the International Union of Geodesy and Geophysics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gorney, D.J.
1987-04-01
This paper highlights progress by U.S. authors during 1983-1986 in the broad area of auroral research. Atmospheric emissions and their use as a tool for remote-sensing the dynamics, energetics, and effects of auroral activity is a subject which is emphasized here because of the vast progress made in this area on both observational and theoretical fronts. The evolution of primary auroral electrons, the acceleration of auroral ions, small-scale electric fields, auroral kilometric radiation, auroral empirical models and activity indices are also reviewed. An extensive bibliography is supplied.
Whistles, bells, and cogs in machines: Thomas Huxley and epiphenomenalism.
Greenwood, John
2010-01-01
In this paper I try to shed some historical light upon the doctrine of epiphenomenalism, by focusing on the version of epiphenomenalism championed by Thomas Huxley, which is often treated as a classic statement of the doctrine. I argue that it is doubtful if Huxley held any form of metaphysical epiphenomenalism, and that he held a more limited form of empirical epiphenomenalism with respect to consciousness but not with respect to mentality per se. Contrary to what is conventionally supposed, Huxley's empirical epiphenomenalism with respect to consciousness was not simply based upon the demonstration of the neurophysiological basis of conscious mentality, or derived from the extension of mechanistic and reflexive principles of explanation to encompass all forms of animal and human behavior, but was based upon the demonstration of purposive and coordinated animal and human behavior in the absence of consciousness. Given Huxley's own treatment of mentality, his characterization of animals and humans as "conscious automata" was not well chosen.
Evolution beyond neo-Darwinism: a new conceptual framework.
Noble, Denis
2015-01-01
Experimental results in epigenetics and related fields of biological research show that the Modern Synthesis (neo-Darwinist) theory of evolution requires either extension or replacement. This article examines the conceptual framework of neo-Darwinism, including the concepts of 'gene', 'selfish', 'code', 'program', 'blueprint', 'book of life', 'replicator' and 'vehicle'. This form of representation is a barrier to extending or replacing existing theory as it confuses conceptual and empirical matters. These need to be clearly distinguished. In the case of the central concept of 'gene', the definition has moved all the way from describing a necessary cause (defined in terms of the inheritable phenotype itself) to an empirically testable hypothesis (in terms of causation by DNA sequences). Neo-Darwinism also privileges 'genes' in causation, whereas in multi-way networks of interactions there can be no privileged cause. An alternative conceptual framework is proposed that avoids these problems, and which is more favourable to an integrated systems view of evolution. © 2015. Published by The Company of Biologists Ltd.
Suppressing disease spreading by using information diffusion on multiplex networks.
Wang, Wei; Liu, Quan-Hui; Cai, Shi-Min; Tang, Ming; Braunstein, Lidia A; Stanley, H Eugene
2016-07-06
Although there is always an interplay between the dynamics of information diffusion and disease spreading, the empirical research on the systemic coevolution mechanisms connecting these two spreading dynamics is still lacking. Here we investigate the coevolution mechanisms and dynamics between information and disease spreading by utilizing real data and a proposed spreading model on multiplex network. Our empirical analysis finds asymmetrical interactions between the information and disease spreading dynamics. Our results obtained from both the theoretical framework and extensive stochastic numerical simulations suggest that an information outbreak can be triggered in a communication network by its own spreading dynamics or by a disease outbreak on a contact network, but that the disease threshold is not affected by information spreading. Our key finding is that there is an optimal information transmission rate that markedly suppresses the disease spreading. We find that the time evolution of the dynamics in the proposed model qualitatively agrees with the real-world spreading processes at the optimal information transmission rate.
Wittgenstein's neurophenomenology.
Cole, J
2007-06-01
Wittgenstein, despite being considered an analytical philosopher, has been quoted extensively by neurologists like Oliver Sacks. This paper explores how Wittgenstein, despite suggesting that science was antithetical to philosophy, made observations relevant to cognitive neuroscience. His work on the inner and the outer, the relation between language and sensation or perception, and on the embodied nature of emotion and its communication, is important for an understanding of neurological impairment beyond our experience. In some of his enigmatic short writing his insights are pertinent to patients' experience, say of pain, Capgras' Syndrome and spinal cord injury. He also made observations on movement sense, will and action. He did not engage in empirical science, nor obtain data in any conventional sense. But his genius was not confined to abstract philosophy. His powers of observation and introspection led him to explore lived experience in new ways, some of which are only now being approached empirically. The method of science, he once wrote, leads philosophy into complete darkness. Had he lived today, one hopes that even he might have changed his mind.
Popular Medicine and Empirics in Greece, 1900–1950: An Oral History Approach
Hionidou, Violetta
2016-01-01
Western literature has focused on medical plurality but also on the pervasive existence of quacks who managed to survive from at least the eighteenth to the twentieth century. Focal points of their practices have been their efforts at enrichment and their extensive advertising. In Greece, empirical, untrained healers in the first half of the twentieth century do not fit in with this picture. They did not ask for payment, although they did accept ‘gifts’; they did not advertise their practice; and they had fixed places of residence. Licensed physicians did not undertake a concerted attack against them, as happened in the West against the quacks, and neither did the state. In this paper, it is argued that both the protection offered by their localities to resident popular healers and the healers’ lack of demand for monetary payment were jointly responsible for the lack of prosecutions of popular healers. Moreover, the linking of popular medicine with ancient traditions, as put forward by influential folklore studies, also reduced the likelihood of an aggressive discourse against the popular healers. Although the Greek situation in the early twentieth century contrasts with the historiography on quacks, it is much more in line with that on wise women and cunning-folk. It is thus the identification of these groups of healers in Greece and elsewhere, mostly through the use of oral histories but also through folklore studies, that reveals a different story from that of the aggressive discourse of medical men against quacks. PMID:27628859
Popular Medicine and Empirics in Greece, 1900-1950: An Oral History Approach.
Hionidou, Violetta
2016-10-01
Western literature has focused on medical plurality but also on the pervasive existence of quacks who managed to survive from at least the eighteenth to the twentieth century. Focal points of their practices have been their efforts at enrichment and their extensive advertising. In Greece, empirical, untrained healers in the first half of the twentieth century do not fit in with this picture. They did not ask for payment, although they did accept 'gifts'; they did not advertise their practice; and they had fixed places of residence. Licensed physicians did not undertake a concerted attack against them, as happened in the West against the quacks, and neither did the state. In this paper, it is argued that both the protection offered by their localities to resident popular healers and the healers' lack of demand for monetary payment were jointly responsible for the lack of prosecutions of popular healers. Moreover, the linking of popular medicine with ancient traditions, as put forward by influential folklore studies, also reduced the likelihood of an aggressive discourse against the popular healers. Although the Greek situation in the early twentieth century contrasts with the historiography on quacks, it is much more in line with that on wise women and cunning-folk. It is thus the identification of these groups of healers in Greece and elsewhere, mostly through the use of oral histories but also through folklore studies, that reveals a different story from that of the aggressive discourse of medical men against quacks.
Testing the Goodwin growth-cycle macroeconomic dynamics in Brazil
NASA Astrophysics Data System (ADS)
Moura, N. J.; Ribeiro, Marcelo B.
2013-05-01
This paper discusses the empirical validity of Goodwin’s (1967) macroeconomic model of growth with cycles by assuming that the individual income distribution of the Brazilian society is described by the Gompertz-Pareto distribution (GPD). This is formed by the combination of the Gompertz curve, representing the overwhelming majority of the population (˜99%), with the Pareto power law, representing the tiny richest part (˜1%). In line with Goodwin’s original model, we identify the Gompertzian part with the workers and the Paretian component with the class of capitalists. Since the GPD parameters are obtained for each year and the Goodwin macroeconomics is a time evolving model, we use previously determined, and further extended here, Brazilian GPD parameters, as well as unemployment data, to study the time evolution of these quantities in Brazil from 1981 to 2009 by means of the Goodwin dynamics. This is done in the original Goodwin model and an extension advanced by Desai et al. (2006). As far as Brazilian data is concerned, our results show partial qualitative and quantitative agreement with both models in the studied time period, although the original one provides better data fit. Nevertheless, both models fall short of a good empirical agreement as they predict single center cycles which were not found in the data. We discuss the specific points where the Goodwin dynamics must be improved in order to provide a more realistic representation of the dynamics of economic systems.
NASA Technical Reports Server (NTRS)
Yang, H. Q.; West, Jeff
2018-01-01
Determination of slosh damping is a very challenging task as there is no analytical solution. The damping physics involves the vorticity dissipation which requires the full solution of the nonlinear Navier-Stokes equations. As a result, previous investigations were mainly carried out by extensive experiments. A systematical study is needed to understand the damping physics of baffled tanks, to identify the difference between the empirical Miles equation and experimental measurements, and to develop new semi-empirical relations to better represent the real damping physics. The approach of this study is to use Computational Fluid Dynamics (CFD) technology to shed light on the damping mechanisms of a baffled tank. First, a 1-D Navier-Stokes equation representing different length scales and time scales in the baffle damping physics is developed and analyzed. Loci-STREAM-VOF, a well validated CFD solver developed at NASA MSFC, is applied to study the vorticity field around a baffle and around the fluid-gas interface to highlight the dissipation mechanisms at different slosh amplitudes. Previous measurement data is then used to validate the CFD damping results. The study found several critical parameters controlling fluid damping from a baffle: local slosh amplitude to baffle thickness (A/t), surface liquid depth to tank radius (d/R), local slosh amplitude to baffle width (A/W); and non-dimensional slosh frequency. The simulation highlights three significant damping regimes where different mechanisms dominate. The study proves that the previously found discrepancies between Miles equation and experimental measurement are not due to the measurement scatter, but rather due to different damping mechanisms at various slosh amplitudes. The limitations on the use of Miles equation are discussed based on the flow regime.
Variations in embodied energy and carbon emission intensities of construction materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wan Omar, Wan-Mohd-Sabki; School of Environmental Engineering, Universiti Malaysia Perlis, 02600 Arau, Perlis; Doh, Jeung-Hwan, E-mail: j.doh@griffith.edu.au
2014-11-15
Identification of parameter variation allows us to conduct more detailed life cycle assessment (LCA) of energy and carbon emission material over their lifecycle. Previous research studies have demonstrated that hybrid LCA (HLCA) can generally overcome the problems of incompleteness and accuracy of embodied energy (EE) and carbon (EC) emission assessment. Unfortunately, the current interpretation and quantification procedure has not been extensively and empirically studied in a qualitative manner, especially in hybridising between the process LCA and I-O LCA. To determine this weakness, this study empirically demonstrates the changes in EE and EC intensities caused by variations to key parameters inmore » material production. Using Australia and Malaysia as a case study, the results are compared with previous hybrid models to identify key parameters and issues. The parameters considered in this study are technological changes, energy tariffs, primary energy factors, disaggregation constant, emission factors, and material price fluctuation. It was found that changes in technological efficiency, energy tariffs and material prices caused significant variations in the model. Finally, the comparison of hybrid models revealed that non-energy intensive materials greatly influence the variations due to high indirect energy and carbon emission in upstream boundary of material production, and as such, any decision related to these materials should be considered carefully. - Highlights: • We investigate the EE and EC intensity variation in Australia and Malaysia. • The influences of parameter variations on hybrid LCA model were evaluated. • Key significant contribution to the EE and EC intensity variation were identified. • High indirect EE and EC content caused significant variation in hybrid LCA models. • Non-energy intensive material caused variation between hybrid LCA models.« less
NASA Technical Reports Server (NTRS)
Yang, H. Q.; West, Jeff
2016-01-01
Determination of slosh damping is a very challenging task as there is no analytical solution. The damping physics involves the vorticity dissipation which requires the full solution of the nonlinear Navier-Stokes equations. As a result, previous investigations were mainly carried out by extensive experiments. A systematical study is needed to understand the damping physics of baffled tanks, to identify the difference between the empirical Miles equation and experimental measurements, and to develop new semi-empirical relations to better represent the real damping physics. The approach of this study is to use Computational Fluid Dynamics (CFD) technology to shed light on the damping mechanisms of a baffled tank. First, a 1-D Navier-Stokes equation representing different length scales and time scales in the baffle damping physics is developed and analyzed. Loci-STREAM-VOF, a well validated CFD solver developed at NASA MSFC, is applied to study the vorticity field around a baffle and around the fluid-gas interface to highlight the dissipation mechanisms at different slosh amplitudes. Previous measurement data is then used to validate the CFD damping results. The study found several critical parameters controlling fluid damping from a baffle: local slosh amplitude to baffle thickness (A/t), surface liquid depth to tank radius (d/R), local slosh amplitude to baffle width (A/W); and non-dimensional slosh frequency. The simulation highlights three significant damping regimes where different mechanisms dominate. The study proves that the previously found discrepancies between Miles equation and experimental measurement are not due to the measurement scatter, but rather due to different damping mechanisms at various slosh amplitudes. The limitations on the use of Miles equation are discussed based on the flow regime.
Sensitivity Analysis of Empirical Results on Civil War Onset
ERIC Educational Resources Information Center
Hegre, Havard; Sambanis, Nicholas
2006-01-01
In the literature on civil war onset, several empirical results are not robust or replicable across studies. Studies use different definitions of civil war and analyze different time periods, so readers cannot easily determine if differences in empirical results are due to those factors or if most empirical results are just not robust. The authors…
Atay, Christina; Conway, Erin R.; Angus, Daniel; Wiles, Janet; Baker, Rosemary; Chenery, Helen J.
2015-01-01
The progressive neuropathology involved in dementia frequently causes a gradual decline in communication skills. Communication partners who are unaware of the specific communication problems faced by people with dementia (PWD) can inadvertently challenge their conversation partner, leading to distress and a reduced flow of information between speakers. Previous research has produced an extensive literature base recommending strategies to facilitate conversational engagement in dementia. However, empirical evidence for the beneficial effects of these strategies on conversational dynamics is sparse. This study uses a time-efficient computational discourse analysis tool called Discursis to examine the link between specific communication behaviours and content-based conversational engagement in 20 conversations between PWD living in residential aged-care facilities and care staff members. Conversations analysed here were baseline conversations recorded before staff members underwent communication training. Care staff members spontaneously exhibited a wide range of facilitative and non-facilitative communication behaviours, which were coded for analysis of conversation dynamics within these baseline conversations. A hybrid approach combining manual coding and automated Discursis metric analysis provides two sets of novel insights. Firstly, this study revealed nine communication behaviours that, if used by the care staff member in a given turn, significantly increased the appearance of subsequent content-based engagement in the conversation by PWD. Secondly, the current findings reveal alignment between human- and computer-generated labelling of communication behaviour for 8 out of the total 22 behaviours under investigation. The approach demonstrated in this study provides an empirical procedure for the detailed evaluation of content-based conversational engagement associated with specific communication behaviours. PMID:26658135
Safety of meropenem in patients reporting penicillin allergy: lack of allergic cross reactions.
Cunha, B A; Hamid, N S; Krol, V; Eisenstein, L
2008-04-01
Over the years, meropenem has become the mainstay of empiric therapy for serious systemic infections in critically ill patients. Although we have had extensive clinical experience since 1996 using meropenem safely in treating hundreds of patients with reported allergic reactions to penicillin without any adverse events, we have not published our experience. This study was conducted to document our clinical practice experience. Accordingly, over a 12-month period we prospectively monitored 110 patients treated with meropenem reporting penicillin allergic reactions for that 12-month period. Since early empiric therapy in such patients is essential, there is often no time for penicillin skin testing. Penicillin skin testing was not done in this "real world" clinical study. Patients were divided into two groups, depending on the nature of their penicillin allergic reactions. During a 12-month period, 110 patients with non-anaphylactic (59) and anaphylactic (51) penicillin allergic reactions tolerated prolonged meropenem therapy (1-4 weeks) safely without any allergic reactions. Based on these data and our previous clinical experience, there appears to be little/no potential cross reactivity between meropenem and penicillins even in patients with a definite history of anaphylactic reactions to penicillins. To the best of our knowledge, this is the first prospective clinical study demonstrating that meropenem may be safely given to patients with known/unknown allergic reactions to penicillin, including those with anaphylactic reactions, without penicillin skin testing. We conclude that meropenem may be given safely to patients reporting a history of non-anaphylactic or anaphylactic allergic reactions to penicillins without penicillin skin testing.
NASA Astrophysics Data System (ADS)
Kuzle, A.
2018-06-01
The important role that metacognition plays as a predictor for student mathematical learning and for mathematical problem-solving, has been extensively documented. But only recently has attention turned to primary grades, and more research is needed at this level. The goals of this paper are threefold: (1) to present metacognitive framework during mathematics problem-solving, (2) to describe their multi-method interview approach developed to study student mathematical metacognition, and (3) to empirically evaluate the utility of their model and the adaptation of their approach in the context of grade 2 and grade 4 mathematics problem-solving. The results are discussed not only with regard to further development of the adapted multi-method interview approach, but also with regard to their theoretical and practical implications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smyth, Padhraic
2013-07-22
This is the final report for a DOE-funded research project describing the outcome of research on non-homogeneous hidden Markov models (NHMMs) and coupled ocean-atmosphere (O-A) intermediate-complexity models (ICMs) to identify the potentially predictable modes of climate variability, and to investigate their impacts on the regional-scale. The main results consist of extensive development of the hidden Markov models for rainfall simulation and downscaling specifically within the non-stationary climate change context together with the development of parallelized software; application of NHMMs to downscaling of rainfall projections over India; identification and analysis of decadal climate signals in data and models; and, studies ofmore » climate variability in terms of the dynamics of atmospheric flow regimes.« less
Beyond upgrading typologies - In search of a better deal for honey value chains in Brazil.
Figueiredo Junior, Hugo S de; Meuwissen, Miranda P M; van der Lans, Ivo A; Oude Lansink, Alfons G J M
2017-01-01
Selection of value chain strategies by development practitioners and value chain participants themselves has been restricted to preset types of upgrading. This paper argues for an extension of the range of strategy solutions to value chains. An empirical application identifies successful strategies for honey value chains in Brazil for 2015-2020. Strategy and performance indicators were selected using the value chain Structure-Conduct-Performance (SCP) framework. Experts' opinion was elicited in a Delphi for business scenarios, and adaptive conjoint analysis was used to identify strategies for increasing production growth and local value-added. This study identifies important strategies beyond upgrading typologies, and finds that important strategies differ by performance goal and scenario. The value chain SCP allows searching for promising strategies towards performance-the "better deal"-in an integrated way.
Beyond upgrading typologies – In search of a better deal for honey value chains in Brazil
Meuwissen, Miranda P. M.; van der Lans, Ivo A.; Oude Lansink, Alfons G. J. M.
2017-01-01
Selection of value chain strategies by development practitioners and value chain participants themselves has been restricted to preset types of upgrading. This paper argues for an extension of the range of strategy solutions to value chains. An empirical application identifies successful strategies for honey value chains in Brazil for 2015–2020. Strategy and performance indicators were selected using the value chain Structure-Conduct-Performance (SCP) framework. Experts’ opinion was elicited in a Delphi for business scenarios, and adaptive conjoint analysis was used to identify strategies for increasing production growth and local value-added. This study identifies important strategies beyond upgrading typologies, and finds that important strategies differ by performance goal and scenario. The value chain SCP allows searching for promising strategies towards performance–the “better deal”–in an integrated way. PMID:28742804
Frustrated Freedom: The Effects of Agency and Wealth on Wellbeing in Rural Mozambique.
Victor, Bart; Fischer, Edward; Cooil, Bruce; Vergara, Alfredo; Mukolo, Abraham; Blevins, Meridith
2013-07-01
In Sen's capability view of poverty, wellbeing is threatened by both deficits of wealth and deficits of individual agency. Sen further predicts that "unfreedom," or low levels of agency will suppress the wellbeing effects of higher levels of wealth. The current paper extends Sen's view to include a condition, labeled "frustrated freedom," in which relatively higher levels of agency can heighten the poverty effects of relatively low levels of material wealth. Applying data from a large scale population study of female heads of household in rural Mozambique, the paper empirically tests Sen's view and the proposed extension. As predicted, agency is found to moderate the relationship between agency, wealth, and wellbeing, uncovering evidence of both unfreedom and frustrated freedom in the population. Further research into the complex dynamics of wellbeing and poverty are called for by the authors.
Brzank, P
2009-03-01
Due to prevalence as well as serious health and social impacts, domestic violence against women is considered a problem of high relevance for victims themselves and for the society as a whole. Empirical data also prove this correlation for Germany. Nevertheless the extensive implications and their interdependency have hardly been examined. In this article, a definition of domestic violence is given followed by a brief introduction of the problem. Next an overview of the results from national and international surveys regarding the social consequences and the economic costs of domestic violence is presented. The burden for following generations becomes obvious, including the consequences for family and social relationship structures, for victims' occupational situations, productivity loss, high risk for poverty, homelessness and the interdependencies on health status. Estimations from international studies on the societal costs illustrate the economic dimension.
Who Is Doing the Housework in Multicultural Britain?
Kan, Man-Yee; Laurie, Heather
2016-01-01
There is an extensive literature on the domestic division of labour within married and cohabiting couples and its relationship to gender equality within the household and the labour market. Most UK research focuses on the white majority population or is ethnicity ‘blind’, effectively ignoring potentially significant intersections between gender, ethnicity, socio-economic position and domestic labour. Quantitative empirical research on the domestic division of labour across ethnic groups has not been possible due to a lack of data that enables disaggregation by ethnic group. We address this gap using data from a nationally representative panel survey, Understanding Society, the UK Household Longitudinal Study containing sufficient sample sizes of ethnic minority groups for meaningful comparisons. We find significant variations in patterns of domestic labour by ethnic group, gender, education and employment status after controlling for individual and household characteristics. PMID:29416186
Provoost, Veerle
2015-03-01
This paper aims to provide a description of how authors publishing in medical ethics journals have made use of empirical research data in papers on the topic of gamete or embryo donation by means of references to studies conducted by others (secondary use). Rather than making a direct contribution to the theoretical methodological literature about the role empirical research data could play or should play in ethics studies, the focus is on the particular uses of these data and the problems that can be encountered with this use. In the selection of papers examined, apart from being used to describe the context, empirical evidence was mainly used to recount problems that needed solving. Few of the authors looked critically at the quality of the studies they quoted, and several instances were found of empirical data being used poorly or inappropriately. This study provides some initial baseline evidence that shows empirical data, in the form of references to studies, are sometimes being used in inappropriate ways. This suggests that medical ethicists should be more concerned about the quality of the empirical data selected, the appropriateness of the choice for a particular type of data (from a particular type of study) and the correct integration of this evidence in sound argumentation. Given that empirical data can be misused also when merely cited instead of reported, it may be worthwhile to explore good practice requirements for this type of use of empirical data in medical ethics.
Norén, Karin; Angerbjörn, Anders
2014-05-01
Many key species in northern ecosystems are characterised by high-amplitude cyclic population demography. In 1924, Charles Elton described the ecology and evolution of cyclic populations in a classic paper and, since then, a major focus has been the underlying causes of population cycles. Elton hypothesised that fluctuations reduced population genetic variation and influenced the direction of selection pressures. In concordance with Elton, present theories concern the direct consequences of population cycles for genetic structure due to the processes of genetic drift and selection, but also include feedback models of genetic composition on population dynamics. Most of these theories gained mathematical support during the 1970s and onwards, but due to methodological drawbacks, difficulties in long-term sampling and a complex interplay between microevolutionary processes, clear empirical data allowing the testing of these predictions are still scarce. Current genetic tools allow for estimates of genetic variation and identification of adaptive genomic regions, making this an ideal time to revisit this subject. Herein, we attempt to contribute towards a consensus regarding the enigma described by Elton almost 90 years ago. We present nine predictions covering the direct and genetic feedback consequences of population cycles on genetic variation and population structure, and review the empirical evidence. Generally, empirical support for the predictions was low and scattered, with obvious gaps in the understanding of basic population processes. We conclude that genetic variation in northern cyclic populations generally is high and that the geographic distribution and amount of diversity are usually suggested to be determined by various forms of context- and density-dependent dispersal exceeding the impact of genetic drift. Furthermore, we found few clear signatures of selection determining genetic composition in cyclic populations. Dispersal is assumed to have a strong impact on genetic structuring and we suggest that the signatures of other microevolutionary processes such as genetic drift and selection are weaker and have been over-shadowed by density-dependent dispersal. We emphasise that basic biological and demographical questions still need to be answered and stress the importance of extensive sampling, appropriate choice of tools and the value of standardised protocols. © 2013 The Authors. Biological Reviews © 2013 Cambridge Philosophical Society.
NASA Astrophysics Data System (ADS)
Monteys, Xavier; Harris, Paul; Caloca, Silvia
2014-05-01
The coastal shallow water zone can be a challenging and expensive environment within which to acquire bathymetry and other oceanographic data using traditional survey methods. Dangers and limited swath coverage make some of these areas unfeasible to survey using ship borne systems, and turbidity can preclude marine LIDAR. As a result, an extensive part of the coastline worldwide remains completely unmapped. Satellite EO multispectral data, after processing, allows timely, cost efficient and quality controlled information to be used for planning, monitoring, and regulating coastal environments. It has the potential to deliver repetitive derivation of medium resolution bathymetry, coastal water properties and seafloor characteristics in shallow waters. Over the last 30 years satellite passive imaging methods for bathymetry extraction, implementing analytical or empirical methods, have had a limited success predicting water depths. Different wavelengths of the solar light penetrate the water column to varying depths. They can provide acceptable results up to 20 m but become less accurate in deeper waters. The study area is located in the inner part of Dublin Bay, on the East coast of Ireland. The region investigated is a C-shaped inlet covering an area of 10 km long and 5 km wide with water depths ranging from 0 to 10 m. The methodology employed on this research uses a ratio of reflectance from SPOT 5 satellite bands, differing to standard linear transform algorithms. High accuracy water depths were derived using multibeam data. The final empirical model uses spatially weighted geographical tools to retrieve predicted depths. The results of this paper confirm that SPOT satellite scenes are suitable to predict depths using empirical models in very shallow embayments. Spatial regression models show better adjustments in the predictions over non-spatial models. The spatial regression equation used provides realistic results down to 6 m below the water surface, with reliable and error controlled depths. Bathymetric extraction approaches involving satellite imagery data are regarded as a fast, successful and economically advantageous solution to automatic water depth calculation in shallow and complex environments.
Oracle or Monacle: Research Concerning Attitudes Toward Feminism.
ERIC Educational Resources Information Center
Prescott, Suzanne; Schmid, Margaret
Both popular studies and more serious empirical studies of attitudes toward feminism are reviewed beginning with Clifford Kirkpatrick's early empirical work and including the more recent empirical studies completed since 1970. The review examines the contents of items used to measure feminism, and the methodology and sampling used in studies, as…
There Is Time for Calculation in Speed Chess, and Calculation Accuracy Increases With Expertise.
Chang, Yu-Hsuan A; Lane, David M
2016-01-01
The recognition-action theory of chess skill holds that expertise in chess is due primarily to the ability to recognize familiar patterns of pieces. Despite its widespread acclaim, empirical evidence for this theory is indirect. One source of indirect evidence is that there is a high correlation between speed chess and standard chess. Assuming that there is little or no time for calculation in speed chess, this high correlation implies that calculation is not the primary factor in standard chess. Two studies were conducted analyzing 100 games of speed chess. In Study 1, we examined the distributions of move times, and the key finding was that players often spent considerable time on a few moves. Moreover, stronger players were more likely than weaker players to do so. Study 2 examined skill differences in calculation by examining poor moves. The stronger players made proportionally fewer blunders (moves that a 2-ply search would have revealed to be errors). Overall, the poor moves made by the weaker players would have required a less extensive search to be revealed as poor moves than the poor moves made by the stronger players. Apparently, the stronger players are searching deeper and more accurately. These results are difficult to reconcile with the view that speed chess does not allow players time to calculate extensively and call into question the assertion that the high correlation between speed chess and standard chess supports recognition-action theory.
Empire: An Analytical Category for Educational Research
ERIC Educational Resources Information Center
Coloma, Roland Sintos
2013-01-01
In this article Roland Sintos Coloma argues for the relevance of empire as an analytical category in educational research. He points out the silence in mainstream studies of education on the subject of empire, the various interpretive approaches to deploying empire as an analytic, and the importance of indigeneity in research on empire and…
Structural aspects of Lorentz-violating quantum field theory
NASA Astrophysics Data System (ADS)
Cambiaso, M.; Lehnert, R.; Potting, R.
2018-01-01
In the last couple of decades the Standard Model Extension has emerged as a fruitful framework to analyze the empirical and theoretical extent of the validity of cornerstones of modern particle physics, namely, of Special Relativity and of the discrete symmetries C, P and T (or some combinations of these). The Standard Model Extension allows to contrast high-precision experimental tests with posited alterations representing minute Lorentz and/or CPT violations. To date no violation of these symmetry principles has been observed in experiments, mostly prompted by the Standard-Model Extension. From the latter, bounds on the extent of departures from Lorentz and CPT symmetries can be obtained with ever increasing accuracy. These analyses have been mostly focused on tree-level processes. In this presentation I would like to comment on structural aspects of perturbative Lorentz violating quantum field theory. I will show that some insight coming from radiative corrections demands a careful reassessment of perturbation theory. Specifically I will argue that both the standard renormalization procedure as well as the Lehmann-Symanzik-Zimmermann reduction formalism need to be adapted given that the asymptotic single-particle states can receive quantum corrections from Lorentz-violating operators that are not present in the original Lagrangian.
Matching weights to simultaneously compare three treatment groups: Comparison to three-way matching
Yoshida, Kazuki; Hernández-Díaz, Sonia; Solomon, Daniel H.; Jackson, John W.; Gagne, Joshua J.; Glynn, Robert J.; Franklin, Jessica M.
2017-01-01
BACKGROUND Propensity score matching is a commonly used tool. However, its use in settings with more than two treatment groups has been less frequent. We examined the performance of a recently developed propensity score weighting method in the three treatment group setting. METHODS The matching weight method is an extension of inverse probability of treatment weighting (IPTW) that reweights both exposed and unexposed groups to emulate a propensity score matched population. Matching weights can generalize to multiple treatment groups. The performance of matching weights in the three-group setting was compared via simulation to three-way 1:1:1 propensity score matching and IPTW. We also applied these methods to an empirical example that compared the safety of three analgesics. RESULTS Matching weights had similar bias, but better mean squared error (MSE) compared to three-way matching in all scenarios. The benefits were more pronounced in scenarios with a rare outcome, unequally sized treatment groups, or poor covariate overlap. IPTW’s performance was highly dependent on covariate overlap. In the empirical example, matching weights achieved the best balance for 24 out of 35 covariates. Hazard ratios were numerically similar to matching. However, the confidence intervals were narrower for matching weights. CONCLUSIONS Matching weights demonstrated improved performance over three-way matching in terms of MSE, particularly in simulation scenarios where finding matched subjects was difficult. Given its natural extension to settings with even more than three groups, we recommend matching weights for comparing outcomes across multiple treatment groups, particularly in settings with rare outcomes or unequal exposure distributions. PMID:28151746
NASA Astrophysics Data System (ADS)
Lo, C.; Kuo-Chen, H.; Hsu, S.
2013-12-01
The active Taiwan orogen is situated in the tectonic convergence between the Philippine Sea plate and Eurasian passive margin. The thick crust under the Central Range of Taiwan was demonstrated by the results from the TAIGER project during 2004-2009. The results show that the deepest moho (~60 km thickness) is located at the eastern flank of the Central Range, while the averaged crust thickness is over 50 km beneath the whole mountain ranges from south to north. Physically the thickened crust provides an excess of the gravitational potential energy (GPE) with respect to the vicinity, implying that the Central Range itself behaves intrinsic extension stress environment. However, due to limited geophysical information such a phenomenon was not well evaluated and not considered to be one of the important factors for the Taiwan mountain building process. In this study, we calculate the GPE of the whole Taiwan region from recent Vp tomography via seismic velocity-rock density empirical relationship. From the catalogue of the earthquake focal mechanisms of Broadband Array in Taiwan for Seismology (BATS), a quite number of extensional earthquakes are distributed in the 10-40 km deep in and around the Central Range, where the crustal potential energy is distinctively higher. Besides, the principal axes of these extensional earthquakes are mainly normal to the large gradient of crust ΔGPE at the edge of Central Range. Accordingly, we conclude that the Central Range is undergoing the mountain building by the strong plate collision; meanwhile it is also bearing the gravitationally instable extension due to inherent buoyant thickening crust.
Progressive Recombination Suppression and Differentiation in Recently Evolved Neo-sex Chromosomes
Natri, Heini M.; Shikano, Takahito; Merilä, Juha
2013-01-01
Recombination suppression leads to the structural and functional differentiation of sex chromosomes and is thus a crucial step in the process of sex chromosome evolution. Despite extensive theoretical work, the exact processes and mechanisms of recombination suppression and differentiation are not well understood. In threespine sticklebacks (Gasterosteus aculeatus), a different sex chromosome system has recently evolved by a fusion between the Y chromosome and an autosome in the Japan Sea lineage, which diverged from the ancestor of other lineages approximately 2 Ma. We investigated the evolutionary dynamics and differentiation processes of sex chromosomes based on comparative analyses of these divergent lineages using 63 microsatellite loci. Both chromosome-wide differentiation patterns and phylogenetic inferences with X and Y alleles indicated that the ancestral sex chromosomes were extensively differentiated before the divergence of these lineages. In contrast, genetic differentiation appeared to have proceeded only in a small region of the neo-sex chromosomes. The recombination maps constructed for the Japan Sea lineage indicated that recombination has been suppressed or reduced over a large region spanning the ancestral and neo-sex chromosomes. Chromosomal regions exhibiting genetic differentiation and suppressed or reduced recombination were detected continuously and sequentially in the neo-sex chromosomes, suggesting that differentiation has gradually spread from the fusion point following the extension of recombination suppression. Our study illustrates an ongoing process of sex chromosome differentiation, providing empirical support for the theoretical model postulating that recombination suppression and differentiation proceed in a gradual manner in the very early stage of sex chromosome evolution. PMID:23436913
Depression as a systemic syndrome: mapping the feedback loops of major depressive disorder.
Wittenborn, A K; Rahmandad, H; Rick, J; Hosseinichimeh, N
2016-02-01
Depression is a complex public health problem with considerable variation in treatment response. The systemic complexity of depression, or the feedback processes among diverse drivers of the disorder, contribute to the persistence of depression. This paper extends prior attempts to understand the complex causal feedback mechanisms that underlie depression by presenting the first broad boundary causal loop diagram of depression dynamics. We applied qualitative system dynamics methods to map the broad feedback mechanisms of depression. We used a structured approach to identify candidate causal mechanisms of depression in the literature. We assessed the strength of empirical support for each mechanism and prioritized those with support from validation studies. Through an iterative process, we synthesized the empirical literature and created a conceptual model of major depressive disorder. The literature review and synthesis resulted in the development of the first causal loop diagram of reinforcing feedback processes of depression. It proposes candidate drivers of illness, or inertial factors, and their temporal functioning, as well as the interactions among drivers of depression. The final causal loop diagram defines 13 key reinforcing feedback loops that involve nine candidate drivers of depression. Future research is needed to expand upon this initial model of depression dynamics. Quantitative extensions may result in a better understanding of the systemic syndrome of depression and contribute to personalized methods of evaluation, prevention and intervention.
Depression as a systemic syndrome: mapping the feedback loops of major depressive disorder
Wittenborn, A. K.; Rahmandad, H.; Rick, J.; Hosseinichimeh, N.
2016-01-01
Background Depression is a complex public health problem with considerable variation in treatment response. The systemic complexity of depression, or the feedback processes among diverse drivers of the disorder, contribute to the persistence of depression. This paper extends prior attempts to understand the complex causal feedback mechanisms that underlie depression by presenting the first broad boundary causal loop diagram of depression dynamics. Method We applied qualitative system dynamics methods to map the broad feedback mechanisms of depression. We used a structured approach to identify candidate causal mechanisms of depression in the literature. We assessed the strength of empirical support for each mechanism and prioritized those with support from validation studies. Through an iterative process, we synthesized the empirical literature and created a conceptual model of major depressive disorder. Results The literature review and synthesis resulted in the development of the first causal loop diagram of reinforcing feedback processes of depression. It proposes candidate drivers of illness, or inertial factors, and their temporal functioning, as well as the interactions among drivers of depression. The final causal loop diagram defines 13 key reinforcing feedback loops that involve nine candidate drivers of depression. Conclusions Future research is needed to expand upon this initial model of depression dynamics. Quantitative extensions may result in a better understanding of the systemic syndrome of depression and contribute to personalized methods of evaluation, prevention and intervention. PMID:26621339
Cao, Zheng; Bowie, James U
2014-01-01
Equilibrium H/D fractionation factors have been extensively employed to qualitatively assess hydrogen bond strengths in protein structure, enzyme active sites, and DNA. It remains unclear how fractionation factors correlate with hydrogen bond free energies, however. Here we develop an empirical relationship between fractionation factors and free energy, allowing for the simple and quantitative measurement of hydrogen bond free energies. Applying our empirical relationship to prior fractionation factor studies in proteins, we find: [1] Within the folded state, backbone hydrogen bonds are only marginally stronger on average in α-helices compared to β-sheets by ∼0.2 kcal/mol. [2] Charge-stabilized hydrogen bonds are stronger than neutral hydrogen bonds by ∼2 kcal/mol on average, and can be as strong as –7 kcal/mol. [3] Changes in a few hydrogen bonds during an enzyme catalytic cycle can stabilize an intermediate state by –4.2 kcal/mol. [4] Backbone hydrogen bonds can make a large overall contribution to the energetics of conformational changes, possibly playing an important role in directing conformational changes. [5] Backbone hydrogen bonding becomes more uniform overall upon ligand binding, which may facilitate participation of the entire protein structure in events at the active site. Our energetic scale provides a simple method for further exploration of hydrogen bond free energies. PMID:24501090
Parameterization of water vapor using high-resolution GPS data and empirical models
NASA Astrophysics Data System (ADS)
Ningombam, Shantikumar S.; Jade, Sridevi; Shrungeshwara, T. S.
2018-03-01
The present work evaluates eleven existing empirical models to estimate Precipitable Water Vapor (PWV) over a high-altitude (4500 m amsl), cold-desert environment. These models are tested extensively and used globally to estimate PWV for low altitude sites (below 1000 m amsl). The moist parameters used in the model are: water vapor scale height (Hc), dew point temperature (Td) and water vapor pressure (Es 0). These moist parameters are derived from surface air temperature and relative humidity measured at high temporal resolution from automated weather station. The performance of these models are examined statistically with observed high-resolution GPS (GPSPWV) data over the region (2005-2012). The correlation coefficient (R) between the observed GPSPWV and Model PWV is 0.98 at daily data and varies diurnally from 0.93 to 0.97. Parameterization of moisture parameters were studied in-depth (i.e., 2 h to monthly time scales) using GPSPWV , Td , and Es 0 . The slope of the linear relationships between GPSPWV and Td varies from 0.073°C-1 to 0.106°C-1 (R: 0.83 to 0.97) while GPSPWV and Es 0 varied from 1.688 to 2.209 (R: 0.95 to 0.99) at daily, monthly and diurnal time scales. In addition, the moist parameters for the cold desert, high-altitude environment are examined in-depth at various time scales during 2005-2012.
Analysis of dystonic tremor in musicians using empirical mode decomposition.
Lee, A; Schoonderwaldt, E; Chadde, M; Altenmüller, E
2015-01-01
Test the hypotheses that tremor amplitude in musicians with task-specific dystonia is higher at the affected finger (dystonic tremor, DT) or the adjacent finger (tremor associated with dystonia, TAD) than (1) in matched fingers of healthy musicians and non-musicians and (2) within patients in the unaffected and non-adjacent fingers of the affected side within patients. We measured 21 patients, 21 healthy musicians and 24 non-musicians. Participants exerted a flexion-extension movement. Instantaneous frequency and amplitude values were obtained with empirical mode decomposition and a Hilbert-transform, allowing to compare tremor amplitudes throughout the movement at various frequency ranges. We did not find a significant difference in tremor amplitude between patients and controls for either DT or TAD. Neither differed tremor amplitude in the within-patient comparisons. Both hypotheses were rejected and apparently neither DT nor TAD occur in musician's dystonia of the fingers. This is the first study assessing DT and TAD in musician's dystonia. Our finding suggests that even though MD is an excellent model for malplasticity due to excessive practice, it does not seem to provide a good model for DT. Rather it seems that musician's dystonia may manifest itself either as dystonic cramping without tremor or as task-specific tremor without overt dystonic cramping. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Wolfe, C R
2001-02-01
Analogy and metaphor are figurative forms of communication that help people integrate new information with prior knowledge to facilitate comprehension and appropriate inferences. The novelty and versatility of the Web place cognitive burdens on learners that can be overcome through the use of analogies and metaphors. This paper explores three uses of figurative communication as design elements in Web-based learning environments, and provides empirical illustrations of each. First, extended analogies can be used as the basis of cover stories that create an analogy between the learner's position and a hypothetical situation. The Dragonfly Web pages make extensive use of analogous cover stories in the design of interactive decision-making games. Feedback from visitors, patterns of usage, and external reviews provide evidence of effectiveness. A second approach is visual analogies based on the principles of ecological psychology. An empirical example suggests that visual analogies are most effective when there is a one-to-one correspondence between the base and visual target analogs. The use of learner-generated analogies is a third approach. Data from an offline study with undergraduate science students are presented indicating that generating analogies are associated with significant improvements in the ability to place events in natural history on a time line. It is concluded that cyberspace itself might form the basis of the next guiding metaphor of mind.
Vu, Duy; Lomi, Alessandro; Mascia, Daniele; Pallotti, Francesca
2017-06-30
The main objective of this paper is to introduce and illustrate relational event models, a new class of statistical models for the analysis of time-stamped data with complex temporal and relational dependencies. We outline the main differences between recently proposed relational event models and more conventional network models based on the graph-theoretic formalism typically adopted in empirical studies of social networks. Our main contribution involves the definition and implementation of a marked point process extension of currently available models. According to this approach, the sequence of events of interest is decomposed into two components: (a) event time and (b) event destination. This decomposition transforms the problem of selection of event destination in relational event models into a conditional multinomial logistic regression problem. The main advantages of this formulation are the possibility of controlling for the effect of event-specific data and a significant reduction in the estimation time of currently available relational event models. We demonstrate the empirical value of the model in an analysis of interhospital patient transfers within a regional community of health care organizations. We conclude with a discussion of how the models we presented help to overcome some the limitations of statistical models for networks that are currently available. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Cognitive behavioral treatment outcomes in adolescent ADHD.
Antshel, Kevin M; Faraone, Stephen V; Gordon, Michael
2014-08-01
To assess the efficacy of cognitive behavioral therapy (CBT) for managing adolescent ADHD. A total of 68 adolescents with ADHD and associated psychiatric comorbidities completed a manualized CBT treatment protocol. The intervention used in the study was a downward extension of the Safren et al. program for adults with ADHD who have symptoms unresolved by medication. Outcome variables consisted of narrow band (ADHD) and broadband (e.g., mood, anxiety, conduct) symptom measures (Behavior Assessment System for Children-2nd edition and ADHD-Rating Scales) as well as functioning measures (parent/teacher ratings and several ecologically real-world measures). Treatment effects emerged on the medication dosage, parent rating of pharmacotherapy adherence, adolescent self-report of personal adjustment (e.g., self-esteem), parent and teacher ratings of inattentive symptoms, school attendance, school tardiness, parent report of peer, family and academic functioning and teacher report of adolescent relationship with teacher, academic progress, and adolescent self-esteem. Adolescents with ADHD with oppositional defiant disorder were rated by parents and teachers as benefiting less from the CBT intervention. Adolescents with ADHD and comorbid anxiety/depression were rated by parents and teachers as benefiting more from the CBT intervention. A downward extension of an empirically validated adult ADHD CBT protocol can benefit some adolescents with ADHD. © 2012 SAGE Publications.
ERIC Educational Resources Information Center
Hallberg, Kelly
2013-01-01
This dissertation is a collection of three papers that employ empirical within study comparisons (WSCs) to identify conditions that support causal inference in observational studies. WSC studies empirically estimate the extent to which a given observational study reproduces the result of a randomized clinical trial (RCT) when both share the same…
NASA Astrophysics Data System (ADS)
Renneke, Richard M.
Field Reversed Configuration plasmas (FRCs) have been created in the Field Reversed Experiment-Liner (FRX-L) with density 2--6 x 10 22 m-3, total temperature 300--400 eV, and lifetime on the order of 10 micros. This thesis investigates global energy balance on high-density FRCs for the first time. The zero-dimensional approach to global energy balance developed by Rej and Tuszewski (Phys. Fluids 27, p. 1514, 1984) is utilized here. From the shots analyzed with this method, it is clear that energy loss from these FRCs is dominated by particle and thermal (collisional) losses. The percentage of radiative losses versus total loss is an order of magnitude lower than previous FRC experiments. This is reasonable for high density based on empirical scaling from the extensive database of tokamak plasma experiments. Ohmic dissipation, which heats plasma when trapped magnetic field decays to create electric field, is an important source of heating for the plasma. Ohmic heating shows a correlation with increasing the effective Lundquist number (S*). Empirical evidence suggest S* can be increased by lowering the density, which does not achieve the goals of FRX-L. A better way to improve ohmic heating is to trap more poloidal flux. This dissertation shows that FRX-L follows a semi-empirical scaling law which predicts plasma temperature gains for larger poloidal flux. Flux (tauφ) and particle (tauN) lifetimes for these FRCs were typically shorter than 10 micros. Approximately 1/3 of the particle and flux lifetimes for these FRCs did not scale with the usual tauN ≈ tauφ scaling of low-density FRCs, but instead showed tauN ≥ tau φ. However, scatter in the data indicates that the average performance of FRCs on FRX-L yields the typical (for FRCs) relationship tau N ≈ tauφ. Fusion energy gain Q was extrapolated for the shots analyzed in this study using a zero-dimensional scaling code with liner effects. The predicted Q is below the desired value of 0.1 (Schoenberg et al., LA-UR-98-2413, 1998). The situation predicted to lead to Q = 0.1 requires a larger plasma pressure than shown in the present data. This can be accomplished by increasing the plasma density (through larger fill pressure) and maintaining temperature with increased flux trapping. Larger Q and other benefits could be realized by raising the plasma pressure for future FRX-L shots. The innovation inherent in this work performed by the author is the extension of the global power balance model to include a time history of the plasma discharge. This extension required rigorous checking of the power balance model using internal density profiles provided by the multichord interferometer. Typical orders of the parameters calculated by the model are ˜500 MW total loss power, ˜100 MW ohmic heating power, and ˜200 MW total compression (input) power. Radiation was never measured above 5 MW, which is why it was deemed insignificant. It should be noted that these numbers are merely estimates and vary widely between shots.
Is Empiricism Empirically False? Lessons from Early Nervous Systems.
Miłkowski, Marcin
2017-01-01
Recent work on skin-brain thesis (de Wiljes et al. 2015; Keijzer 2015; Keijzer et al. 2013) suggests the possibility of empirical evidence that empiricism is false. It implies that early animals need no traditional sensory receptors to be engaged in cognitive activity. The neural structure required to coordinate extensive sheets of contractile tissue for motility provides the starting point for a new multicellular organized form of sensing. Moving a body by muscle contraction provides the basis for a multicellular organization that is sensitive to external surface structure at the scale of the animal body. In other words, the nervous system first evolved for action, not for receiving sensory input. Thus, sensory input is not required for minimal cognition; only action is. The whole body of an organism, in particular its highly specific animal sensorimotor organization, reflects the bodily and environmental spatiotemporal structure. The skin-brain thesis suggests that, in contrast to empiricist claims that cognition is constituted by sensory systems, cognition may be also constituted by action-oriented feedback mechanisms. Instead of positing the reflex arc as the elementary building block of nervous systems, it proposes that endogenous motor activity is crucial for cognitive processes. In the paper, I discuss the issue whether the skin-brain thesis and its supporting evidence can be really used to overthrow the main tenet of empiricism empirically, by pointing out to cognizing agents that fail to have any sensory apparatus.
Hemakom, Apit; Powezka, Katarzyna; Goverdovsky, Valentin; Jaffer, Usman; Mandic, Danilo P
2017-12-01
A highly localized data-association measure, termed intrinsic synchrosqueezing transform (ISC), is proposed for the analysis of coupled nonlinear and non-stationary multivariate signals. This is achieved based on a combination of noise-assisted multivariate empirical mode decomposition and short-time Fourier transform-based univariate and multivariate synchrosqueezing transforms. It is shown that the ISC outperforms six other combinations of algorithms in estimating degrees of synchrony in synthetic linear and nonlinear bivariate signals. Its advantage is further illustrated in the precise identification of the synchronized respiratory and heart rate variability frequencies among a subset of bass singers of a professional choir, where it distinctly exhibits better performance than the continuous wavelet transform-based ISC. We also introduce an extension to the intrinsic phase synchrony (IPS) measure, referred to as nested intrinsic phase synchrony (N-IPS), for the empirical quantification of physically meaningful and straightforward-to-interpret trends in phase synchrony. The N-IPS is employed to reveal physically meaningful variations in the levels of cooperation in choir singing and performing a surgical procedure. Both the proposed techniques successfully reveal degrees of synchronization of the physiological signals in two different aspects: (i) precise localization of synchrony in time and frequency (ISC), and (ii) large-scale analysis for the empirical quantification of physically meaningful trends in synchrony (N-IPS).
Introduction: social complexity and the bow in the prehistoric North American record.
Bingham, Paul M; Souza, Joanne; Blitz, John H
2013-01-01
This Special Issue of Evolutionary Anthropology grew out of a symposium at the 2012 Society for American Archaeology (SAA) meeting in Memphis, Tennessee (April 18-22). The goal of the symposium was to explore what we will argue is one of the most important and promising opportunities in the global archeological enterprise. In late prehistoric North America, the initial rise of cultures of strikingly enhanced complexity and the local introduction of a novel weapon technology, the bow, apparently correlate intimately in a diverse set of independent cases across the continent, as originally pointed out by Blitz. If this empirical relationship ultimately proves robust, it gives us an unprecedented opportunity to evaluate hypotheses for the causal processes producing social complexity and, by extension, to assess the possibility of a universal theory of history. The rise of comparably complex cultures was much more recent in North America than it was elsewhere and the resulting fresher archeological record is relatively well explored. These and other features make prehistoric North America a unique empirical environment. Together, the symposium and this issue have brought together outstanding investigators with both empirical and theoretical expertise. The strong cross-feeding and extended interactions between these investigators have given us all the opportunity to advance the promising exploration of what we call the North American Neolithic transitions. Our goal in this paper is to contextualize this issue. Copyright © 2013 Wiley Periodicals, Inc.
Semi-empirical and phenomenological instrument functions for the scanning tunneling microscope
NASA Astrophysics Data System (ADS)
Feuchtwang, T. E.; Cutler, P. H.; Notea, A.
1988-08-01
Recent progress in the development of a convenient algorithm for the determination of a quantitative local density of states (LDOS) of the sample, from data measured in the STM, is reviewd. It is argued that the sample LDOS strikes a good balance between the information content of a surface characteristic and effort required to obtain it experimentally. Hence, procedures to determine the sample LDOS as directly and as tip-model independently as possible are emphasized. The solution of the STM's "inverse" problem in terms of novel versions of the instrument (or Green) function technique is considered in preference to the well known, more direct solutions. Two types of instrument functions are considered: Approximations of the basic tip-instrument function obtained from the transfer Hamiltonian theory of the STM-STS. And, phenomenological instrument functions devised as a systematic scheme for semi-empirical first order corrections of "ideal" models. The instrument function, in this case, describes the corrections as the response of an independent component of the measuring apparatus inserted between the "ideal" instrument and the measured data. This linear response theory of measurement is reviewed and applied. A procedure for the estimation of the consistency of the model and the systematic errors due to the use of an approximate instrument function is presented. The independence of the instrument function techniques from explicit microscopic models of the tip is noted. The need for semi-empirical, as opposed to strictly empirical or analytical determination of the instrument function is discussed. The extension of the theory to the scanning tunneling spectrometer is noted, as well as its use in a theory of resolution.
Empirical Scientific Research and Legal Studies Research--A Missing Link
ERIC Educational Resources Information Center
Landry, Robert J., III
2016-01-01
This article begins with an overview of what is meant by empirical scientific research in the context of legal studies. With that backdrop, the argument is presented that without engaging in normative, theoretical, and doctrinal research in tandem with empirical scientific research, the role of legal studies scholarship in making meaningful…
On the prediction of auto-rotational characteristics of light airplane fuselages
NASA Technical Reports Server (NTRS)
Pamadi, B. N.; Taylor, L. W., Jr.
1984-01-01
A semi-empirical theory is presented for the estimation of aerodynamic forces and moments acting on a steadily rotating (spinning) airplane fuselage, with a particular emphasis on the prediction of its auto-rotational behavior. This approach is based on an extension of the available analytical methods for high angle of attack and side-slip and then coupling this procedure with strip theory for application to a rotating airplane fuselage. The analysis is applied to the fuselage of a light general aviation airplane and the results are shown to be in fair agreement with experimental data.
Utilization of humus-rich forest soil (mull) in geochemical exploration for gold
Curtin, Gary C.; Lakin, H.W.; Neuerburg, G.J.; Hubert, A.E.
1968-01-01
Distribution of gold in humus-rich forest soil (mull) reflects the known distribution of gold deposits in bedrock in the Empire district, Colorado. Gold from the bedrock is accumulated by pine and aspen trees and is concentrated in the mull by the decay of organic litter from the trees. Anomalies in mull which do not coincide with known gold deposits merit further exploration. The gold anomalies in soil (6- to 12-inch depth) and in float pebbles and cobbles poorly reflect the known distribution of gold deposits in bedrock beneath the extensive cover of colluvium and glacial drift.
Creep, Fatigue and Environmental Interactions and Their Effect on Crack Growth in Superalloys
NASA Technical Reports Server (NTRS)
Telesman, J.; Gabb, T. P.; Ghosn, L. J.; Smith, T.
2017-01-01
Complex interactions of creep/fatigue/environment control dwell fatigue crack growth (DFCG) in superalloys. Crack tip stress relaxation during dwells significantly changes the crack driving force and influence DFCG. Linear Elastic Fracture Mechanics, Kmax, parameter unsuitable for correlating DFCG behavior due to extensive visco-plastic deformation. Magnitude of remaining crack tip axial stresses controls DFCG resistance due to the brittle-intergranular nature of the crack growth process. Proposed a new empirical parameter, Ksrf, which incorporates visco-plastic evolution of the magnitude of remaining crack tip stresses. Previous work performed at 704C, extend the work to 760C.
Roughness influence on human blood drop spreading and splashing
NASA Astrophysics Data System (ADS)
Smith, Fiona; Buntsma, Naomi; Brutin, David
2017-11-01
The impact behaviour of complex fluid droplets is a topic that has been extensively studied but with much debate. The Bloodstain Pattern Analysis (BPA) community is encountering this scientific problem with daily practical cases since they use bloodstains as evidence in crime scene reconstruction. We aim to provide fundamental explanations in the study of blood drip stains by investigating the influence of surface roughness and wettability on the splashing limit of droplets of blood, a non-Newtonian colloidal fluid. Droplets of blood impacting perpendicularly different surfaces at different velocities were recorded. The recordings were analysed as well as the surfaces characteristics in order to find an empirical solution since we found that roughness plays a major role in the threshold of the splashing/non-splashing behaviour of blood compared to the wettability. Moreover it appears that roughness alters the deformation of the drip stains. These observations are key in characterising features of drip stains with the impacting conditions, which would answer some forensic issues.
Collapse of Corroded Pipelines under Combined Tension and External Pressure
Ye, Hao; Yan, Sunting; Jin, Zhijiang
2016-01-01
In this work, collapse of corroded pipeline under combined external pressure and tension is investigated through numerical method. Axially uniform corrosion with symmetric imperfections is firstly considered. After verifying with existing experimental results, the finite element model is used to study the effect of tension on collapse pressure. An extensive parametric study is carried out using Python script and FORTRAN subroutine to investigate the influence of geometric parameters on the collapse behavior under combined loads. The results are used to develop an empirical equation for estimating the collapse pressure under tension. In addition, the effects of loading path, initial imperfection length, yielding anisotropy and corrosion defect length on the collapse behavior are also investigated. It is found that tension has a significant influence on collapse pressure of corroded pipelines. Loading path and anisotropic yielding are also important factors affecting the collapse behavior. For pipelines with relatively long corrosion defect, axially uniform corrosion models could be used to estimate the collapse pressure. PMID:27111544
Garner, Bryan R.; Funk, Rodney R.; Hunter, Brooke D.
2012-01-01
The turnover of substance use disorder (SUD) treatment staff has been assumed to adversely impact treatment effectiveness, yet only limited research has empirically examined this assumption. Representing an extension of prior organizational-level analyses of the impact of staff turnover on client outcomes, this study examined the impact of SUD clinician turnover on adolescent treatment outcomes using a client perspective. Multilevel regression analysis did reveal that relative to those adolescents who did not experience clinician turnover, adolescents who experienced both direct and indirect clinician turnover reported a significantly higher percentage of days using alcohol or drugs at 6-month follow-up. However, clinician turnover was not found to have significant associations (negative or positive) with the other five treatment outcomes examined (e.g., substance-related problems, involvement in illegal activity). Thus, consistent with our prior findings, the current study provides additional evidence that turnover of SUD clinicians is not necessarily associated with adverse treatment outcomes. PMID:23083980
Do adolescent delinquency and problem drinking share psychosocial risk factors? A literature review.
Curcio, Angela L; Mak, Anita S; George, Amanda M
2013-04-01
Despite the prevalence and damaging effects of adolescent problem drinking, relative to delinquency, far less research has focused on drinking using an integrated theoretical approach. The aim of the current research was to review existing literature on psychosocial risk factors for delinquency and problem drinking, and explore whether integrating elements of social learning theory with an established psychosocial control theory of delinquency could explain adolescent problem drinking. We reviewed 71 studies published post-1990 with particular focus on articles that empirically researched risk factors for adolescent problem drinking and delinquency in separate and concurrent studies and meta-analytic reviews. We found shared risk factors for adolescent delinquency and problem drinking that are encompassed by an extension of psychosocial control theory. The potential of an extended psychosocial control theory providing a parsimonious theoretical approach to explaining delinquency, problem drinking and other adolescent problem behaviours, along with suggestions for future investigations, is discussed. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Beaumont, Samuel; Otero, Toribio F.
2018-07-01
Polypyrrole film electrodes are constituted by multielectronic electrochemical molecular machines (every polymeric molecule) counterions and water, mimicking the intracellular matrix of muscular cells. The influence of the electrolyte concentration on the reversible oxidation/reduction of polypyrrole films was studied in NaCl aqueous solutions by consecutive square potential waves. The consumed redox charge and the consumed electrical energy change as a function of the concentration. That means that the extension (the consumed charge) of the reaction involving conformational, or allosteric, movements of the reacting polymeric chains (molecular machines) responds to (senses) the chemical energy of the reaction ambient. A theoretical description of the attained empirical results is presented getting the sensing equations and the concomitant sensitivities. Those results could indicate the origin and nature of the neural signals sent to the brain from biological haptic muscles working by cooperative actuation of the actin-myosin molecular machines driven by chemical reactions and sensing, simultaneously, the fatigue state of the muscle.
NASA Technical Reports Server (NTRS)
Lyon, R. J. P.; Prelat, A. E.; Kirk, R. (Principal Investigator)
1981-01-01
An attempt was made to match HCMM- and U2HCMR-derived temperature data over two test sites of very local size to similar data collected in the field at nearly the same times. Results indicate that HCMM investigations using resolutions cells of 500 m or so are best conducted with areally-extensive sites, rather than point observations. The excellent quality day-VIS imagery is particularly useful for lineament studies, as is the DELTA-T imagery. Attempts to register the ground observed temperatures (even for 0.5 sq mile targets) were unsuccessful due to excessive pixel-to-pixel noise on the HCMM data. Several computer models were explored and related to thermal parameter value changes with observed data. Unless quite complex models, with many parameters which can be observed (perhaps not even measured (perhaps not even measured) only under remote sensing conditions (e.g., roughness, wind shear, etc) are used, the model outputs do not match the observed data. Empirical relationship may be most readily studied.
Description and Evaluation of a Measurement Technique for Assessment of Performing Gender
Harris, Kathleen Mullan; Halpern, Carolyn Tucker
2016-01-01
The influence of masculinity and femininity on behaviors and outcomes has been extensively studied in social science research using various measurement strategies. In the present paper, we describe and evaluate a measurement technique that uses existing survey items to capture the extent to which an individual behaves similarly to their same-gender peers. We use data from the first four waves of The National Longitudinal Study of Adolescent to Adult Health (Add Health), a nationally representative sample of adolescents (age 12–18) in the United States who were re-interviewed at ages 13–19, 18–26, and 24–32. We estimate split-half reliability and provide evidence that supports the validity of this measurement technique. We demonstrate that the resulting measure does not perform as a trait measure and is associated with involvement in violent fights, a pattern consistent with theory and empirical findings. This measurement technique represents a novel approach for gender researchers with the potential for expanding our current knowledge base. PMID:28630528
Identifying habitat sinks: A case study of Cooper's hawks in an urban environment
Mannan, R.W.; Steidl, R.J.; Boal, C.W.
2008-01-01
We studied a population of Cooper's hawks (Accipiter cooperii) in Tucson, Arizona from 1994 to 2005. High rates of mortality of nestlings from an urban-related disease prompted speculation that the area represented an ecological trap and habitat sink for Cooper's hawks. In this paper, we used estimates of survival and productivity from 11years of monitoring to develop an estimate of the rate of population change, ??, for Cooper's hawks in the area. We used a Cormack-Jolly-Seber approach to estimate survival of breeding hawks, and a stochastic, stage-based matrix to estimate ??. Despite the urban-related disease, the estimate of ?? indicated that the area does not function as a habitat sink for Cooper's hawks (?? = 1.11 ?? 0.047; P = 0.0073 for the null of ?? 1). Because data required to reliably identify habitat sinks are extensive and difficult to acquire, we suggest that the concept of habitat sinks be applied cautiously until substantiated with reliable empirical evidence. ?? 2008 Springer Science+Business Media, LLC.
NASA Astrophysics Data System (ADS)
Hoyt, Taylor J.; Freedman, Wendy L.; Madore, Barry F.; Seibert, Mark; Beaton, Rachael L.; Hatt, Dylan; Jang, In Sung; Lee, Myung Gyoon; Monson, Andrew J.; Rich, Jeffrey A.
2018-05-01
We present a new empirical JHK absolute calibration of the tip of the red giant branch (TRGB) in the Large Magellanic Cloud (LMC). We use published data from the extensive Near-Infrared Synoptic Survey containing 3.5 million stars, 65,000 of which are red giants that fall within one magnitude of the TRGB. Adopting the TRGB slopes from a companion study of the isolated dwarf galaxy IC 1613, as well as an LMC distance modulus of μ 0 = 18.49 mag from (geometric) detached eclipsing binaries, we derive absolute JHK zero points for the near-infrared TRGB. For a comparison with measurements in the bar alone, we apply the calibrated JHK TRGB to a 500 deg2 area of the 2MASS survey. The TRGB reveals the 3D structure of the LMC with a tilt in the direction perpendicular to the major axis of the bar, which is in agreement with previous studies.
Performance bounds for modal analysis using sparse linear arrays
NASA Astrophysics Data System (ADS)
Li, Yuanxin; Pezeshki, Ali; Scharf, Louis L.; Chi, Yuejie
2017-05-01
We study the performance of modal analysis using sparse linear arrays (SLAs) such as nested and co-prime arrays, in both first-order and second-order measurement models. We treat SLAs as constructed from a subset of sensors in a dense uniform linear array (ULA), and characterize the performance loss of SLAs with respect to the ULA due to using much fewer sensors. In particular, we claim that, provided the same aperture, in order to achieve comparable performance in terms of Cramér-Rao bound (CRB) for modal analysis, SLAs require more snapshots, of which the number is about the number of snapshots used by ULA times the compression ratio in the number of sensors. This is shown analytically for the case with one undamped mode, as well as empirically via extensive numerical experiments for more complex scenarios. Moreover, the misspecified CRB proposed by Richmond and Horowitz is also studied, where SLAs suffer more performance loss than their ULA counterpart.
Transposed-letter priming effects in reading aloud words and nonwords.
Mousikou, Petroula; Kinoshita, Sachiko; Wu, Simon; Norris, Dennis
2015-10-01
A masked nonword prime generated by transposing adjacent inner letters in a word (e.g., jugde) facilitates the recognition of the target word (JUDGE) more than a prime in which the relevant letters are replaced by different letters (e.g., junpe). This transposed-letter (TL) priming effect has been widely interpreted as evidence that the coding of letter position is flexible, rather than precise. Although the TL priming effect has been extensively investigated in the domain of visual word recognition using the lexical decision task, very few studies have investigated this empirical phenomenon in reading aloud. In the present study, we investigated TL priming effects in reading aloud words and nonwords and found that these effects are of equal magnitude for the two types of items. We take this result as support for the view that the TL priming effect arises from noisy perception of letter order within the prime prior to the mapping of orthography to phonology.
Ogle, Christin M; Rubin, David C; Siegler, Ilene C
2016-03-01
Using data from a longitudinal study of community-dwelling older adults, we analyzed the most extensive set of known correlates of PTSD symptoms obtained from a single sample to examine the measures' independent and combined utility in accounting for PTSD symptom severity. Fifteen measures identified as PTSD risk factors in published meta-analyses and 12 theoretically and empirically supported individual difference and health-related measures were included. Individual difference measures assessed after the trauma, including insecure attachment and factors related to the current trauma memory, such as self-rated severity, event centrality, frequency of involuntary recall, and physical reactions to the memory, accounted for symptom severity better than measures of pre-trauma factors. In an analysis restricted to prospective measures assessed before the trauma, the total variance explained decreased from 56% to 16%. Results support a model of PTSD in which characteristics of the current trauma memory promote the development and maintenance of PTSD symptoms.
Using Predictability for Lexical Segmentation.
Çöltekin, Çağrı
2017-09-01
This study investigates a strategy based on predictability of consecutive sub-lexical units in learning to segment a continuous speech stream into lexical units using computational modeling and simulations. Lexical segmentation is one of the early challenges during language acquisition, and it has been studied extensively through psycholinguistic experiments as well as computational methods. However, despite strong empirical evidence, the explicit use of predictability of basic sub-lexical units in models of segmentation is underexplored. This paper presents an incremental computational model of lexical segmentation for exploring the usefulness of predictability for lexical segmentation. We show that the predictability cue is a strong cue for segmentation. Contrary to earlier reports in the literature, the strategy yields state-of-the-art segmentation performance with an incremental computational model that uses only this particular cue in a cognitively plausible setting. The paper also reports an in-depth analysis of the model, investigating the conditions affecting the usefulness of the strategy. Copyright © 2016 Cognitive Science Society, Inc.
The role of post-adoption phase trust in B2C e-service loyalty: towards a more comprehensive picture
NASA Astrophysics Data System (ADS)
Mäntymäki, Matti
Despite the extensive interest in trust within information systems (IS) and e-commerce disciplines, only few studies examine trust in the post-adoption phase of the customer relationship. Not only gaining new customers by increasing adoption, but also keeping the existing ones loyal, is largely considered important for e-business success. This paper scrutinizes the role of trust in customer loyalty, focusing on B2C e-services by conducting a three-sectional literature review stemming from IS, e-commerce and marketing. The key findings of this study are: 1. Literature discussing the role of trust after the adoption phase is relatively scarce and fragmented 2. In the empirical testing trust is mostly viewed as a monolith 3. Quantitative research methods dominate the field 4. Since trust may play a role during the whole relationship, also dynamic ways to scrutinize trust would be appropriate. Implications of these findings are discussed and ideas for further research suggested.
Winman, Thomas; Rystedt, Hans
2011-03-01
The implementation of generic models for organizing information in complex institutions like those in healthcare creates a gap between standardization and the need for locally relevant knowledge. The present study addresses how this gap can be bridged by focusing on the practical work of healthcare staff in transforming information in EPRs into knowledge that is useful for everyday work. Video recording of shift handovers on a rehabilitation ward serves as the empirical case. The results show how extensive selections and reorganizations of information in EPRs are carried out in order to transform information into professionally relevant accounts. We argue that knowledge about the institutional obligations and professional ways of construing information are fundamental for these transitions. The findings point to the need to consider the role of professional knowledge inherent in unpacking information in efforts to develop information systems intended to bridge between institutional and professional boundaries in healthcare. © The Author(s) 2011.
Li, Yang; Fu, Hua; Zhao, Fang; Luo, Jianfeng; Kawachi, Ichiro
2013-09-01
The effect of individual educational attainment on health has been extensively documented in western countries, whereas empirical evidence of education spillover effects in marital dyads is scarce and inconsistent. A total of 2764 individuals (or 1382 marital dyads) were surveyed in the Shanghai Healthy City Project 2008. Logistic regression models were used for analysis, and all analyses were stratified by gender. Significant protective associations were observed in univariate models linking general health status to the individual's own educational attainment and to their partner's educational level. After controlling for presence of chronic conditions, lifestyle factors, and social support, these associations were attenuated. The authors found a gender difference in the association of spouse's educational attainment with self-rated health. The influence of education on health may be partly mediated by lifestyle and other factors.
Health consumer groups in the UK: a new social movement?
Allsop, Judith; Jones, Kathryn; Baggott, Rob
2004-09-01
This paper argues that a health consumer movement has developed in the United Kingdom over the last decade. Drawing on two empirical studies of groups that promote and/or represent the interests of patients, users and carers, it argues that groups formed by people with personal experience of a condition are now more widespread. Feelings of pain and loss can lead to the identification of others in a similar position, and to the formation of groups and action in the political sphere. Research shows that groups share a common discourse and follow similar participative practices, and there is extensive networking. Informal and formal alliances have formed to pursue joint action and indicate a wider health consumer movement. As governments have also increased the opportunities for participation, this has the potential for patients and carers to shape services in ways more responsive to their needs.
Reciprocity of weighted networks
Squartini, Tiziano; Picciolo, Francesco; Ruzzenenti, Franco; Garlaschelli, Diego
2013-01-01
In directed networks, reciprocal links have dramatic effects on dynamical processes, network growth, and higher-order structures such as motifs and communities. While the reciprocity of binary networks has been extensively studied, that of weighted networks is still poorly understood, implying an ever-increasing gap between the availability of weighted network data and our understanding of their dyadic properties. Here we introduce a general approach to the reciprocity of weighted networks, and define quantities and null models that consistently capture empirical reciprocity patterns at different structural levels. We show that, counter-intuitively, previous reciprocity measures based on the similarity of mutual weights are uninformative. By contrast, our measures allow to consistently classify different weighted networks according to their reciprocity, track the evolution of a network's reciprocity over time, identify patterns at the level of dyads and vertices, and distinguish the effects of flux (im)balances or other (a)symmetries from a true tendency towards (anti-)reciprocation. PMID:24056721
Reciprocity of weighted networks.
Squartini, Tiziano; Picciolo, Francesco; Ruzzenenti, Franco; Garlaschelli, Diego
2013-01-01
In directed networks, reciprocal links have dramatic effects on dynamical processes, network growth, and higher-order structures such as motifs and communities. While the reciprocity of binary networks has been extensively studied, that of weighted networks is still poorly understood, implying an ever-increasing gap between the availability of weighted network data and our understanding of their dyadic properties. Here we introduce a general approach to the reciprocity of weighted networks, and define quantities and null models that consistently capture empirical reciprocity patterns at different structural levels. We show that, counter-intuitively, previous reciprocity measures based on the similarity of mutual weights are uninformative. By contrast, our measures allow to consistently classify different weighted networks according to their reciprocity, track the evolution of a network's reciprocity over time, identify patterns at the level of dyads and vertices, and distinguish the effects of flux (im)balances or other (a)symmetries from a true tendency towards (anti-)reciprocation.
The nitrogen-vacancy colour centre in diamond
NASA Astrophysics Data System (ADS)
Doherty, Marcus W.; Manson, Neil B.; Delaney, Paul; Jelezko, Fedor; Wrachtrup, Jörg; Hollenberg, Lloyd C. L.
2013-07-01
The nitrogen-vacancy (NV) colour centre in diamond is an important physical system for emergent quantum technologies, including quantum metrology, information processing and communications, as well as for various nanotechnologies, such as biological and sub-diffraction limit imaging, and for tests of entanglement in quantum mechanics. Given this array of existing and potential applications and the almost 50 years of NV research, one would expect that the physics of the centre is well understood, however, the study of the NV centre has proved challenging, with many early assertions now believed false and many remaining issues yet to be resolved. This review represents the first time that the key empirical and ab initio results have been extracted from the extensive NV literature and assembled into one consistent picture of the current understanding of the centre. As a result, the key unresolved issues concerning the NV centre are identified and the possible avenues for their resolution are examined.
Anchoring effect on first passage process in Taiwan financial market
NASA Astrophysics Data System (ADS)
Liu, Hsing; Liao, Chi-Yo; Ko, Jing-Yuan; Lih, Jiann-Shing
2017-07-01
Empirical analysis of the price fluctuations of financial markets has received extensive attention because a substantial amount of financial market data has been collected and because of advances in data-mining techniques. Price fluctuation trends can help investors to make informed trading decisions, but such decisions may also be affected by a psychological factors-the anchoring effect. This study explores the intraday price time series of Taiwan futures, and applies diffusion model and quantitative methods to analyze the relationship between the anchoring effect and price fluctuations during first passage process. Our results indicate that power-law scaling and anomalous diffusion for stock price fluctuations are related to the anchoring effect. Moreover, microscopic price fluctuations before switching point in first passage process correspond with long-term price fluctuations of Taiwan's stock market. We find that microscopic trends could provide useful information for understanding macroscopic trends in stock markets.
Climate Change: Modeling the Human Response
NASA Astrophysics Data System (ADS)
Oppenheimer, M.; Hsiang, S. M.; Kopp, R. E.
2012-12-01
Integrated assessment models have historically relied on forward modeling including, where possible, process-based representations to project climate change impacts. Some recent impact studies incorporate the effects of human responses to initial physical impacts, such as adaptation in agricultural systems, migration in response to drought, and climate-related changes in worker productivity. Sometimes the human response ameliorates the initial physical impacts, sometimes it aggravates it, and sometimes it displaces it onto others. In these arenas, understanding of underlying socioeconomic mechanisms is extremely limited. Consequently, for some sectors where sufficient data has accumulated, empirically based statistical models of human responses to past climate variability and change have been used to infer response sensitivities which may apply under certain conditions to future impacts, allowing a broad extension of integrated assessment into the realm of human adaptation. We discuss the insights gained from and limitations of such modeling for benefit-cost analysis of climate change.
Personality and Depression: Explanatory Models and Review of the Evidence
Klein, Daniel N.; Kotov, Roman; Bufferd, Sara J.
2012-01-01
Understanding the association between personality and depression has implications for elucidating etiology and comorbidity, identifying at-risk individuals, and tailoring treatment. We discuss seven major models that have been proposed to explain the relation between personality and depression, and we review key methodological issues, including study design, the heterogeneity of mood disorders, and the assessment of personality. We then selectively review the extensive empirical literature on the role of personality traits in depression in adults and children. Current evidence suggests that depression is linked to traits such as neuroticism/negative emotionality, extraversion/positive emotionality, and conscientiousness. Moreover, personality characteristics appear to contribute to the onset and course of depression through a variety of pathways. Implications for prevention and prediction of treatment response are discussed, as well as specific considerations to guide future research on the relation between personality and depression. PMID:21166535
Lu, Zhong-Lin
2009-01-01
Sensory physiologists and psychologists have recognized the importance of attention on human performance for more than 100 years. Since the 1970s, controlled and extensive experiments have examined effects of selective attention to a location in space or to an object. In addition to behavioral studies, cognitive neuroscientists have investigated the neural bases of attention. In this paper, I briefly review some classical attention paradigms, recent advances on the theory of attention, and some new insights from psychophysics and cognitive neuroscience. The focus is on the mechanisms of attention, that is, how attention improves human performance. Situations in which the perception of objects is unchanged, but performance may differ due to different decision structures, are distinguished from those in which attention changes the perceptual processes. The perceptual template model is introduced as a theoretical framework for analyzing mechanisms of attention. I also present empirical evidence for two attention mechanisms, stimulus enhancement and external noise exclusion, from psychophysics, neurophysiology and brain imaging. PMID:20523762
Merging analytic and empirical GEO debris synchronization dynamics
NASA Astrophysics Data System (ADS)
Anderson, Paul V.; McKnight, Darren S.; Di Pentino, Frank; Schaub, Hanspeter
2016-09-01
The motion of abandoned satellites near the geostationary (GEO) region has been extensively studied, modeled, and compared to the motion of station-kept, operational satellites, providing insights into the evolution of uncontrolled orbits at GEO. Analytic developments produced a family of curves represented in the ascending node versus inclination space describing the long-term precession of the orbit plane at GEO, and forecasted the clustering of objects at the geopotential wells. However, recent investigations were undertaken to characterize apparent anomalistic behavior of GEO objects and classification of objects into related families. This paper provides a unifying summary of early bottom-up analytical theory with more recent top-down operational observations, highlighting the common linkage between these dimensions of GEO object behavior. This paper also identifies the relevance of these patterns of life tendencies for future operations at and near GEO, and discusses the long-term implications of these patterns of life for space situational awareness activities in this regime.
Brain Ways: Meynert, Bachelard and the Material Imagination of the Inner Life
Phelps, Scott
2016-01-01
The Austrian psychiatrist Theodor Meynert’s anatomical theories of the brain and nerves are laden with metaphorical imagery, ranging from the colonies of empire to the tentacles of jellyfish. This paper analyses among Meynert’s earliest works a different set of less obvious metaphors, namely, the fibres, threads, branches and paths used to elaborate the brain’s interior. I argue that these metaphors of material, or what the philosopher Gaston Bachelard called ‘material images’, helped Meynert not only to imaginatively extend the tracts of fibrous tissue inside the brain but to insinuate their function as pathways co-extensive with the mind. Above all, with reference to Bachelard’s study of the material imagination, I argue that Meynert helped entrench the historical intuition that the mind, whatever it was, consisted of some interiority – one which came to be increasingly articulated through the fibrous confines of the brain. PMID:27292326
Performance of Environmental Resources of a Tourist Destination
2013-01-01
Despite the apparent importance of destinations’ environmental resources, there appears to be little theoretical and applied research explicitly focusing on destination environmental supply. This research attempts to address this gap in the literature. First, it reviews and evaluates the body of research in tourism environmental resources and proposes a conceptual model to test their performance. The model combines tourism supply–demand view with importance–performance gaps and was used to survey tourism in Slovenia. The results show that the studied destination uses its environmental resources too extensively and that Slovenian environmental tourism experience does not meet visitors’ expectations. This finding challenges Slovenian policy makers, who position Slovenia as a green destination. The proposed model can form the basis for further conceptual and empirical research into the tourism contributions of environmental resources. In its present form, it can be used to examine environmental performance and to suggest policy implications for any destination. PMID:29901033
The strength-of-weak-ties perspective on creativity: a comprehensive examination and extension.
Baer, Markus
2010-05-01
Disentangling the effects of weak ties on creativity, the present study separated, both theoretically and empirically, the effects of the size and strength of actors' idea networks and examined their joint impact while simultaneously considering the separate, moderating role of network diversity. I hypothesized that idea networks of optimal size and weak strength were more likely to boost creativity when they afforded actors access to a wide range of different social circles. In addition, I examined whether the joint effects of network size, strength, and diversity on creativity were further qualified by the openness to experience personality dimension. As expected, results indicated that actors were most creative when they maintained idea networks of optimal size, weak strength, and high diversity and when they scored high on the openness dimension. The implications of these results are discussed. PsycINFO Database Record (c) 2010 APA, all rights reserved.
Influence of the impact energy on the pattern of blood drip stains
NASA Astrophysics Data System (ADS)
Smith, F. R.; Nicloux, C.; Brutin, D.
2018-01-01
The maximum spreading diameter of complex fluid droplets has been extensively studied and explained by numerous physical models. This research focuses therefore on a different aspect, the bulging outer rim observed after evaporation on the final dried pattern of blood droplets. A correlation is found between the inner diameter, the maximum outer diameter, and the impact speed. This shows how the drying mechanism of a blood drip stain is influenced by the impact energy, which induces a larger spreading diameter and thus a different redistribution of red blood cells inside the droplet. An empirical relation is established between the final dried pattern of a passive bloodstain and its impact speed, yielding a possible forensic application. Indeed, being able to relate accurately the energy of the drop with its final pattern would give a clue to investigators, as currently no such simple and accurate tool exists.
Insight into point defects and impurities in titanium from first principles
NASA Astrophysics Data System (ADS)
Nayak, Sanjeev K.; Hung, Cain J.; Sharma, Vinit; Alpay, S. Pamir; Dongare, Avinash M.; Brindley, William J.; Hebert, Rainer J.
2018-03-01
Titanium alloys find extensive use in the aerospace and biomedical industries due to a unique combination of strength, density, and corrosion resistance. Decades of mostly experimental research has led to a large body of knowledge of the processing-microstructure-properties linkages. But much of the existing understanding of point defects that play a significant role in the mechanical properties of titanium is based on semi-empirical rules. In this work, we present the results of a detailed self-consistent first-principles study that was developed to determine formation energies of intrinsic point defects including vacancies, self-interstitials, and extrinsic point defects, such as, interstitial and substitutional impurities/dopants. We find that most elements, regardless of size, prefer substitutional positions, but highly electronegative elements, such as C, N, O, F, S, and Cl, some of which are common impurities in Ti, occupy interstitial positions.
Understanding the links between education and smoking.
Maralani, Vida
2014-11-01
This study extends the theoretical and empirical literature on the relationship between education and smoking by focusing on the life course links between experiences from adolescence and health outcomes in adulthood. Differences in smoking by completed education are apparent at ages 12-18, long before that education is acquired. I use characteristics from the teenage years, including social networks, future expectations, and school experiences measured before the start of smoking regularly to predict smoking in adulthood. Results show that school policies, peers, and youths' mortality expectations predict smoking in adulthood but that college aspirations and analytical skills do not. I also show that smoking status at age 16 predicts both completed education and adult smoking, controlling for an extensive set of covariates. Overall, educational inequalities in smoking are better understood as a bundling of advantageous statuses that develops in childhood, rather than the effect of education producing better health. Copyright © 2014 Elsevier Inc. All rights reserved.
Asymptotics of empirical eigenstructure for high dimensional spiked covariance.
Wang, Weichen; Fan, Jianqing
2017-06-01
We derive the asymptotic distributions of the spiked eigenvalues and eigenvectors under a generalized and unified asymptotic regime, which takes into account the magnitude of spiked eigenvalues, sample size, and dimensionality. This regime allows high dimensionality and diverging eigenvalues and provides new insights into the roles that the leading eigenvalues, sample size, and dimensionality play in principal component analysis. Our results are a natural extension of those in Paul (2007) to a more general setting and solve the rates of convergence problems in Shen et al. (2013). They also reveal the biases of estimating leading eigenvalues and eigenvectors by using principal component analysis, and lead to a new covariance estimator for the approximate factor model, called shrinkage principal orthogonal complement thresholding (S-POET), that corrects the biases. Our results are successfully applied to outstanding problems in estimation of risks of large portfolios and false discovery proportions for dependent test statistics and are illustrated by simulation studies.
Asymptotics of empirical eigenstructure for high dimensional spiked covariance
Wang, Weichen
2017-01-01
We derive the asymptotic distributions of the spiked eigenvalues and eigenvectors under a generalized and unified asymptotic regime, which takes into account the magnitude of spiked eigenvalues, sample size, and dimensionality. This regime allows high dimensionality and diverging eigenvalues and provides new insights into the roles that the leading eigenvalues, sample size, and dimensionality play in principal component analysis. Our results are a natural extension of those in Paul (2007) to a more general setting and solve the rates of convergence problems in Shen et al. (2013). They also reveal the biases of estimating leading eigenvalues and eigenvectors by using principal component analysis, and lead to a new covariance estimator for the approximate factor model, called shrinkage principal orthogonal complement thresholding (S-POET), that corrects the biases. Our results are successfully applied to outstanding problems in estimation of risks of large portfolios and false discovery proportions for dependent test statistics and are illustrated by simulation studies. PMID:28835726
Charged particle concepts for fog dispersion
NASA Technical Reports Server (NTRS)
Frost, W.; Collins, F. G.; Koepf, D.
1981-01-01
Charged particle techniques hold promise for dispersing warm fog in the terminal area of commercial airports. This report focuses on features of the charged particle technique which require further study. The basic physical principles of the technique and the major verification experiments carried out in the past are described. The fundamentals of the nozzle operation are given. The nozzle characteristics and the theory of particle charging in the nozzle are discussed, including information from extensive literature on electrostatic precipitation relative to environmental pollution control and a description of some preliminary reported analyses on the jet characteristics and interaction with neighboring jets. The equation governing the transfer of water substances and of electrical charge is given together with a brief description of several semi-empirical, mathematical expressions necessary for the governing equations. The necessary ingredients of a field experiment to verify the system once a prototype is built are described.
NASA Astrophysics Data System (ADS)
Blum, Jürgen
2018-03-01
After 25 years of laboratory research on protoplanetary dust agglomeration, a consistent picture of the various processes that involve colliding dust aggregates has emerged. Besides sticking, bouncing and fragmentation, other effects, like, e.g., erosion or mass transfer, have now been extensively studied. Coagulation simulations consistently show that μm-sized dust grains can grow to mm- to cm-sized aggregates before they encounter the bouncing barrier, whereas sub-μm-sized water-ice particles can directly grow to planetesimal sizes. For siliceous materials, other processes have to be responsible for turning the dust aggregates into planetesimals. In this article, these processes are discussed, the physical properties of the emerging dusty or icy planetesimals are presented and compared to empirical evidence from within and without the Solar System. In conclusion, the formation of planetesimals by a gravitational collapse of dust "pebbles" seems the most likely.
Frustrated Freedom: The Effects of Agency and Wealth on Wellbeing in Rural Mozambique
Victor, Bart; Fischer, Edward; Cooil, Bruce; Vergara, Alfredo; Mukolo, Abraham; Blevins, Meridith
2014-01-01
In Sen's capability view of poverty, wellbeing is threatened by both deficits of wealth and deficits of individual agency. Sen further predicts that “unfreedom,” or low levels of agency will suppress the wellbeing effects of higher levels of wealth. The current paper extends Sen's view to include a condition, labeled “frustrated freedom,” in which relatively higher levels of agency can heighten the poverty effects of relatively low levels of material wealth. Applying data from a large scale population study of female heads of household in rural Mozambique, the paper empirically tests Sen's view and the proposed extension. As predicted, agency is found to moderate the relationship between agency, wealth, and wellbeing, uncovering evidence of both unfreedom and frustrated freedom in the population. Further research into the complex dynamics of wellbeing and poverty are called for by the authors. PMID:25125791
Perpetrators of spousal homicide: a review.
Aldridge, Mari L; Browne, Kevin D
2003-07-01
It has been argued that individuals who engage in spouse abuse increase their violence toward their partners, which can culminate in the death of either the assaulter or the victim. The aim of this review is to identify risk factors that determine whether an abusive relationship will end in eventual death. An extensive search revealed 22 empirical research studies on risk factors for spousal homicide. The circumstances of spousal homicide are described and salient risk factors are highlighted. In the United Kingdom, 37% of all women were murdered by their current or former intimate partner compared to 6% of men. The most common cause of an intimate partner's death in England and Wales was being attacked with a sharp implement or being strangled. By contrast, the most common cause in the United States for spousal homicide was being shot. Nine major risk factors are found that may help predict the probability of a partner homicide and prevent future victims.
Enhancing Team Composition in Professional Networks: Problem Definitions and Fast Solutions
Li, Liangyue; Tong, Hanghang; Cao, Nan; Ehrlich, Kate; Lin, Yu-Ru; Buchler, Norbou
2017-01-01
In this paper, we study ways to enhance the composition of teams based on new requirements in a collaborative environment. We focus on recommending team members who can maintain the team’s performance by minimizing changes to the team’s skills and social structure. Our recommendations are based on computing team-level similarity, which includes skill similarity, structural similarity as well as the synergy between the two. Current heuristic approaches are one-dimensional and not comprehensive, as they consider the two aspects independently. To formalize team-level similarity, we adopt the notion of graph kernel of attributed graphs to encompass the two aspects and their interaction. To tackle the computational challenges, we propose a family of fast algorithms by (a) designing effective pruning strategies, and (b) exploring the smoothness between the existing and the new team structures. Extensive empirical evaluations on real world datasets validate the effectiveness and efficiency of our algorithms. PMID:29104408
Intra-seasonal NDVI change projections in semi-arid Africa
Funk, Christopher C.; Brown, Molly E.
2006-01-01
Early warning systems (EWS) tend to focus on the identification of slow onset disasters such famine and epidemic disease. Since hazardous environmental conditions often precede disastrous outcomes by many months, effective monitoring via satellite and in situ observations can successfully guide mitigation activities. Accurate short term forecasts of NDVI could increase lead times, making early warning earlier. This paper presents a simple empirical model for making 1 to 4 month NDVI projections. These statistical projections are based on parameterized satellite rainfall estimates (RFE) and relative humidity demand (RHD). A quasi-global, 1 month ahead, 1° study demonstrates reasonable accuracies in many semi-arid regions. In Africa, a 0.1° cross-validated skill assessment quantifies the technique's applicability at 1 to 4 month forecast intervals. These results suggest that useful projections can be made over many semi-arid, food insecure regions of Africa, with plausible extensions to drought prone areas of Asia, Australia and South America.
Two Strategies for Qualitative Content Analysis: An Intramethod Approach to Triangulation.
Renz, Susan M; Carrington, Jane M; Badger, Terry A
2018-04-01
The overarching aim of qualitative research is to gain an understanding of certain social phenomena. Qualitative research involves the studied use and collection of empirical materials, all to describe moments and meanings in individuals' lives. Data derived from these various materials require a form of analysis of the content, focusing on written or spoken language as communication, to provide context and understanding of the message. Qualitative research often involves the collection of data through extensive interviews, note taking, and tape recording. These methods are time- and labor-intensive. With the advances in computerized text analysis software, the practice of combining methods to analyze qualitative data can assist the researcher in making large data sets more manageable and enhance the trustworthiness of the results. This article will describe a novel process of combining two methods of qualitative data analysis, or Intramethod triangulation, as a means to provide a deeper analysis of text.
Ultrasonic inspection of carbon fiber reinforced plastic by means of sample-recognition methods
NASA Technical Reports Server (NTRS)
Bilgram, R.
1985-01-01
In the case of carbon fiber reinforced plastic (CFRP), it has not yet been possible to detect nonlocal defects and material degradation related to aging with the aid of nondestructive inspection method. An approach for overcoming difficulties regarding such an inspection involves an extension of the ultrasonic inspection procedure on the basis of a use of signal processing and sample recognition methods. The basic concept involved in this approach is related to the realization that the ultrasonic signal contains information regarding the medium which is not utilized in conventional ultrasonic inspection. However, the analytical study of the phyiscal processes involved is very complex. For this reason, an empirical approach is employed to make use of the information which has not been utilized before. This approach uses reference signals which can be obtained with material specimens of different quality. The implementation of these concepts for the supersonic inspection of CFRP laminates is discussed.
Reprint of "Theoretical description of metal/oxide interfacial properties: The case of MgO/Ag(001)"
NASA Astrophysics Data System (ADS)
Prada, Stefano; Giordano, Livia; Pacchioni, Gianfranco; Goniakowski, Jacek
2017-02-01
We compare the performances of different DFT functionals applied to ultra-thin MgO(100) films supported on the Ag(100) surface, a prototypical system of a weakly interacting oxide/metal interface, extensively studied in the past. Beyond semi-local DFT-GGA approximation, we also use the hybrid DFT-HSE approach to improve the description of the oxide electronic structure. Moreover, to better account for the interfacial adhesion, we include the van de Waals interactions by means of either the semi-empirical force fields by Grimme (DFT-D2 and DFT-D2*) or the self-consistent density functional optB88-vdW. We compare and discuss the results on the structural, electronic, and adhesion characteristics of the interface as obtained for pristine and oxygen-deficient Ag-supported MgO films in the 1-4 ML thickness range.
Theoretical description of metal/oxide interfacial properties: The case of MgO/Ag(001)
NASA Astrophysics Data System (ADS)
Prada, Stefano; Giordano, Livia; Pacchioni, Gianfranco; Goniakowski, Jacek
2016-12-01
We compare the performances of different DFT functionals applied to ultra-thin MgO(100) films supported on the Ag(100) surface, a prototypical system of a weakly interacting oxide/metal interface, extensively studied in the past. Beyond semi-local DFT-GGA approximation, we also use the hybrid DFT-HSE approach to improve the description of the oxide electronic structure. Moreover, to better account for the interfacial adhesion, we include the van de Waals interactions by means of either the semi-empirical force fields by Grimme (DFT-D2 and DFT-D2*) or the self-consistent density functional optB88-vdW. We compare and discuss the results on the structural, electronic, and adhesion characteristics of the interface as obtained for pristine and oxygen-deficient Ag-supported MgO films in the 1-4 ML thickness range.
Bender, Donna S; Morey, Leslie C; Skodol, Andrew E
2011-07-01
Personality disorders are associated with fundamental disturbances of self and interpersonal relations, problems that vary in severity within and across disorders. This review surveyed clinician-rated measures of personality psychopathology that focus on self-other dimensions to explore the feasibility and utility of constructing a scale of severity of impairment in personality functioning for DSM-5. Robust elements of the instruments were considered in creating a continuum of personality functioning based on aspects of identity, self-direction, empathy, and intimacy. Building on preliminary findings (Morey et al., 2011 /this issue), the proposed Levels of Personality Functioning will be subjected to extensive empirical testing in the DSM-5 field trials and elsewhere. The resulting version of this severity measure is expected to have clinical utility in identifying personality psychopathology, planning treatment, building the therapeutic alliance, and studying treatment course and outcome.