Bridging process-based and empirical approaches to modeling tree growth
Harry T. Valentine; Annikki Makela; Annikki Makela
2005-01-01
The gulf between process-based and empirical approaches to modeling tree growth may be bridged, in part, by the use of a common model. To this end, we have formulated a process-based model of tree growth that can be fitted and applied in an empirical mode. The growth model is grounded in pipe model theory and an optimal control model of crown development. Together, the...
Familiarizing Students with the Empirically Supported Treatment Approaches for Childhood Problems.
ERIC Educational Resources Information Center
Wilkins, Victoria; Chambliss, Catherine
The clinical research literature exploring the efficacy of particular treatment approaches is reviewed with the intent to facilitate the training of counseling students. Empirically supported treatments (ESTs) is defined operationally as evidence-based treatments following the listing of empirically validated psychological treatments reported by…
A Theoretical and Empirical Comparison of Three Approaches to Achievement Testing.
ERIC Educational Resources Information Center
Haladyna, Tom; Roid, Gale
Three approaches to the construction of achievement tests are compared: construct, operational, and empirical. The construct approach is based upon classical test theory and measures an abstract representation of the instructional objectives. The operational approach specifies instructional intent through instructional objectives, facet design,…
North Dakota implementation of mechanistic-empirical pavement design guide (MEPDG).
DOT National Transportation Integrated Search
2014-12-01
North Dakota currently designs roads based on the AASHTO Design Guide procedure, which is based on : the empirical findings of the AASHTO Road Test of the late 1950s. However, limitations of the current : empirical approach have prompted AASHTO to mo...
Characterization and effectiveness of pay-for-performance in ophthalmology: a systematic review.
Herbst, Tim; Emmert, Martin
2017-06-05
To identify, characterize and compare existing pay-for-performance approaches and their impact on the quality of care and efficiency in ophthalmology. A systematic evidence-based review was conducted. English, French and German written literature published between 2000 and 2015 were searched in the following databases: Medline (via PubMed), NCBI web site, Scopus, Web of Knowledge, Econlit and the Cochrane Library. Empirical as well as descriptive articles were included. Controlled clinical trials, meta-analyses, randomized controlled studies as well as observational studies were included as empirical articles. Systematic characterization of identified pay-for-performance approaches (P4P approaches) was conducted according to the "Model for Implementing and Monitoring Incentives for Quality" (MIMIQ). Methodological quality of empirical articles was assessed according to the Critical Appraisal Skills Programme (CASP) checklists. Overall, 13 relevant articles were included. Eleven articles were descriptive and two articles included empirical analyses. Based on these articles, four different pay-for-performance approaches implemented in the United States were identified. With regard to quality and incentive elements, systematic comparison showed numerous differences between P4P approaches. Empirical studies showed isolated cost or quality effects, while a simultaneous examination of these effects was missing. Research results show that experiences with pay-for-performance approaches in ophthalmology are limited. Identified approaches differ with regard to quality and incentive elements restricting comparability. Two empirical studies are insufficient to draw strong conclusions about the effectiveness and efficiency of these approaches.
DOT National Transportation Integrated Search
1997-05-01
Current pavement design procedures are based principally on empirical approaches. The current trend toward developing more mechanistic-empirical type pavement design methods led Minnesota to develop the Minnesota Road Research Project (Mn/ROAD), a lo...
Li, Dongmei; Le Pape, Marc A; Parikh, Nisha I; Chen, Will X; Dye, Timothy D
2013-01-01
Microarrays are widely used for examining differential gene expression, identifying single nucleotide polymorphisms, and detecting methylation loci. Multiple testing methods in microarray data analysis aim at controlling both Type I and Type II error rates; however, real microarray data do not always fit their distribution assumptions. Smyth's ubiquitous parametric method, for example, inadequately accommodates violations of normality assumptions, resulting in inflated Type I error rates. The Significance Analysis of Microarrays, another widely used microarray data analysis method, is based on a permutation test and is robust to non-normally distributed data; however, the Significance Analysis of Microarrays method fold change criteria are problematic, and can critically alter the conclusion of a study, as a result of compositional changes of the control data set in the analysis. We propose a novel approach, combining resampling with empirical Bayes methods: the Resampling-based empirical Bayes Methods. This approach not only reduces false discovery rates for non-normally distributed microarray data, but it is also impervious to fold change threshold since no control data set selection is needed. Through simulation studies, sensitivities, specificities, total rejections, and false discovery rates are compared across the Smyth's parametric method, the Significance Analysis of Microarrays, and the Resampling-based empirical Bayes Methods. Differences in false discovery rates controls between each approach are illustrated through a preterm delivery methylation study. The results show that the Resampling-based empirical Bayes Methods offer significantly higher specificity and lower false discovery rates compared to Smyth's parametric method when data are not normally distributed. The Resampling-based empirical Bayes Methods also offers higher statistical power than the Significance Analysis of Microarrays method when the proportion of significantly differentially expressed genes is large for both normally and non-normally distributed data. Finally, the Resampling-based empirical Bayes Methods are generalizable to next generation sequencing RNA-seq data analysis.
We show that a conditional probability analysis that utilizes a stressor-response model based on a logistic regression provides a useful approach for developing candidate water quality criterai from empirical data. The critical step in this approach is transforming the response ...
Empirical likelihood-based tests for stochastic ordering
BARMI, HAMMOU EL; MCKEAGUE, IAN W.
2013-01-01
This paper develops an empirical likelihood approach to testing for the presence of stochastic ordering among univariate distributions based on independent random samples from each distribution. The proposed test statistic is formed by integrating a localized empirical likelihood statistic with respect to the empirical distribution of the pooled sample. The asymptotic null distribution of this test statistic is found to have a simple distribution-free representation in terms of standard Brownian bridge processes. The approach is used to compare the lengths of rule of Roman Emperors over various historical periods, including the “decline and fall” phase of the empire. In a simulation study, the power of the proposed test is found to improve substantially upon that of a competing test due to El Barmi and Mukerjee. PMID:23874142
ERIC Educational Resources Information Center
Lassnigg, Lorenz; Vogtenhuber, Stefan
2011-01-01
The empirical approach referred to in this article describes the relationship between education and training (ET) supply and employment in Austria; the use of the new ISCED (International Standard Classification of Education) fields of study variable makes this approach applicable abroad. The purpose is to explore a system that produces timely…
Using an empirical and rule-based modeling approach to map cause of disturbance in U.S
Todd A. Schroeder; Gretchen G. Moisen; Karen Schleeweis; Chris Toney; Warren B. Cohen; Zhiqiang Yang; Elizabeth A. Freeman
2015-01-01
Recently completing over a decade of research, the NASA/NACP funded North American Forest Dynamics (NAFD) project has led to several important advancements in the way U.S. forest disturbance dynamics are mapped at regional and continental scales. One major contribution has been the development of an empirical and rule-based modeling approach which addresses two of the...
A. Weiskittel; D. Maguire; R. Monserud
2007-01-01
Hybrid models offer the opportunity to improve future growth projections by combining advantages of both empirical and process-based modeling approaches. Hybrid models have been constructed in several regions and their performance relative to a purely empirical approach has varied. A hybrid model was constructed for intensively managed Douglas-fir plantations in the...
Empirical projection-based basis-component decomposition method
NASA Astrophysics Data System (ADS)
Brendel, Bernhard; Roessl, Ewald; Schlomka, Jens-Peter; Proksa, Roland
2009-02-01
Advances in the development of semiconductor based, photon-counting x-ray detectors stimulate research in the domain of energy-resolving pre-clinical and clinical computed tomography (CT). For counting detectors acquiring x-ray attenuation in at least three different energy windows, an extended basis component decomposition can be performed in which in addition to the conventional approach of Alvarez and Macovski a third basis component is introduced, e.g., a gadolinium based CT contrast material. After the decomposition of the measured projection data into the basis component projections, conventional filtered-backprojection reconstruction is performed to obtain the basis-component images. In recent work, this basis component decomposition was obtained by maximizing the likelihood-function of the measurements. This procedure is time consuming and often unstable for excessively noisy data or low intrinsic energy resolution of the detector. Therefore, alternative procedures are of interest. Here, we introduce a generalization of the idea of empirical dual-energy processing published by Stenner et al. to multi-energy, photon-counting CT raw data. Instead of working in the image-domain, we use prior spectral knowledge about the acquisition system (tube spectra, bin sensitivities) to parameterize the line-integrals of the basis component decomposition directly in the projection domain. We compare this empirical approach with the maximum-likelihood (ML) approach considering image noise and image bias (artifacts) and see that only moderate noise increase is to be expected for small bias in the empirical approach. Given the drastic reduction of pre-processing time, the empirical approach is considered a viable alternative to the ML approach.
Knight, Rod
2016-05-01
The field of population and public health ethics (PPHE) has yet to fully embrace the generation of evidence as an important project. This article reviews the philosophical debates related to the 'empirical turn' in clinical bioethics, and critically analyses how PPHE has and can engage with the philosophical implications of generating empirical data within the task of normative inquiry. A set of five conceptual and theoretical issues pertaining to population health that are unresolved and could potentially benefit from empirical PPHE approaches to normative inquiry are discussed. Each issue differs from traditional empirical bioethical approaches, in that they emphasize (1) concerns related to the population, (2) 'upstream' policy-relevant health interventions - within and outside of the health care system and (3) the prevention of illness and disease. Within each theoretical issue, a conceptual example from population and public health approaches to HIV prevention and health promotion is interrogated. Based on the review and critical analysis, this article concludes that empirical-normative approaches to population and public health ethics would be most usefully pursued as an iterative project (rather than as a linear project), in which the normative informs the empirical questions to be asked and new empirical evidence constantly directs conceptualizations of what constitutes morally robust public health practices. Finally, a conceptualization of an empirical population and public health ethics is advanced in order to open up new interdisciplinary 'spaces', in which empirical and normative approaches to ethical inquiry are transparently (and ethically) integrated. © The Author(s) 2015.
Mao, Ningying; Lesher, Beth; Liu, Qifa; Qin, Lei; Chen, Yixi; Gao, Xin; Earnshaw, Stephanie R; McDade, Cheryl L; Charbonneau, Claudie
2016-01-01
Invasive fungal infections (IFIs) require rapid diagnosis and treatment. A decision-analytic model was used to estimate total costs and survival associated with a diagnostic-driven (DD) or an empiric treatment approach in neutropenic patients with hematological malignancies receiving chemotherapy or autologous/allogeneic stem cell transplants in Shanghai, Beijing, Chengdu, and Guangzhou, the People's Republic of China. Treatment initiation for the empiric approach occurred after clinical suspicion of an IFI; treatment initiation for the DD approach occurred after clinical suspicion and a positive IFI diagnostic test result. Model inputs were obtained from the literature; treatment patterns and resource use were based on clinical opinion. Total costs were lower for the DD versus the empiric approach in Shanghai (¥3,232 vs ¥4,331), Beijing (¥3,894 vs ¥4,864), Chengdu, (¥4,632 vs ¥5,795), and Guangzhou (¥8,489 vs ¥9,795). Antifungal administration was lower using the DD (5.7%) than empiric (9.8%) approach, with similar survival rates. Results from one-way and probabilistic sensitivity analyses were most sensitive to changes in diagnostic test sensitivity and IFI incidence; the DD approach dominated the empiric approach in 88% of scenarios. These results suggest that a DD compared to an empiric treatment approach in the People's Republic of China may be cost saving, with similar overall survival in immunocompromised patients with suspected IFIs.
Recognizing of stereotypic patterns in epileptic EEG using empirical modes and wavelets
NASA Astrophysics Data System (ADS)
Grubov, V. V.; Sitnikova, E.; Pavlov, A. N.; Koronovskii, A. A.; Hramov, A. E.
2017-11-01
Epileptic activity in the form of spike-wave discharges (SWD) appears in the electroencephalogram (EEG) during absence seizures. This paper evaluates two approaches for detecting stereotypic rhythmic activities in EEG, i.e., the continuous wavelet transform (CWT) and the empirical mode decomposition (EMD). The CWT is a well-known method of time-frequency analysis of EEG, whereas EMD is a relatively novel approach for extracting signal's waveforms. A new method for pattern recognition based on combination of CWT and EMD is proposed. It was found that this combined approach resulted to the sensitivity of 86.5% and specificity of 92.9% for sleep spindles and 97.6% and 93.2% for SWD, correspondingly. Considering strong within- and between-subjects variability of sleep spindles, the obtained efficiency in their detection was high in comparison with other methods based on CWT. It is concluded that the combination of a wavelet-based approach and empirical modes increases the quality of automatic detection of stereotypic patterns in rat's EEG.
ERIC Educational Resources Information Center
Merrick, K. E.
2010-01-01
This correspondence describes an adaptation of puzzle-based learning to teaching an introductory computer programming course. Students from two offerings of the course--with and without the puzzle-based learning--were surveyed over a two-year period. Empirical results show that the synthesis of puzzle-based learning concepts with existing course…
Performance-Based Service Quality Model: An Empirical Study on Japanese Universities
ERIC Educational Resources Information Center
Sultan, Parves; Wong, Ho
2010-01-01
Purpose: This paper aims to develop and empirically test the performance-based higher education service quality model. Design/methodology/approach: The study develops 67-item instrument for measuring performance-based service quality with a particular focus on the higher education sector. Scale reliability is confirmed using the Cronbach's alpha.…
NASA Astrophysics Data System (ADS)
Prabhu Verleker, Akshay; Fang, Qianqian; Choi, Mi-Ran; Clare, Susan; Stantz, Keith M.
2015-03-01
The purpose of this study is to develop an alternate empirical approach to estimate near-infra-red (NIR) photon propagation and quantify optically induced drug release in brain metastasis, without relying on computationally expensive Monte Carlo techniques (gold standard). Targeted drug delivery with optically induced drug release is a noninvasive means to treat cancers and metastasis. This study is part of a larger project to treat brain metastasis by delivering lapatinib-drug-nanocomplexes and activating NIR-induced drug release. The empirical model was developed using a weighted approach to estimate photon scattering in tissues and calibrated using a GPU based 3D Monte Carlo. The empirical model was developed and tested against Monte Carlo in optical brain phantoms for pencil beams (width 1mm) and broad beams (width 10mm). The empirical algorithm was tested against the Monte Carlo for different albedos along with diffusion equation and in simulated brain phantoms resembling white-matter (μs'=8.25mm-1, μa=0.005mm-1) and gray-matter (μs'=2.45mm-1, μa=0.035mm-1) at wavelength 800nm. The goodness of fit between the two models was determined using coefficient of determination (R-squared analysis). Preliminary results show the Empirical algorithm matches Monte Carlo simulated fluence over a wide range of albedo (0.7 to 0.99), while the diffusion equation fails for lower albedo. The photon fluence generated by empirical code matched the Monte Carlo in homogeneous phantoms (R2=0.99). While GPU based Monte Carlo achieved 300X acceleration compared to earlier CPU based models, the empirical code is 700X faster than the Monte Carlo for a typical super-Gaussian laser beam.
A Perspective on Computational Human Performance Models as Design Tools
NASA Technical Reports Server (NTRS)
Jones, Patricia M.
2010-01-01
The design of interactive systems, including levels of automation, displays, and controls, is usually based on design guidelines and iterative empirical prototyping. A complementary approach is to use computational human performance models to evaluate designs. An integrated strategy of model-based and empirical test and evaluation activities is particularly attractive as a methodology for verification and validation of human-rated systems for commercial space. This talk will review several computational human performance modeling approaches and their applicability to design of display and control requirements.
ERIC Educational Resources Information Center
Gillespie, Ann
2014-01-01
Introduction: This research is the first to investigate the experiences of teacher-librarians as evidence-based practice. An empirically derived model is presented in this paper. Method: This qualitative study utilised the expanded critical incident approach, and investigated the real-life experiences of fifteen Australian teacher-librarians,…
Alladin, Assen; Sabatini, Linda; Amundson, Jon K
2007-04-01
This paper briefly surveys the trend of and controversy surrounding empirical validation in psychotherapy. Empirical validation of hypnotherapy has paralleled the practice of validation in psychotherapy and the professionalization of clinical psychology, in general. This evolution in determining what counts as evidence for bona fide clinical practice has gone from theory-driven clinical approaches in the 1960s and 1970s through critical attempts at categorization of empirically supported therapies in the 1990s on to the concept of evidence-based practice in 2006. Implications of this progression in professional psychology are discussed in the light of hypnosis's current quest for validation and empirical accreditation.
Feldstein Ewing, Sarah W; Chung, Tammy
2013-06-01
Research on mechanisms of behavior change provides an innovative method to improve treatment for addictive behaviors. An important extension of mechanisms of change research involves the use of translational approaches, which examine how basic biological (i.e., brain-based mechanisms) and behavioral factors interact in initiating and sustaining positive behavior change as a result of psychotherapy. Articles in this special issue include integrative conceptual reviews and innovative empirical research on brain-based mechanisms that may underlie risk for addictive behaviors and response to psychotherapy from adolescence through adulthood. Review articles discuss hypothesized mechanisms of change for cognitive and behavioral therapies, mindfulness-based interventions, and neuroeconomic approaches. Empirical articles cover a range of addictive behaviors, including use of alcohol, cigarettes, marijuana, cocaine, and pathological gambling and represent a variety of imaging approaches including fMRI, magneto-encephalography, real-time fMRI, and diffusion tensor imaging. Additionally, a few empirical studies directly examine brain-based mechanisms of change, whereas others examine brain-based indicators as predictors of treatment outcome. Finally, two commentaries discuss craving as a core feature of addiction, and the importance of a developmental approach to examining mechanisms of change. Ultimately, translational research on mechanisms of behavior change holds promise for increasing understanding of how psychotherapy may modify brain structure and functioning and facilitate the initiation and maintenance of positive treatment outcomes for addictive behaviors. 2013 APA, all rights reserved
Feldstein Ewing, Sarah W.; Chung, Tammy
2013-01-01
Research on mechanisms of behavior change provides an innovative method to improve treatment for addictive behaviors. An important extension of mechanisms of change research involves the use of translational approaches, which examine how basic biological (i.e., brain-based mechanisms) and behavioral factors interact in initiating and sustaining positive behavior change as a result of psychotherapy. Articles in this special issue include integrative conceptual reviews and innovative empirical research on brain-based mechanisms that may underlie risk for addictive behaviors and response to psychotherapy from adolescence through adulthood. Review articles discuss hypothesized mechanisms of change for cognitive and behavioral therapies, mindfulness-based interventions, and neuroeconomic approaches. Empirical articles cover a range of addictive behaviors, including use of alcohol, cigarettes, marijuana, cocaine, and pathological gambling and represent a variety of imaging approaches including fMRI, magneto-encephalography, real time fMRI, and diffusion tensor imaging. Additionally, a few empirical studies directly examined brain-based mechanisms of change, whereas others examined brain-based indicators as predictors of treatment outcome. Finally, two commentaries discuss craving as a core feature of addiction, and the importance of a developmental approach to examining mechanisms of change. Ultimately, translational research on mechanisms of behavior change holds promise for increasing understanding of how psychotherapy may modify brain structure and functioning and facilitate the initiation and maintenance of positive treatment outcomes for addictive behaviors. PMID:23815447
NASA Astrophysics Data System (ADS)
Chardon, J.; Mathevet, T.; Le Lay, M.; Gailhard, J.
2012-04-01
In the context of a national energy company (EDF : Electricité de France), hydro-meteorological forecasts are necessary to ensure safety and security of installations, meet environmental standards and improve water ressources management and decision making. Hydrological ensemble forecasts allow a better representation of meteorological and hydrological forecasts uncertainties and improve human expertise of hydrological forecasts, which is essential to synthesize available informations, coming from different meteorological and hydrological models and human experience. An operational hydrological ensemble forecasting chain has been developed at EDF since 2008 and is being used since 2010 on more than 30 watersheds in France. This ensemble forecasting chain is characterized ensemble pre-processing (rainfall and temperature) and post-processing (streamflow), where a large human expertise is solicited. The aim of this paper is to compare 2 hydrological ensemble post-processing methods developed at EDF in order improve ensemble forecasts reliability (similar to Monatanari &Brath, 2004; Schaefli et al., 2007). The aim of the post-processing methods is to dress hydrological ensemble forecasts with hydrological model uncertainties, based on perfect forecasts. The first method (called empirical approach) is based on a statistical modelisation of empirical error of perfect forecasts, by streamflow sub-samples of quantile class and lead-time. The second method (called dynamical approach) is based on streamflow sub-samples of quantile class and streamflow variation, and lead-time. On a set of 20 watersheds used for operational forecasts, results show that both approaches are necessary to ensure a good post-processing of hydrological ensemble, allowing a good improvement of reliability, skill and sharpness of ensemble forecasts. The comparison of the empirical and dynamical approaches shows the limits of the empirical approach which is not able to take into account hydrological dynamic and processes, i. e. sample heterogeneity. For a same streamflow range corresponds different processes such as rising limbs or recession, where uncertainties are different. The dynamical approach improves reliability, skills and sharpness of forecasts and globally reduces confidence intervals width. When compared in details, the dynamical approach allows a noticeable reduction of confidence intervals during recessions where uncertainty is relatively lower and a slight increase of confidence intervals during rising limbs or snowmelt where uncertainty is greater. The dynamic approach, validated by forecaster's experience that considered the empirical approach not discriminative enough, improved forecaster's confidence and communication of uncertainties. Montanari, A. and Brath, A., (2004). A stochastic approach for assessing the uncertainty of rainfall-runoff simulations. Water Resources Research, 40, W01106, doi:10.1029/2003WR002540. Schaefli, B., Balin Talamba, D. and Musy, A., (2007). Quantifying hydrological modeling errors through a mixture of normal distributions. Journal of Hydrology, 332, 303-315.
Systematic approach to developing empirical interatomic potentials for III-N semiconductors
NASA Astrophysics Data System (ADS)
Ito, Tomonori; Akiyama, Toru; Nakamura, Kohji
2016-05-01
A systematic approach to the derivation of empirical interatomic potentials is developed for III-N semiconductors with the aid of ab initio calculations. The parameter values of empirical potential based on bond order potential are determined by reproducing the cohesive energy differences among 3-fold coordinated hexagonal, 4-fold coordinated zinc blende, wurtzite, and 6-fold coordinated rocksalt structures in BN, AlN, GaN, and InN. The bond order p is successfully introduced as a function of the coordination number Z in the form of p = a exp(-bZn ) if Z ≤ 4 and p = (4/Z)α if Z ≥ 4 in empirical interatomic potential. Moreover, the energy difference between wurtzite and zinc blende structures can be successfully evaluated by considering interaction beyond the second-nearest neighbors as a function of ionicity. This approach is feasible for developing empirical interatomic potentials applicable to a system consisting of poorly coordinated atoms at surfaces and interfaces including nanostructures.
ERIC Educational Resources Information Center
Vogelezang, Michiel; Van Berkel, Berry; Verdonk, Adri
2015-01-01
Between 1970 and 1990, the Dutch working group "Empirical Introduction to Chemistry" developed a secondary school chemistry education curriculum based on the educational vision of the mathematicians van Hiele and van Hiele-Geldof. This approach viewed learning as a process in which students must go through discontinuous level transitions…
ERIC Educational Resources Information Center
van der Linden, Wim J.; Eggen, Theo J. H. M.
A procedure for the sequential optimization of the calibration of an item bank is given. The procedure is based on an empirical Bayes approach to a reformulation of the Rasch model as a model for paired comparisons between the difficulties of test items in which ties are allowed to occur. First, it is indicated how a paired-comparisons design…
MacDonald, Donald D.; Dipinto, Lisa M.; Field, Jay; Ingersoll, Christopher G.; Long, Edward R.; Swartz, Richard C.
2000-01-01
Sediment-quality guidelines (SQGs) have been published for polychlorinated biphenyls (PCBs) using both empirical and theoretical approaches. Empirically based guidelines have been developed using the screening-level concentration, effects range, effects level, and apparent effects threshold approaches. Theoretically based guidelines have been developed using the equilibrium-partitioning approach. Empirically-based guidelines were classified into three general categories, in accordance with their original narrative intents, and used to develop three consensus-based sediment effect concentrations (SECs) for total PCBs (tPCBs), including a threshold effect concentration, a midrange effect concentration, and an extreme effect concentration. Consensus-based SECs were derived because they estimate the central tendency of the published SQGs and, thus, reconcile the guidance values that have been derived using various approaches. Initially, consensus-based SECs for tPCBs were developed separately for freshwater sediments and for marine and estuarine sediments. Because the respective SECs were statistically similar, the underlying SQGs were subsequently merged and used to formulate more generally applicable SECs. The three consensus-based SECs were then evaluated for reliability using matching sediment chemistry and toxicity data from field studies, dose-response data from spiked-sediment toxicity tests, and SQGs derived from the equilibrium-partitioning approach. The results of this evaluation demonstrated that the consensus-based SECs can accurately predict both the presence and absence of toxicity in field-collected sediments. Importantly, the incidence of toxicity increases incrementally with increasing concentrations of tPCBs. Moreover, the consensus-based SECs are comparable to the chronic toxicity thresholds that have been estimated from dose-response data and equilibrium-partitioning models. Therefore, consensus-based SECs provide a unifying synthesis of existing SQGs, reflect causal rather than correlative effects, and accurately predict sediment toxicity in PCB-contaminated sediments.
Rational design of gene-based vaccines.
Barouch, Dan H
2006-01-01
Vaccine development has traditionally been an empirical discipline. Classical vaccine strategies include the development of attenuated organisms, whole killed organisms, and protein subunits, followed by empirical optimization and iterative improvements. While these strategies have been remarkably successful for a wide variety of viruses and bacteria, these approaches have proven more limited for pathogens that require cellular immune responses for their control. In this review, current strategies to develop and optimize gene-based vaccines are described, with an emphasis on novel approaches to improve plasmid DNA vaccines and recombinant adenovirus vector-based vaccines. Copyright 2006 Pathological Society of Great Britain and Ireland. Published by John Wiley & Sons, Ltd.
A review of covariate selection for non-experimental comparative effectiveness research.
Sauer, Brian C; Brookhart, M Alan; Roy, Jason; VanderWeele, Tyler
2013-11-01
This paper addresses strategies for selecting variables for adjustment in non-experimental comparative effectiveness research and uses causal graphs to illustrate the causal network that relates treatment to outcome. Variables in the causal network take on multiple structural forms. Adjustment for a common cause pathway between treatment and outcome can remove confounding, whereas adjustment for other structural types may increase bias. For this reason, variable selection would ideally be based on an understanding of the causal network; however, the true causal network is rarely known. Therefore, we describe more practical variable selection approaches based on background knowledge when the causal structure is only partially known. These approaches include adjustment for all observed pretreatment variables thought to have some connection to the outcome, all known risk factors for the outcome, and all direct causes of the treatment or the outcome. Empirical approaches, such as forward and backward selection and automatic high-dimensional proxy adjustment, are also discussed. As there is a continuum between knowing and not knowing the causal, structural relations of variables, we recommend addressing variable selection in a practical way that involves a combination of background knowledge and empirical selection and that uses high-dimensional approaches. This empirical approach can be used to select from a set of a priori variables based on the researcher's knowledge to be included in the final analysis or to identify additional variables for consideration. This more limited use of empirically derived variables may reduce confounding while simultaneously reducing the risk of including variables that may increase bias. Copyright © 2013 John Wiley & Sons, Ltd.
A Review of Covariate Selection for Nonexperimental Comparative Effectiveness Research
Sauer, Brian C.; Brookhart, Alan; Roy, Jason; Vanderweele, Tyler
2014-01-01
This paper addresses strategies for selecting variables for adjustment in non-experimental comparative effectiveness research (CER), and uses causal graphs to illustrate the causal network that relates treatment to outcome. Variables in the causal network take on multiple structural forms. Adjustment for on a common cause pathway between treatment and outcome can remove confounding, while adjustment for other structural types may increase bias. For this reason variable selection would ideally be based on an understanding of the causal network; however, the true causal network is rarely know. Therefore, we describe more practical variable selection approaches based on background knowledge when the causal structure is only partially known. These approaches include adjustment for all observed pretreatment variables thought to have some connection to the outcome, all known risk factors for the outcome, and all direct causes of the treatment or the outcome. Empirical approaches, such as forward and backward selection and automatic high-dimensional proxy adjustment, are also discussed. As there is a continuum between knowing and not knowing the causal, structural relations of variables, we recommend addressing variable selection in a practical way that involves a combination of background knowledge and empirical selection and that uses the high-dimensional approaches. This empirical approach can be used to select from a set of a priori variables based on the researcher’s knowledge to be included in the final analysis or to identify additional variables for consideration. This more limited use of empirically-derived variables may reduce confounding while simultaneously reducing the risk of including variables that may increase bias. PMID:24006330
Complex dynamics and empirical evidence (Invited Paper)
NASA Astrophysics Data System (ADS)
Delli Gatti, Domenico; Gaffeo, Edoardo; Giulioni, Gianfranco; Gallegati, Mauro; Kirman, Alan; Palestrini, Antonio; Russo, Alberto
2005-05-01
Standard macroeconomics, based on a reductionist approach centered on the representative agent, is badly equipped to explain the empirical evidence where heterogeneity and industrial dynamics are the rule. In this paper we show that a simple agent-based model of heterogeneous financially fragile agents is able to replicate a large number of scaling type stylized facts with a remarkable degree of statistical precision.
A discrete element method-based approach to predict the breakage of coal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gupta, Varun; Sun, Xin; Xu, Wei
Pulverization is an essential pre-combustion technique employed for solid fuels, such as coal, to reduce particle sizes. Smaller particles ensure rapid and complete combustion, leading to low carbon emissions. Traditionally, the resulting particle size distributions from pulverizers have been determined by empirical or semi-empirical approaches that rely on extensive data gathered over several decades during operations or experiments, with limited predictive capabilities for new coals and processes. Our work presents a Discrete Element Method (DEM)-based computational approach to model coal particle breakage with experimentally characterized coal physical properties. We also examined the effect of select operating parameters on the breakagemore » behavior of coal particles.« less
A discrete element method-based approach to predict the breakage of coal
Gupta, Varun; Sun, Xin; Xu, Wei; ...
2017-08-05
Pulverization is an essential pre-combustion technique employed for solid fuels, such as coal, to reduce particle sizes. Smaller particles ensure rapid and complete combustion, leading to low carbon emissions. Traditionally, the resulting particle size distributions from pulverizers have been determined by empirical or semi-empirical approaches that rely on extensive data gathered over several decades during operations or experiments, with limited predictive capabilities for new coals and processes. Our work presents a Discrete Element Method (DEM)-based computational approach to model coal particle breakage with experimentally characterized coal physical properties. We also examined the effect of select operating parameters on the breakagemore » behavior of coal particles.« less
Classification of Marital Relationships: An Empirical Approach.
ERIC Educational Resources Information Center
Snyder, Douglas K.; Smith, Gregory T.
1986-01-01
Derives an empirically based classification system of marital relationships, employing a multidimensional self-report measure of marital interaction. Spouses' profiles on the Marital Satisfaction Inventory for samples of clinic and nonclinic couples were subjected to cluster analysis, resulting in separate five-group typologies for husbands and…
NASA Astrophysics Data System (ADS)
Tsutsumi, Morito; Seya, Hajime
2009-12-01
This study discusses the theoretical foundation of the application of spatial hedonic approaches—the hedonic approach employing spatial econometrics or/and spatial statistics—to benefits evaluation. The study highlights the limitations of the spatial econometrics approach since it uses a spatial weight matrix that is not employed by the spatial statistics approach. Further, the study presents empirical analyses by applying the Spatial Autoregressive Error Model (SAEM), which is based on the spatial econometrics approach, and the Spatial Process Model (SPM), which is based on the spatial statistics approach. SPMs are conducted based on both isotropy and anisotropy and applied to different mesh sizes. The empirical analysis reveals that the estimated benefits are quite different, especially between isotropic and anisotropic SPM and between isotropic SPM and SAEM; the estimated benefits are similar for SAEM and anisotropic SPM. The study demonstrates that the mesh size does not affect the estimated amount of benefits. Finally, the study provides a confidence interval for the estimated benefits and raises an issue with regard to benefit evaluation.
Vujanovic, Anka A; Meyer, Thomas D; Heads, Angela M; Stotts, Angela L; Villarreal, Yolanda R; Schmitz, Joy M
2017-07-01
The co-occurrence of depression and substance use disorders (SUD) is highly prevalent and associated with poor treatment outcomes for both disorders. As compared to individuals suffering from either disorder alone, individuals with both conditions are likely to endure a more severe and chronic clinical course with worse treatment outcomes. Thus, current practice guidelines recommend treating these co-occurring disorders simultaneously. The overarching aims of this narrative are two-fold: (1) to provide an updated review of the current empirical status of integrated psychotherapy approaches for SUD and depression comorbidity, based on models of traditional cognitive-behavioral therapy (CBT) and newer third-wave CBT approaches, including acceptance- and mindfulness-based interventions and behavioral activation (BA); and (2) to propose a novel theoretical framework for transdiagnostic CBT for SUD-depression, based upon empirically grounded psychological mechanisms underlying this highly prevalent comorbidity. Traditional CBT approaches for the treatment of SUD-depression are well-studied. Despite advances in the development and evaluation of various third-wave psychotherapies, more work needs to be done to evaluate the efficacy of such approaches for SUD-depression. Informed by this summary of the evidence, we propose a transdiagnostic therapy approach that aims to integrate treatment elements found in empirically supported CBT-based interventions for SUD and depression. By targeting shared cognitive-affective processes underlying SUD-depression, transdiagnostic treatment models have the potential to offer a novel clinical approach to treating this difficult-to-treat comorbidity and relevant, co-occurring psychiatric disturbances, such as posttraumatic stress.
Rollover risk prediction of heavy vehicles by reliability index and empirical modelling
NASA Astrophysics Data System (ADS)
Sellami, Yamine; Imine, Hocine; Boubezoul, Abderrahmane; Cadiou, Jean-Charles
2018-03-01
This paper focuses on a combination of a reliability-based approach and an empirical modelling approach for rollover risk assessment of heavy vehicles. A reliability-based warning system is developed to alert the driver to a potential rollover before entering into a bend. The idea behind the proposed methodology is to estimate the rollover risk by the probability that the vehicle load transfer ratio (LTR) exceeds a critical threshold. Accordingly, a so-called reliability index may be used as a measure to assess the vehicle safe functioning. In the reliability method, computing the maximum of LTR requires to predict the vehicle dynamics over the bend which can be in some cases an intractable problem or time-consuming. With the aim of improving the reliability computation time, an empirical model is developed to substitute the vehicle dynamics and rollover models. This is done by using the SVM (Support Vector Machines) algorithm. The preliminary obtained results demonstrate the effectiveness of the proposed approach.
Attachment-Based Family Therapy: A Review of the Empirical Support.
Diamond, Guy; Russon, Jody; Levy, Suzanne
2016-09-01
Attachment-based family therapy (ABFT) is an empirically supported treatment designed to capitalize on the innate, biological desire for meaningful and secure relationships. The therapy is grounded in attachment theory and provides an interpersonal, process-oriented, trauma-focused approach to treating adolescent depression, suicidality, and trauma. Although a process-oriented therapy, ABFT offers a clear structure and road map to help therapists quickly address attachment ruptures that lie at the core of family conflict. Several clinical trials and process studies have demonstrated empirical support for the model and its proposed mechanism of change. This article provides an overview of the clinical model and the existing empirical support for ABFT. © 2016 Family Process Institute.
Nonlinear bulging factor based on R-curve data
NASA Technical Reports Server (NTRS)
Jeong, David Y.; Tong, Pin
1994-01-01
In this paper, a nonlinear bulging factor is derived using a strain energy approach combined with dimensional analysis. The functional form of the bulging factor contains an empirical constant that is determined using R-curve data from unstiffened flat and curved panel tests. The determination of this empirical constant is based on the assumption that the R-curve is the same for both flat and curved panels.
Competency-Based Curriculum Development: A Pragmatic Approach
ERIC Educational Resources Information Center
Broski, David; And Others
1977-01-01
Examines the concept of competency-based education, describes an experience-based model for its development, and discusses some empirically derived rules-of-thumb for its application in allied health. (HD)
Health care information systems and formula-based reimbursement: an empirical study.
Palley, M A; Conger, S
1995-01-01
Current initiatives in health care administration use formula-based approaches to reimbursement. Examples of such approaches include capitation and diagnosis related groups (DRGs). These approaches seek to contain medical costs and to facilitate managerial control over scarce health care resources. This article considers various characteristics of formula-based reimbursement, their operationalization on hospital information systems, and how these relate to hospital compliance costs.
ERIC Educational Resources Information Center
Chomsky, Noam
2015-01-01
Core concepts of language are highly contested. In some cases this is legitimate: real empirical and conceptual issues arise. In other cases, it seems that controversies are based on misunderstanding. A number of crucial cases are reviewed, and an approach to language is outlined that appears to have strong conceptual and empirical motivation, and…
DOT National Transportation Integrated Search
2011-07-01
Current pavement design based on the AASHTO Design Guide uses an empirical approach from the results of the AASHO Road Test conducted in 1958. To address some of the limitations of the original design guide, AASHTO developed a new guide: Mechanistic ...
Designing Educative Curriculum Materials: A Theoretically and Empirically Driven Process
ERIC Educational Resources Information Center
Davis, Elizabeth A.; Palincsar, Annemarie Sullivan; Arias, Anna Maria; Bismack, Amber Schultz; Marulis, Loren M.; Iwashyna, Stefanie K.
2014-01-01
In this article, the authors argue for a design process in the development of educative curriculum materials that is theoretically and empirically driven. Using a design-based research approach, they describe their design process for incorporating educative features intended to promote teacher learning into existing, high-quality curriculum…
2018-04-01
empirical, external energy-damage correlation methods for evaluating hearing damage risk associated with impulsive noise exposure. AHAAH applies the...is validated against the measured results of human exposures to impulsive sounds, and unlike wholly empirical correlation approaches, AHAAH’s...a measured level (LAEQ8 of 85 dB). The approach in MIL-STD-1474E is very different. Previous standards tried to find a correlation between some
2016-09-01
is to fit empirical Beta distributions to observed data, and then to use a randomization approach to make inferences on the difference between...a Ridit analysis on the often sparse data sets in many Flying Qualities applicationsi. The method of this paper is to fit empirical Beta ...One such measure is the discrete- probability-distribution version of the (squared) ‘Hellinger Distance’ (Yang & Le Cam , 2000) 2(, ) = 1
Castelnuovo, Gianluca
2010-01-01
The field of research and practice in psychotherapy has been deeply influenced by two different approaches: the empirically supported treatments (ESTs) movement, linked with the evidence-based medicine (EBM) perspective and the “Common Factors” approach, typically connected with the “Dodo Bird Verdict”. About the first perspective, since 1998 a list of ESTs has been established in mental health field. Criterions for “well-established” and “probably efficacious” treatments have arisen. The development of these kinds of paradigms was motivated by the emergence of a “managerial” approach and related systems for remuneration also for mental health providers and for insurance companies. In this article ESTs will be presented underlining also some possible criticisms. Finally complementary approaches, that could add different evidence in the psychotherapy research in comparison with traditional EBM approach, are presented. PMID:21833197
Measuring 'virtue' in medicine.
Kotzee, Ben; Ignatowicz, Agnieszka
2016-06-01
Virtue-approaches to medical ethics are becoming ever more influential. Virtue theorists advocate redefining right or good action in medicine in terms of the character of the doctor performing the action (rather than adherence to rules or principles). In medical education, too, calls are growing to reconceive medical education as a form of character formation (rather than instruction in rules or principles). Empirical studies of doctors' ethics from a virtue-perspective, however, are few and far between. In this respect, theoretical and empirical study of medical ethics are out of alignment. In this paper, we survey the empirical study of medical ethics and find that most studies of doctors' ethics are rules- or principles-based and not virtue-based. We outline the challenges that exist for studying medical ethics empirically from a virtue-based perspective and canvas the runners and riders in the effort to find virtue-based assessments of medical ethics.
Kaleem, Muhammad; Gurve, Dharmendra; Guergachi, Aziz; Krishnan, Sridhar
2018-06-25
The objective of the work described in this paper is development of a computationally efficient methodology for patient-specific automatic seizure detection in long-term multi-channel EEG recordings. Approach: A novel patient-specific seizure detection approach based on signal-derived Empirical Mode Decomposition (EMD)-based dictionary approach is proposed. For this purpose, we use an empirical framework for EMD-based dictionary creation and learning, inspired by traditional dictionary learning methods, in which the EMD-based dictionary is learned from the multi-channel EEG data being analyzed for automatic seizure detection. We present the algorithm for dictionary creation and learning, whose purpose is to learn dictionaries with a small number of atoms. Using training signals belonging to seizure and non-seizure classes, an initial dictionary, termed as the raw dictionary, is formed. The atoms of the raw dictionary are composed of intrinsic mode functions obtained after decomposition of the training signals using the empirical mode decomposition algorithm. The raw dictionary is then trained using a learning algorithm, resulting in a substantial decrease in the number of atoms in the trained dictionary. The trained dictionary is then used for automatic seizure detection, such that coefficients of orthogonal projections of test signals against the trained dictionary form the features used for classification of test signals into seizure and non-seizure classes. Thus no hand-engineered features have to be extracted from the data as in traditional seizure detection approaches. Main results: The performance of the proposed approach is validated using the CHB-MIT benchmark database, and averaged accuracy, sensitivity and specificity values of 92.9%, 94.3% and 91.5%, respectively, are obtained using support vector machine classifier and five-fold cross-validation method. These results are compared with other approaches using the same database, and the suitability of the approach for seizure detection in long-term multi-channel EEG recordings is discussed. Significance: The proposed approach describes a computationally efficient method for automatic seizure detection in long-term multi-channel EEG recordings. The method does not rely on hand-engineered features, as are required in traditional approaches. Furthermore, the approach is suitable for scenarios where the dictionary once formed and trained can be used for automatic seizure detection of newly recorded data, making the approach suitable for long-term multi-channel EEG recordings. © 2018 IOP Publishing Ltd.
A four stage approach for ontology-based health information system design.
Kuziemsky, Craig E; Lau, Francis
2010-11-01
To describe and illustrate a four stage methodological approach to capture user knowledge in a biomedical domain area, use that knowledge to design an ontology, and then implement and evaluate the ontology as a health information system (HIS). A hybrid participatory design-grounded theory (GT-PD) method was used to obtain data and code them for ontology development. Prototyping was used to implement the ontology as a computer-based tool. Usability testing evaluated the computer-based tool. An empirically derived domain ontology and set of three problem-solving approaches were developed as a formalized model of the concepts and categories from the GT coding. The ontology and problem-solving approaches were used to design and implement a HIS that tested favorably in usability testing. The four stage approach illustrated in this paper is useful for designing and implementing an ontology as the basis for a HIS. The approach extends existing ontology development methodologies by providing an empirical basis for theory incorporated into ontology design. Copyright © 2010 Elsevier B.V. All rights reserved.
The Utility of the Pattern of the Strengths and Weaknesses Approach
ERIC Educational Resources Information Center
Fiorello, Catherine A.; Flanagan, Dawn P.; Hale, James B.
2014-01-01
Unlike ability-achievement discrepancy and response-to-intervention approaches, the processing strengths and weaknesses (PSW) approach is the only empirically based approach that attempts to identify the pattern of deficit in the basic psychological processes that interferes with academic achievement for children with specific learning…
The Foundations of Accent and Intelligibility in Pronunciation Research
ERIC Educational Resources Information Center
Munro, Murray J.; Derwing, Tracey M.
2011-01-01
Our goal in developing this timeline was to trace the empirical bases of current approaches to L2 pronunciation teaching, with particular attention to the concepts of "accent" and "intelligibility". The process of identifying suitable works for inclusion challenged us in several ways. First, the number of empirical studies of pronunciation…
ERIC Educational Resources Information Center
Wilkins, Victoria; Chambliss, Catherine
When training counseling students, it is important to familiarize them with the clinical research literature exploring the efficacy of particular treatments. The bulk of the document is comprised of a review of empirically supported treatments (ESTs). ESTs or evidence-based treatments are grounded in studies recommended by the American…
The Theoretical and Empirical Basis for Meditation as an Intervention for PTSD
ERIC Educational Resources Information Center
Lang, Ariel J.; Strauss, Jennifer L.; Bomyea, Jessica; Bormann, Jill E.; Hickman, Steven D.; Good, Raquel C.; Essex, Michael
2012-01-01
In spite of the existence of good empirically supported treatments for posttraumatic stress disorder (PTSD), consumers and providers continue to ask for more options for managing this common and often chronic condition. Meditation-based approaches are being widely implemented, but there is minimal research rigorously assessing their effectiveness.…
ERIC Educational Resources Information Center
Kratochwill, Thomas R.; Stoiber, Karen Callan
2000-01-01
Developmental psychopathology and principles advance in Hughes' target article can be useful to promote development, evaluation, and application of empirically supported interventions (ESIs), but embracing a pathological framework is extremely limited given the diversity in theoretical approaches relevant to school-based ESIs. Argues that in order…
Using Loss Functions for DIF Detection: An Empirical Bayes Approach.
ERIC Educational Resources Information Center
Zwick, Rebecca; Thayer, Dorothy; Lewis, Charles
2000-01-01
Studied a method for flagging differential item functioning (DIF) based on loss functions. Builds on earlier research that led to the development of an empirical Bayes enhancement to the Mantel-Haenszel DIF analysis. Tested the method through simulation and found its performance better than some commonly used DIF classification systems. (SLD)
Educational Leadership in a Competitive State: A Contradiction in Terms?
ERIC Educational Resources Information Center
Moos, Lejf
2012-01-01
Purpose: The purpose of this paper is to explore how important the choice of theoretical perspective is on the analyses of empirical data from a Danish case study. Design/methodology/approach: The empirical bases for the analyses are qualitative, longitudinal case studies of school leadership in the International Successful School Principalship…
Aguilar-Guisado, Manuela; Martín-Peña, Almudena; Espigado, Ildefonso; Ruiz Pérez de Pipaon, Maite; Falantes, José; de la Cruz, Fátima; Cisneros, José M.
2012-01-01
Background Giving antifungal therapy exclusively to selected patients with persistent febrile neutropenia may avoid over-treatment without increasing mortality. The aim of this study was to validate an innovative diagnostic and therapeutic approach based on assessing patients’ risk profile and clinical criteria in order to select those patients requiring antifungal therapy. The efficacy of this approach was compared to that of universal empirical antifungal therapy. Design and Methods This was a prospective study which included all consecutive adult hematology patients with neutropenia and fever refractory to 5 days of empirical antibacterial therapy admitted to a teaching hospital in Spain over a 2-year period. A diagnostic and therapeutic approach based on clinical criteria and risk profile was applied in order to select patients for antifungal therapy. The sensitivity, specificity and negative predictive value of this approach and also the overall success rate, according to the same criteria of efficacy described in classical clinical trials, were analyzed. Results Eighty-five episodes were included, 35 of them (41.2%) in patients at high risk of invasive fungal infections. Antifungal therapy was not indicated in 33 episodes (38.8%). The overall incidence of proven and probable invasive fungal infections was 14.1%, all of which occurred in patients who had received empirical antifungal therapy. The 30-day crude mortality rate was 15.3% and the invasive fungal infection-related mortality rate was 2.8% (2/72). The overall success rate following the diagnostic and therapeutic approach was 36.5% compared with 33.9% and 33.7% obtained in the trial by Walsh et al. The sensitivity, specificity and negative predictive value of the study approach were 100%, 52.4% and 100%, respectively. Conclusions Based on the high negative predictive value of this diagnostic and therapeutic approach in persistent febrile neutropenia patients with hematologic malignancies or patients who have received a hematopoietic stem cell transplant, the approach is useful for identifying patients who are not likely to develop invasive fungal infection and do not, therefore, require antifungal therapy. The effectiveness of the strategy is similar to that of universal empirical antifungal therapy reported in controlled trials. PMID:22058202
Raible, C; Leidl, R
2004-11-01
The German hospital market faces an extensive process of consolidation. In this change hospitals consider cooperation as one possibility to improve competitiveness. To investigate explanations of changes in the German hospital market by theoretical approaches of cooperation research. The aims and mechanism of the theories, their relevance in terms of contents and their potential for empirical tests were used as criteria to assess the approaches, with current and future trends in the German hospital market providing the framework. Based on literature review, six theoretical approaches were investigated: industrial organization, transaction cost theory, game theory, resource dependency, institutional theory, and co-operative investment and finance theory. In addition, the data needed to empirically test the theories were specified. As a general problem, some of the theoretical approaches set a perfect market as a precondition. This precondition is not met by the heavily regulated German hospital market. Given the current regulations and the assessment criteria, industrial organization as well as resource-dependency and institutional theory approaches showed the highest potential to explain various aspects of the changes in the hospital market. So far, none of the approaches investigated provides a comprehensive and empirically tested explanation of the changes in the German hospital market. However, some of the approaches provide a theoretical background for part of the changes. As this dynamic market is economically of high significance, there is a need for further development and empirical testing of relevant theoretical approaches.
Hypnosis and the treatment of posttraumatic conditions: an evidence-based approach.
Lynn, Steven Jay; Cardeña, Etzel
2007-04-01
This article reviews the evidence for the use of hypnosis in the treatment of posttraumatic conditions including posttraumatic stress disorder and acute stress disorder. The review focuses on empirically supported principles and practices and suggests that hypnosis can be a useful adjunctive procedure in the treatment of posttraumatic conditions. Cognitive-behavioral and exposure-based interventions, which have the greatest empirical support, are highlighted, and an illustrative case study is presented.
Martins Pereira, Sandra; de Sá Brandão, Patrícia Joana; Araújo, Joana; Carvalho, Ana Sofia
2017-01-01
Introduction Antimicrobial resistance (AMR) is a challenging global and public health issue, raising bioethical challenges, considerations and strategies. Objectives This research protocol presents a conceptual model leading to formulating an empirically based bioethics framework for antibiotic use, AMR and designing ethically robust strategies to protect human health. Methods Mixed methods research will be used and operationalized into five substudies. The bioethical framework will encompass and integrate two theoretical models: global bioethics and ethical decision-making. Results Being a study protocol, this article reports on planned and ongoing research. Conclusions Based on data collection, future findings and using a comprehensive, integrative, evidence-based approach, a step-by-step bioethical framework will be developed for (i) responsible use of antibiotics in healthcare and (ii) design of strategies to decrease AMR. This will entail the analysis and interpretation of approaches from several bioethical theories, including deontological and consequentialist approaches, and the implications of uncertainty to these approaches. PMID:28459355
Marto, Aminaton; Jahed Armaghani, Danial; Tonnizam Mohamad, Edy; Makhtar, Ahmad Mahir
2014-01-01
Flyrock is one of the major disturbances induced by blasting which may cause severe damage to nearby structures. This phenomenon has to be precisely predicted and subsequently controlled through the changing in the blast design to minimize potential risk of blasting. The scope of this study is to predict flyrock induced by blasting through a novel approach based on the combination of imperialist competitive algorithm (ICA) and artificial neural network (ANN). For this purpose, the parameters of 113 blasting operations were accurately recorded and flyrock distances were measured for each operation. By applying the sensitivity analysis, maximum charge per delay and powder factor were determined as the most influential parameters on flyrock. In the light of this analysis, two new empirical predictors were developed to predict flyrock distance. For a comparison purpose, a predeveloped backpropagation (BP) ANN was developed and the results were compared with those of the proposed ICA-ANN model and empirical predictors. The results clearly showed the superiority of the proposed ICA-ANN model in comparison with the proposed BP-ANN model and empirical approaches. PMID:25147856
Marto, Aminaton; Hajihassani, Mohsen; Armaghani, Danial Jahed; Mohamad, Edy Tonnizam; Makhtar, Ahmad Mahir
2014-01-01
Flyrock is one of the major disturbances induced by blasting which may cause severe damage to nearby structures. This phenomenon has to be precisely predicted and subsequently controlled through the changing in the blast design to minimize potential risk of blasting. The scope of this study is to predict flyrock induced by blasting through a novel approach based on the combination of imperialist competitive algorithm (ICA) and artificial neural network (ANN). For this purpose, the parameters of 113 blasting operations were accurately recorded and flyrock distances were measured for each operation. By applying the sensitivity analysis, maximum charge per delay and powder factor were determined as the most influential parameters on flyrock. In the light of this analysis, two new empirical predictors were developed to predict flyrock distance. For a comparison purpose, a predeveloped backpropagation (BP) ANN was developed and the results were compared with those of the proposed ICA-ANN model and empirical predictors. The results clearly showed the superiority of the proposed ICA-ANN model in comparison with the proposed BP-ANN model and empirical approaches.
NASA Astrophysics Data System (ADS)
Dobronets, Boris S.; Popova, Olga A.
2018-05-01
The paper considers a new approach of regression modeling that uses aggregated data presented in the form of density functions. Approaches to Improving the reliability of aggregation of empirical data are considered: improving accuracy and estimating errors. We discuss the procedures of data aggregation as a preprocessing stage for subsequent to regression modeling. An important feature of study is demonstration of the way how represent the aggregated data. It is proposed to use piecewise polynomial models, including spline aggregate functions. We show that the proposed approach to data aggregation can be interpreted as the frequency distribution. To study its properties density function concept is used. Various types of mathematical models of data aggregation are discussed. For the construction of regression models, it is proposed to use data representation procedures based on piecewise polynomial models. New approaches to modeling functional dependencies based on spline aggregations are proposed.
NASA Astrophysics Data System (ADS)
Buchecker, M.; Menzel, S.; Home, R.
2013-06-01
Recent literature suggests that dialogic forms of risk communication are more effective to build stakeholders' hazard-related social capacities. In spite of the high theoretical expectations, there is a lack of univocal empirical evidence on the relevance of these effects. This is mainly due to the methodological limitations of the existing evaluation approaches. In our paper we aim at eliciting the contribution of participatory river revitalisation projects on stakeholders' social capacity building by triangulating the findings of three evaluation studies that were based on different approaches: a field-experimental, a qualitative long-term ex-post and a cross-sectional household survey approach. The results revealed that social learning and avoiding the loss of trust were more relevant benefits of participatory flood management than acceptance building. The results suggest that stakeholder involvements should be more explicitly designed as tools for long-term social learning.
Evidence-based Nursing Education - a Systematic Review of Empirical Research
Reiber, Karin
2011-01-01
The project „Evidence-based Nursing Education – Preparatory Stage“, funded by the Landesstiftung Baden-Württemberg within the programme Impulsfinanzierung Forschung (Funding to Stimulate Research), aims to collect information on current research concerned with nursing education and to process existing data. The results of empirical research which has already been carried out were systematically evaluated with aim of identifying further topics, fields and matters of interest for empirical research in nursing education. In the course of the project, the available empirical studies on nursing education were scientifically analysed and systematised. The over-arching aim of the evidence-based training approach – which extends beyond the aims of this project - is the conception, organisation and evaluation of vocational training and educational processes in the caring professions on the basis of empirical data. The following contribution first provides a systematic, theoretical link to the over-arching reference framework, as the evidence-based approach is adapted from thematically related specialist fields. The research design of the project is oriented towards criteria introduced from a selection of studies and carries out a two-stage systematic review of the selected studies. As a result, the current status of research in nursing education, as well as its organisation and structure, and questions relating to specialist training and comparative education are introduced and discussed. Finally, the empirical research on nursing training is critically appraised as a complementary element in educational theory/psychology of learning and in the ethical tradition of research. This contribution aims, on the one hand, to derive and describe the methods used, and to introduce the steps followed in gathering and evaluating the data. On the other hand, it is intended to give a systematic overview of empirical research work in nursing education. In order to preserve a holistic view of the research field and methods, detailed individual findings are not included. PMID:21818237
Consumer trust in food safety--a multidisciplinary approach and empirical evidence from Taiwan.
Chen, Mei-Fang
2008-12-01
Food scandals that happened in recent years have increased consumers' risk perceptions of foods and decreased their trust in food safety. A better understanding of the consumer trust in food safety can improve the effectiveness of public policy and allow the development of the best practice in risk communication. This study proposes a research framework from a psychometric approach to investigate the relationships between the consumer's trust in food safety and the antecedents of risk perceptions of foods based on a reflexive modernization perspective and a cultural theory perspective in the hope of benefiting the future empirical study. The empirical results from a structural equation modeling analysis of Taiwan as a case in point reveal that this research framework based on a multidisciplinary perspective can be a valuable tool for a growing understanding of consumer trust in food safety. The antecedents in the psychometric research framework comprised reflexive modernization factors and cultural theory factors have all been supported in this study except the consumer's perception of pessimism toward food. Moreover, the empirical results of repeated measures analysis of variance give more detailed information to grasp empirical implications and to provide some suggestions to the actors and institutions involved in the food supply chain in Taiwan.
Classroom EFL Writing: The Alignment-Oriented Approach
ERIC Educational Resources Information Center
Haiyan, Miao; Rilong, Liu
2016-01-01
This paper outlines the alignment-oriented approach in classroom EFL writing. Based on a review of the characteristics of the written language and comparison between the product-focused approach and the process-focused approach, the paper proposes a practical classroom procedure as to how to teach EFL writing. A follow-up empirical study is…
ERIC Educational Resources Information Center
Houser, Bonnie L.
2017-01-01
There are relatively few empirical studies that examine whether using a competency-based education (CBE) approach results in increased student learning or achievement when compared to traditional education approaches. This study uses a quantitative research methodology, a nonexperimental comparative descriptive research design, and a two-group…
ERIC Educational Resources Information Center
Wilson, Amanda; Hainey, Thomas; Connolly, Thomas M.
2013-01-01
Newer approaches such as games-based learning (GBL) and games-based construction are being adopted to motivate and engage students within the Curriculum for Excellence (CfE) in Scotland. GBL and games-based construction suffer from a dearth of empirical evidence supporting their validity as teaching and learning approaches. To address this issue…
2012-01-01
Background An important question in the analysis of biochemical data is that of identifying subsets of molecular variables that may jointly influence a biological response. Statistical variable selection methods have been widely used for this purpose. In many settings, it may be important to incorporate ancillary biological information concerning the variables of interest. Pathway and network maps are one example of a source of such information. However, although ancillary information is increasingly available, it is not always clear how it should be used nor how it should be weighted in relation to primary data. Results We put forward an approach in which biological knowledge is incorporated using informative prior distributions over variable subsets, with prior information selected and weighted in an automated, objective manner using an empirical Bayes formulation. We employ continuous, linear models with interaction terms and exploit biochemically-motivated sparsity constraints to permit exact inference. We show an example of priors for pathway- and network-based information and illustrate our proposed method on both synthetic response data and by an application to cancer drug response data. Comparisons are also made to alternative Bayesian and frequentist penalised-likelihood methods for incorporating network-based information. Conclusions The empirical Bayes method proposed here can aid prior elicitation for Bayesian variable selection studies and help to guard against mis-specification of priors. Empirical Bayes, together with the proposed pathway-based priors, results in an approach with a competitive variable selection performance. In addition, the overall procedure is fast, deterministic, and has very few user-set parameters, yet is capable of capturing interplay between molecular players. The approach presented is general and readily applicable in any setting with multiple sources of biological prior knowledge. PMID:22578440
Increasing Functional Communication in Non-Speaking Preschool Children: Comparison of PECS and VOCA
ERIC Educational Resources Information Center
Bock, Stacey Jones; Stoner, Julia B.; Beck, Ann R.; Hanley, Laurie; Prochnow, Jessica
2005-01-01
For individuals who have complex communication needs and for the interventionists who work with them, the collection of empirically derived data that support the use of an intervention approach is critical. The purposes of this study were to continue building an empirically derived base of support for, and to compare the relative effectiveness of…
ERIC Educational Resources Information Center
Ghahramanlou-Holloway, Marjan; Cox, Daniel W.; Greene, Farrah N.
2012-01-01
To date, no empirically based inpatient intervention for individuals who have attempted suicide exists. We present an overview of a novel psychotherapeutic approach, Post-Admission Cognitive Therapy (PACT), currently under development and empirical testing for inpatients who have been admitted for a recent suicide attempt. PACT is adapted from an…
ERIC Educational Resources Information Center
Teixeira, Pedro Nuno; Rocha, Vera; Biscaia, Ricardo; Cardoso, Margarida Fonseca
2012-01-01
The expansion of higher education systems has often been associated with the need for increasing diversification, namely at the program level, based on the pressures to adapt more general programmes to a more diverse student population and multiple regional, social, and economic needs. This paper explores empirically the question of programme…
Mission Operations Planning with Preferences: An Empirical Study
NASA Technical Reports Server (NTRS)
Bresina, John L.; Khatib, Lina; McGann, Conor
2006-01-01
This paper presents an empirical study of some nonexhaustive approaches to optimizing preferences within the context of constraint-based, mixed-initiative planning for mission operations. This work is motivated by the experience of deploying and operating the MAPGEN (Mixed-initiative Activity Plan GENerator) system for the Mars Exploration Rover Mission. Responsiveness to the user is one of the important requirements for MAPGEN, hence, the additional computation time needed to optimize preferences must be kept within reasonabble bounds. This was the primary motivation for studying non-exhaustive optimization approaches. The specific goals of rhe empirical study are to assess the impact on solution quality of two greedy heuristics used in MAPGEN and to assess the improvement gained by applying a linear programming optimization technique to the final solution.
Mitchell, John T.; Zylowska, Lidia; Kollins, Scott H.
2015-01-01
Research examining nonpharmacological interventions for adults diagnosed with attention-deficit/hyperactivity disorder (ADHD) has expanded in recent years and provides patients with more treatment options. Mindfulness-based training is an example of an intervention that is gaining promising preliminary empirical support and is increasingly administered in clinical settings. The aim of this review is to provide a rationale for the application of mindfulness to individuals diagnosed with ADHD, describe the current state of the empirical basis for mindfulness training in ADHD, and summarize a treatment approach specific to adults diagnosed with ADHD: the Mindful Awareness Practices (MAPs) for ADHD Program. Two case study examples are provided to demonstrate relevant clinical issues for practitioners interested in this approach. Directions for future research, including mindfulness meditation as a standalone treatment and as a complementary approach to cognitive-behavioral therapy, are provided. PMID:25908900
Ethnomathematics in Perspective of Sundanese Culture
ERIC Educational Resources Information Center
Abdullah, Atje Setiawan
2017-01-01
This study is an exploratory research aims to find and know about a phenomenon by exploration. Therefore, the approach used in this study is ethnographic approach, an empirical and theoretical approach to get description and deep analysis about a culture based on field study. From the sustainable interviews and confirmation about field research…
Alegre-Cortés, J; Soto-Sánchez, C; Pizá, Á G; Albarracín, A L; Farfán, F D; Felice, C J; Fernández, E
2016-07-15
Linear analysis has classically provided powerful tools for understanding the behavior of neural populations, but the neuron responses to real-world stimulation are nonlinear under some conditions, and many neuronal components demonstrate strong nonlinear behavior. In spite of this, temporal and frequency dynamics of neural populations to sensory stimulation have been usually analyzed with linear approaches. In this paper, we propose the use of Noise-Assisted Multivariate Empirical Mode Decomposition (NA-MEMD), a data-driven template-free algorithm, plus the Hilbert transform as a suitable tool for analyzing population oscillatory dynamics in a multi-dimensional space with instantaneous frequency (IF) resolution. The proposed approach was able to extract oscillatory information of neurophysiological data of deep vibrissal nerve and visual cortex multiunit recordings that were not evidenced using linear approaches with fixed bases such as the Fourier analysis. Texture discrimination analysis performance was increased when Noise-Assisted Multivariate Empirical Mode plus Hilbert transform was implemented, compared to linear techniques. Cortical oscillatory population activity was analyzed with precise time-frequency resolution. Similarly, NA-MEMD provided increased time-frequency resolution of cortical oscillatory population activity. Noise-Assisted Multivariate Empirical Mode Decomposition plus Hilbert transform is an improved method to analyze neuronal population oscillatory dynamics overcoming linear and stationary assumptions of classical methods. Copyright © 2016 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Harvey, Marina; Ambler, Trudy; Cahir, Jayde
2017-01-01
Anecdotal and empirical evidence indicates that mentoring can be a successful strategy for supporting professional learning, yet limited literature exists on approaches to mentoring designed specifically for academics working in higher education. The aim of this study was to create an approach to mentoring tailored to the needs of academics and…
A Computer-Based Game That Promotes Mathematics Learning More than a Conventional Approach
ERIC Educational Resources Information Center
McLaren, Bruce M.; Adams, Deanne M.; Mayer, Richard E.; Forlizzi, Jodi
2017-01-01
Excitement about learning from computer-based games has been papable in recent years and has led to the development of many educational games. However, there are relatively few sound empirical studies in the scientific literature that have shown the benefits of learning mathematics from games as opposed to more traditional approaches. The…
Charting Collective Knowledge: Supporting Self-Regulated Learning in the Workplace
ERIC Educational Resources Information Center
Littlejohn, Allison; Milligan, Colin; Margaryan, Anoush
2012-01-01
Purpose: This study aims to outline an approach to improving the effectiveness of work-based learning through knowledge creation and enhancing self-regulated learning. The paper presents a case example of a novel approach to learning through knowledge creation in the workplace. This case example is based on empirical data collected through a study…
An Acceptance and Mindfulness-Based Approach to Social Phobia: A Case Study
ERIC Educational Resources Information Center
Brady, Victoria Popick; Whitman, Sarah M.
2012-01-01
Over the past few years, there has been a proliferation of theoretical discussions and empirical research on the use of acceptance and mindfulness-based therapies to treat anxiety disorders. Because these treatment approaches are in their infancy, many clinicians may still be uncertain about how to apply such treatments in their work with clients.…
Gynecological surgery from the Hippocratics to the fall of the Roman Empire.
Bliquez, Lawrence J
2010-01-01
The article aims to explore advances in the Greco-Roman gynecological surgery with particular emphasis on the Roman Empire. The development and improvement of the Roman surgical instrumentarium occurred in tandem with surgical advances, gynecological as well as general. It might therefore be said that the approach taken in this paper is one based on material culture.
ERIC Educational Resources Information Center
Huerta, Juan Carlos; Sperry, Rita
2013-01-01
This article outlines a systematic and manageable method for learning community program assessment based on collecting empirical direct measures of student learning. Developed at Texas A&M University--Corpus Christi where all full-time, first-year students are in learning communities, the approach ties integrative assignment design to a rubric…
Literacy and science: each in the service of the other.
Pearson, P David; Moje, Elizabeth; Greenleaf, Cynthia
2010-04-23
We use conceptual and empirical lenses to examine synergies between inquiry science and literacy teaching and learning of K-12 (kindergarten through high school) curriculum. We address two questions: (i) how can reading and writing be used as tools to support inquiry-based science, and (ii) how do reading and writing benefit when embedded in an inquiry-based science setting? After elaborating the theoretical and empirical support for integrated approaches, we discuss how to support their implementation in today's complicated curricular landscape.
Ultrasonic nondestructive evaluation, microstructure, and mechanical property interrelations
NASA Technical Reports Server (NTRS)
Vary, A.
1984-01-01
Ultrasonic techniques for mechanical property characterizations are reviewed and conceptual models are advanced for explaining and interpreting the empirically based results. At present, the technology is generally empirically based and is emerging from the research laboratory. Advancement of the technology will require establishment of theoretical foundations for the experimentally observed interrelations among ultrasonic measurements, mechanical properties, and microstructure. Conceptual models are applied to ultrasonic assessment of fracture toughness to illustrate an approach for predicting correlations found among ultrasonic measurements, microstructure, and mechanical properties.
ERIC Educational Resources Information Center
Martino, Steve; Gallon, Steve; Ball, Samuel A.; Carroll, Kathleen M.
2007-01-01
A clinical trials training approach to supervision is a promising and empirically supported method for preparing addiction counselors to implement evidence-based behavioral treatments in community treatment programs. This supervision approach has three main components: (1) direct observation of treatment sessions; (2) structured performance…
Stylistics in Teacher Training: Research Programs and Future Prospects
ERIC Educational Resources Information Center
Ventura, Ana Clara
2016-01-01
The aim of this research is to analyse and systematize the conceptual and empirical bases of the available literature on research approaches, objects of study, and future prospects in the field of stylistics, in order to encourage best practice in teacher training. Three research approaches are presented: the empiricist-behaviorist approach, the…
Comparing and Contrasting Consensus versus Empirical Domains
Jason, Leonard A.; Kot, Bobby; Sunnquist, Madison; Brown, Abigail; Reed, Jordan; Furst, Jacob; Newton, Julia L.; Strand, Elin Bolle; Vernon, Suzanne D.
2015-01-01
Background Since the publication of the CFS case definition [1], there have been a number of other criteria proposed including the Canadian Consensus Criteria [2] and the Myalgic Encephalomyelitis: International Consensus Criteria. [3] Purpose The current study compared these domains that were developed through consensus methods to one obtained through more empirical approaches using factor analysis. Methods Using data mining, we compared and contrasted fundamental features of consensus-based criteria versus empirical latent factors. In general, these approaches found the domain of Fatigue/Post-exertional malaise as best differentiating patients from controls. Results Findings indicated that the Fukuda et al. criteria had the worst sensitivity and specificity. Conclusions These outcomes might help both theorists and researchers better determine which fundamental domains to be used for the case definition. PMID:26977374
A discrete element method-based approach to predict the breakage of coal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gupta, Varun; Sun, Xin; Xu, Wei
Pulverization is an essential pre-combustion technique employed for solid fuels, such as coal, to reduce particle sizes. Smaller particles ensure rapid and complete combustion, leading to low carbon emissions. Traditionally, the resulting particle size distributions from pulverizers have been informed by empirical or semi-empirical approaches that rely on extensive data gathered over several decades during operations or experiments. However, the predictive capabilities for new coals and processes are limited. This work presents a Discrete Element Method based computational framework to predict particle size distribution resulting from the breakage of coal particles characterized by the coal’s physical properties. The effect ofmore » certain operating parameters on the breakage behavior of coal particles also is examined.« less
Understanding similarity of groundwater systems with empirical copulas
NASA Astrophysics Data System (ADS)
Haaf, Ezra; Kumar, Rohini; Samaniego, Luis; Barthel, Roland
2016-04-01
Within the classification framework for groundwater systems that aims for identifying similarity of hydrogeological systems and transferring information from a well-observed to an ungauged system (Haaf and Barthel, 2015; Haaf and Barthel, 2016), we propose a copula-based method for describing groundwater-systems similarity. Copulas are an emerging method in hydrological sciences that make it possible to model the dependence structure of two groundwater level time series, independently of the effects of their marginal distributions. This study is based on Samaniego et al. (2010), which described an approach calculating dissimilarity measures from bivariate empirical copula densities of streamflow time series. Subsequently, streamflow is predicted in ungauged basins by transferring properties from similar catchments. The proposed approach is innovative because copula-based similarity has not yet been applied to groundwater systems. Here we estimate the pairwise dependence structure of 600 wells in Southern Germany using 10 years of weekly groundwater level observations. Based on these empirical copulas, dissimilarity measures are estimated, such as the copula's lower- and upper corner cumulated probability, copula-based Spearman's rank correlation - as proposed by Samaniego et al. (2010). For the characterization of groundwater systems, copula-based metrics are compared with dissimilarities obtained from precipitation signals corresponding to the presumed area of influence of each groundwater well. This promising approach provides a new tool for advancing similarity-based classification of groundwater system dynamics. Haaf, E., Barthel, R., 2015. Methods for assessing hydrogeological similarity and for classification of groundwater systems on the regional scale, EGU General Assembly 2015, Vienna, Austria. Haaf, E., Barthel, R., 2016. An approach for classification of hydrogeological systems at the regional scale based on groundwater hydrographs EGU General Assembly 2016, Vienna, Austria. Samaniego, L., Bardossy, A., Kumar, R., 2010. Streamflow prediction in ungauged catchments using copula-based dissimilarity measures. Water Resources Research, 46. DOI:10.1029/2008wr007695
Uncertainty quantification in Eulerian-Lagrangian models for particle-laden flows
NASA Astrophysics Data System (ADS)
Fountoulakis, Vasileios; Jacobs, Gustaaf; Udaykumar, Hs
2017-11-01
A common approach to ameliorate the computational burden in simulations of particle-laden flows is to use a point-particle based Eulerian-Lagrangian model, which traces individual particles in their Lagrangian frame and models particles as mathematical points. The particle motion is determined by Stokes drag law, which is empirically corrected for Reynolds number, Mach number and other parameters. The empirical corrections are subject to uncertainty. Treating them as random variables renders the coupled system of PDEs and ODEs stochastic. An approach to quantify the propagation of this parametric uncertainty to the particle solution variables is proposed. The approach is based on averaging of the governing equations and allows for estimation of the first moments of the quantities of interest. We demonstrate the feasibility of our proposed methodology of uncertainty quantification of particle-laden flows on one-dimensional linear and nonlinear Eulerian-Lagrangian systems. This research is supported by AFOSR under Grant FA9550-16-1-0008.
Chomsky, Noam
2015-02-01
Core concepts of language are highly contested. In some cases this is legitimate: real empirical and conceptual issues arise. In other cases, it seems that controversies are based on misunderstanding. A number of crucial cases are reviewed, and an approach to language is outlined that appears to have strong conceptual and empirical motivation, and to lead to conclusions about a number of significant issues that differ from some conventional beliefs.
NASA Astrophysics Data System (ADS)
Shiri, Jalal
2018-06-01
Among different reference evapotranspiration (ETo) modeling approaches, mass transfer-based methods have been less studied. These approaches utilize temperature and wind speed records. On the other hand, the empirical equations proposed in this context generally produce weak simulations, except when a local calibration is used for improving their performance. This might be a crucial drawback for those equations in case of local data scarcity for calibration procedure. So, application of heuristic methods can be considered as a substitute for improving the performance accuracy of the mass transfer-based approaches. However, given that the wind speed records have usually higher variation magnitudes than the other meteorological parameters, application of a wavelet transform for coupling with heuristic models would be necessary. In the present paper, a coupled wavelet-random forest (WRF) methodology was proposed for the first time to improve the performance accuracy of the mass transfer-based ETo estimation approaches using cross-validation data management scenarios in both local and cross-station scales. The obtained results revealed that the new coupled WRF model (with the minimum scatter index values of 0.150 and 0.192 for local and external applications, respectively) improved the performance accuracy of the single RF models as well as the empirical equations to great extent.
Integrating WEPP into the WEPS infrastructure
USDA-ARS?s Scientific Manuscript database
The Wind Erosion Prediction System (WEPS) and the Water Erosion Prediction Project (WEPP) share a common modeling philosophy, that of moving away from primarily empirically based models based on indices or "average conditions", and toward a more process based approach which can be evaluated using ac...
Niang, Oumar; Thioune, Abdoulaye; El Gueirea, Mouhamed Cheikh; Deléchelle, Eric; Lemoine, Jacques
2012-09-01
The major problem with the empirical mode decomposition (EMD) algorithm is its lack of a theoretical framework. So, it is difficult to characterize and evaluate this approach. In this paper, we propose, in the 2-D case, the use of an alternative implementation to the algorithmic definition of the so-called "sifting process" used in the original Huang's EMD method. This approach, especially based on partial differential equations (PDEs), was presented by Niang in previous works, in 2005 and 2007, and relies on a nonlinear diffusion-based filtering process to solve the mean envelope estimation problem. In the 1-D case, the efficiency of the PDE-based method, compared to the original EMD algorithmic version, was also illustrated in a recent paper. Recently, several 2-D extensions of the EMD method have been proposed. Despite some effort, 2-D versions for EMD appear poorly performing and are very time consuming. So in this paper, an extension to the 2-D space of the PDE-based approach is extensively described. This approach has been applied in cases of both signal and image decomposition. The obtained results confirm the usefulness of the new PDE-based sifting process for the decomposition of various kinds of data. Some results have been provided in the case of image decomposition. The effectiveness of the approach encourages its use in a number of signal and image applications such as denoising, detrending, or texture analysis.
Krueger, Robert F; Tackett, Jennifer L; MacDonald, Angus
2016-11-01
Traditionally, psychopathology has been conceptualized in terms of polythetic categories derived from committee deliberations and enshrined in authoritative psychiatric nosologies-most notably the Diagnostic and Statistical Manual of Mental Disorders (DSM; American Psychiatric Association [APA], 2013). As the limitations of this form of classification have become evident, empirical data have been increasingly relied upon to investigate the structure of psychopathology. These efforts have borne fruit in terms of an increasingly consistent set of psychopathological constructs closely connected with similar personality constructs. However, the work of validating these constructs using convergent sources of data is an ongoing enterprise. This special section collects several new efforts to use structural approaches to study the validity of this empirically based organizational scheme for psychopathology. Inasmuch as a structural approach reflects the natural organization of psychopathology, it has great potential to facilitate comprehensive organization of information on the correlates of psychopathology, providing evidence for the convergent and discriminant validity of an empirical approach to classification. Here, we highlight several themes that emerge from this burgeoning literature. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Prediction of the Dynamic Yield Strength of Metals Using Two Structural-Temporal Parameters
NASA Astrophysics Data System (ADS)
Selyutina, N. S.; Petrov, Yu. V.
2018-02-01
The behavior of the yield strength of steel and a number of aluminum alloys is investigated in a wide range of strain rates, based on the incubation time criterion of yield and the empirical models of Johnson-Cook and Cowper-Symonds. In this paper, expressions for the parameters of the empirical models are derived through the characteristics of the incubation time criterion; a satisfactory agreement of these data and experimental results is obtained. The parameters of the empirical models can depend on some strain rate. The independence of the characteristics of the incubation time criterion of yield from the loading history and their connection with the structural and temporal features of the plastic deformation process give advantage of the approach based on the concept of incubation time with respect to empirical models and an effective and convenient equation for determining the yield strength in a wider range of strain rates.
NASA Astrophysics Data System (ADS)
Tu, Rui; Wang, Rongjiang; Zhang, Yong; Walter, Thomas R.
2014-06-01
The description of static displacements associated with earthquakes is traditionally achieved using GPS, EDM or InSAR data. In addition, displacement histories can be derived from strong-motion records, allowing an improvement of geodetic networks at a high sampling rate and a better physical understanding of earthquake processes. Strong-motion records require a correction procedure appropriate for baseline shifts that may be caused by rotational motion, tilting and other instrumental effects. Common methods use an empirical bilinear correction on the velocity seismograms integrated from the strong-motion records. In this study, we overcome the weaknesses of an empirically based bilinear baseline correction scheme by using a net-based criterion to select the timing parameters. This idea is based on the physical principle that low-frequency seismic waveforms at neighbouring stations are coherent if the interstation distance is much smaller than the distance to the seismic source. For a dense strong-motion network, it is plausible to select the timing parameters so that the correlation coefficient between the velocity seismograms of two neighbouring stations is maximized after the baseline correction. We applied this new concept to the KiK-Net and K-Net strong-motion data available for the 2011 Mw 9.0 Tohoku earthquake. We compared the derived coseismic static displacement with high-quality GPS data, and with the results obtained using empirical methods. The results show that the proposed net-based approach is feasible and more robust than the individual empirical approaches. The outliers caused by unknown problems in the measurement system can be easily detected and quantified.
Determining the non-inferiority margin for patient reported outcomes.
Gerlinger, Christoph; Schmelter, Thomas
2011-01-01
One of the cornerstones of any non-inferiority trial is the choice of the non-inferiority margin delta. This threshold of clinical relevance is very difficult to determine, and in practice, delta is often "negotiated" between the sponsor of the trial and the regulatory agencies. However, for patient reported, or more precisely patient observed outcomes, the patients' minimal clinically important difference (MCID) can be determined empirically by relating the treatment effect, for example, a change on a 100-mm visual analogue scale, to the patient's satisfaction with the change. This MCID can then be used to define delta. We used an anchor-based approach with non-parametric discriminant analysis and ROC analysis and a distribution-based approach with Norman's half standard deviation rule to determine delta in three examples endometriosis-related pelvic pain measured on a 100-mm visual analogue scale, facial acne measured by lesion counts, and hot flush counts. For each of these examples, all three methods yielded quite similar results. In two of the cases, the empirically derived MCIDs were smaller or similar of deltas used before in non-inferiority trials, and in the third case, the empirically derived MCID was used to derive a responder definition that was accepted by the FDA. In conclusion, for patient-observed endpoints, the delta can be derived empirically. In our view, this is a better approach than that of asking the clinician for a "nice round number" for delta, such as 10, 50%, π, e, or i. Copyright © 2011 John Wiley & Sons, Ltd.
ERIC Educational Resources Information Center
Suh, Youngsuk; Talley, Anna E.
2015-01-01
This study compared and illustrated four differential distractor functioning (DDF) detection methods for analyzing multiple-choice items. The log-linear approach, two item response theory-model-based approaches with likelihood ratio tests, and the odds ratio approach were compared to examine the congruence among the four DDF detection methods.…
Salloch, Sabine; Wäscher, Sebastian; Vollmann, Jochen; Schildmann, Jan
2015-04-04
Empirical-ethical research constitutes a relatively new field which integrates socio-empirical research and normative analysis. As direct inferences from descriptive data to normative conclusions are problematic, an ethical framework is needed to determine the relevance of the empirical data for normative argument. While issues of normative-empirical collaboration and questions of empirical methodology have been widely discussed in the literature, the normative methodology of empirical-ethical research has seldom been addressed. Based on our own research experience, we discuss one aspect of this normative methodology, namely the selection of an ethical theory serving as a background for empirical-ethical research. Whereas criteria for a good ethical theory in philosophical ethics are usually related to inherent aspects, such as the theory's clarity or coherence, additional points have to be considered in the field of empirical-ethical research. Three of these additional criteria will be discussed in the article: (a) the adequacy of the ethical theory for the issue at stake, (b) the theory's suitability for the purposes and design of the empirical-ethical research project, and (c) the interrelation between the ethical theory selected and the theoretical backgrounds of the socio-empirical research. Using the example of our own study on the development of interventions which support clinical decision-making in oncology, we will show how the selection of an ethical theory as a normative background for empirical-ethical research can proceed. We will also discuss the limitations of the procedures chosen in our project. The article stresses that a systematic and reasoned approach towards theory selection in empirical-ethical research should be given priority rather than an accidental or implicit way of choosing the normative framework for one's own research. It furthermore shows that the overall design of an empirical-ethical study is a multi-faceted endeavor which has to balance between theoretical and pragmatic considerations.
Meta-Analysis of Group Learning Activities: Empirically Based Teaching Recommendations
ERIC Educational Resources Information Center
Tomcho, Thomas J.; Foels, Rob
2012-01-01
Teaching researchers commonly employ group-based collaborative learning approaches in Teaching of Psychology teaching activities. However, the authors know relatively little about the effectiveness of group-based activities in relation to known psychological processes associated with group dynamics. Therefore, the authors conducted a meta-analytic…
Against the empirical viability of the Deutsch-Wallace-Everett approach to quantum mechanics
NASA Astrophysics Data System (ADS)
Dawid, Richard; Thébault, Karim P. Y.
2014-08-01
The subjective Everettian approach to quantum mechanics presented by Deutsch and Wallace fails to constitute an empirically viable theory of quantum phenomena. The decision theoretic implementation of the Born rule realized in this approach provides no basis for rejecting Everettian quantum mechanics in the face of empirical data that contradicts the Born rule. The approach of Greaves and Myrvold, which provides a subjective implementation of the Born rule as well but derives it from empirical data rather than decision theoretic arguments, avoids the problem faced by Deutsch and Wallace and is empirically viable. However, there is good reason to cast doubts on its scientific value.
NASA Astrophysics Data System (ADS)
Li, Ning; Yang, Jianguo; Zhou, Rui; Liang, Caiping
2016-04-01
Knock is one of the major constraints to improve the performance and thermal efficiency of spark ignition (SI) engines. It can also result in severe permanent engine damage under certain operating conditions. Based on the ensemble empirical mode decomposition (EEMD), this paper proposes a new approach to determine the knock characteristics in SI engines. By adding a uniformly distributed and finite white Gaussian noise, the EEMD can preserve signal continuity in different scales and therefore alleviates the mode-mixing problem occurring in the classic empirical mode decomposition (EMD). The feasibilities of applying the EEMD to detect the knock signatures of a test SI engine via the pressure signal measured from combustion chamber and the vibration signal measured from cylinder head are investigated. Experimental results show that the EEMD-based method is able to detect the knock signatures from both the pressure signal and vibration signal, even in initial stage of knock. Finally, by comparing the application results with those obtained by short-time Fourier transform (STFT), Wigner-Ville distribution (WVD) and discrete wavelet transform (DWT), the superiority of the EEMD method in determining knock characteristics is demonstrated.
ERIC Educational Resources Information Center
Kadir, Z. Abdul; Abdullah, N. H.; Anthony, E.; Salleh, B. Mohd; Kamarulzaman, R.
2016-01-01
Problem-based Learning (PBL) approach has been widely used in various disciplines since it is claimed to improve students' soft skills. However, empirical supports on the effect of PBL on problem solving skills have been lacking and anecdotal in nature. This study aimed to determine the effect of PBL approach on students' problem solving skills…
ERIC Educational Resources Information Center
Irvin, Larry K.; Horner, Robert H.; Ingram, Kimberly; Todd, Anne W.; Sugai, George; Sampson, Nadia Katul; Boland, Joseph B.
2006-01-01
In this evaluation we used Messick's construct validity as a conceptual framework for an empirical study assessing the validity of use, utility, and impact of office discipline referral (ODR) measures for data-based decision making about student behavior in schools. The Messick approach provided a rubric for testing the fit of our theory of use of…
Approaches to Learning Information Literacy: A Phenomenographic Study
ERIC Educational Resources Information Center
Diehm, Rae-Anne; Lupton, Mandy
2012-01-01
This paper reports on an empirical study that explores the ways students approach learning to find and use information. Based on interviews with 15 education students in an Australian university, this study uses phenomenography as its methodological and theoretical basis. The study reveals that students use three main strategies for learning…
When ICT Meets Schools: Differentiation, Complexity and Adaptability
ERIC Educational Resources Information Center
Tubin, Dorit
2007-01-01
Purpose: The purpose of this study is to explore the interaction between information communication technology (ICT) and the school's organizational structure, and propose an analytical model based both on Luhmann's system theory and empirical findings. Design/methodology/approach: The approach of building a theory from a case study research along…
Best Practices Inquiry: A Multidimensional, Value-Critical Framework
ERIC Educational Resources Information Center
Petr, Christopher G.; Walter, Uta M.
2005-01-01
This article offers a multidimensional framework that broadens current approaches to "best practices" inquiry to include (1) the perspectives of both the consumers of services and professional practitioners and (2) a value-based critique. The predominant empirical approach to best practices inquiry is a necessary, but not sufficient, component of…
I show that a conditional probability analysis using a stressor-response model based on a logistic regression provides a useful approach for developing candidate water quality criteria from empirical data, such as the Maryland Biological Streams Survey (MBSS) data.
Evaluating a Pivot-Based Approach for Bilingual Lexicon Extraction
Kim, Jae-Hoon; Kwon, Hong-Seok; Seo, Hyeong-Won
2015-01-01
A pivot-based approach for bilingual lexicon extraction is based on the similarity of context vectors represented by words in a pivot language like English. In this paper, in order to show validity and usability of the pivot-based approach, we evaluate the approach in company with two different methods for estimating context vectors: one estimates them from two parallel corpora based on word association between source words (resp., target words) and pivot words and the other estimates them from two parallel corpora based on word alignment tools for statistical machine translation. Empirical results on two language pairs (e.g., Korean-Spanish and Korean-French) have shown that the pivot-based approach is very promising for resource-poor languages and this approach observes its validity and usability. Furthermore, for words with low frequency, our method is also well performed. PMID:25983745
ERIC Educational Resources Information Center
Flewelling, Robert L.; Austin, David; Hale, Kelly; LaPlante, Marcia; Liebig, Melissa; Piasecki, Linda; Uerz, Lori
2005-01-01
Despite the popularity and perceived potential effectiveness of community-based coalitions in helping to prevent and reduce adolescent substance use, empirical evidence supporting this approach is sparse. Many reasons have been suggested for why coalition-based prevention initiatives, and community-level interventions in general, have not…
ERIC Educational Resources Information Center
Brassler, Mirjam; Dettmers, Jan
2017-01-01
Interdisciplinary competence is important in academia for both employability and sustainable development. However, to date, there are no specific interdisciplinary education models and, naturally, no empirical studies to assess them. Since problem-based learning (PBL) and project-based learning (PjBL) are learning approaches that emphasize…
Moran, Nikki
2014-01-01
The agenda in music research that is broadly recognized as embodied music cognition has arrived hand-in-hand with a social interpretation of music, focusing on the real-world basis of its performance, and fostering an empirical approach to musician movement regarding the communicative function and potential of those movements. However, embodied cognition emerged from traditional cognitivism, which produced a body of scientific explanation of music-theoretic concepts. The analytical object of this corpus is based on the particular imagined encounter of a listener responding to an idealized "work." Although this problem of essentialism has been identified within mainstream musicology, the lingering effects may spill over into interdisciplinary, empirical research. This paper defines the situation according to its legacy of individualism, and offers an alternative sketch of musical activity as performance event, a model that highlights the social interaction processes at the heart of musical behavior. I describe some recent empirical work based on interaction-oriented approaches, arguing that this particular focus - on the social interaction process itself - creates a distinctive and promising agenda for further research into embodied music cognition.
A preliminary study of mechanistic approach in pavement design to accommodate climate change effects
NASA Astrophysics Data System (ADS)
Harnaeni, S. R.; Pramesti, F. P.; Budiarto, A.; Setyawan, A.
2018-03-01
Road damage is caused by some factors, including climate changes, overload, and inappropriate procedure for material and development process. Meanwhile, climate change is a phenomenon which cannot be avoided. The effects observed include air temperature rise, sea level rise, rainfall changes, and the intensity of extreme weather phenomena. Previous studies had shown the impacts of climate changes on road damage. Therefore, several measures to anticipate the damage should be considered during the planning and construction in order to reduce the cost of road maintenance. There are three approaches generally applied in the design of flexible pavement thickness, namely mechanistic approach, mechanistic-empirical (ME) approach and empirical approach. The advantages of applying mechanistic approach or mechanistic-empirical (ME) approaches are its efficiency and reliability in the design of flexible pavement thickness as well as its capacity to accommodate climate changes in compared to empirical approach. However, generally, the design of flexible pavement thickness in Indonesia still applies empirical approach. This preliminary study aimed to emphasize the importance of the shifting towards a mechanistic approach in the design of flexible pavement thickness.
NASA Astrophysics Data System (ADS)
Hargrove, W. W.; Hoffman, F. M.; Kumar, J.; Spruce, J.; Norman, S. P.
2013-12-01
Here we present diverse examples where empirical mining and statistical analysis of large data sets have already been shown to be useful for a wide variety of practical decision-making problems within the realm of large-scale ecology. Because a full understanding and appreciation of particular ecological phenomena are possible only after hypothesis-directed research regarding the existence and nature of that process, some ecologists may feel that purely empirical data harvesting may represent a less-than-satisfactory approach. Restricting ourselves exclusively to process-driven approaches, however, may actually slow progress, particularly for more complex or subtle ecological processes. We may not be able to afford the delays caused by such directed approaches. Rather than attempting to formulate and ask every relevant question correctly, empirical methods allow trends, relationships and associations to emerge freely from the data themselves, unencumbered by a priori theories, ideas and prejudices that have been imposed upon them. Although they cannot directly demonstrate causality, empirical methods can be extremely efficient at uncovering strong correlations with intermediate "linking" variables. In practice, these correlative structures and linking variables, once identified, may provide sufficient predictive power to be useful themselves. Such correlation "shadows" of causation can be harnessed by, e.g., Bayesian Belief Nets, which bias ecological management decisions, made with incomplete information, toward favorable outcomes. Empirical data-harvesting also generates a myriad of testable hypotheses regarding processes, some of which may even be correct. Quantitative statistical regionalizations based on quantitative multivariate similarity have lended insights into carbon eddy-flux direction and magnitude, wildfire biophysical conditions, phenological ecoregions useful for vegetation type mapping and monitoring, forest disease risk maps (e.g., sudden oak death), global aquatic ecoregion risk maps for aquatic invasives, and forest vertical structure ecoregions (e.g., using extensive LiDAR data sets). Multivariate Spatio-Temporal Clustering, which quantitatively places alternative future conditions on a common footing with present conditions, allows prediction of present and future shifts in tree species ranges, given alternative climatic change forecasts. ForWarn, a forest disturbance detection and monitoring system mining 12 years of national 8-day MODIS phenology data, has been operating since 2010, producing national maps every 8 days showing many kinds of potential forest disturbances. Forest resource managers can view disturbance maps via a web-based viewer, and alerts are issued when particular forest disturbances are seen. Regression-based decadal trend analysis showing long-term forest thrive and decline areas, and individual-based, brute-force supercomputing to map potential movement corridors and migration routes across landscapes will also be discussed. As significant ecological changes occur with increasing rapidity, such empirical data-mining approaches may be the most efficient means to help land managers find the best, most-actionable policies and decision strategies.
Billieux, Joël; Philippot, Pierre; Schmid, Cécile; Maurage, Pierre; De Mol, Jan; Van der Linden, Martial
2015-01-01
Dysfunctional use of the mobile phone has often been conceptualized as a 'behavioural addiction' that shares most features with drug addictions. In the current article, we challenge the clinical utility of the addiction model as applied to mobile phone overuse. We describe the case of a woman who overuses her mobile phone from two distinct approaches: (1) a symptom-based categorical approach inspired from the addiction model of dysfunctional mobile phone use and (2) a process-based approach resulting from an idiosyncratic clinical case conceptualization. In the case depicted here, the addiction model was shown to lead to standardized and non-relevant treatment, whereas the clinical case conceptualization allowed identification of specific psychological processes that can be targeted with specific, empirically based psychological interventions. This finding highlights that conceptualizing excessive behaviours (e.g., gambling and sex) within the addiction model can be a simplification of an individual's psychological functioning, offering only limited clinical relevance. The addiction model, applied to excessive behaviours (e.g., gambling, sex and Internet-related activities) may lead to non-relevant standardized treatments. Clinical case conceptualization allowed identification of specific psychological processes that can be targeted with specific empirically based psychological interventions. The biomedical model might lead to the simplification of an individual's psychological functioning with limited clinical relevance. Copyright © 2014 John Wiley & Sons, Ltd.
A new approach in psychotherapy: ACT (acceptance and commitment therapy).
McHugh, Louise
2011-09-01
Acceptance and commitment therapy (ACT) focuses on enhancing psychological flexibility in the service of achieving core life values. One thing that distinguishes ACT from other psychotherapies is its grounding in empirical behavioural science. The results of the latter suggest that the capacity for human language can produce seriously negative psychological effects under certain circumstances. ACT is a therapeutic approach in which the negative effects of human language are undermined so as to support flexible values based living. ACT therapeutic work involves six key processes proposed under the "hexaflex" model. ACT has received considerable empirical support at a number of different levels of analysis.
Empirical data and moral theory. A plea for integrated empirical ethics.
Molewijk, Bert; Stiggelbout, Anne M; Otten, Wilma; Dupuis, Heleen M; Kievit, Job
2004-01-01
Ethicists differ considerably in their reasons for using empirical data. This paper presents a brief overview of four traditional approaches to the use of empirical data: "the prescriptive applied ethicists," "the theorists," "the critical applied ethicists," and "the particularists." The main aim of this paper is to introduce a fifth approach of more recent date (i.e. "integrated empirical ethics") and to offer some methodological directives for research in integrated empirical ethics. All five approaches are presented in a table for heuristic purposes. The table consists of eight columns: "view on distinction descriptive-prescriptive sciences," "location of moral authority," "central goal(s)," "types of normativity," "use of empirical data," "method," "interaction empirical data and moral theory," and "cooperation with descriptive sciences." Ethicists can use the table in order to identify their own approach. Reflection on these issues prior to starting research in empirical ethics should lead to harmonization of the different scientific disciplines and effective planning of the final research design. Integrated empirical ethics (IEE) refers to studies in which ethicists and descriptive scientists cooperate together continuously and intensively. Both disciplines try to integrate moral theory and empirical data in order to reach a normative conclusion with respect to a specific social practice. IEE is not wholly prescriptive or wholly descriptive since IEE assumes an interdepence between facts and values and between the empirical and the normative. The paper ends with three suggestions for consideration on some of the future challenges of integrated empirical ethics.
NASA Astrophysics Data System (ADS)
Hsiao, Y. R.; Tsai, C.
2017-12-01
As the WHO Air Quality Guideline indicates, ambient air pollution exposes world populations under threat of fatal symptoms (e.g. heart disease, lung cancer, asthma etc.), raising concerns of air pollution sources and relative factors. This study presents a novel approach to investigating the multiscale variations of PM2.5 in southern Taiwan over the past decade, with four meteorological influencing factors (Temperature, relative humidity, precipitation and wind speed),based on Noise-assisted Multivariate Empirical Mode Decomposition(NAMEMD) algorithm, Hilbert Spectral Analysis(HSA) and Time-dependent Intrinsic Correlation(TDIC) method. NAMEMD algorithm is a fully data-driven approach designed for nonlinear and nonstationary multivariate signals, and is performed to decompose multivariate signals into a collection of channels of Intrinsic Mode Functions (IMFs). TDIC method is an EMD-based method using a set of sliding window sizes to quantify localized correlation coefficients for multiscale signals. With the alignment property and quasi-dyadic filter bank of NAMEMD algorithm, one is able to produce same number of IMFs for all variables and estimates the cross correlation in a more accurate way. The performance of spectral representation of NAMEMD-HSA method is compared with Complementary Empirical Mode Decomposition/ Hilbert Spectral Analysis (CEEMD-HSA) and Wavelet Analysis. The nature of NAMAMD-based TDICC analysis is then compared with CEEMD-based TDIC analysis and the traditional correlation analysis.
Fire risk in San Diego County, California: A weighted Bayesian model approach
Kolden, Crystal A.; Weigel, Timothy J.
2007-01-01
Fire risk models are widely utilized to mitigate wildfire hazards, but models are often based on expert opinions of less understood fire-ignition and spread processes. In this study, we used an empirically derived weights-of-evidence model to assess what factors produce fire ignitions east of San Diego, California. We created and validated a dynamic model of fire-ignition risk based on land characteristics and existing fire-ignition history data, and predicted ignition risk for a future urbanization scenario. We then combined our empirical ignition-risk model with a fuzzy fire behavior-risk model developed by wildfire experts to create a hybrid model of overall fire risk. We found that roads influence fire ignitions and that future growth will increase risk in new rural development areas. We conclude that empirically derived risk models and hybrid models offer an alternative method to assess current and future fire risk based on management actions.
ERIC Educational Resources Information Center
Loyens, Sofie M. M.; Gijbels, David; Coertjens, Liesje; Cote, Daniel J.
2013-01-01
Problem-based learning (PBL) represents a major development in higher educational practice and is believed to promote deep learning in students. However, empirical findings on the promotion of deep learning in PBL remain unclear. The aim of the present study is to investigate the relationships between students' approaches to learning (SAL) and…
Dudoit, Sandrine; Gilbert, Houston N.; van der Laan, Mark J.
2014-01-01
Summary This article proposes resampling-based empirical Bayes multiple testing procedures for controlling a broad class of Type I error rates, defined as generalized tail probability (gTP) error rates, gTP(q, g) = Pr(g(Vn, Sn) > q), and generalized expected value (gEV) error rates, gEV(g) = E[g(Vn, Sn)], for arbitrary functions g(Vn, Sn) of the numbers of false positives Vn and true positives Sn. Of particular interest are error rates based on the proportion g(Vn, Sn) = Vn/(Vn + Sn) of Type I errors among the rejected hypotheses, such as the false discovery rate (FDR), FDR = E[Vn/(Vn + Sn)]. The proposed procedures offer several advantages over existing methods. They provide Type I error control for general data generating distributions, with arbitrary dependence structures among variables. Gains in power are achieved by deriving rejection regions based on guessed sets of true null hypotheses and null test statistics randomly sampled from joint distributions that account for the dependence structure of the data. The Type I error and power properties of an FDR-controlling version of the resampling-based empirical Bayes approach are investigated and compared to those of widely-used FDR-controlling linear step-up procedures in a simulation study. The Type I error and power trade-off achieved by the empirical Bayes procedures under a variety of testing scenarios allows this approach to be competitive with or outperform the Storey and Tibshirani (2003) linear step-up procedure, as an alternative to the classical Benjamini and Hochberg (1995) procedure. PMID:18932138
Krieger, Jonathan D
2014-08-01
I present a protocol for creating geometric leaf shape metrics to facilitate widespread application of geometric morphometric methods to leaf shape measurement. • To quantify circularity, I created a novel shape metric in the form of the vector between a circle and a line, termed geometric circularity. Using leaves from 17 fern taxa, I performed a coordinate-point eigenshape analysis to empirically identify patterns of shape covariation. I then compared the geometric circularity metric to the empirically derived shape space and the standard metric, circularity shape factor. • The geometric circularity metric was consistent with empirical patterns of shape covariation and appeared more biologically meaningful than the standard approach, the circularity shape factor. The protocol described here has the potential to make geometric morphometrics more accessible to plant biologists by generalizing the approach to developing synthetic shape metrics based on classic, qualitative shape descriptors.
Hudziak, Jim; Ivanova, Masha Y
2016-04-01
All health is tied to emotional and behavioral health. To improve population health, we need innovative approaches to healthcare that target emotional and behavioral health. The Vermont Family Based Approach (VFBA) is a healthcare paradigm that aims to improve population health by improving emotional and behavioral health. Because the family is a powerful health-promoting social institution, the VFBA also aims to shift the delivery of healthcare to the family level. This article introduces the VFBA, and presents the main empirical findings that informed the approach in the context of the early childhood period. Copyright © 2016 Elsevier Inc. All rights reserved.
Sample Size Estimation in Cluster Randomized Educational Trials: An Empirical Bayes Approach
ERIC Educational Resources Information Center
Rotondi, Michael A.; Donner, Allan
2009-01-01
The educational field has now accumulated an extensive literature reporting on values of the intraclass correlation coefficient, a parameter essential to determining the required size of a planned cluster randomized trial. We propose here a simple simulation-based approach including all relevant information that can facilitate this task. An…
Networking for Leadership, Inquiry, and Systemic Thinking: A New Approach to Inquiry-Based Learning.
ERIC Educational Resources Information Center
Byers, Al; Fitzgerald, Mary Ann
2002-01-01
Points out difficulties with a change from traditional teaching methods to a more inquiry-centered approach. Presents theoretical and empirical foundations for the Networking for Leadership, Inquiry, and Systemic Thinking (NLIST) initiative sponsored by the Council of State Science Supervisors (CSSS) and NASA, describes its progress, and outlines…
Revisiting First Language Acquisition through Empirical and Rational Perspectives
ERIC Educational Resources Information Center
Tahriri, Abdorreza
2012-01-01
Acquisition in general and first language acquisition in particular is a very complex and a multifaceted phenomenon. The way that children acquire a language in a very limited period is astonishing. Various approaches have been proposed so far to account for this extraordinary phenomenon. These approaches are indeed based on various philosophical…
ERIC Educational Resources Information Center
Milford, Jaime L.; Austin, Julia L.; Smith, Jane Ellen
2007-01-01
The Community Reinforcement Approach (CRA) and Community Reinforcement and Family Training (CRAFT) are both highly effective and empirically validated psychosocial approaches to the treatment of addictions whose unique designs may help achieve certain public health objectives. Literature will be reviewed to examine the potential impact of CRA and…
An Empirical Comparison of Heterogeneity Variance Estimators in 12,894 Meta-Analyses
ERIC Educational Resources Information Center
Langan, Dean; Higgins, Julian P. T.; Simmonds, Mark
2015-01-01
Heterogeneity in meta-analysis is most commonly estimated using a moment-based approach described by DerSimonian and Laird. However, this method has been shown to produce biased estimates. Alternative methods to estimate heterogeneity include the restricted maximum likelihood approach and those proposed by Paule and Mandel, Sidik and Jonkman, and…
Learning in the Liminal Space: A Semiotic Approach to Threshold Concepts
ERIC Educational Resources Information Center
Land, Ray; Rattray, Julie; Vivian, Peter
2014-01-01
The threshold concepts approach to student learning and curriculum design now informs an empirical research base comprising over 170 disciplinary and professional contexts. It draws extensively on the notion of troublesomeness in a "liminal" space of learning. The latter is a transformative state in the process of learning in which there…
ERIC Educational Resources Information Center
Campbell, Susan; Cannon, Barbara; Ellis, James T.; Lifter, Karen; Luiselli, James K.; Navalta, Carryl P.; Taras, Marie
1998-01-01
Describes a comprehensive continuum of services model for children with autism developed by a human services agency in Massachusetts, which incorporates these and additional empirically based approaches. Service components, methodologies, and program objectives are described, including representative summary data. Best practice approaches toward…
ERIC Educational Resources Information Center
Rebeschi, Lisa M.
2013-01-01
Professional nurses are challenged to provide high quality, evidence-based care in today's increasingly complex healthcare environment. Thus, nurses need to develop an appreciation for life-long learning. Understanding student approach to learning may provide nurse educators with empirical evidence to support specific teaching/learning strategies…
Children and Terrorism-Related News: Training Parents in Coping and Media Literacy
ERIC Educational Resources Information Center
Comer, Jonathan S.; Furr, Jami M.; Beidas, Rinad S.; Weiner, Courtney L.; Kendall, Philip C.
2008-01-01
This study examined associations between televised news regarding risk for future terrorism and youth outcomes and investigated the effects of training mothers in an empirically based approach to addressing such news with children. This approach--Coping and Media Literacy (CML)--emphasized modeling, media literacy, and contingent reinforcement and…
USDA-ARS?s Scientific Manuscript database
Compound-specific isotopic analysis of amino acids (CSIA-AA) has emerged in the last decade as a powerful approach for tracing the origins and fate of nitrogen in ecological and biogeochemical studies. This approach is based on the empirical knowledge that source AAs (i.e., phenylalanine), fractiona...
Evidence-Based Administration for Decision Making in the Framework of Knowledge Strategic Management
ERIC Educational Resources Information Center
Del Junco, Julio Garcia; Zaballa, Rafael De Reyna; de Perea, Juan Garcia Alvarez
2010-01-01
Purpose: This paper seeks to present a model based on evidence-based administration (EBA), which aims to facilitate the creation, transformation and diffusion of knowledge in learning organizations. Design/methodology/approach: A theoretical framework is proposed based on EBA and the case method. Accordingly, an empirical study was carried out in…
Understanding Project-Based Learning in Second Life with a Pedagogy, Training, and Assessment Trio
ERIC Educational Resources Information Center
Jarmon, Leslie; Traphagan, Tomoko; Mayrath, Michael
2008-01-01
This paper presents an empirical study of how Second Life (SL) was utilized for a highly successful project-based graduate interdisciplinary communication course. Researchers found that an integrated threefold approach emphasizing project-based pedagogy, technical training and support, and assessment/research was effective in cultivating and…
Serious Games for Learning: Games-Based Child Sexual Abuse Prevention in Schools
ERIC Educational Resources Information Center
Scholes, Laura; Jones, Christian; Stieler-Hunt, Colleen; Rolfe, Ben
2014-01-01
In spite of research demonstrating conceptual weakness in many child sexual abuse (CSA) prevention programmes and outdated modes of delivery, students continue to participate in a diversity of initiatives. Referring to the development of a games-based approach to CSA prevention in Australia, this paper examines empirically based attributes of…
Mechanisms of Power within a Community-Based Food Security Planning Process
ERIC Educational Resources Information Center
McCullum, Christine; Pelletier, David; Barr, Donald; Wilkins, Jennifer; Habicht, Jean-Pierre
2004-01-01
A community food security movement has begun to address problems of hunger and food insecurity by utilizing a community-based approach. Although various models have been implemented, little empirical research has assessed how power operates within community-based food security initiatives. The purpose of this research was to determine how power…
NASA Astrophysics Data System (ADS)
Jiang, Fan; Zhu, Zhencai; Li, Wei; Zhou, Gongbo; Chen, Guoan
2014-07-01
Accurately identifying faults in rotor-bearing systems by analyzing vibration signals, which are nonlinear and nonstationary, is challenging. To address this issue, a new approach based on ensemble empirical mode decomposition (EEMD) and self-zero space projection analysis is proposed in this paper. This method seeks to identify faults appearing in a rotor-bearing system using simple algebraic calculations and projection analyses. First, EEMD is applied to decompose the collected vibration signals into a set of intrinsic mode functions (IMFs) for features. Second, these extracted features under various mechanical health conditions are used to design a self-zero space matrix according to space projection analysis. Finally, the so-called projection indicators are calculated to identify the rotor-bearing system's faults with simple decision logic. Experiments are implemented to test the reliability and effectiveness of the proposed approach. The results show that this approach can accurately identify faults in rotor-bearing systems.
Desired attributes of evidence assessments for evidence-based practices.
Leff, H Stephen; Conley, Jeremy A
2006-11-01
In this paper we describe three approaches to assessing evidence for stakeholders interested in evidence-based practices: narrative reviews, systematic reviews (including meta-analyses), and registries. We then compare the approaches in terms of the degree to which they posses desired attributes of evidence assessments. Our review suggests that hybrid approaches that combined the best features of all three should be pursued to further the use of evidence-based practices, and that such hybrids are possible given the capacity of the World Wide Web. We conclude by stressing the need for empirical research on evidence assessments.
Empirical Approaches to the Birthday Problem
ERIC Educational Resources Information Center
Flores, Alfinio; Cauto, Kevin M.
2012-01-01
This article will describe two activities in which students conduct experiments with random numbers so they can see that having at least one repeated birthday in a group of 40 is not unusual. The first empirical approach was conducted by author Cauto in a secondary school methods course. The second empirical approach was used by author Flores with…
Tracking Expected Improvements of Decadal Prediction in Climate Services
NASA Astrophysics Data System (ADS)
Suckling, E.; Thompson, E.; Smith, L. A.
2013-12-01
Physics-based simulation models are ultimately expected to provide the best available (decision-relevant) probabilistic climate predictions, as they can capture the dynamics of the Earth System across a range of situations, situations for which observations for the construction of empirical models are scant if not nonexistent. This fact in itself provides neither evidence that predictions from today's Earth Systems Models will outperform today's empirical models, nor a guide to the space and time scales on which today's model predictions are adequate for a given purpose. Empirical (data-based) models are employed to make probability forecasts on decadal timescales. The skill of these forecasts is contrasted with that of state-of-the-art climate models, and the challenges faced by each approach are discussed. The focus is on providing decision-relevant probability forecasts for decision support. An empirical model, known as Dynamic Climatology is shown to be competitive with CMIP5 climate models on decadal scale probability forecasts. Contrasting the skill of simulation models not only with each other but also with empirical models can reveal the space and time scales on which a generation of simulation models exploits their physical basis effectively. It can also quantify their ability to add information in the formation of operational forecasts. Difficulties (i) of information contamination (ii) of the interpretation of probabilistic skill and (iii) of artificial skill complicate each modelling approach, and are discussed. "Physics free" empirical models provide fixed, quantitative benchmarks for the evaluation of ever more complex climate models, that is not available from (inter)comparisons restricted to only complex models. At present, empirical models can also provide a background term for blending in the formation of probability forecasts from ensembles of simulation models. In weather forecasting this role is filled by the climatological distribution, and can significantly enhance the value of longer lead-time weather forecasts to those who use them. It is suggested that the direct comparison of simulation models with empirical models become a regular component of large model forecast intercomparison and evaluation. This would clarify the extent to which a given generation of state-of-the-art simulation models provide information beyond that available from simpler empirical models. It would also clarify current limitations in using simulation forecasting for decision support. No model-based probability forecast is complete without a quantitative estimate if its own irrelevance; this estimate is likely to increase as a function of lead time. A lack of decision-relevant quantitative skill would not bring the science-based foundation of anthropogenic warming into doubt. Similar levels of skill with empirical models does suggest a clear quantification of limits, as a function of lead time, for spatial and temporal scales on which decisions based on such model output are expected to prove maladaptive. Failing to clearly state such weaknesses of a given generation of simulation models, while clearly stating their strength and their foundation, risks the credibility of science in support of policy in the long term.
Multi-focus image fusion based on window empirical mode decomposition
NASA Astrophysics Data System (ADS)
Qin, Xinqiang; Zheng, Jiaoyue; Hu, Gang; Wang, Jiao
2017-09-01
In order to improve multi-focus image fusion quality, a novel fusion algorithm based on window empirical mode decomposition (WEMD) is proposed. This WEMD is an improved form of bidimensional empirical mode decomposition (BEMD), due to its decomposition process using the adding window principle, effectively resolving the signal concealment problem. We used WEMD for multi-focus image fusion, and formulated different fusion rules for bidimensional intrinsic mode function (BIMF) components and the residue component. For fusion of the BIMF components, the concept of the Sum-modified-Laplacian was used and a scheme based on the visual feature contrast adopted; when choosing the residue coefficients, a pixel value based on the local visibility was selected. We carried out four groups of multi-focus image fusion experiments and compared objective evaluation criteria with other three fusion methods. The experimental results show that the proposed fusion approach is effective and performs better at fusing multi-focus images than some traditional methods.
NASA Astrophysics Data System (ADS)
Huang, Shaohua; Wang, Lan; Chen, Weiwei; Lin, Duo; Huang, Lingling; Wu, Shanshan; Feng, Shangyuan; Chen, Rong
2014-09-01
A surface-enhanced Raman spectroscopy (SERS) approach was utilized for urine biochemical analysis with the aim to develop a label-free and non-invasive optical diagnostic method for esophagus cancer detection. SERS spectrums were acquired from 31 normal urine samples and 47 malignant esophagus cancer (EC) urine samples. Tentative assignments of urine SERS bands demonstrated esophagus cancer specific changes, including an increase in the relative amounts of urea and a decrease in the percentage of uric acid in the urine of normal compared with EC. The empirical algorithm integrated with linear discriminant analysis (LDA) were employed to identify some important urine SERS bands for differentiation between healthy subjects and EC urine. The empirical diagnostic approach based on the ratio of the SERS peak intensity at 527 to 1002 cm-1 and 725 to 1002 cm-1 coupled with LDA yielded a diagnostic sensitivity of 72.3% and specificity of 96.8%, respectively. The area under the receive operating characteristic (ROC) curve was 0.954, which further evaluate the performance of the diagnostic algorithm based on the ratio of the SERS peak intensity combined with LDA analysis. This work demonstrated that the urine SERS spectra associated with empirical algorithm has potential for noninvasive diagnosis of esophagus cancer.
Construction of Optimally Reduced Empirical Model by Spatially Distributed Climate Data
NASA Astrophysics Data System (ADS)
Gavrilov, A.; Mukhin, D.; Loskutov, E.; Feigin, A.
2016-12-01
We present an approach to empirical reconstruction of the evolution operator in stochastic form by space-distributed time series. The main problem in empirical modeling consists in choosing appropriate phase variables which can efficiently reduce the dimension of the model at minimal loss of information about system's dynamics which consequently leads to more robust model and better quality of the reconstruction. For this purpose we incorporate in the model two key steps. The first step is standard preliminary reduction of observed time series dimension by decomposition via certain empirical basis (e. g. empirical orthogonal function basis or its nonlinear or spatio-temporal generalizations). The second step is construction of an evolution operator by principal components (PCs) - the time series obtained by the decomposition. In this step we introduce a new way of reducing the dimension of the embedding in which the evolution operator is constructed. It is based on choosing proper combinations of delayed PCs to take into account the most significant spatio-temporal couplings. The evolution operator is sought as nonlinear random mapping parameterized using artificial neural networks (ANN). Bayesian approach is used to learn the model and to find optimal hyperparameters: the number of PCs, the dimension of the embedding, the degree of the nonlinearity of ANN. The results of application of the method to climate data (sea surface temperature, sea level pressure) and their comparing with the same method based on non-reduced embedding are presented. The study is supported by Government of Russian Federation (agreement #14.Z50.31.0033 with the Institute of Applied Physics of RAS).
A global empirical system for probabilistic seasonal climate prediction
NASA Astrophysics Data System (ADS)
Eden, J. M.; van Oldenborgh, G. J.; Hawkins, E.; Suckling, E. B.
2015-12-01
Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961-2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño-Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.
An empirical system for probabilistic seasonal climate prediction
NASA Astrophysics Data System (ADS)
Eden, Jonathan; van Oldenborgh, Geert Jan; Hawkins, Ed; Suckling, Emma
2016-04-01
Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961-2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño-Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.
Predicting the Magnetic Properties of ICMEs: A Pragmatic View
NASA Astrophysics Data System (ADS)
Riley, P.; Linker, J.; Ben-Nun, M.; Torok, T.; Ulrich, R. K.; Russell, C. T.; Lai, H.; de Koning, C. A.; Pizzo, V. J.; Liu, Y.; Hoeksema, J. T.
2017-12-01
The southward component of the interplanetary magnetic field plays a crucial role in being able to successfully predict space weather phenomena. Yet, thus far, it has proven extremely difficult to forecast with any degree of accuracy. In this presentation, we describe an empirically-based modeling framework for estimating Bz values during the passage of interplanetary coronal mass ejections (ICMEs). The model includes: (1) an empirically-based estimate of the magnetic properties of the flux rope in the low corona (including helicity and field strength); (2) an empirically-based estimate of the dynamic properties of the flux rope in the high corona (including direction, speed, and mass); and (3) a physics-based estimate of the evolution of the flux rope during its passage to 1 AU driven by the output from (1) and (2). We compare model output with observations for a selection of events to estimate the accuracy of this approach. Importantly, we pay specific attention to the uncertainties introduced by the components within the framework, separating intrinsic limitations from those that can be improved upon, either by better observations or more sophisticated modeling. Our analysis suggests that current observations/modeling are insufficient for this empirically-based framework to provide reliable and actionable prediction of the magnetic properties of ICMEs. We suggest several paths that may lead to better forecasts.
Retamar, Pilar; Portillo, María M.; López-Prieto, María Dolores; Rodríguez-López, Fernando; de Cueto, Marina; García, María V.; Gómez, María J.; del Arco, Alfonso; Muñoz, Angel; Sánchez-Porto, Antonio; Torres-Tortosa, Manuel; Martín-Aspas, Andrés; Arroyo, Ascensión; García-Figueras, Carolina; Acosta, Federico; Corzo, Juan E.; León-Ruiz, Laura; Escobar-Lara, Trinidad
2012-01-01
The impact of the adequacy of empirical therapy on outcome for patients with bloodstream infections (BSI) is key for determining whether adequate empirical coverage should be prioritized over other, more conservative approaches. Recent systematic reviews outlined the need for new studies in the field, using improved methodologies. We assessed the impact of inadequate empirical treatment on the mortality of patients with BSI in the present-day context, incorporating recent methodological recommendations. A prospective multicenter cohort including all BSI episodes in adult patients was performed in 15 hospitals in Andalucía, Spain, over a 2-month period in 2006 to 2007. The main outcome variables were 14- and 30-day mortality. Adjusted analyses were performed by multivariate analysis and propensity score-based matching. Eight hundred one episodes were included. Inadequate empirical therapy was administered in 199 (24.8%) episodes; mortality at days 14 and 30 was 18.55% and 22.6%, respectively. After controlling for age, Charlson index, Pitt score, neutropenia, source, etiology, and presentation with severe sepsis or shock, inadequate empirical treatment was associated with increased mortality at days 14 and 30 (odds ratios [ORs], 2.12 and 1.56; 95% confidence intervals [95% CI], 1.34 to 3.34 and 1.01 to 2.40, respectively). The adjusted ORs after a propensity score-based matched analysis were 3.03 and 1.70 (95% CI, 1.60 to 5.74 and 0.98 to 2.98, respectively). In conclusion, inadequate empirical therapy is independently associated with increased mortality in patients with BSI. Programs to improve the quality of empirical therapy in patients with suspicion of BSI and optimization of definitive therapy should be implemented. PMID:22005999
Critical Realist Review: Exploring the Real, beyond the Empirical
ERIC Educational Resources Information Center
Edgley, Alison; Stickley, Theodore; Timmons, Stephen; Meal, Andy
2016-01-01
This article defines the "critical realist review", a literature-based methodological approach to critical analysis of health care studies (or any discipline charged with social interventions) that is robust, insightful and essential for the complexities of twenty-first century evidence-based health and social care. We argue that this…
Research into Practice: The Task-Based Approach to Instructed Second Language Acquisition
ERIC Educational Resources Information Center
East, Martin
2017-01-01
This article discusses the phenomenon of task-based language teaching (TBLT) in instructed additional language settings. It begins from the premise that, despite considerable theoretical and empirical support, TBLT remains a contested endeavour. Critics of TBLT argue that, particularly with regard to time-limited foreign language instructional…
Lynn, Steven Jay; Malakataris, Anne; Condon, Liam; Maxwell, Reed; Cleere, Colleen
2012-04-01
In this article, we describe how cognitive hypnotherapy can be used in conjunction with evidence-based practices for the treatment of post-traumatic stress disorder (PTSD). We review cognitive-behavioral interventions for PTSD, including mindfulness and acceptance-based approaches, and contend that (a) empirical support for the use of hypnosis in treating a variety of conditions is considerable; (b) hypnosis is fundamentally a cognitive-behavioral intervention; (c) psychological interventions with a firm footing in cognitive-behavioral therapy (CBT) are well-suited to treat the symptoms of PTSD; and (d) hypnosis can be a useful adjunct to evidence-based cognitive-behavioral approaches, including mindfulness and acceptance-based interventions, for treating PTSD.
ERIC Educational Resources Information Center
Raykov, Tenko; Little, Todd D.
1999-01-01
Describes a method for evaluating results of Procrustean rotation to a target factor pattern matrix in exploratory factor analysis. The approach, based on the bootstrap method, yields empirical approximations of the sampling distributions of: (1) differences between target elements and rotated factor pattern matrices; and (2) the overall…
ERIC Educational Resources Information Center
Lim, Lois; Oei, Adam C.
2015-01-01
Despite the widespread use of Orton-Gillingham (OG) based approaches to dyslexia remediation, empirical support documenting its effectiveness is lacking. Recently, Chia and Houghton demonstrated the effectiveness of the OG approach for remediation of dyslexia in Singapore. As a conceptual replication and extension of that research, we report…
The Intimate Correlation of Invitational Education and Effective Classroom Management
ERIC Educational Resources Information Center
Riner, Phillip S.
2003-01-01
Critics of Invitational Education and other self-concept approaches to learning have long argued that there is a lack of empirical data to support the claims that approaches to student instruction based on self-concept theory are central to effective learning. Ellis (2001) examines a number of these analyses where self-concept, self-esteem, and…
ERIC Educational Resources Information Center
Flanagan, Rosemary; Esquivel, Giselle B.
2006-01-01
School psychologists have a critical role in identifying social-emotional problems and psychopathology in youth based on a set of personality-assessment competencies. The development of competencies in assessing personality and psychopathology is complex, requiring a variety of integrated methods and approaches. Given the limited extent and scope…
ERIC Educational Resources Information Center
Kang, Okim; Thomson, Ron I.; Moran, Meghan
2018-01-01
This study compared five research-based intelligibility measures as they were applied to six varieties of English. The objective was to determine which approach to measuring intelligibility would be most reliable for predicting listener comprehension, as measured through a listening comprehension test similar to the Test of English as a Foreign…
What Do We Know and How Well Do We Know It? Identifying Practice-Based Insights in Education
ERIC Educational Resources Information Center
Miller, Barbara; Pasley, Joan
2012-01-01
Knowledge derived from practice forms a significant portion of the knowledge base in the education field, yet is not accessible using existing empirical research methods. This paper describes a systematic, rigorous, grounded approach to collecting and analysing practice-based knowledge using the authors' research in teacher leadership as an…
ERIC Educational Resources Information Center
Thune, Taran; Støren, Liv Anne
2015-01-01
Purpose: The purpose of this paper is to present an empirically based discussion of how cooperation between higher education institutions and work organisations (WOs) can increase graduate learning experiences and employability. Design/methodology/approach: Data are based on an electronic and mail-based graduate survey among Norwegian master's…
Elbogen, Eric B; Fuller, Sara; Johnson, Sally C; Brooks, Stephanie; Kinneer, Patricia; Calhoun, Patrick S; Beckham, Jean C
2010-08-01
Increased media attention to post-deployment violence highlights the need to develop effective models to guide risk assessment among military Veterans. Ideally, a method would help identify which Veterans are most at risk for violence so that it can be determined what could be done to prevent violent behavior. This article suggests how empirical approaches to risk assessment used successfully in civilian populations can be applied to Veterans. A review was conducted of the scientific literature on Veteran populations regarding factors related to interpersonal violence generally and to domestic violence specifically. A checklist was then generated of empirically-supported risk factors for clinicians to consider in practice. To conceptualize how these known risk factors relate to a Veteran's violence potential, risk assessment scholarship was utilized to develop an evidence-based method to guide mental health professionals. The goals of this approach are to integrate science into practice, overcome logistical barriers, and permit more effective assessment, monitoring, and management of violence risk for clinicians working with Veterans, both in Department of Veteran Affairs settings and in the broader community. Research is needed to test the predictive validity of risk assessment models. Ultimately, the use of a systematic, empirical framework could lead to improved clinical decision-making in the area of risk assessment and potentially help prevent violence among Veterans. Published by Elsevier Ltd.
Elbogen, Eric B.; Fuller, Sara; Johnson, Sally C.; Brooks, Stephanie; Kinneer, Patricia; Calhoun, Patrick; Beckham, Jean C.
2010-01-01
Despite increased media attention on violent acts against others committed by military Veterans, few models have been developed to systematically guide violence risk assessment among Veterans. Ideally, a model would identify which Veterans are most at risk for violence and increased attention could then be turned to determining what could be done to prevent violent behavior. This article suggests how empirical approaches to risk assessment used successfully in civilian populations can be applied to Veterans. A review was conducted of the scientific literature on Veteran populations regarding factors related to interpersonal violence generally and to domestic violence specifically. A list was then generated of empirically-supported risk factors for clinicians to consider in practice. To conceptualize how these known risk factors relate to a Veteran’s violence potential, risk assessment scholarship was utilized to develop an evidence-based method to guide mental health professionals. The goals of this approach are to integrate science into practice, overcome logistical barriers, and permit more effective assessment, monitoring, and management of violence risk for clinicians working with Veterans, both in Veteran Administration settings and in the broader community. It is likely that the use of a systematic, empirical framework could lead to improved clinical decision-making in the area of risk assessment, and help reduce violence among Veterans. PMID:20627387
An Analysis of Machine- and Human-Analytics in Classification.
Tam, Gary K L; Kothari, Vivek; Chen, Min
2017-01-01
In this work, we present a study that traces the technical and cognitive processes in two visual analytics applications to a common theoretic model of soft knowledge that may be added into a visual analytics process for constructing a decision-tree model. Both case studies involved the development of classification models based on the "bag of features" approach. Both compared a visual analytics approach using parallel coordinates with a machine-learning approach using information theory. Both found that the visual analytics approach had some advantages over the machine learning approach, especially when sparse datasets were used as the ground truth. We examine various possible factors that may have contributed to such advantages, and collect empirical evidence for supporting the observation and reasoning of these factors. We propose an information-theoretic model as a common theoretic basis to explain the phenomena exhibited in these two case studies. Together we provide interconnected empirical and theoretical evidence to support the usefulness of visual analytics.
Estimating topological properties of weighted networks from limited information
NASA Astrophysics Data System (ADS)
Gabrielli, Andrea; Cimini, Giulio; Garlaschelli, Diego; Squartini, Angelo
A typical problem met when studying complex systems is the limited information available on their topology, which hinders our understanding of their structural and dynamical properties. A paramount example is provided by financial networks, whose data are privacy protected. Yet, the estimation of systemic risk strongly depends on the detailed structure of the interbank network. The resulting challenge is that of using aggregate information to statistically reconstruct a network and correctly predict its higher-order properties. Standard approaches either generate unrealistically dense networks, or fail to reproduce the observed topology by assigning homogeneous link weights. Here we develop a reconstruction method, based on statistical mechanics concepts, that exploits the empirical link density in a highly non-trivial way. Technically, our approach consists in the preliminary estimation of node degrees from empirical node strengths and link density, followed by a maximum-entropy inference based on a combination of empirical strengths and estimated degrees. Our method is successfully tested on the international trade network and the interbank money market, and represents a valuable tool for gaining insights on privacy-protected or partially accessible systems. Acknoweledgement to ``Growthcom'' ICT - EC project (Grant No: 611272) and ``Crisislab'' Italian Project.
Moran, Nikki
2014-01-01
The agenda in music research that is broadly recognized as embodied music cognition has arrived hand-in-hand with a social interpretation of music, focusing on the real-world basis of its performance, and fostering an empirical approach to musician movement regarding the communicative function and potential of those movements. However, embodied cognition emerged from traditional cognitivism, which produced a body of scientific explanation of music-theoretic concepts. The analytical object of this corpus is based on the particular imagined encounter of a listener responding to an idealized “work.” Although this problem of essentialism has been identified within mainstream musicology, the lingering effects may spill over into interdisciplinary, empirical research. This paper defines the situation according to its legacy of individualism, and offers an alternative sketch of musical activity as performance event, a model that highlights the social interaction processes at the heart of musical behavior. I describe some recent empirical work based on interaction-oriented approaches, arguing that this particular focus – on the social interaction process itself – creates a distinctive and promising agenda for further research into embodied music cognition. PMID:25101011
The birth of the empirical turn in bioethics.
Borry, Pascal; Schotsmans, Paul; Dierickx, Kris
2005-02-01
Since its origin, bioethics has attracted the collaboration of few social scientists, and social scientific methods of gathering empirical data have remained unfamiliar to ethicists. Recently, however, the clouded relations between the empirical and normative perspectives on bioethics appear to be changing. Three reasons explain why there was no easy and consistent input of empirical evidence in bioethics. Firstly, interdisciplinary dialogue runs the risk of communication problems and divergent objectives. Secondly, the social sciences were absent partners since the beginning of bioethics. Thirdly, the meta-ethical distinction between 'is' and 'ought' created a 'natural' border between the disciplines. Now, bioethics tends to accommodate more empirical research. Three hypotheses explain this emergence. Firstly, dissatisfaction with a foundationalist interpretation of applied ethics created a stimulus to incorporate empirical research in bioethics. Secondly, clinical ethicists became engaged in empirical research due to their strong integration in the medical setting. Thirdly, the rise of the evidence-based paradigm had an influence on the practice of bioethics. However, a problematic relationship cannot simply and easily evolve into a perfect interaction. A new and positive climate for empirical approaches has arisen, but the original difficulties have not disappeared.
Salloch, Sabine; Schildmann, Jan; Vollmann, Jochen
2012-04-13
The methodology of medical ethics during the last few decades has shifted from a predominant use of normative-philosophical analyses to an increasing involvement of empirical methods. The articles which have been published in the course of this so-called 'empirical turn' can be divided into conceptual accounts of empirical-normative collaboration and studies which use socio-empirical methods to investigate ethically relevant issues in concrete social contexts. A considered reference to normative research questions can be expected from good quality empirical research in medical ethics. However, a significant proportion of empirical studies currently published in medical ethics lacks such linkage between the empirical research and the normative analysis. In the first part of this paper, we will outline two typical shortcomings of empirical studies in medical ethics with regard to a link between normative questions and empirical data: (1) The complete lack of normative analysis, and (2) cryptonormativity and a missing account with regard to the relationship between 'is' and 'ought' statements. Subsequently, two selected concepts of empirical-normative collaboration will be presented and how these concepts may contribute to improve the linkage between normative and empirical aspects of empirical research in medical ethics will be demonstrated. Based on our analysis, as well as our own practical experience with empirical research in medical ethics, we conclude with a sketch of concrete suggestions for the conduct of empirical research in medical ethics. High quality empirical research in medical ethics is in need of a considered reference to normative analysis. In this paper, we demonstrate how conceptual approaches of empirical-normative collaboration can enhance empirical research in medical ethics with regard to the link between empirical research and normative analysis.
2012-01-01
Background The methodology of medical ethics during the last few decades has shifted from a predominant use of normative-philosophical analyses to an increasing involvement of empirical methods. The articles which have been published in the course of this so-called 'empirical turn' can be divided into conceptual accounts of empirical-normative collaboration and studies which use socio-empirical methods to investigate ethically relevant issues in concrete social contexts. Discussion A considered reference to normative research questions can be expected from good quality empirical research in medical ethics. However, a significant proportion of empirical studies currently published in medical ethics lacks such linkage between the empirical research and the normative analysis. In the first part of this paper, we will outline two typical shortcomings of empirical studies in medical ethics with regard to a link between normative questions and empirical data: (1) The complete lack of normative analysis, and (2) cryptonormativity and a missing account with regard to the relationship between 'is' and 'ought' statements. Subsequently, two selected concepts of empirical-normative collaboration will be presented and how these concepts may contribute to improve the linkage between normative and empirical aspects of empirical research in medical ethics will be demonstrated. Based on our analysis, as well as our own practical experience with empirical research in medical ethics, we conclude with a sketch of concrete suggestions for the conduct of empirical research in medical ethics. Summary High quality empirical research in medical ethics is in need of a considered reference to normative analysis. In this paper, we demonstrate how conceptual approaches of empirical-normative collaboration can enhance empirical research in medical ethics with regard to the link between empirical research and normative analysis. PMID:22500496
Evidence-based ethics? On evidence-based practice and the "empirical turn" from normative bioethics
Goldenberg, Maya J
2005-01-01
Background The increase in empirical methods of research in bioethics over the last two decades is typically perceived as a welcomed broadening of the discipline, with increased integration of social and life scientists into the field and ethics consultants into the clinical setting, however it also represents a loss of confidence in the typical normative and analytic methods of bioethics. Discussion The recent incipiency of "Evidence-Based Ethics" attests to this phenomenon and should be rejected as a solution to the current ambivalence toward the normative resolution of moral problems in a pluralistic society. While "evidence-based" is typically read in medicine and other life and social sciences as the empirically-adequate standard of reasonable practice and a means for increasing certainty, I propose that the evidence-based movement in fact gains consensus by displacing normative discourse with aggregate or statistically-derived empirical evidence as the "bottom line". Therefore, along with wavering on the fact/value distinction, evidence-based ethics threatens bioethics' normative mandate. The appeal of the evidence-based approach is that it offers a means of negotiating the demands of moral pluralism. Rather than appealing to explicit values that are likely not shared by all, "the evidence" is proposed to adjudicate between competing claims. Quantified measures are notably more "neutral" and democratic than liberal markers like "species normal functioning". Yet the positivist notion that claims stand or fall in light of the evidence is untenable; furthermore, the legacy of positivism entails the quieting of empirically non-verifiable (or at least non-falsifiable) considerations like moral claims and judgments. As a result, evidence-based ethics proposes to operate with the implicit normativity that accompanies the production and presentation of all biomedical and scientific facts unchecked. Summary The "empirical turn" in bioethics signals a need for reconsideration of the methods used for moral evaluation and resolution, however the options should not include obscuring normative content by seemingly neutral technical measure. PMID:16277663
AGENT-BASED MODELING OF INDUSTRIAL ECOSYSTEMS
The objectives of this research are to investigate behavioral and organizational questions associated with environmental regulation of firms, and to test specifically whether a bottom-up approach that highlights principal-agent problems offers new insights and empirical validi...
Design flood hydrograph estimation procedure for small and fully-ungauged basins
NASA Astrophysics Data System (ADS)
Grimaldi, S.; Petroselli, A.
2013-12-01
The Rational Formula is the most applied equation in practical hydrology due to its simplicity and the effective compromise between theory and data availability. Although the Rational Formula is affected by several drawbacks, it is reliable and surprisingly accurate considering the paucity of input information. However, after more than a century, the recent computational, theoretical, and large-scale monitoring progresses compel us to try to suggest a more advanced yet still empirical procedure for estimating peak discharge in small and ungauged basins. In this contribution an alternative empirical procedure (named EBA4SUB - Event Based Approach for Small and Ungauged Basins) based on the common modelling steps: design hyetograph, rainfall excess, and rainfall-runoff transformation, is described. The proposed approach, accurately adapted for the fully-ungauged basin condition, provides a potentially better estimation of the peak discharge, a design hydrograph shape, and, most importantly, reduces the subjectivity of the hydrologist in its application.
Communication: Charge-population based dispersion interactions for molecules and materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stöhr, Martin; Department Chemie, Technische Universität München, Lichtenbergstr. 4, D-85748 Garching; Michelitsch, Georg S.
2016-04-21
We introduce a system-independent method to derive effective atomic C{sub 6} coefficients and polarizabilities in molecules and materials purely from charge population analysis. This enables the use of dispersion-correction schemes in electronic structure calculations without recourse to electron-density partitioning schemes and expands their applicability to semi-empirical methods and tight-binding Hamiltonians. We show that the accuracy of our method is en par with established electron-density partitioning based approaches in describing intermolecular C{sub 6} coefficients as well as dispersion energies of weakly bound molecular dimers, organic crystals, and supramolecular complexes. We showcase the utility of our approach by incorporation of the recentlymore » developed many-body dispersion method [Tkatchenko et al., Phys. Rev. Lett. 108, 236402 (2012)] into the semi-empirical density functional tight-binding method and propose the latter as a viable technique to study hybrid organic-inorganic interfaces.« less
Western municipal water conservation policy: The case of disaggregated demand
NASA Astrophysics Data System (ADS)
Burness, Stuart; Chermak, Janie; Krause, Kate
2005-03-01
We investigate aspects of the felicity of both incentive-based and command and control policies in effecting municipal water conservation goals. When demand can be disaggregated according to uses or users, our results suggest that policy efforts be focused on the submarket wherein demand is more elastic. Under plausible consumer parameters, a household production function approach to water utilization prescribes the nature of demand elasticities in alternative uses and squares nicely with empirical results from the literature. An empirical example illustrates. Overall, given data and other informational limitations, extant institutional structures, and in situ technology, our analysis suggests a predisposition for command and control policies over incentive-based tools.
Linking agent-based models and stochastic models of financial markets
Feng, Ling; Li, Baowen; Podobnik, Boris; Preis, Tobias; Stanley, H. Eugene
2012-01-01
It is well-known that financial asset returns exhibit fat-tailed distributions and long-term memory. These empirical features are the main objectives of modeling efforts using (i) stochastic processes to quantitatively reproduce these features and (ii) agent-based simulations to understand the underlying microscopic interactions. After reviewing selected empirical and theoretical evidence documenting the behavior of traders, we construct an agent-based model to quantitatively demonstrate that “fat” tails in return distributions arise when traders share similar technical trading strategies and decisions. Extending our behavioral model to a stochastic model, we derive and explain a set of quantitative scaling relations of long-term memory from the empirical behavior of individual market participants. Our analysis provides a behavioral interpretation of the long-term memory of absolute and squared price returns: They are directly linked to the way investors evaluate their investments by applying technical strategies at different investment horizons, and this quantitative relationship is in agreement with empirical findings. Our approach provides a possible behavioral explanation for stochastic models for financial systems in general and provides a method to parameterize such models from market data rather than from statistical fitting. PMID:22586086
Linking agent-based models and stochastic models of financial markets.
Feng, Ling; Li, Baowen; Podobnik, Boris; Preis, Tobias; Stanley, H Eugene
2012-05-29
It is well-known that financial asset returns exhibit fat-tailed distributions and long-term memory. These empirical features are the main objectives of modeling efforts using (i) stochastic processes to quantitatively reproduce these features and (ii) agent-based simulations to understand the underlying microscopic interactions. After reviewing selected empirical and theoretical evidence documenting the behavior of traders, we construct an agent-based model to quantitatively demonstrate that "fat" tails in return distributions arise when traders share similar technical trading strategies and decisions. Extending our behavioral model to a stochastic model, we derive and explain a set of quantitative scaling relations of long-term memory from the empirical behavior of individual market participants. Our analysis provides a behavioral interpretation of the long-term memory of absolute and squared price returns: They are directly linked to the way investors evaluate their investments by applying technical strategies at different investment horizons, and this quantitative relationship is in agreement with empirical findings. Our approach provides a possible behavioral explanation for stochastic models for financial systems in general and provides a method to parameterize such models from market data rather than from statistical fitting.
Empirical approaches to metacommunities: a review and comparison with theory.
Logue, Jürg B; Mouquet, Nicolas; Peter, Hannes; Hillebrand, Helmut
2011-09-01
Metacommunity theory has advanced understanding of how spatial dynamics and local interactions shape community structure and biodiversity. Here, we review empirical approaches to metacommunities, both observational and experimental, pertaining to how well they relate to and test theoretical metacommunity paradigms and how well they capture the realities of natural ecosystems. First, we show that the species-sorting and mass-effects paradigms are the most commonly tested and supported paradigms. Second, the dynamics observed can often be ascribed to two or more of the four non-exclusive paradigms. Third, empirical approaches relate only weakly to the concise assumptions and predictions made by the paradigms. Consequently, we suggest major avenues of improvement for empirical metacommunity approaches, including the integration across theoretical approaches and the incorporation of evolutionary and meta-ecosystem dynamics. We hope for metacommunity ecology to thereby bridge existing gaps between empirical and theoretical work, thus becoming a more powerful framework to understand dynamics across ecosystems. Copyright © 2011 Elsevier Ltd. All rights reserved.
Vast Portfolio Selection with Gross-exposure Constraints*
Fan, Jianqing; Zhang, Jingjin; Yu, Ke
2012-01-01
We introduce the large portfolio selection using gross-exposure constraints. We show that with gross-exposure constraint the empirically selected optimal portfolios based on estimated covariance matrices have similar performance to the theoretical optimal ones and there is no error accumulation effect from estimation of vast covariance matrices. This gives theoretical justification to the empirical results in Jagannathan and Ma (2003). We also show that the no-short-sale portfolio can be improved by allowing some short positions. The applications to portfolio selection, tracking, and improvements are also addressed. The utility of our new approach is illustrated by simulation and empirical studies on the 100 Fama-French industrial portfolios and the 600 stocks randomly selected from Russell 3000. PMID:23293404
NASA Astrophysics Data System (ADS)
Hauptmann, S.; Bülk, M.; Schön, L.; Erbslöh, S.; Boorsma, K.; Grasso, F.; Kühn, M.; Cheng, P. W.
2014-12-01
Design load simulations for wind turbines are traditionally based on the blade- element-momentum theory (BEM). The BEM approach is derived from a simplified representation of the rotor aerodynamics and several semi-empirical correction models. A more sophisticated approach to account for the complex flow phenomena on wind turbine rotors can be found in the lifting-line free vortex wake method. This approach is based on a more physics based representation, especially for global flow effects. This theory relies on empirical correction models only for the local flow effects, which are associated with the boundary layer of the rotor blades. In this paper the lifting-line free vortex wake method is compared to a state- of-the-art BEM formulation with regard to aerodynamic and aeroelastic load simulations of the 5MW UpWind reference wind turbine. Different aerodynamic load situations as well as standardised design load cases that are sensitive to the aeroelastic modelling are evaluated in detail. This benchmark makes use of the AeroModule developed by ECN, which has been coupled to the multibody simulation code SIMPACK.
NASA Astrophysics Data System (ADS)
Dash, Y.; Mishra, S. K.; Panigrahi, B. K.
2017-12-01
Prediction of northeast/post monsoon rainfall which occur during October, November and December (OND) over Indian peninsula is a challenging task due to the dynamic nature of uncertain chaotic climate. It is imperative to elucidate this issue by examining performance of different machine leaning (ML) approaches. The prime objective of this research is to compare between a) statistical prediction using historical rainfall observations and global atmosphere-ocean predictors like Sea Surface Temperature (SST) and Sea Level Pressure (SLP) and b) empirical prediction based on a time series analysis of past rainfall data without using any other predictors. Initially, ML techniques have been applied on SST and SLP data (1948-2014) obtained from NCEP/NCAR reanalysis monthly mean provided by the NOAA ESRL PSD. Later, this study investigated the applicability of ML methods using OND rainfall time series for 1948-2014 and forecasted up to 2018. The predicted values of aforementioned methods were verified using observed time series data collected from Indian Institute of Tropical Meteorology and the result revealed good performance of ML algorithms with minimal error scores. Thus, it is found that both statistical and empirical methods are useful for long range climatic projections.
Steiner, Silvan
2018-01-01
The importance of various information sources in decision-making in interactive team sports is debated. While some highlight the role of the perceptual information provided by the current game context, others point to the role of knowledge-based information that athletes have regarding their team environment. Recently, an integrative perspective considering the simultaneous involvement of both of these information sources in decision-making in interactive team sports has been presented. In a theoretical example concerning passing decisions, the simultaneous involvement of perceptual and knowledge-based information has been illustrated. However, no precast method of determining the contribution of these two information sources empirically has been provided. The aim of this article is to bridge this gap and present a statistical approach to estimating the effects of perceptual information and associative knowledge on passing decisions. To this end, a sample dataset of scenario-based passing decisions is analyzed. This article shows how the effects of perceivable team positionings and athletes' knowledge about their fellow team members on passing decisions can be estimated. Ways of transfering this approach to real-world situations and implications for future research using more representative designs are presented.
Steiner, Silvan
2018-01-01
The importance of various information sources in decision-making in interactive team sports is debated. While some highlight the role of the perceptual information provided by the current game context, others point to the role of knowledge-based information that athletes have regarding their team environment. Recently, an integrative perspective considering the simultaneous involvement of both of these information sources in decision-making in interactive team sports has been presented. In a theoretical example concerning passing decisions, the simultaneous involvement of perceptual and knowledge-based information has been illustrated. However, no precast method of determining the contribution of these two information sources empirically has been provided. The aim of this article is to bridge this gap and present a statistical approach to estimating the effects of perceptual information and associative knowledge on passing decisions. To this end, a sample dataset of scenario-based passing decisions is analyzed. This article shows how the effects of perceivable team positionings and athletes' knowledge about their fellow team members on passing decisions can be estimated. Ways of transfering this approach to real-world situations and implications for future research using more representative designs are presented. PMID:29623057
NASA Astrophysics Data System (ADS)
Coppola, Diego; Laiolo, Marco; Cigolini, Corrado
2016-04-01
The rate at which the lava is erupted is a crucial parameter to be monitored during any volcanic eruption. However, its accurate and systematic measurement, throughout the whole duration of an event, remains a big challenge, also for volcanologists working on highly studied and well monitored volcanoes. The thermal approach (also known as thermal proxy) is actually one of most promising techniques adopted during effusive eruptions, since it allows to estimate Time Averaged lava Discharge Rates (TADR) from remote-sensed infrared data acquired several time per day. However, due to the complexity of the physic behind the effusive phenomenon and the difficulty to have field validations, the application of the thermal proxy is still debated and limited to few volcanoes only. Here we present the analysis of MODIS Middle InfraRed data, collected by during several distinct eruptions, in order to show how an alternative, empirical method (called radiant density approach; Coppola et al., 2013) permit to estimate TADRs over a wide range of emplacement styles and lava compositions. We suggest that the simplicity of this empirical approach allows its rapid application during eruptive crisis, and provides the basis for more complex models based on the cooling and spreading processes of the active lava bodies.
NASA Astrophysics Data System (ADS)
Kim, Taeyoun; Hwang, Seho; Jang, Seonghyung
2017-01-01
When finding the "sweet spot" of a shale gas reservoir, it is essential to estimate the brittleness index (BI) and total organic carbon (TOC) of the formation. Particularly, the BI is one of the key factors in determining the crack propagation and crushing efficiency for hydraulic fracturing. There are several methods for estimating the BI of a formation, but most of them are empirical equations that are specific to particular rock types. We estimated the mineralogical BI based on elemental capture spectroscopy (ECS) log and elastic BI based on well log data, and we propose a new method for predicting S-wave velocity (VS) using mineralogical BI and elastic BI. The TOC is related to the gas content of shale gas reservoirs. Since it is difficult to perform core analysis for all intervals of shale gas reservoirs, we make empirical equations for the Horn River Basin, Canada, as well as TOC log using a linear relation between core-tested TOC and well log data. In addition, two empirical equations have been suggested for VS prediction based on density and gamma ray log used for TOC analysis. By applying the empirical equations proposed from the perspective of BI and TOC to another well log data and then comparing predicted VS log with real VS log, the validity of empirical equations suggested in this paper has been tested.
Empirical and semi-analytical models for predicting peak outflows caused by embankment dam failures
NASA Astrophysics Data System (ADS)
Wang, Bo; Chen, Yunliang; Wu, Chao; Peng, Yong; Song, Jiajun; Liu, Wenjun; Liu, Xin
2018-07-01
Prediction of peak discharge of floods has attracted great attention for researchers and engineers. In present study, nine typical nonlinear mathematical models are established based on database of 40 historical dam failures. The first eight models that were developed with a series of regression analyses are purely empirical, while the last one is a semi-analytical approach that was derived from an analytical solution of dam-break floods in a trapezoidal channel. Water depth above breach invert (Hw), volume of water stored above breach invert (Vw), embankment length (El), and average embankment width (Ew) are used as independent variables to develop empirical formulas of estimating the peak outflow from breached embankment dams. It is indicated from the multiple regression analysis that a function using the former two variables (i.e., Hw and Vw) produce considerably more accurate results than that using latter two variables (i.e., El and Ew). It is shown that the semi-analytical approach works best in terms of both prediction accuracy and uncertainty, and the established empirical models produce considerably reasonable results except the model only using El. Moreover, present models have been compared with other models available in literature for estimating peak discharge.
ERIC Educational Resources Information Center
Wells, John G.
2016-01-01
Though not empirically established as an efficacious pedagogy for promoting higher order thinking skills, technological/engineering design-based learning in K-12 STEM education is increasingly embraced as a core instructional method for integrative STEM learning that promotes the development of student critical thinking skills (Honey, Pearson,…
ERIC Educational Resources Information Center
Tulbure, Bogdan T.; Szentagotai, Aurora; Dobrean, Anca; David, Daniel
2012-01-01
Investigating the empirical support of various assessment instruments, the evidence based assessment approach expands the scientific basis of psychotherapy. Starting from Hunsley and Mash's evaluative framework, we critically reviewed the rating scales designed to measure social anxiety or phobia in youth. Thirteen of the most researched social…
Towards an Integration of Research on Teaching and Learning
ERIC Educational Resources Information Center
Svensson, Lennart
2016-01-01
The aim of this article is to present arguments for an integrated empirical research on teaching and learning based on previous research and the phenomenographic research tradition. From 1970 and for some years after, the main focus in phenomenographic research was on students' approaches to and understanding of subject matter. Later, based on…
ERIC Educational Resources Information Center
Piercy, Niall
2013-01-01
The use of experiential learning techniques has become popular in business education. Experiential learning approaches offer major benefits for teaching contemporary management practices such as cross-functional and team-based working. However, there remains relatively little empirical data on the success of experiential pedagogies in supporting…
ERIC Educational Resources Information Center
Moran, Daniel J.; Consulting, Pickslyde
2010-01-01
The evidence-based executive coaching movement suggests translating empirical research into practical methods to help leaders develop a repertoire of crisis resiliency and value-directed change management skills. Acceptance and Commitment Therapy (ACT) is an evidence-based modern cognitive-behavior therapy approach that has been and applied to…
ERIC Educational Resources Information Center
Kovac, Velibor Bobo; Lund, Ingrid; Omdal, Heidi
2017-01-01
This study explores the possibility that the concept of learning environment (LE) is understood and interpreted differently by various users, depending on their relative positions in the educational system, institutional affiliation, and cultural heritage. The study employs a qualitative approach and is based on 14 semistructured separate…
ERIC Educational Resources Information Center
Stockard, Jean; Wood, Timothy W.
2017-01-01
Most evaluators have embraced the goal of evidence-based practice (EBP). Yet, many have criticized EBP review systems that prioritize randomized control trials and use various criteria to limit the studies examined. They suggest this could produce policy recommendations based on small, unrepresentative segments of the literature and recommend a…
Integrating the Demonstration Orientation and Standards-Based Models of Achievement Goal Theory
ERIC Educational Resources Information Center
Wynne, Heather Marie
2014-01-01
Achievement goal theory and thus, the empirical measures stemming from the research, are currently divided on two conceptual approaches, namely the reason versus aims-based models of achievement goals. The factor structure and predictive utility of goal constructs from the Patterns of Adaptive Learning Strategies (PALS) and the latest two versions…
Modeling of Kerena Emergency Condenser
NASA Astrophysics Data System (ADS)
Bryk, Rafał; Schmidt, Holger; Mull, Thomas; Wagner, Thomas; Ganzmann, Ingo; Herbst, Oliver
2017-12-01
KERENA is an innovative boiling water reactor concept equipped with several passive safety systems. For the experimental verification of performance of the systems and for codes validation, the Integral Test Stand Karlstein (INKA) was built in Karlstein, Germany. The emergency condenser (EC) system transfers heat from the reactor pressure vessel (RPV) to the core flooding pool in case of water level decrease in the RPV. EC is composed of a large number of slightly inclined tubes. During accident conditions, steam enters into the tubes and condenses due to the contact of the tubes with cold water at the secondary side. The condensed water flows then back to the RPV due to gravity. In this paper two approaches for modeling of condensation in slightly inclined tubes are compared and verified against experiments. The first approach is based on the flow regime map. Depending on the regime, heat transfer coefficient is calculated according to specific semi-empirical correlation. The second approach uses a general, fully-empirical correlation. The models are developed with utilization of the object-oriented Modelica language and the open-source OpenModelica environment. The results are compared with data obtained during a large scale integral test, simulating loss of coolant accident performed at Integral Test Stand Karlstein (INKA). The comparison shows a good agreement.Due to the modularity of models, both of them may be used in the future in systems incorporating condensation in horizontal or slightly inclined tubes. Depending on his preferences, the modeller may choose one-equation based approach or more sophisticated model composed of several exchangeable semi-empirical correlations.
DOT National Transportation Integrated Search
2016-04-01
Advancements in pavement management practice require evaluating the performance of pavement preservation treatments using performance-related characteristics. However, state highway agencies face the challenge of developing performance-based relation...
2005-05-01
Revised Manuscript Received March 3, 2005 ABSTRACT: Base flipping is a highly conserved strategy used by enzymes to gain catalytic access to DNA bases that...73.13. Jencks, W. P. (1985) A primer for the bema hapothle-an between uracil and other normal DNA bases at this inter- empirical-approach to the
The Small World of Psychopathology
Borsboom, Denny; Cramer, Angélique O. J.; Schmittmann, Verena D.; Epskamp, Sacha; Waldorp, Lourens J.
2011-01-01
Background Mental disorders are highly comorbid: people having one disorder are likely to have another as well. We explain empirical comorbidity patterns based on a network model of psychiatric symptoms, derived from an analysis of symptom overlap in the Diagnostic and Statistical Manual of Mental Disorders-IV (DSM-IV). Principal Findings We show that a) half of the symptoms in the DSM-IV network are connected, b) the architecture of these connections conforms to a small world structure, featuring a high degree of clustering but a short average path length, and c) distances between disorders in this structure predict empirical comorbidity rates. Network simulations of Major Depressive Episode and Generalized Anxiety Disorder show that the model faithfully reproduces empirical population statistics for these disorders. Conclusions In the network model, mental disorders are inherently complex. This explains the limited successes of genetic, neuroscientific, and etiological approaches to unravel their causes. We outline a psychosystems approach to investigate the structure and dynamics of mental disorders. PMID:22114671
ERIC Educational Resources Information Center
Yogev, Sara; Brett, Jeanne
This paper offers a conceptual framework for the intersection of work and family roles based on the constructs of work involvement and family involvement. The theoretical and empirical literature on the intersection of work and family roles is reviewed from two analytical approaches. From the individual level of analysis, the literature reviewed…
ERIC Educational Resources Information Center
Archer, Louise; Dawson, Emily; DeWitt, Jennifer; Seakins, Amy; Wong, Billy
2015-01-01
This paper sets out an argument and approach for moving beyond a primarily arts-based conceptualization of cultural capital, as has been the tendency within Bourdieusian approaches to date. We advance the notion that, in contemporary society, scientific forms of cultural and social capital can command a high symbolic and exchange value. Our…
NASA Astrophysics Data System (ADS)
Skersys, Tomas; Butleris, Rimantas; Kapocius, Kestutis
2013-10-01
Approaches for the analysis and specification of business vocabularies and rules are very relevant topics in both Business Process Management and Information Systems Development disciplines. However, in common practice of Information Systems Development, the Business modeling activities still are of mostly empiric nature. In this paper, basic aspects of the approach for business vocabularies' semi-automated extraction from business process models are presented. The approach is based on novel business modeling-level OMG standards "Business Process Model and Notation" (BPMN) and "Semantics for Business Vocabularies and Business Rules" (SBVR), thus contributing to OMG's vision about Model-Driven Architecture (MDA) and to model-driven development in general.
Bacterial clonal diagnostics as a tool for evidence-based empiric antibiotic selection
Tchesnokova, Veronika; Avagyan, Hovhannes; Rechkina, Elena; Chan, Diana; Muradova, Mariya; Haile, Helen Ghirmai; Radey, Matthew; Weissman, Scott; Riddell, Kim; Scholes, Delia; Johnson, James R.
2017-01-01
Despite the known clonal distribution of antibiotic resistance in many bacteria, empiric (pre-culture) antibiotic selection still relies heavily on species-level cumulative antibiograms, resulting in overuse of broad-spectrum agents and excessive antibiotic/pathogen mismatch. Urinary tract infections (UTIs), which account for a large share of antibiotic use, are caused predominantly by Escherichia coli, a highly clonal pathogen. In an observational clinical cohort study of urgent care patients with suspected UTI, we assessed the potential for E. coli clonal-level antibiograms to improve empiric antibiotic selection. A novel PCR-based clonotyping assay was applied to fresh urine samples to rapidly detect E. coli and the urine strain's clonotype. Based on a database of clonotype-specific antibiograms, the acceptability of various antibiotics for empiric therapy was inferred using a 20%, 10%, and 30% allowed resistance threshold. The test's performance characteristics and possible effects on prescribing were assessed. The rapid test identified E. coli clonotypes directly in patients’ urine within 25–35 minutes, with high specificity and sensitivity compared to culture. Antibiotic selection based on a clonotype-specific antibiogram could reduce the relative likelihood of antibiotic/pathogen mismatch by ≥ 60%. Compared to observed prescribing patterns, clonal diagnostics-guided antibiotic selection could safely double the use of trimethoprim/sulfamethoxazole and minimize fluoroquinolone use. In summary, a rapid clonotyping test showed promise for improving empiric antibiotic prescribing for E. coli UTI, including reversing preferential use of fluoroquinolones over trimethoprim/sulfamethoxazole. The clonal diagnostics approach merges epidemiologic surveillance, antimicrobial stewardship, and molecular diagnostics to bring evidence-based medicine directly to the point of care. PMID:28350870
Bacterial clonal diagnostics as a tool for evidence-based empiric antibiotic selection.
Tchesnokova, Veronika; Avagyan, Hovhannes; Rechkina, Elena; Chan, Diana; Muradova, Mariya; Haile, Helen Ghirmai; Radey, Matthew; Weissman, Scott; Riddell, Kim; Scholes, Delia; Johnson, James R; Sokurenko, Evgeni V
2017-01-01
Despite the known clonal distribution of antibiotic resistance in many bacteria, empiric (pre-culture) antibiotic selection still relies heavily on species-level cumulative antibiograms, resulting in overuse of broad-spectrum agents and excessive antibiotic/pathogen mismatch. Urinary tract infections (UTIs), which account for a large share of antibiotic use, are caused predominantly by Escherichia coli, a highly clonal pathogen. In an observational clinical cohort study of urgent care patients with suspected UTI, we assessed the potential for E. coli clonal-level antibiograms to improve empiric antibiotic selection. A novel PCR-based clonotyping assay was applied to fresh urine samples to rapidly detect E. coli and the urine strain's clonotype. Based on a database of clonotype-specific antibiograms, the acceptability of various antibiotics for empiric therapy was inferred using a 20%, 10%, and 30% allowed resistance threshold. The test's performance characteristics and possible effects on prescribing were assessed. The rapid test identified E. coli clonotypes directly in patients' urine within 25-35 minutes, with high specificity and sensitivity compared to culture. Antibiotic selection based on a clonotype-specific antibiogram could reduce the relative likelihood of antibiotic/pathogen mismatch by ≥ 60%. Compared to observed prescribing patterns, clonal diagnostics-guided antibiotic selection could safely double the use of trimethoprim/sulfamethoxazole and minimize fluoroquinolone use. In summary, a rapid clonotyping test showed promise for improving empiric antibiotic prescribing for E. coli UTI, including reversing preferential use of fluoroquinolones over trimethoprim/sulfamethoxazole. The clonal diagnostics approach merges epidemiologic surveillance, antimicrobial stewardship, and molecular diagnostics to bring evidence-based medicine directly to the point of care.
NASA Astrophysics Data System (ADS)
Bora, Sanjay; Scherbaum, Frank; Kuehn, Nicolas; Stafford, Peter; Edwards, Benjamin
2016-04-01
The current practice of deriving empirical ground motion prediction equations (GMPEs) involves using ground motions recorded at multiple sites. However, in applications like site-specific (e.g., critical facility) hazard ground motions obtained from the GMPEs are need to be adjusted/corrected to a particular site/site-condition under investigation. This study presents a complete framework for developing a response spectral GMPE, within which the issue of adjustment of ground motions is addressed in a manner consistent with the linear system framework. The present approach is a two-step process in which the first step consists of deriving two separate empirical models, one for Fourier amplitude spectra (FAS) and the other for a random vibration theory (RVT) optimized duration (Drvto) of ground motion. In the second step the two models are combined within the RVT framework to obtain full response spectral amplitudes. Additionally, the framework also involves a stochastic model based extrapolation of individual Fourier spectra to extend the useable frequency limit of the empirically derived FAS model. The stochastic model parameters were determined by inverting the Fourier spectral data using an approach similar to the one as described in Edwards and Faeh (2013). Comparison of median predicted response spectra from present approach with those from other regional GMPEs indicates that the present approach can also be used as a stand-alone model. The dataset used for the presented analysis is a subset of the recently compiled database RESORCE-2012 across Europe, the Middle East and the Mediterranean region.
Variable Density Multilayer Insulation for Cryogenic Storage
NASA Technical Reports Server (NTRS)
Hedayat, A.; Brown, T. M.; Hastings, L. J.; Martin, J.
2000-01-01
Two analytical models for a foam/Variable Density Multi-Layer Insulation (VD-MLI) system performance are discussed. Both models are one-dimensional and contain three heat transfer mechanisms, namely conduction through the spacer material, radiation between the shields, and conduction through the gas. One model is based on the methodology developed by McIntosh while the other model is based on the Lockheed semi-empirical approach. All models input variables are based on the Multi-purpose Hydrogen Test Bed (MHTB) geometry and available values for material properties and empirical solid conduction coefficient. Heat flux predictions are in good agreement with the MHTB data, The heat flux predictions are presented for the foam/MLI combinations with 30, 45, 60, and 75 MLI layers
A scale-based approach to interdisciplinary research and expertise in sports.
Ibáñez-Gijón, Jorge; Buekers, Martinus; Morice, Antoine; Rao, Guillaume; Mascret, Nicolas; Laurin, Jérome; Montagne, Gilles
2017-02-01
After more than 20 years since the introduction of ecological and dynamical approaches in sports research, their promising opportunity for interdisciplinary research has not been fulfilled yet. The complexity of the research process and the theoretical and empirical difficulties associated with an integrated ecological-dynamical approach have been the major factors hindering the generalisation of interdisciplinary projects in sports sciences. To facilitate this generalisation, we integrate the major concepts from the ecological and dynamical approaches to study behaviour as a multi-scale process. Our integration gravitates around the distinction between functional (ecological) and execution (organic) scales, and their reciprocal intra- and inter-scale constraints. We propose an (epistemological) scale-based definition of constraints that accounts for the concept of synergies as emergent coordinative structures. To illustrate how we can operationalise the notion of multi-scale synergies we use an interdisciplinary model of locomotor pointing. To conclude, we show the value of this approach for interdisciplinary research in sport sciences, as we discuss two examples of task-specific dimensionality reduction techniques in the context of an ongoing project that aims to unveil the determinants of expertise in basketball free throw shooting. These techniques provide relevant empirical evidence to help bootstrap the challenging modelling efforts required in sport sciences.
Time series regression-based pairs trading in the Korean equities market
NASA Astrophysics Data System (ADS)
Kim, Saejoon; Heo, Jun
2017-07-01
Pairs trading is an instance of statistical arbitrage that relies on heavy quantitative data analysis to profit by capitalising low-risk trading opportunities provided by anomalies of related assets. A key element in pairs trading is the rule by which open and close trading triggers are defined. This paper investigates the use of time series regression to define the rule which has previously been identified with fixed threshold-based approaches. Empirical results indicate that our approach may yield significantly increased excess returns compared to ones obtained by previous approaches on large capitalisation stocks in the Korean equities market.
ERIC Educational Resources Information Center
Widiputera, Ferdi; De Witte, Kristof; Groot, Wim; van den Brink, Henriëtte Maassen
2017-01-01
This paper reviews studies on diversity in higher education institutions and suggests empirical approaches to measure diversity. "Diversity" in this paper refers to the internal and external differences among academic programs and institutions. As the empirical literature is relatively salient about how to measure diversity in higher…
Fight the power: the limits of empiricism and the costs of positivistic rigor.
Indick, William
2002-01-01
A summary of the influence of positivistic philosophy and empiricism on the field of psychology is followed by a critique of the empirical method. The dialectic process is advocated as an alternative method of inquiry. The main advantage of the dialectic method is that it is open to any logical argument, including empirical hypotheses, but unlike empiricism, it does not automatically reject arguments that are not based on observable data. Evolutionary and moral psychology are discussed as examples of important fields of study that could benefit from types of arguments that frequently do not conform to the empirical standards of systematic observation and falsifiability of hypotheses. A dialectic method is shown to be a suitable perspective for those fields of research, because it allows for logical arguments that are not empirical and because it fosters a functionalist perspective, which is indispensable for both evolutionary and moral theories. It is suggested that all psychologists may gain from adopting a dialectic approach, rather than restricting themselves to empirical arguments alone.
Recent Progress in Treating Protein-Ligand Interactions with Quantum-Mechanical Methods.
Yilmazer, Nusret Duygu; Korth, Martin
2016-05-16
We review the first successes and failures of a "new wave" of quantum chemistry-based approaches to the treatment of protein/ligand interactions. These approaches share the use of "enhanced", dispersion (D), and/or hydrogen-bond (H) corrected density functional theory (DFT) or semi-empirical quantum mechanical (SQM) methods, in combination with ensemble weighting techniques of some form to capture entropic effects. Benchmark and model system calculations in comparison to high-level theoretical as well as experimental references have shown that both DFT-D (dispersion-corrected density functional theory) and SQM-DH (dispersion and hydrogen bond-corrected semi-empirical quantum mechanical) perform much more accurately than older DFT and SQM approaches and also standard docking methods. In addition, DFT-D might soon become and SQM-DH already is fast enough to compute a large number of binding modes of comparably large protein/ligand complexes, thus allowing for a more accurate assessment of entropic effects.
Norlyk, Annelise; Harder, Ingegerd
2010-03-01
This article contributes to the debate about phenomenology as a research approach in nursing by providing a systematic review of what nurse researchers hold as phenomenology in published empirical studies. Based on the assumption that presentations of phenomenological approaches in peer-reviewed journals have consequences for the quality of future research, the aim was to analyze articles presenting phenomenological studies and, in light of the findings, raise a discussion about addressing scientific criteria. The analysis revealed considerable variations, ranging from brief to detailed descriptions of the stated phenomenological approach, and from inconsistencies to methodological clarity and rigor. Variations, apparent inconsistencies, and omissions made it unclear what makes a phenomenological study phenomenological. There is a need for clarifying how the principles of the phenomenological philosophy are implemented in a particular study before publishing. This should include an articulation of methodological keywords of the investigated phenomenon, and how an open attitude was adopted.
Estimation of treatment effect in a subpopulation: An empirical Bayes approach.
Shen, Changyu; Li, Xiaochun; Jeong, Jaesik
2016-01-01
It is well recognized that the benefit of a medical intervention may not be distributed evenly in the target population due to patient heterogeneity, and conclusions based on conventional randomized clinical trials may not apply to every person. Given the increasing cost of randomized trials and difficulties in recruiting patients, there is a strong need to develop analytical approaches to estimate treatment effect in subpopulations. In particular, due to limited sample size for subpopulations and the need for multiple comparisons, standard analysis tends to yield wide confidence intervals of the treatment effect that are often noninformative. We propose an empirical Bayes approach to combine both information embedded in a target subpopulation and information from other subjects to construct confidence intervals of the treatment effect. The method is appealing in its simplicity and tangibility in characterizing the uncertainty about the true treatment effect. Simulation studies and a real data analysis are presented.
Combining medically assisted treatment and Twelve-Step programming: a perspective and review.
Galanter, Marc
2018-01-01
People with severe substance use disorders require long-term rehabilitative care after the initial treatment. There is, however, a deficit in the availability of such care. This may be due both to inadequate medical coverage and insufficient use of community-based Twelve-Step programs in many treatment facilities. In order to address this deficit, rehabilitative care for severe substance use disorders could be promoted through collaboration between practitioners of medically assisted treatment, employing medications, and Twelve-Step-oriented practitioners. To describe the limitations and benefits in applying biomedical approaches and Twelve-Step resources in the rehabilitation of persons with severe substance use disorders; and to assess how the two approaches can be employed together to improve clinical outcome. Empirical literature focusing on clinical and manpower issues is reviewed with regard (a) to limitations in available treatment options in ambulatory and residential addiction treatment facilities for persons with severe substance use disorders, (b) problems of long-term rehabilitation particular to opioid-dependent persons, associated with the limitations of pharmacologic approaches, (c) the relative effectiveness of biomedical and Twelve-Step approaches in the clinical context, and (d) the potential for enhanced use of these approaches, singly and in combination, to address perceived deficits. The biomedical and Twelve-Step-oriented approaches are based on differing theoretical and empirically grounded models. Research-based opportunities are reviewed for improving addiction rehabilitation resources with enhanced collaboration between practitioners of these two potentially complementary practice models. This can involve medications for both acute and chronic treatment for substances for which such medications are available, and Twelve-Step-based support for abstinence and long-term rehabilitation. Clinical and Scientific Significance: Criteria for developing evidence-based approaches for combined treatment should be developed, and research for evidence-based treatment on this basis can be undertaken in order to develop improved clinical outcome.
How to improve the teaching of clinical reasoning: a narrative review and a proposal.
Schmidt, Henk G; Mamede, Sílvia
2015-10-01
The development of clinical reasoning (CR) in students has traditionally been left to clinical rotations, which, however, often offer limited practice and suboptimal supervision. Medical schools begin to address these limitations by organising pre-clinical CR courses. The purpose of this paper is to review the variety of approaches employed in the teaching of CR and to present a proposal to improve these practices. We conducted a narrative review of the literature on teaching CR. To that end, we searched PubMed and Web of Science for papers published until June 2014. Additional publications were identified in the references cited in the initial papers. We used theoretical considerations to characterise approaches and noted empirical findings, when available. Of the 48 reviewed papers, only 24 reported empirical findings. The approaches to teaching CR were shown to vary on two dimensions. The first pertains to the way the case information is presented. The case is either unfolded to students gradually - the 'serial-cue' approach - or is presented in a 'whole-case' format. The second dimension concerns the purpose of the exercise: is its aim to help students acquire or apply knowledge, or is its purpose to teach students a way of thinking? The most prevalent approach is the serial-cue approach, perhaps because it tries to directly simulate the diagnostic activities of doctors. Evidence supporting its effectiveness is, however, lacking. There is some empirical evidence that whole-case, knowledge-oriented approaches contribute to the improvement of students' CR. However, thinking process-oriented approaches were shown to be largely ineffective. Based on research on how expertise develops in medicine, we argue that students in different phases of their training may benefit from different approaches to the teaching of CR. © 2015 John Wiley & Sons Ltd.
Empirical conversion of the vertical profile of reflectivity from Ku-band to S-band frequency
NASA Astrophysics Data System (ADS)
Cao, Qing; Hong, Yang; Qi, Youcun; Wen, Yixin; Zhang, Jian; Gourley, Jonathan J.; Liao, Liang
2013-02-01
ABSTRACT This paper presents an empirical method for converting reflectivity from Ku-band (13.8 GHz) to S-band (2.8 GHz) for several hydrometeor species, which facilitates the incorporation of Tropical Rainfall Measuring Mission (TRMM) Precipitation Radar (PR) measurements into quantitative precipitation estimation (QPE) products from the U.S. Next-Generation Radar (NEXRAD). The development of empirical dual-frequency relations is based on theoretical simulations, which have assumed appropriate scattering and microphysical models for liquid and solid hydrometeors (raindrops, snow, and ice/hail). Particle phase, shape, orientation, and density (especially for snow particles) have been considered in applying the T-matrix method to compute the scattering amplitudes. Gamma particle size distribution (PSD) is utilized to model the microphysical properties in the ice region, melting layer, and raining region of precipitating clouds. The variability of PSD parameters is considered to study the characteristics of dual-frequency reflectivity, especially the variations in radar dual-frequency ratio (DFR). The empirical relations between DFR and Ku-band reflectivity have been derived for particles in different regions within the vertical structure of precipitating clouds. The reflectivity conversion using the proposed empirical relations has been tested using real data collected by TRMM-PR and a prototype polarimetric WSR-88D (Weather Surveillance Radar 88 Doppler) radar, KOUN. The processing and analysis of collocated data demonstrate the validity of the proposed empirical relations and substantiate their practical significance for reflectivity conversion, which is essential to the TRMM-based vertical profile of reflectivity correction approach in improving NEXRAD-based QPE.
Box-wing model approach for solar radiation pressure modelling in a multi-GNSS scenario
NASA Astrophysics Data System (ADS)
Tobias, Guillermo; Jesús García, Adrián
2016-04-01
The solar radiation pressure force is the largest orbital perturbation after the gravitational effects and the major error source affecting GNSS satellites. A wide range of approaches have been developed over the years for the modelling of this non gravitational effect as part of the orbit determination process. These approaches are commonly divided into empirical, semi-analytical and analytical, where their main difference relies on the amount of knowledge of a-priori physical information about the properties of the satellites (materials and geometry) and their attitude. It has been shown in the past that the pre-launch analytical models fail to achieve the desired accuracy mainly due to difficulties in the extrapolation of the in-orbit optical and thermic properties, the perturbations in the nominal attitude law and the aging of the satellite's surfaces, whereas empirical models' accuracies strongly depend on the amount of tracking data used for deriving the models, and whose performances are reduced as the area to mass ratio of the GNSS satellites increases, as it happens for the upcoming constellations such as BeiDou and Galileo. This paper proposes to use basic box-wing model for Galileo complemented with empirical parameters, based on the limited available information about the Galileo satellite's geometry. The satellite is modelled as a box, representing the satellite bus, and a wing representing the solar panel. The performance of the model will be assessed for GPS, GLONASS and Galileo constellations. The results of the proposed approach have been analyzed over a one year period. In order to assess the results two different SRP models have been used. Firstly, the proposed box-wing model and secondly, the new CODE empirical model, ECOM2. The orbit performances of both models are assessed using Satellite Laser Ranging (SLR) measurements, together with the evaluation of the orbit prediction accuracy. This comparison shows the advantages and disadvantages of taking the physical interactions between satellite and solar radiation into account in an empirical model with respect to a pure empirical model.
Chan, R W
2001-09-01
Empirical data on the viscoelastic shear properties of human vocal-fold mucosa (cover) were recently reported at relatively low frequency (0.01-15 Hz). For the data to become relevant to voice production, attempts have been made to parametrize and extrapolate the data to higher frequencies using constitutive modeling [Chan and Titze, J. Acoust. Soc. Am. 107, 565-580 (2000)]. This study investigated the feasibility of an alternative approach for data extrapolation, namely the principle of time-temperature superposition (TTS). TTS is a hybrid theoretical-empirical approach widely used by rheologists to estimate the viscoelastic properties of polymeric systems at time or frequency scales not readily accessible experimentally. It is based on the observation that for many polymers, the molecular configurational changes that occur in a given time scale at a low temperature correspond to those that occur in a shorter time scale at a higher temperature. Using a rotational rheometer, the elastic shear modulus (G') and viscous shear modulus (G'') of vocal-fold cover (superficial layer of lamina propria) tissue samples were measured at 0.01-15 Hz at relatively low temperatures (5 degrees-37 degrees C). Data were empirically shifted according to TTS, yielding composite "master curves" for predicting the magnitude of the shear moduli at higher frequencies at 37 degrees C. Results showed that TTS may be a feasible approach for estimating the viscoelastic shear properties of vocal-fold tissues at frequencies of phonation (on the order of 100-1000 Hz).
ERIC Educational Resources Information Center
Shulha, Lyn M.; Whitmore, Elizabeth; Cousins, J. Bradley; Gilbert, Nathalie; al Hudib, Hind
2016-01-01
This article introduces a set of evidence-based principles to guide evaluation practice in contexts where evaluation knowledge is collaboratively produced by evaluators and stakeholders. The data from this study evolved in four phases: two pilot phases exploring the desirability of developing a set of principles; an online questionnaire survey…
ERIC Educational Resources Information Center
Ousley, Chris
2010-01-01
This study sought to provide empirical evidence regarding the use of spatial analysis in enrollment management to predict persistence and graduation. The research utilized data from the 2000 U.S. Census and applicant records from The University of Arizona to study the spatial distributions of enrollments. Based on the initial results, stepwise…
ERIC Educational Resources Information Center
Holmberg, Carrie
2017-01-01
This study involved empirical investigation of a moves-based conceptualization of teacher practices of planning, enacting, and reflecting on formative assessment (FA) in mathematics classrooms in a high-needs school district in California. A qualitative case study of six middle school mathematics teachers' practices of "posing"…
Sources of Meaningfulness in the Workplace: A Study in the US Hospitality Sector
ERIC Educational Resources Information Center
Dimitrov, Danielle
2012-01-01
Purpose: The purpose of this paper is to explore the sources of meaningfulness at the workplace, according to the perceptions of hospitality employees from different national cultures in one US-based hotel, based on Dimitrov's empirical study about the features of the humane organization. Design/methodology/approach: This was an exploratory…
ERIC Educational Resources Information Center
Rosen, Yigal
2015-01-01
How can activities in which collaborative skills of an individual are measured be standardized? In order to understand how students perform on collaborative problem solving (CPS) computer-based assessment, it is necessary to examine empirically the multi-faceted performance that may be distributed across collaboration methods. The aim of this…
Theoretical and Empirical Base for Implementation Components of Health-Promoting Schools
ERIC Educational Resources Information Center
Samdal, Oddrun; Rowling, Louise
2011-01-01
Purpose: Efforts to create a scientific base for the health-promoting school approach have so far not articulated a clear "Science of Delivery". There is thus a need for systematic identification of clearly operationalised implementation components. To address a next step in the refinement of the health-promoting schools' work, this paper sets out…
ERIC Educational Resources Information Center
Torres, Irene; Simovska, Venka
2017-01-01
Purpose: The purpose of this paper is to contribute to the debate concerning community participation in school-based health education and health promotion, with regard to food and nutrition. Design/methodology/approach: Based on empirical data generated over the course of one year of fieldwork in three rural communities and schools in Ecuador, the…
The Value of Web Log Data in Use-based Design and Testing.
ERIC Educational Resources Information Center
Burton, Mary C.; Walther, Joseph B.
2001-01-01
Suggests Web-based logs contain useful empirical data with which World Wide Web designers and design theorists can assess usability and effectiveness of design choices. Enumerates identification of types of Web server logs, client logs, types and uses of log data, and issues associated with the validity of these data. Presents an approach to…
Louis R. Iverson; Anantha M. Prasad; Stephen N. Matthews; Matthew P. Peters
2011-01-01
We present an approach to modeling potential climate-driven changes in habitat for tree and bird species in the eastern United States. First, we took an empirical-statistical modeling approach, using randomForest, with species abundance data from national inventories combined with soil, climate, and landscape variables, to build abundance-based habitat models for 134...
ERIC Educational Resources Information Center
Tempelman, E.; Pilot, A.
2011-01-01
In 2007, the Faculty of Industrial Design Engineering of the Delft University of Technology introduced a new bachelor program. Based on theories of learning and instruction three design principles were used to develop an approach that aims to make it easier for students to bridge the gap between theoretical design engineering courses and practical…
Pattern-Directed Attention in Uncertain Frequency Detection.
1983-10-14
performance when compared to a single frequency condition even if the listeners are aware that more than one signal can occur ( Creelman , 1960; Green...be missed. On the-other hand, the multiple band approach, introduced by Green (1958) and modified by Creelman (1960), assumes that listeners base...multiple-band approaches ( Creelman , 1960; Green, 1961; Macmillan & Schwartz, 1975). In general, the two views are difficult to distinguish empirically, and
NASA Astrophysics Data System (ADS)
Mikeš, Daniel
2010-05-01
Theoretical geology Present day geology is mostly empirical of nature. I claim that geology is by nature complex and that the empirical approach is bound to fail. Let's consider the input to be the set of ambient conditions and the output to be the sedimentary rock record. I claim that the output can only be deduced from the input if the relation from input to output be known. The fundamental question is therefore the following: Can one predict the output from the input or can one predict the behaviour of a sedimentary system? If one can, than the empirical/deductive method has changes, if one can't than that method is bound to fail. The fundamental problem to solve is therefore the following: How to predict the behaviour of a sedimentary system? It is interesting to observe that this question is never asked and many a study is conducted by the empirical/deductive method; it seems that the empirical method has been accepted as being appropriate without question. It is, however, easy to argument that a sedimentary system is by nature complex and that several input parameters vary at the same time and that they can create similar output in the rock record. It follows trivially from these first principles that in such a case the deductive solution cannot be unique. At the same time several geological methods depart precisely from the assumption, that one particular variable is the dictator/driver and that the others are constant, even though the data do not support such an assumption. The method of "sequence stratigraphy" is a typical example of such a dogma. It can be easily argued that all the interpretation resulting from a method that is built on uncertain or wrong assumptions is erroneous. Still, this method has survived for many years, nonwithstanding all the critics it has received. This is just one example of the present day geological world and is not unique. Even the alternative methods criticising sequence stratigraphy actually depart from the same erroneous assumptions and do not solve the very fundamental issue that lies at the base of the problem. This problem is straighforward and obvious: a sedimentary system is inherently four-dimensional (3 spatial dimensions + 1 temporal dimension). Any method using an inferior number or dimensions is bound to fail to describe the evolution of a sedimentary system. It is indicative of the present day geological world that such fundamental issues be overlooked. The only reason for which one can appoint the socalled "rationality" in todays society. Simple "common sense" leads us to the conclusion that in this case the empirical method is bound to fail and the only method that can solve the problem is the theoretical approach. Reasoning that is completely trivial for the traditional exact sciences like physics and mathematics and applied sciences like engineering. However, not for geology, a science that was traditionally descriptive and jumped to empirical science, skipping the stage of theoretical science. I argue that the gap of theoretical geology is left open and needs to be filled. Every discipline in geology lacks a theoretical base. This base can only be filled by the theoretical/inductive approach and can impossibly be filled by the empirical/deductive approach. Once a critical mass of geologists realises this flaw in todays geology, we can start solving the fundamental problems in geology.
Braking the bandwagon: scrutinizing the science and politics of empirically supported therapies.
Hagemoser, Steven D
2009-12-01
Proponents of empirically supported therapies (ESTs) argue that because manualized ESTs have demonstrated efficacy in treating a range of psychological disorders, they should be the treatments of choice. In this article, the author uses a hypothetical treatment for obesity to highlight numerous flaws in EST logic and argues for common factors as a more clinically relevant but empirically challenging approach. The author then explores how political variables may be contributing to the expansion of EST and the resulting restriction of practitioner autonomy. Last, the author argues that EST is best viewed as 1 component of a more comprehensive evidence-based practice framework. The author concludes with some cautionary statements about the perils of equating the EST paradigm with the scientist-practitioner ideal.
NASA Astrophysics Data System (ADS)
Dong, Lieqian; Wang, Deying; Zhang, Yimeng; Zhou, Datong
2017-09-01
Signal enhancement is a necessary step in seismic data processing. In this paper we utilize the complementary ensemble empirical mode decomposition (CEEMD) and complex curvelet transform (CCT) methods to separate signal from random noise further to improve the signal to noise (S/N) ratio. Firstly, the original data with noise is decomposed into a series of intrinsic mode function (IMF) profiles with the aid of CEEMD. Then the IMFs with noise are transformed into CCT domain. By choosing different thresholds which are based on the noise level difference of each IMF profile, the noise in original data can be suppressed. Finally, we illustrate the effectiveness of the approach by simulated and field datasets.
Educational Cost-Benefit Analysis.
ERIC Educational Resources Information Center
Hough, J. R.
1994-01-01
Educational cost-benefit analysis, as practiced in both industrialized and developing nations, has been much criticized. Manpower planning, the principal alternative, has received even harsher criticism. The two approaches should be combined in empirically based projects that study recent graduates and chart their subsequent employment progress.…
U.S. ENVIRONMENTAL PROTECTION AGENCY'S LANDFILL GAS EMISSION MODEL (LANDGEM)
The paper discusses EPA's available software for estimating landfill gas emissions. This software is based on a first-order decomposition rate equation using empirical data from U.S. landfills. The software provides a relatively simple approach to estimating landfill gas emissi...
Team Learning in Technology-Mediated Distributed Teams
ERIC Educational Resources Information Center
Andres, Hayward P.; Shipps, Belinda P.
2010-01-01
This study examines technological, educational/learning, and social affordances associated with the facilitation of project-based learning and problem solving in technology-mediated distributed teams. An empirical interpretive research approach using direct observation is used to interpret, evaluate and rate observable manifested behaviors and…
NASA Astrophysics Data System (ADS)
Li, Lingqi; Gottschalk, Lars; Krasovskaia, Irina; Xiong, Lihua
2018-01-01
Reconstruction of missing runoff data is of important significance to solve contradictions between the common situation of gaps and the fundamental necessity of complete time series for reliable hydrological research. The conventional empirical orthogonal functions (EOF) approach has been documented to be useful for interpolating hydrological series based upon spatiotemporal decomposition of runoff variation patterns, without additional measurements (e.g., precipitation, land cover). This study develops a new EOF-based approach (abbreviated as CEOF) that conditions EOF expansion on the oscillations at outlet (or any other reference station) of a target basin and creates a set of residual series by removing the dependence on this reference series, in order to redefine the amplitude functions (components). This development allows a transparent hydrological interpretation of the dimensionless components and thereby strengthens their capacities to explain various runoff regimes in a basin. The two approaches are demonstrated on an application of discharge observations from the Ganjiang basin, China. Two alternatives for determining amplitude functions based on centred and standardised series, respectively, are tested. The convergence in the reconstruction of observations at different sites as a function of the number of components and its relation to the characteristics of the site are analysed. Results indicate that the CEOF approach offers an efficient way to restore runoff records with only one to four components; it shows more superiority in nested large basins than at headwater sites and often performs better than the EOF approach when using standardised series, especially in improving infilling accuracy for low flows. Comparisons against other interpolation methods (i.e., nearest neighbour, linear regression, inverse distance weighting) further confirm the advantage of the EOF-based approaches in avoiding spatial and temporal inconsistencies in estimated series.
How rational should bioethics be? The value of empirical approaches.
Alvarez, A A
2001-10-01
Rational justification of claims with empirical content calls for empirical and not only normative philosophical investigation. Empirical approaches to bioethics are epistemically valuable, i.e., such methods may be necessary in providing and verifying basic knowledge about cultural values and norms. Our assumptions in moral reasoning can be verified or corrected using these methods. Moral arguments can be initiated or adjudicated by data drawn from empirical investigation. One may argue that individualistic informed consent, for example, is not compatible with the Asian communitarian orientation. But this normative claim uses an empirical assumption that may be contrary to the fact that some Asians do value and argue for informed consent. Is it necessary and factual to neatly characterize some cultures as individualistic and some as communitarian? Empirical investigation can provide a reasonable way to inform such generalizations. In a multi-cultural context, such as in the Philippines, there is a need to investigate the nature of the local ethos before making any appeal to authenticity. Otherwise we may succumb to the same ethical imperialism we are trying hard to resist. Normative claims that involve empirical premises cannot be reasonable verified or evaluated without utilizing empirical methods along with philosophical reflection. The integration of empirical methods to the standard normative approach to moral reasoning should be reasonably guided by the epistemic demands of claims arising from cross-cultural discourse in bioethics.
Rector, Neil A; Man, Vincent; Lerman, Bethany
2014-06-01
Cognitive-behavioural therapy (CBT) is an empirically supported treatment for anxiety disorders. CBT treatments are based on disorder-specific protocols that have been developed to target individual anxiety disorders, despite that anxiety disorders frequently co-occur and are comorbid with depression. Given the high rates of diagnostic comorbidity, substantial overlap in dimensional symptom ratings, and extensive evidence that the mood and anxiety disorders share a common set of psychological and biological vulnerabilities, transdiagnostic CBT protocols have recently been developed to treat the commonalities among the mood and anxiety disorders. We conducted a selective review of empirical developments in the transdiagnostic CBT treatment of anxiety and depression (2008-2013). Preliminary evidence suggests that theoretically based transdiagnostic CBT approaches lead to large treatment effects on the primary anxiety disorder, considerable reduction of diagnostic comorbidity, and some preliminary effects regarding the impact on the putative, shared psychological mechanisms. However, the empirical literature remains tentative owing to relatively small samples, limited direct comparisons with disorder-specific CBT protocols, and the relative absence of the study of disorder-specific compared with shared mechanisms of action in treatment. We conclude with a treatment conceptualization of the new transdiagnostic interventions as complementary, rather than contradictory, to disorder-specific CBT.
Estimating topological properties of weighted networks from limited information.
Cimini, Giulio; Squartini, Tiziano; Gabrielli, Andrea; Garlaschelli, Diego
2015-10-01
A problem typically encountered when studying complex systems is the limitedness of the information available on their topology, which hinders our understanding of their structure and of the dynamical processes taking place on them. A paramount example is provided by financial networks, whose data are privacy protected: Banks publicly disclose only their aggregate exposure towards other banks, keeping individual exposures towards each single bank secret. Yet, the estimation of systemic risk strongly depends on the detailed structure of the interbank network. The resulting challenge is that of using aggregate information to statistically reconstruct a network and correctly predict its higher-order properties. Standard approaches either generate unrealistically dense networks, or fail to reproduce the observed topology by assigning homogeneous link weights. Here, we develop a reconstruction method, based on statistical mechanics concepts, that makes use of the empirical link density in a highly nontrivial way. Technically, our approach consists in the preliminary estimation of node degrees from empirical node strengths and link density, followed by a maximum-entropy inference based on a combination of empirical strengths and estimated degrees. Our method is successfully tested on the international trade network and the interbank money market, and represents a valuable tool for gaining insights on privacy-protected or partially accessible systems.
Estimating topological properties of weighted networks from limited information
NASA Astrophysics Data System (ADS)
Cimini, Giulio; Squartini, Tiziano; Gabrielli, Andrea; Garlaschelli, Diego
2015-10-01
A problem typically encountered when studying complex systems is the limitedness of the information available on their topology, which hinders our understanding of their structure and of the dynamical processes taking place on them. A paramount example is provided by financial networks, whose data are privacy protected: Banks publicly disclose only their aggregate exposure towards other banks, keeping individual exposures towards each single bank secret. Yet, the estimation of systemic risk strongly depends on the detailed structure of the interbank network. The resulting challenge is that of using aggregate information to statistically reconstruct a network and correctly predict its higher-order properties. Standard approaches either generate unrealistically dense networks, or fail to reproduce the observed topology by assigning homogeneous link weights. Here, we develop a reconstruction method, based on statistical mechanics concepts, that makes use of the empirical link density in a highly nontrivial way. Technically, our approach consists in the preliminary estimation of node degrees from empirical node strengths and link density, followed by a maximum-entropy inference based on a combination of empirical strengths and estimated degrees. Our method is successfully tested on the international trade network and the interbank money market, and represents a valuable tool for gaining insights on privacy-protected or partially accessible systems.
Retooling Predictive Relations for non-volatile PM by Comparison to Measurements
NASA Astrophysics Data System (ADS)
Vander Wal, R. L.; Abrahamson, J. P.
2015-12-01
Non-volatile particulate matter (nvPM) emissions from jet aircraft at cruise altitude are of particular interest for climate and atmospheric processes but are difficult to measure and are normally approximated. To provide such inventory estimates the present approach is to use measured, ground-based values with scaling to cruise (engine operating) conditions. Several points are raised by this approach. First is what ground based values to use. Empirical and semi-empirical approaches, such as the revised first order approximation (FOA3) and formation-oxidation (FOX) methods, each with embedded assumptions are available to calculate a ground-based black carbon concentration, CBC. Second is the scaling relation that can depend upon the ratios of fuel-air equivalence, pressure, and combustor flame temperature. We are using measured ground-based values to evaluate the accuracy of present methods towards developing alternative methods for CBCby smoke number or via a semi-empirical kinetic method for the specific engine, CFM56-2C, representative of a rich-dome style combustor, and as one of the most prevalent engine families in commercial use. Applying scaling relations to measured ground based values and comparison to measurements at cruise evaluates the accuracy of current scaling formalism. In partnership with GE Aviation, performing engine cycle deck calculations enables critical comparison between estimated or predicted thermodynamic parameters and true (engine) operational values for the CFM56-2C engine. Such specific comparisons allow tracing differences between predictive estimates for, and measurements of nvPM to their origin - as either divergence of input parameters or in the functional form of the predictive relations. Such insights will lead to development of new predictive tools for jet aircraft nvPM emissions. Such validated relations can then be extended to alternative fuels with confidence in operational thermodynamic values and functional form. Comparisons will then be made between these new predictive relationships and measurements of nvPM from alternative fuels using ground and cruise data - as collected during NASA-led AAFEX and ACCESS field campaigns, respectively.
Shim, Jihyun; Mackerell, Alexander D
2011-05-01
A significant number of drug discovery efforts are based on natural products or high throughput screens from which compounds showing potential therapeutic effects are identified without knowledge of the target molecule or its 3D structure. In such cases computational ligand-based drug design (LBDD) can accelerate the drug discovery processes. LBDD is a general approach to elucidate the relationship of a compound's structure and physicochemical attributes to its biological activity. The resulting structure-activity relationship (SAR) may then act as the basis for the prediction of compounds with improved biological attributes. LBDD methods range from pharmacophore models identifying essential features of ligands responsible for their activity, quantitative structure-activity relationships (QSAR) yielding quantitative estimates of activities based on physiochemical properties, and to similarity searching, which explores compounds with similar properties as well as various combinations of the above. A number of recent LBDD approaches involve the use of multiple conformations of the ligands being studied. One of the basic components to generate multiple conformations in LBDD is molecular mechanics (MM), which apply an empirical energy function to relate conformation to energies and forces. The collection of conformations for ligands is then combined with functional data using methods ranging from regression analysis to neural networks, from which the SAR is determined. Accordingly, for effective application of LBDD for SAR determinations it is important that the compounds be accurately modelled such that the appropriate range of conformations accessible to the ligands is identified. Such accurate modelling is largely based on use of the appropriate empirical force field for the molecules being investigated and the approaches used to generate the conformations. The present chapter includes a brief overview of currently used SAR methods in LBDD followed by a more detailed presentation of issues and limitations associated with empirical energy functions and conformational sampling methods.
Critical Realism and Empirical Bioethics: A Methodological Exposition.
McKeown, Alex
2017-09-01
This paper shows how critical realism can be used to integrate empirical data and philosophical analysis within 'empirical bioethics'. The term empirical bioethics, whilst appearing oxymoronic, simply refers to an interdisciplinary approach to the resolution of practical ethical issues within the biological and life sciences, integrating social scientific, empirical data with philosophical analysis. It seeks to achieve a balanced form of ethical deliberation that is both logically rigorous and sensitive to context, to generate normative conclusions that are practically applicable to the problem, challenge, or dilemma. Since it incorporates both philosophical and social scientific components, empirical bioethics is a field that is consistent with the use of critical realism as a research methodology. The integration of philosophical and social scientific approaches to ethics has been beset with difficulties, not least because of the irreducibly normative, rather than descriptive, nature of ethical analysis and the contested relation between fact and value. However, given that facts about states of affairs inform potential courses of action and their consequences, there is a need to overcome these difficulties and successfully integrate data with theory. Previous approaches have been formulated to overcome obstacles in combining philosophical and social scientific perspectives in bioethical analysis; however each has shortcomings. As a mature interdisciplinary approach critical realism is well suited to empirical bioethics, although it has hitherto not been widely used. Here I show how it can be applied to this kind of research and explain how it represents an improvement on previous approaches.
Identifying Similarities in Cognitive Subtest Functional Requirements: An Empirical Approach
ERIC Educational Resources Information Center
Frisby, Craig L.; Parkin, Jason R.
2007-01-01
In the cognitive test interpretation literature, a Rational/Intuitive, Indirect Empirical, or Combined approach is typically used to construct conceptual taxonomies of the functional (behavioral) similarities between subtests. To address shortcomings of these approaches, the functional requirements for 49 subtests from six individually…
NASA Astrophysics Data System (ADS)
Sternberg, Oren; Bednarski, Valerie R.; Perez, Israel; Wheeland, Sara; Rockway, John D.
2016-09-01
Non-invasive optical techniques pertaining to the remote sensing of power quality disturbances (PQD) are part of an emerging technology field typically dominated by radio frequency (RF) and invasive-based techniques. Algorithms and methods to analyze and address PQD such as probabilistic neural networks and fully informed particle swarms have been explored in industry and academia. Such methods are tuned to work with RF equipment and electronics in existing power grids. As both commercial and defense assets are heavily power-dependent, understanding electrical transients and failure events using non-invasive detection techniques is crucial. In this paper we correlate power quality empirical models to the observed optical response. We also empirically demonstrate a first-order approach to map household, office and commercial equipment PQD to user functions and stress levels. We employ a physics-based image and signal processing approach, which demonstrates measured non-invasive (remote sensing) techniques to detect and map the base frequency associated with the power source to the various PQD on a calibrated source.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dierauf, Timothy; Kurtz, Sarah; Riley, Evan
This paper provides a recommended method for evaluating the AC capacity of a photovoltaic (PV) generating station. It also presents companion guidance on setting the facilitys capacity guarantee value. This is a principles-based approach that incorporates plant fundamental design parameters such as loss factors, module coefficients, and inverter constraints. This method has been used to prove contract guarantees for over 700 MW of installed projects. The method is transparent, and the results are deterministic. In contrast, current industry practices incorporate statistical regression where the empirical coefficients may only characterize the collected data. Though these methods may work well when extrapolationmore » is not required, there are other situations where the empirical coefficients may not adequately model actual performance.This proposed Fundamentals Approach method provides consistent results even where regression methods start to lose fidelity.« less
Olsen, S O
2001-04-01
A theoretical model of involvement in consumption of food products was tested in a representative survey of Norwegian households for the particular case of consuming seafood as a common family meal. The empirical study is based on using structural equation approach to test construct validity of measures and the empirical fit of the theoretical model. Attitudes, negative feelings, social norms and moral obligation were proved to be important, reliable and different constructs and explained 63% of the variation in seafood involvement. Negative feelings and moral obligation was the most important antecedents of involvement. Both our proposed model and modified model with seafood involvement as a mediator fit well with the data and proved our expectations in a promising way. Copyright 2001 Academic Press.
Modeling Healthcare Processes Using Commitments: An Empirical Evaluation.
Telang, Pankaj R; Kalia, Anup K; Singh, Munindar P
2015-01-01
The two primary objectives of this paper are: (a) to demonstrate how Comma, a business modeling methodology based on commitments, can be applied in healthcare process modeling, and (b) to evaluate the effectiveness of such an approach in producing healthcare process models. We apply the Comma approach on a breast cancer diagnosis process adapted from an HHS committee report, and presents the results of an empirical study that compares Comma with a traditional approach based on the HL7 Messaging Standard (Traditional-HL7). Our empirical study involved 47 subjects, and two phases. In the first phase, we partitioned the subjects into two approximately equal groups. We gave each group the same requirements based on a process scenario for breast cancer diagnosis. Members of one group first applied Traditional-HL7 and then Comma whereas members of the second group first applied Comma and then Traditional-HL7-each on the above-mentioned requirements. Thus, each subject produced two models, each model being a set of UML Sequence Diagrams. In the second phase, we repartitioned the subjects into two groups with approximately equal distributions from both original groups. We developed exemplar Traditional-HL7 and Comma models; we gave one repartitioned group our Traditional-HL7 model and the other repartitioned group our Comma model. We provided the same changed set of requirements to all subjects and asked them to modify the provided exemplar model to satisfy the new requirements. We assessed solutions produced by subjects in both phases with respect to measures of flexibility, time, difficulty, objective quality, and subjective quality. Our study found that Comma is superior to Traditional-HL7 in flexibility and objective quality as validated via Student's t-test to the 10% level of significance. Comma is a promising new approach for modeling healthcare processes. Further gains could be made through improved tooling and enhanced training of modeling personnel.
Modeling Healthcare Processes Using Commitments: An Empirical Evaluation
2015-01-01
The two primary objectives of this paper are: (a) to demonstrate how Comma, a business modeling methodology based on commitments, can be applied in healthcare process modeling, and (b) to evaluate the effectiveness of such an approach in producing healthcare process models. We apply the Comma approach on a breast cancer diagnosis process adapted from an HHS committee report, and presents the results of an empirical study that compares Comma with a traditional approach based on the HL7 Messaging Standard (Traditional-HL7). Our empirical study involved 47 subjects, and two phases. In the first phase, we partitioned the subjects into two approximately equal groups. We gave each group the same requirements based on a process scenario for breast cancer diagnosis. Members of one group first applied Traditional-HL7 and then Comma whereas members of the second group first applied Comma and then Traditional-HL7—each on the above-mentioned requirements. Thus, each subject produced two models, each model being a set of UML Sequence Diagrams. In the second phase, we repartitioned the subjects into two groups with approximately equal distributions from both original groups. We developed exemplar Traditional-HL7 and Comma models; we gave one repartitioned group our Traditional-HL7 model and the other repartitioned group our Comma model. We provided the same changed set of requirements to all subjects and asked them to modify the provided exemplar model to satisfy the new requirements. We assessed solutions produced by subjects in both phases with respect to measures of flexibility, time, difficulty, objective quality, and subjective quality. Our study found that Comma is superior to Traditional-HL7 in flexibility and objective quality as validated via Student’s t-test to the 10% level of significance. Comma is a promising new approach for modeling healthcare processes. Further gains could be made through improved tooling and enhanced training of modeling personnel. PMID:26539985
ENHANCING TEST SENSITIVITY IN TOXICITY TESTING BY USING A STATISTICAL PERFORMANCE STANDARD
Previous reports have shown that within-test sensitivity can vary markedly among laboratories. Experts have advocated an empirical approach to controlling test variability based on the MSD, control means, and other test acceptability criteria. (The MSD represents the smallest dif...
Misleading Theoretical Assumptions in Hypertext/Hypermedia Research.
ERIC Educational Resources Information Center
Tergan, Sigmar-Olaf
1997-01-01
Reviews basic theoretical assumptions of research on learning with hypertext/hypermedia. Focuses on whether the results of research on hypertext/hypermedia-based learning support these assumptions. Results of empirical studies and theoretical analysis reveal that many research approaches have been misled by inappropriate theoretical assumptions on…
Estimating monetary damages from flooding in the United States under a changing climate
A national-scale analysis of potential changes in monetary damages from flooding under climate change. The approach uses empirically based statistical relationships between historical precipitation and flood damage records from 18 hydrologic regions of the United States, along w...
Global Managers' Career Competencies
ERIC Educational Resources Information Center
Cappellen, Tineke; Janssens, Maddy
2008-01-01
Purpose: This study aims to empirically examine the career competencies of global managers having world-wide coordination responsibility: knowing-why, knowing-how and knowing-whom career competencies. Design/methodology/approach: Based on in-depth interviews with 45 global managers, the paper analyzes career stories from a content analysis…
The community health clinics as a learning context for student nurses.
Makupu, M B; Botes, A
2000-09-01
The purpose of the research study was to describe guidelines to improve the community health clinics as a learning context conductive to learning. The objectives of the study commenced by getting the perception of student nurses from a nursing college in Gauteng; community sisters from ten community health clinics in the Southern Metropolitan Local Council and college tutors from a college in Gauteng. The research design and method used, consisting of a qualitative, exploratory, descriptive and contextual approach and the design was divided into two phases. Phase one consisted of a field/empirical study and phase two of conceptualization. In all the samples follow-up focus group interviews were conducted to confirm the findings. To ensure trustworthiness, Lincoln and Guba's model (1985) was implemented and data analysis was according to Tesch's model (1990 in Creswell 1994:155) based on a qualitative approach. The conceptual framework discussed, indicating a body of knowledge, was based on the study and empirical findings from phase one to give clear meaning and understanding regarding the research study.
Hulvershorn, Leslie A; Quinn, Patrick D; Scott, Eric L
2015-01-01
The past several decades have seen dramatic growth in empirically supported treatments for adolescent substance use disorders (SUDs), yet even the most well-established approaches struggle to produce large or long-lasting improvements. These difficulties may stem, in part, from the high rates of comorbidity between SUDs and other psychiatric disorders. We critically reviewed the treatment outcome literature for adolescents with co-occurring SUDs and internalizing disorders. Our review identified components of existing treatments that might be included in an integrated, evidence-based approach to the treatment of SUDs and internalizing disorders. An effective program may involve careful assessment, inclusion of parents or guardians, and tailoring of interventions via a modular strategy. The existing literature guides the development of a conceptual evidence-based, modular treatment model targeting adolescents with co-occurring internalizing and SUDs. With empirical study, such a model may better address treatment outcomes for both disorder types in adolescents.
Hulvershorn, Leslie A.; Quinn, Patrick D.; Scott, Eric L.
2016-01-01
Background The past several decades have seen dramatic growth in empirically supported treatments for adolescent substance use disorders (SUDs), yet even the most well-established approaches struggle to produce large or long-lasting improvements. These difficulties may stem, in part, from the high rates of comorbidity between SUDs and other psychiatric disorders. Method We critically reviewed the treatment outcome literature for adolescents with co-occurring SUDs and internalizing disorders. Results Our review identified components of existing treatments that might be included in an integrated, evidence-based approach to the treatment of SUDs and internalizing disorders. An effective program may involve careful assessment, inclusion of parents or guardians, and tailoring of interventions via a modular strategy. Conclusions The existing literature guides the development of a conceptual evidence-based, modular treatment model targeting adolescents with co-occurring internalizing and SUDs. With empirical study, such a model may better address treatment outcomes for both disorder types in adolescents. PMID:25973718
Ricks, Samantha L; Alt, Mary
2016-07-01
The purpose of this tutorial is to provide clinicians with a theoretically motivated and evidence-based approach to teaching adjectives to children who struggle with word learning. Given that there are almost no treatment studies to guide this topic, we have synthesized findings from experimental and theoretical literature to come up with a principles-based approach to treatment. We provide a sample lesson plan, incorporating our 3 theoretical principles, and describe the materials chosen and methods used during treatment and assessment. This approach is theoretically motivated, but it needs to be empirically tested.
Retrieving hydrological connectivity from empirical causality in karst systems
NASA Astrophysics Data System (ADS)
Delforge, Damien; Vanclooster, Marnik; Van Camp, Michel; Poulain, Amaël; Watlet, Arnaud; Hallet, Vincent; Kaufmann, Olivier; Francis, Olivier
2017-04-01
Because of their complexity, karst systems exhibit nonlinear dynamics. Moreover, if one attempts to model a karst, the hidden behavior complicates the choice of the most suitable model. Therefore, both intense investigation methods and nonlinear data analysis are needed to reveal the underlying hydrological connectivity as a prior for a consistent physically based modelling approach. Convergent Cross Mapping (CCM), a recent method, promises to identify causal relationships between time series belonging to the same dynamical systems. The method is based on phase space reconstruction and is suitable for nonlinear dynamics. As an empirical causation detection method, it could be used to highlight the hidden complexity of a karst system by revealing its inner hydrological and dynamical connectivity. Hence, if one can link causal relationships to physical processes, the method should show great potential to support physically based model structure selection. We present the results of numerical experiments using karst model blocks combined in different structures to generate time series from actual rainfall series. CCM is applied between the time series to investigate if the empirical causation detection is consistent with the hydrological connectivity suggested by the karst model.
Heinonen, Johannes P M; Palmer, Stephen C F; Redpath, Steve M; Travis, Justin M J
2014-01-01
Individual-based models have gained popularity in ecology, and enable simultaneous incorporation of spatial explicitness and population dynamic processes to understand spatio-temporal patterns of populations. We introduce an individual-based model for understanding and predicting spatial hen harrier (Circus cyaneus) population dynamics in Great Britain. The model uses a landscape with habitat, prey and game management indices. The hen harrier population was initialised according to empirical census estimates for 1988/89 and simulated until 2030, and predictions for 1998, 2004 and 2010 were compared to empirical census estimates for respective years. The model produced a good qualitative match to overall trends between 1989 and 2010. Parameter explorations revealed relatively high elasticity in particular to demographic parameters such as juvenile male mortality. This highlights the need for robust parameter estimates from empirical research. There are clearly challenges for replication of real-world population trends, but this model provides a useful tool for increasing understanding of drivers of hen harrier dynamics and focusing research efforts in order to inform conflict management decisions.
NASA Astrophysics Data System (ADS)
Kiafar, Hamed; Babazadeh, Hosssien; Marti, Pau; Kisi, Ozgur; Landeras, Gorka; Karimi, Sepideh; Shiri, Jalal
2017-10-01
Evapotranspiration estimation is of crucial importance in arid and hyper-arid regions, which suffer from water shortage, increasing dryness and heat. A modeling study is reported here to cross-station assessment between hyper-arid and humid conditions. The derived equations estimate ET0 values based on temperature-, radiation-, and mass transfer-based configurations. Using data from two meteorological stations in a hyper-arid region of Iran and two meteorological stations in a humid region of Spain, different local and cross-station approaches are applied for developing and validating the derived equations. The comparison of the gene expression programming (GEP)-based-derived equations with corresponding empirical-semi empirical ET0 estimation equations reveals the superiority of new formulas in comparison with the corresponding empirical equations. Therefore, the derived models can be successfully applied in these hyper-arid and humid regions as well as similar climatic contexts especially in data-lack situations. The results also show that when relying on proper input configurations, cross-station might be a promising alternative for locally trained models for the stations with data scarcity.
Heinonen, Johannes P. M.; Palmer, Stephen C. F.; Redpath, Steve M.; Travis, Justin M. J.
2014-01-01
Individual-based models have gained popularity in ecology, and enable simultaneous incorporation of spatial explicitness and population dynamic processes to understand spatio-temporal patterns of populations. We introduce an individual-based model for understanding and predicting spatial hen harrier (Circus cyaneus) population dynamics in Great Britain. The model uses a landscape with habitat, prey and game management indices. The hen harrier population was initialised according to empirical census estimates for 1988/89 and simulated until 2030, and predictions for 1998, 2004 and 2010 were compared to empirical census estimates for respective years. The model produced a good qualitative match to overall trends between 1989 and 2010. Parameter explorations revealed relatively high elasticity in particular to demographic parameters such as juvenile male mortality. This highlights the need for robust parameter estimates from empirical research. There are clearly challenges for replication of real-world population trends, but this model provides a useful tool for increasing understanding of drivers of hen harrier dynamics and focusing research efforts in order to inform conflict management decisions. PMID:25405860
John D. Armstrong; Keith H. Nislow
2012-01-01
Modelling approaches for relating discharge to the biology of Atlantic salmon, Salmo salar L., and brown trout, Salmo trutta L., growing in rivers are reviewed. Process-based and empirical models are set within a common framework of input of water flow and output of characteristics of fish, such as growth and survival, which relate directly to population dynamics. A...
Dealing with contaminated datasets: An approach to classifier training
NASA Astrophysics Data System (ADS)
Homenda, Wladyslaw; Jastrzebska, Agnieszka; Rybnik, Mariusz
2016-06-01
The paper presents a novel approach to classification reinforced with rejection mechanism. The method is based on a two-tier set of classifiers. First layer classifies elements, second layer separates native elements from foreign ones in each distinguished class. The key novelty presented here is rejection mechanism training scheme according to the philosophy "one-against-all-other-classes". Proposed method was tested in an empirical study of handwritten digits recognition.
Sport fishing: a comparison of three indirect methods for estimating benefits.
Darrell L. Hueth; Elizabeth J. Strong; Roger D. Fight
1988-01-01
Three market-based methods for estimating values of sport fishing were compared by using a common data base. The three approaches were the travel-cost method, the hedonic travel-cost method, and the household-production method. A theoretical comparison of the resulting values showed that the results were not fully comparable in several ways. The comparison of empirical...
Gender in the Teaching Profession: University Students' Views of Teaching as a Career
ERIC Educational Resources Information Center
Tašner, Veronika; Mihelic, Mojca Žveglic; Ceplak, Metka Mencin
2017-01-01
The purpose of our research is to gain a better insight into what encourages young adults, in particular young women, to enter the teaching profession. The empirical part of the article is based on a pilot study including 132 students, with data collection being based on a survey approach using a questionnaire. The research attempts to address the…
ERIC Educational Resources Information Center
Barnhardt, Bradford; Ginns, Paul
2014-01-01
This article orients a recently proposed alienation-based framework for student learning theory (SLT) to the empirical basis of the approaches to learning perspective. The proposed framework makes new macro-level interpretations of an established micro-level theory, across three levels of interpretation: (1) a context-free psychological state…
ERIC Educational Resources Information Center
Blackburn, Greg
2017-01-01
Much has been written about the promise and peril of technology in education. This paper presents an empirical study that explores how technology can play a pivotal role in student learning and how teaching staff can adopt innovative technology-based approaches in the creation of interactive online problem-based learning (PBL) resources, allowing…
ERIC Educational Resources Information Center
Zabit, Mohd Nazir Md
2010-01-01
This review forms the background to explore and to gain empirical support among lecturers to improve the students' critical thinking skills in business education courses in Malaysia, in which the main teaching and learning methodology is Problem-Based Learning (PBL). The PBL educational approach is known to have maximum positive impacts in…
An Adaptive E-Learning System Based on Students' Learning Styles: An Empirical Study
ERIC Educational Resources Information Center
Drissi, Samia; Amirat, Abdelkrim
2016-01-01
Personalized e-learning implementation is recognized as one of the most interesting research areas in the distance web-based education. Since the learning style of each learner is different one must fit e-learning with the different needs of learners. This paper presents an approach to integrate learning styles into adaptive e-learning hypermedia.…
ERIC Educational Resources Information Center
Boulet, Marie-Michele
2004-01-01
Design prescriptions to create web-based courses and sites that are dynamic, easy-to-use, interactive and data-driven, emerge from a "how to do it" approach. Unfortunately, the theory behind these methods, prescriptions, procedures or tools, is rarely provided and the important terms, such as "easy-to-use", to which these…
ERIC Educational Resources Information Center
Bodenmann, Guy; Shantinath, S. D.
2004-01-01
We describe a distress prevention training program for couples and three empirical studies that support its effectiveness. The program, Couples Coping Enhancement Training (CCET), is based both upon stress and coping theory and research on couples. In addition to traditional elements of couples programs (e.g., communication and problem-solving…
ERIC Educational Resources Information Center
Raes, Annelies; Schellens, Tammy
2015-01-01
This study deals with the implementation of a web-based collaborative inquiry (WISE) project in secondary science education and unravels the contribution and challenges of this learning approach to foster students' motivation to learn science, and its relation with student and class-level characteristics. An empirical mixed methods study in 13…
Empire: An Analytical Category for Educational Research
ERIC Educational Resources Information Center
Coloma, Roland Sintos
2013-01-01
In this article Roland Sintos Coloma argues for the relevance of empire as an analytical category in educational research. He points out the silence in mainstream studies of education on the subject of empire, the various interpretive approaches to deploying empire as an analytic, and the importance of indigeneity in research on empire and…
NASA Astrophysics Data System (ADS)
Kang, Sinhang; Lee, Seung-Rae
2018-05-01
Many debris flow spreading analyses have been conducted during recent decades to prevent damage from debris flows. An empirical approach that has been used in various studies on debris flow spreading has advantages such as simple data acquisition and good applicability for large areas. In this study, a GIS-based empirical model that was developed at the University of Lausanne (Switzerland) is used to assess the debris flow susceptibility. Study sites are classified based on the types of soil texture or geological conditions, which can indirectly consider geotechnical or rheological properties, to supplement the weaknesses of Flow-R which neglects local controlling factors. The mean travel angle for each classification is calculated from a debris flow inventory map. The debris flow susceptibility is assessed based on changes in the flow-direction algorithm, an inertial function with a 5-m DEM resolution. A simplified friction-limited model was applied to the runout distance analysis by using the appropriate travel angle for the corresponding classification with a velocity limit of 28 m/s. The most appropriate algorithm combinations that derived the highest average of efficiency and sensitivity for each classification are finally determined by applying a confusion matrix with the efficiency and the sensitivity to the results of the susceptibility assessment. The proposed schemes can be useful for debris flow susceptibility assessment in both the study area and the central region of Korea, which has similar environmental factors such as geological conditions, topography and rainfall characteristics to the study area.
'Nobody tosses a dwarf!' The relation between the empirical and the normative reexamined.
Leget, Carlo; Borry, Pascal; de Vries, Raymond
2009-05-01
This article discusses the relation between empirical and normative approaches in bioethics. The issue of dwarf tossing, while admittedly unusual, is chosen as a point of departure because it challenges the reader to look with fresh eyes upon several central bioethical themes, including human dignity, autonomy, and the protection of vulnerable people. After an overview of current approaches to the integration of empirical and normative ethics, we consider five ways that the empirical and normative can be brought together to speak to the problem of dwarf tossing: prescriptive applied ethics, theoretical ethics, critical applied ethics, particularist ethics and integrated empirical ethics. We defend a position of critical applied ethics that allows for a two-way relation between empirical and normative theories. Against efforts fully to integrate the normative and the empirical into one synthesis, we propose that the two should stand in tension and relation to one another. The approach we endorse acknowledges that a social practice can and should be judged both by the gathering of empirical data and by normative ethics. Critical applied ethics uses a five stage process that includes: (a) determination of the problem, (b) description of the problem, (c) empirical study of effects and alternatives, (d) normative weighing and (e) evaluation of the effects of a decision. In each stage, we explore the perspective from both the empirical (sociological) and the normative ethical point of view. We conclude by applying our five-stage critical applied ethics to the example of dwarf tossing.
Faris, Allison T.; Seed, Raymond B.; Kayen, Robert E.; Wu, Jiaer
2006-01-01
During the 1906 San Francisco Earthquake, liquefaction-induced lateral spreading and resultant ground displacements damaged bridges, buried utilities, and lifelines, conventional structures, and other developed works. This paper presents an improved engineering tool for the prediction of maximum displacement due to liquefaction-induced lateral spreading. A semi-empirical approach is employed, combining mechanistic understanding and data from laboratory testing with data and lessons from full-scale earthquake field case histories. The principle of strain potential index, based primary on correlation of cyclic simple shear laboratory testing results with in-situ Standard Penetration Test (SPT) results, is used as an index to characterized the deformation potential of soils after they liquefy. A Bayesian probabilistic approach is adopted for development of the final predictive model, in order to take fullest advantage of the data available and to deal with the inherent uncertainties intrinstiic to the back-analyses of field case histories. A case history from the 1906 San Francisco Earthquake is utilized to demonstrate the ability of the resultant semi-empirical model to estimate maximum horizontal displacement due to liquefaction-induced lateral spreading.
Novak, Mark; Wootton, J. Timothy; Doak, Daniel F.; Emmerson, Mark; Estes, James A.; Tinker, M. Timothy
2011-01-01
How best to predict the effects of perturbations to ecological communities has been a long-standing goal for both applied and basic ecology. This quest has recently been revived by new empirical data, new analysis methods, and increased computing speed, with the promise that ecologically important insights may be obtainable from a limited knowledge of community interactions. We use empirically based and simulated networks of varying size and connectance to assess two limitations to predicting perturbation responses in multispecies communities: (1) the inaccuracy by which species interaction strengths are empirically quantified and (2) the indeterminacy of species responses due to indirect effects associated with network size and structure. We find that even modest levels of species richness and connectance (∼25 pairwise interactions) impose high requirements for interaction strength estimates because system indeterminacy rapidly overwhelms predictive insights. Nevertheless, even poorly estimated interaction strengths provide greater average predictive certainty than an approach that uses only the sign of each interaction. Our simulations provide guidance in dealing with the trade-offs involved in maximizing the utility of network approaches for predicting dynamics in multispecies communities.
A Global Classification of Contemporary Fire Regimes
NASA Astrophysics Data System (ADS)
Norman, S. P.; Kumar, J.; Hargrove, W. W.; Hoffman, F. M.
2014-12-01
Fire regimes provide a sensitive indicator of changes in climate and human use as the concept includes fire extent, season, frequency, and intensity. Fires that occur outside the distribution of one or more aspects of a fire regime may affect ecosystem resilience. However, global scale data related to these varied aspects of fire regimes are highly inconsistent due to incomplete or inconsistent reporting. In this study, we derive a globally applicable approach to characterizing similar fire regimes using long geophysical time series, namely MODIS hotspots since 2000. K-means non-hierarchical clustering was used to generate empirically based groups that minimized within-cluster variability. Satellite-based fire detections are known to have shortcomings, including under-detection from obscuring smoke, clouds or dense canopy cover and rapid spread rates, as often occurs with flashy fuels or during extreme weather. Such regions are free from preconceptions, and the empirical, data-mining approach used on this relatively uniform data source allows the region structures to emerge from the data themselves. Comparing such an empirical classification to expectations from climate, phenology, land use or development-based models can help us interpret the similarities and differences among places and how they provide different indicators of changes of concern. Classifications can help identify where large infrequent mega-fires are likely to occur ahead of time such as in the boreal forest and portions of the Interior US West, and where fire reports are incomplete such as in less industrial countries.
Hardcastle, Thomas J
2016-01-15
High-throughput data are now commonplace in biological research. Rapidly changing technologies and application mean that novel methods for detecting differential behaviour that account for a 'large P, small n' setting are required at an increasing rate. The development of such methods is, in general, being done on an ad hoc basis, requiring further development cycles and a lack of standardization between analyses. We present here a generalized method for identifying differential behaviour within high-throughput biological data through empirical Bayesian methods. This approach is based on our baySeq algorithm for identification of differential expression in RNA-seq data based on a negative binomial distribution, and in paired data based on a beta-binomial distribution. Here we show how the same empirical Bayesian approach can be applied to any parametric distribution, removing the need for lengthy development of novel methods for differently distributed data. Comparisons with existing methods developed to address specific problems in high-throughput biological data show that these generic methods can achieve equivalent or better performance. A number of enhancements to the basic algorithm are also presented to increase flexibility and reduce computational costs. The methods are implemented in the R baySeq (v2) package, available on Bioconductor http://www.bioconductor.org/packages/release/bioc/html/baySeq.html. tjh48@cam.ac.uk Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Concept Analysis of Spirituality: An Evolutionary Approach.
Weathers, Elizabeth; McCarthy, Geraldine; Coffey, Alice
2016-04-01
The aim of this article is to clarify the concept of spirituality for future nursing research. Previous concept analyses of spirituality have mostly reviewed the conceptual literature with little consideration of the empirical literature. The literature reviewed in prior concept analyses extends from 1972 to 2005, with no analysis conducted in the past 9 years. Rodgers' evolutionary framework was used to review both the theoretical and empirical literature pertaining to spirituality. Evolutionary concept analysis is a formal method of philosophical inquiry, in which papers are analyzed to identify attributes, antecedents, and consequences of the concept. Empirical and conceptual literature. Three defining attributes of spirituality were identified: connectedness, transcendence, and meaning in life. A conceptual definition of spirituality was proposed based on the findings. Also, four antecedents and five primary consequences of spirituality were identified. Spirituality is a complex concept. This concept analysis adds some clarification by proposing a definition of spirituality that is underpinned by both conceptual and empirical research. Furthermore, exemplars of spirituality, based on prior qualitative research, are presented to support the findings. Hence, the findings of this analysis could guide future nursing research on spirituality. © 2015 Wiley Periodicals, Inc.
Wonnemann, Meinolf; Frömke, Cornelia; Koch, Armin
2015-01-01
We investigated different evaluation strategies for bioequivalence trials with highly variable drugs on their resulting empirical type I error and empirical power. The classical 'unscaled' crossover design with average bioequivalence evaluation, the Add-on concept of the Japanese guideline, and the current 'scaling' approach of EMA were compared. Simulation studies were performed based on the assumption of a single dose drug administration while changing the underlying intra-individual variability. Inclusion of Add-on subjects following the Japanese concept led to slight increases of the empirical α-error (≈7.5%). For the approach of EMA we noted an unexpected tremendous increase of the rejection rate at a geometric mean ratio of 1.25. Moreover, we detected error rates slightly above the pre-set limit of 5% even at the proposed 'scaled' bioequivalence limits. With the classical 'unscaled' approach and the Japanese guideline concept the goal of reduced subject numbers in bioequivalence trials of HVDs cannot be achieved. On the other hand, widening the acceptance range comes at the price that quite a number of products will be accepted bioequivalent that had not been accepted in the past. A two-stage design with control of the global α therefore seems the better alternative.
Defining Campus Violence: A Phenomenological Analysis of Community Stakeholder Perspectives
ERIC Educational Resources Information Center
Mayhew, Matthew J.; Caldwell, Rebecca J.; Goldman, Emily Grey
2011-01-01
The purpose of this study was to derive an empirically based understanding of campus violence. Grounded in a communication paradigm offered by sociolinguistic scholars, we adopted a phenomenological approach for conducting and analyzing 23 interviews from campus community stakeholders, including students, staff, faculty, administrators, and…
Seal coat damage evaluation due to superheavy load moves based on a mechanistic-empirical approach.
DOT National Transportation Integrated Search
2010-03-01
The number of superheavy load (SHL) moves has increased drastically within the past 5 years in : Texas. Along with the increasing SHL moves, the Texas Department of Transportation (TxDOT) has : become increasingly aware of the rising concerns associa...
Outsourcing in Higher Education: An Empirical Examination
ERIC Educational Resources Information Center
Gupta, Atul; Herath, S. Kanthi; Mikouiza, Nathalie C.
2005-01-01
Purpose: To measure the degree of implementation and satisfaction level with the outsourcing initiatives from higher education institutions. Design/methodology/approach: Uses a survey questionnaire to measure the levels of satisfaction with the institutions' services and the questionnaire was based on six factors that are deemed significant in…
Promoting Entrepreneurship among Informatics Engineering Students: Insights from a Case Study
ERIC Educational Resources Information Center
Fernandes, João M.; Afonso, Paulo; Fonte, Victor; Alves, Victor; Ribeiro, António Nestor
2017-01-01
Universities seek to promote entrepreneurship through effective education approaches, which need to be in permanent evolution. Nevertheless, the literature in entrepreneurship education lacks empirical evidence. This article discusses relevant issues related to promoting entrepreneurship in the software field, based on the experience of a…
Goldstein, Naomi E. S.; Kemp, Kathleen A.; Leff, Stephen S.; Lochman, John E.
2014-01-01
The use of manual-based interventions tends to improve client outcomes and promote replicability. With an increasingly strong link between funding and the use of empirically supported prevention and intervention programs, manual development and adaptation have become research priorities. As a result, researchers and scholars have generated guidelines for developing manuals from scratch, but there are no extant guidelines for adapting empirically supported, manualized prevention and intervention programs for use with new populations. Thus, this article proposes step-by-step guidelines for the manual adaptation process. It also describes two adaptations of an extensively researched anger management intervention to exemplify how an empirically supported program was systematically and efficiently adapted to achieve similar outcomes with vastly different populations in unique settings. PMID:25110403
Protein structure refinement using a quantum mechanics-based chemical shielding predictor.
Bratholm, Lars A; Jensen, Jan H
2017-03-01
The accurate prediction of protein chemical shifts using a quantum mechanics (QM)-based method has been the subject of intense research for more than 20 years but so far empirical methods for chemical shift prediction have proven more accurate. In this paper we show that a QM-based predictor of a protein backbone and CB chemical shifts (ProCS15, PeerJ , 2016, 3, e1344) is of comparable accuracy to empirical chemical shift predictors after chemical shift-based structural refinement that removes small structural errors. We present a method by which quantum chemistry based predictions of isotropic chemical shielding values (ProCS15) can be used to refine protein structures using Markov Chain Monte Carlo (MCMC) simulations, relating the chemical shielding values to the experimental chemical shifts probabilistically. Two kinds of MCMC structural refinement simulations were performed using force field geometry optimized X-ray structures as starting points: simulated annealing of the starting structure and constant temperature MCMC simulation followed by simulated annealing of a representative ensemble structure. Annealing of the CHARMM structure changes the CA-RMSD by an average of 0.4 Å but lowers the chemical shift RMSD by 1.0 and 0.7 ppm for CA and N. Conformational averaging has a relatively small effect (0.1-0.2 ppm) on the overall agreement with carbon chemical shifts but lowers the error for nitrogen chemical shifts by 0.4 ppm. If an amino acid specific offset is included the ProCS15 predicted chemical shifts have RMSD values relative to experiments that are comparable to popular empirical chemical shift predictors. The annealed representative ensemble structures differ in CA-RMSD relative to the initial structures by an average of 2.0 Å, with >2.0 Å difference for six proteins. In four of the cases, the largest structural differences arise in structurally flexible regions of the protein as determined by NMR, and in the remaining two cases, the large structural change may be due to force field deficiencies. The overall accuracy of the empirical methods are slightly improved by annealing the CHARMM structure with ProCS15, which may suggest that the minor structural changes introduced by ProCS15-based annealing improves the accuracy of the protein structures. Having established that QM-based chemical shift prediction can deliver the same accuracy as empirical shift predictors we hope this can help increase the accuracy of related approaches such as QM/MM or linear scaling approaches or interpreting protein structural dynamics from QM-derived chemical shift.
NASA Astrophysics Data System (ADS)
Liu, Xiangli; Cheng, Siwei; Wang, Shouyang; Hong, Yongmiao; Li, Yi
2008-02-01
This study employs a parametric approach based on TGARCH and GARCH models to estimate the VaR of the copper futures market and spot market in China. Considering the short selling mechanism in the futures market, the paper introduces two new notions: upside VaR and extreme upside risk spillover. And downside VaR and upside VaR are examined by using the above approach. Also, we use Kupiec’s [P.H. Kupiec, Techniques for verifying the accuracy of risk measurement models, Journal of Derivatives 3 (1995) 73-84] backtest to test the power of our approaches. In addition, we investigate information spillover effects between the futures market and the spot market by employing a linear Granger causality test, and Granger causality tests in mean, volatility and risk respectively. Moreover, we also investigate the relationship between the futures market and the spot market by using a test based on a kernel function. Empirical results indicate that there exist significant two-way spillovers between the futures market and the spot market, and the spillovers from the futures market to the spot market are much more striking.
Hektner, Joel M; Brennan, Alison L; Brotherson, Sean E
2013-09-01
The Nurtured Heart Approach to parenting (NHA; Glasser & Easley, 2008) is summarized and evaluated in terms of its alignment with current theoretical perspectives and empirical evidence in family studies and developmental science. Originally conceived and promoted as a behavior management approach for parents of difficult children (i.e., with behavior disorders), NHA is increasingly offered as a valuable strategy for parents of any children, despite a lack of published empirical support. Parents using NHA are trained to minimize attention to undesired behaviors, provide positive attention and praise for compliance with rules, help children be successful by scaffolding and shaping desired behavior, and establish a set of clear rules and consequences. Many elements of the approach have strong support in the theoretical and empirical literature; however, some of the assumptions are more questionable, such as that negative child behavior can always be attributed to unintentional positive reinforcement by parents responding with negative attention. On balance, NHA appears to promote effective and validated parenting practices, but its effectiveness now needs to be tested empirically. © FPI, Inc.
An empirical approach for estimating stress-coupling lengths for marine-terminating glaciers
Enderlin, Ellyn; Hamilton, Gordon S.; O'Neel, Shad; Bartholomaus, Timothy C.; Morlighem, Mathieu; Holt, John W.
2016-01-01
Here we present a new empirical method to estimate the SCL for marine-terminating glaciers using high-resolution observations. We use the empirically-determined periodicity in resistive stress oscillations as a proxy for the SCL. Application of our empirical method to two well-studied tidewater glaciers (Helheim Glacier, SE Greenland, and Columbia Glacier, Alaska, USA) demonstrates that SCL estimates obtained using this approach are consistent with theory (i.e., can be parameterized as a function of the ice thickness) and with prior, independent SCL estimates. In order to accurately resolve stress variations, we suggest that similar empirical stress-coupling parameterizations be employed in future analyses of glacier dynamics.
NASA Astrophysics Data System (ADS)
Srivastava, R. K.; Panda, R. K.; Halder, Debjani
2017-08-01
The primary objective of this study was to evaluate the performance of the time-domain reflectometry (TDR) technique for daily evapotranspiration estimation of peanut and maize crop in a sub-humid region. Four independent methods were used to estimate crop evapotranspiration (ETc), namely, soil water balance budgeting approach, energy balance approach—(Bowen ratio), empirical methods approach, and Pan evaporation method. The soil water balance budgeting approach utilized the soil moisture measurement by gravimetric and TDR method. The empirical evapotranspiration methods such as combination approach (FAO-56 Penman-Monteith and Penman), temperature-based approach (Hargreaves-Samani), and radiation-based approach (Priestley-Taylor, Turc, Abetw) were used to estimate the reference evapotranspiration (ET0). The daily ETc determined by the FAO-56 Penman-Monteith, Priestley-Taylor, Turc, Pan evaporation, and Bowen ratio were found to be at par with the ET values derived from the soil water balance budget; while the methods Abetw, Penman, and Hargreaves-Samani were not found to be ideal for the determination of ETc. The study illustrates the in situ applicability of the TDR method in order to make it possible for a user to choose the best way for the optimum water consumption for a given crop in a sub-humid region. The study suggests that the FAO-56 Penman-Monteith, Turc, and Priestley-Taylor can be used for the determination of crop ETc using TDR in comparison to soil water balance budget.
Lenas, Petros; Moos, Malcolm; Luyten, Frank P
2009-12-01
Recent advances in developmental biology, systems biology, and network science are converging to poise the heretofore largely empirical field of tissue engineering on the brink of a metamorphosis into a rigorous discipline based on universally accepted engineering principles of quality by design. Failure of more simplistic approaches to the manufacture of cell-based therapies has led to increasing appreciation of the need to imitate, at least to some degree, natural mechanisms that control cell fate and differentiation. The identification of many of these mechanisms, which in general are based on cell signaling pathways, is an important step in this direction. Some well-accepted empirical concepts of developmental biology, such as path-dependence, robustness, modularity, and semiautonomy of intermediate tissue forms, that appear sequentially during tissue development are starting to be incorporated in process design.
Mark Fenn; Charles Driscoll; Quingtao Zhou; Leela Rao; Thomas Meixner; Edith Allen; Fengming Yuan; Timothy Sullivan
2015-01-01
Empirical and dynamic biogeochemical modelling are complementary approaches for determining the critical load (CL) of atmospheric nitrogen (N) or other constituent deposition that an ecosystem can tolerate without causing ecological harm. The greatest benefits are obtained when these approaches are used in combination. Confounding environmental factors can complicate...
Giuliano, Karen K
2003-04-01
The philosophy of Aristotle and its impact on the process of empirical scientific inquiry has been substantial. The influence of the clarity and orderliness of his thinking, when applied to the acquisition of knowledge in nursing, can not be overstated. Traditional empirical approaches have and will continue to have an important influence on the development of nursing knowledge through nursing research. However, as nursing is primarily a practice discipline, the transition from empirical and syllogistic reasoning is problematic. Other types of inquiry are essential in the application of nursing knowledge obtained by empirical scientific approaches and to understand how that knowledge can best be used in the care of patients. This paper reviews the strengths and limitations of syllogistic reasoning by applying it to a recently published study on temperature measurement in nursing. It then discusses possible ways that the empirical knowledge gained from that study and confirmed in its reasoning by logical analysis could be used in the daily care of critically ill patients. It concludes by highlighting the utility of broader approaches to knowledge development, including interpretative approaches and contemporary empiricism, as a way to bridge the gap between factual empirical knowledge and the practical application of that knowledge in everyday clinical nursing practice.
Goldbart, Juliet; Chadwick, Darren; Buell, Susan
2014-11-01
People with profound intellectual and multiple disabilities (PMLD) have communication impairments as one defining characteristic. To explore speech and language therapists' (SLTs) decision making in communication interventions for people with PMLD, in terms of the intervention approaches used, the factors informing the decisions to use specific interventions and the extent to which the rationales underpinning these decisions related to the components of evidence based practice (EBP), namely empirical evidence, clinical experience and client/carer views and values. A questionnaire on communication assessment and intervention for people with PMLD was sent to SLTs in the UK to elicit information on: the communication intervention approaches they used; their rationales for their intervention choices; their use of published evidence to inform decision making. Intensive interaction and objects of reference were the communication interventions most often used with people with PMLD, with some differences between children and adults evident. Rationales provided conformed somewhat to the EBP framework though extension of the existing framework and addition of practical and organizational considerations led to a revised typology of rationale for decision making. Rationales most frequently related to the empowerment, development and behavioural preferences of the person with PMLD. Empirical research evidence was seldom mentioned by SLTs as informing intervention decision making leading to very diverse practice. There is a need for further research on the effectiveness of commonly used but under-evaluated interventions. There is also a need to alert SLTs to the evidence base supporting other approaches, particularly switch-based, cause and effect approaches. © 2014 Royal College of Speech and Language Therapists.
O'Brien, D J; León-Vintró, L; McClean, B
2016-01-01
The use of radiotherapy fields smaller than 3 cm in diameter has resulted in the need for accurate detector correction factors for small field dosimetry. However, published factors do not always agree and errors introduced by biased reference detectors, inaccurate Monte Carlo models, or experimental errors can be difficult to distinguish. The aim of this study was to provide a robust set of detector-correction factors for a range of detectors using numerical, empirical, and semiempirical techniques under the same conditions and to examine the consistency of these factors between techniques. Empirical detector correction factors were derived based on small field output factor measurements for circular field sizes from 3.1 to 0.3 cm in diameter performed with a 6 MV beam. A PTW 60019 microDiamond detector was used as the reference dosimeter. Numerical detector correction factors for the same fields were derived based on calculations from a geant4 Monte Carlo model of the detectors and the Linac treatment head. Semiempirical detector correction factors were derived from the empirical output factors and the numerical dose-to-water calculations. The PTW 60019 microDiamond was found to over-respond at small field sizes resulting in a bias in the empirical detector correction factors. The over-response was similar in magnitude to that of the unshielded diode. Good agreement was generally found between semiempirical and numerical detector correction factors except for the PTW 60016 Diode P, where the numerical values showed a greater over-response than the semiempirical values by a factor of 3.7% for a 1.1 cm diameter field and higher for smaller fields. Detector correction factors based solely on empirical measurement or numerical calculation are subject to potential bias. A semiempirical approach, combining both empirical and numerical data, provided the most reliable results.
An empirically based model for knowledge management in health care organizations.
Sibbald, Shannon L; Wathen, C Nadine; Kothari, Anita
2016-01-01
Knowledge management (KM) encompasses strategies, processes, and practices that allow an organization to capture, share, store, access, and use knowledge. Ideal KM combines different sources of knowledge to support innovation and improve performance. Despite the importance of KM in health care organizations (HCOs), there has been very little empirical research to describe KM in this context. This study explores KM in HCOs, focusing on the status of current intraorganizational KM. The intention is to provide insight for future studies and model development for effective KM implementation in HCOs. A qualitative methods approach was used to create an empirically based model of KM in HCOs. Methods included (a) qualitative interviews (n = 24) with senior leadership to identify types of knowledge important in these roles plus current information-seeking behaviors/needs and (b) in-depth case study with leaders in new executive positions (n = 2). The data were collected from 10 HCOs. Our empirically based model for KM was assessed for face and content validity. The findings highlight the paucity of formal KM in our sample HCOs. Organizational culture, leadership, and resources are instrumental in supporting KM processes. An executive's knowledge needs are extensive, but knowledge assets are often limited or difficult to acquire as much of the available information is not in a usable format. We propose an empirically based model for KM to highlight the importance of context (internal and external), and knowledge seeking, synthesis, sharing, and organization. Participants who reviewed the model supported its basic components and processes, and potential for incorporating KM into organizational processes. Our results articulate ways to improve KM, increase organizational learning, and support evidence-informed decision-making. This research has implications for how to better integrate evidence and knowledge into organizations while considering context and the role of organizational processes.
Equivalent income and fair evaluation of health care.
Fleurbaey, Marc; Luchini, Stéphane; Muller, Christophe; Schokkaert, Erik
2013-06-01
We argue that the economic evaluation of health care (cost-benefit analysis) should respect individual preferences and should incorporate distributional considerations. Relying on individual preferences does not imply subjective welfarism. We propose a particular non-welfarist approach, based on the concept of equivalent income, and show how it helps to define distributional weights. We illustrate the feasibility of our approach with empirical results from a pilot survey. Copyright © 2012 John Wiley & Sons, Ltd.
ERIC Educational Resources Information Center
Aagaard, Jesper
2017-01-01
In time, phenomenology has become a viable approach to conducting qualitative studies in education. Popular and well-established methods include descriptive and hermeneutic phenomenology. Based on critiques of the essentialism and receptivity of these two methods, however, this article offers a third variation of empirical phenomenology:…
DOT National Transportation Integrated Search
2000-08-01
The report describes a risk-based approach for assessing the implications of higher train speeds on highway-railroad grade crossing safety, and allocating limited resources to best reduce this risk. To predict accident frequency, an existing DOT mode...
Re-Storying an Entrepreneurial Identity: Education, Experience and Self-Narrative
ERIC Educational Resources Information Center
Harmeling, Susan S.
2011-01-01
Purpose: This paper aims to explore the ways in which entrepreneurship education may serve as an identity workspace. Design/methodology/approach: This is a conceptual/theoretical paper based on previously completed empirical work. Findings: The paper makes the connection between worldmaking, experience, action and identity. Practical implications:…
ADHD and Reading Disabilities: A Cluster Analytic Approach for Distinguishing Subgroups.
ERIC Educational Resources Information Center
Bonafina, Marcela A.; Newcorn, Jeffrey H.; McKay, Kathleen E.; Koda, Vivian H.; Halperin, Jeffrey M.
2000-01-01
Using cluster analysis, a study empirically divided 54 children with attention-deficit/hyperactivity disorder (ADHD) based on their Full Scale IQ and reading ability. Clusters had different patterns of cognitive, behavioral, and neurochemical functions, as determined by discrepancies in Verbal-Performance IQ, academic achievement, parent…
Literacy, Competence and Meaning-Making: A Human Sciences Approach
ERIC Educational Resources Information Center
Nikolajeva, Maria
2010-01-01
This semiotically informed article problematizes the concept of literacy as an aesthetic activity rather than reading skills and offers strategies for assessing young readers' understanding of fictional texts. Although not based on empirical research, the essay refers to and theorizes from extensive field studies of children's responses to…
Psychological Research in Educational Technology in China
ERIC Educational Resources Information Center
Ru-De, Liu
2010-01-01
Information and communication technology (ICT) has increasingly been bringing about significant changes in education in an ongoing process. The educational reform is not a mere technological issue but rather is based on an empirical grounding in a psychological research approach to learning and instruction. This paper introduces the research work…
Familia Adelante: A Multi-Risk Prevention Intervention for Latino Families
ERIC Educational Resources Information Center
Cervantes, Richard; Goldbach, Jeremy; Santos, Susana M.
2011-01-01
A comprehensive approach for providing behavioral health services to youth is becoming increasingly emphasized. Latino youth are at increased risk for substance abuse, mental health concerns, unsafe sexual practices and HIV, and these outcomes have been empirically connected to individual, family and community-based stress. Despite this knowledge,…
Assessing the Robustness of Chemical Prioritizations Based on ToxCast Chemical Profiling
A central goal of the U.S. EPA’s ToxCast™ program is to provide empirical, scientific evidence to aid in prioritizing the toxicity testing of thousands of chemicals. The agency has developed a prioritization approach, the Toxicological Prioritization Index (ToxPi™), that calculat...
Developing International Managers: The Contribution of Cultural Experience to Learning
ERIC Educational Resources Information Center
Townsend, Peter; Regan, Padraic; Li, Liang Liang
2015-01-01
Purpose: The purpose of this paper is to evaluate cultural experience as a learning strategy for developing international managers. Design/methodology/approach: Using an integrated framework, two quantitative studies, based on empirical methodology, are conducted. Study 1, with an undergraduate sample situated in the Asia Pacific, aimed to examine…
Modelling Diffusion of a Personalized Learning Framework
ERIC Educational Resources Information Center
Karmeshu; Raman, Raghu; Nedungadi, Prema
2012-01-01
A new modelling approach for diffusion of personalized learning as an educational process innovation in social group comprising adopter-teachers is proposed. An empirical analysis regarding the perception of 261 adopter-teachers from 18 schools in India about a particular personalized learning framework has been made. Based on this analysis,…
E. Gregory McPherson; Paula J. Peper
2012-01-01
This paper describes three long-term tree growth studies conducted to evaluate tree performance because repeated measurements of the same trees produce critical data for growth model calibration and validation. Several empirical and process-based approaches to modeling tree growth are reviewed. Modeling is more advanced in the fields of forestry and...
Beginning Student Teachers' Teacher Identities Based on Their Practical Theories
ERIC Educational Resources Information Center
Stenberg, Katariina; Karlsson, Liisa; Pitkaniemi, Harri; Maaranen, Katriina
2014-01-01
In this article, we investigate first-year student teachers' teacher identities through their practical theories and ask what these practical theories reveal about their emerging teacher identities? This study approaches teacher identity from a dialogical viewpoint where identity is constructed through various positions. The empirical part of this…
Visual resources and the public: an empirical approach
Rachel Kaplan
1979-01-01
Visual resource management systems incorporate many assumptions about how people see the landscape. While these assumptions are not articulated, they nonetheless affect the decision process. Problems inherent in some of these assumptions are examined. Extensive research based on people's preference ratings of different settings provides insight into people's...
Mixture toxicology in the 21st century: Pathway-based concepts and tools
The past decade has witnessed notable evolution of approaches focused on predicting chemical hazards and risks in the absence of empirical data from resource-intensive in vivo toxicity tests. In silico models, in vitro high-throughput toxicity assays, and short-term in vivo tests...
Entrepreneurship Education in Italian Universities: Trend, Situation and Opportunities
ERIC Educational Resources Information Center
Iacobucci, Donato; Micozzi, Alessandra
2012-01-01
Purpose: The purpose of this paper is to provide an analysis of the present situation and recent evolution of entrepreneurship education in Italian universities and to discuss whether these courses and curricula match the demand for entrepreneurial competences. Design/methodology/approach: The empirical analysis is based on a census of…
A Contingent Analysis of the Relationship between IS Implementation Strategies and IS Success.
ERIC Educational Resources Information Center
Kim, Sang-Hoon; Lee, Jinjoo
1991-01-01
Considers approaches to dealing with user attitudes toward newly implemented information systems (IS), and suggests that behavioral management strategies relevant to IS fall into three categories: (1) empirical/rational; (2) normative/reeducative; and (3) power/coercive, based on "planned change" theories. An integrative contingent model…
Neuromyths among Teachers and Student Teachers
ERIC Educational Resources Information Center
Tardif, Eric; Doudin, Pierre-André; Meylan, Nicolas
2015-01-01
Many so-called brain-based educational approaches have been strongly criticized for their lack of empirical support and occasionally for their use of pseudoscientific concepts. As a result, several use the term neuromyths to refer to false beliefs or misinterpretations regarding neuroscientific facts. We surveyed both teachers and student teachers…
Organizational Commitment, Knowledge Management Interventions, and Learning Organization Capacity
ERIC Educational Resources Information Center
Massingham, Peter; Diment, Kieren
2009-01-01
Purpose: The purpose of this paper is to examine the relationship between organizational commitment and knowledge management initiatives in developing learning organization capacity (LOC). Design/methodology/approach: This is an empirical study based on a single case study, using partial least squares (PLS) analysis. Findings: The strategic…
Evidence Valued and Used by Health Promotion Practitioners
ERIC Educational Resources Information Center
Li, V.; Carter, S. M.; Rychetnik, L.
2015-01-01
The use of evidence has become a foundational part of health promotion practice. Although there is a general consensus that adopting an evidence-based approach is necessary for practice, disagreement remains about what types of evidence practitioners should use to guide their work. An empirical understanding of how practitioners conceptualize and…
Counseling Practice: In Defense of Passive Modes of Professional Engagement
ERIC Educational Resources Information Center
Hansen, James T.
2010-01-01
Historically, passive ideologies of counseling have regularly morphed into active approaches. The author contends that professional power and status are the underlying motives for this ideological transition. Based on empirical findings and recent philosophical developments, a case is made for the counseling profession to revalue passive…
Data layer integration for the national map of the united states
Usery, E.L.; Finn, M.P.; Starbuck, M.
2009-01-01
The integration of geographic data layers in multiple raster and vector formats, from many different organizations and at a variety of resolutions and scales, is a significant problem for The National Map of the United States being developed by the U.S. Geological Survey. Our research has examined data integration from a layer-based approach for five of The National Map data layers: digital orthoimages, elevation, land cover, hydrography, and transportation. An empirical approach has included visual assessment by a set of respondents with statistical analysis to establish the meaning of various types of integration. A separate theoretical approach with established hypotheses tested against actual data sets has resulted in an automated procedure for integration of specific layers and is being tested. The empirical analysis has established resolution bounds on meanings of integration with raster datasets and distance bounds for vector data. The theoretical approach has used a combination of theories on cartographic transformation and generalization, such as T??pfer's radical law, and additional research concerning optimum viewing scales for digital images to establish a set of guiding principles for integrating data of different resolutions.
Peterson, Kathryn M; Piazza, Cathleen C; Volkert, Valerie M
2016-09-01
Treatments of pediatric feeding disorders based on applied behavior analysis (ABA) have the most empirical support in the research literature (Volkert & Piazza, 2012); however, professionals often recommend, and caregivers often use, treatments that have limited empirical support. In the current investigation, we compared a modified sequential oral sensory approach (M-SOS; Benson, Parke, Gannon, & Muñoz, 2013) to an ABA approach for the treatment of the food selectivity of 6 children with autism. We randomly assigned 3 children to ABA and 3 children to M-SOS and compared the effects of treatment in a multiple baseline design across novel, healthy target foods. We used a multielement design to assess treatment generalization. Consumption of target foods increased for children who received ABA, but not for children who received M-SOS. We subsequently implemented ABA with the children for whom M-SOS was not effective and observed a potential treatment generalization effect during ABA when M-SOS preceded ABA. © 2016 Society for the Experimental Analysis of Behavior.
Driscoll, Mary A; Kerns, Robert D
Chronic pain is a significant public health concern. For many, chronic pain is associated with declines in physical functioning and increases in emotional distress. Additionally, the socioeconomic burden associated with costs of care, lost wages and declines in productivity are significant. A large and growing body of research continues to support the biopsychosocial model as the predominant framework for conceptualizing the experience of chronic pain and its multiple negative impacts. The model also informs a widely accepted and empirically supported approach for the optimal management of chronic pain. This chapter briefly articulates the historical foundations of the biopsychosocial model of chronic pain followed by a relatively detailed discussion of an empirically informed, integrated, multimodal and interdisciplinary treatment approach. The role of mental health professionals, especially psychologists, in the management of chronic pain is particularly highlighted.
A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems.
Silva, Lenardo C; Almeida, Hyggo O; Perkusich, Angelo; Perkusich, Mirko
2015-10-30
Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage.
Parenthood and Worrying About Climate Change: The Limitations of Previous Approaches.
Ekholm, Sara; Olofsson, Anna
2017-02-01
The present study considers the correlation between parenthood and worry about the consequences of climate change. Two approaches to gauging people's perceptions of the risks of climate change are compared: the classic approach, which measures risk perception, and the emotion-based approach, which measures feelings toward a risk object. The empirical material is based on a questionnaire-based survey of 3,529 people in Sweden, of whom 1,376 answered, giving a response rate of 39%. The results show that the correlation of parenthood and climate risk is significant when the emotional aspect is raised, but not when respondents were asked to do cognitive estimates of risk. Parenthood proves significant in all three questions that measure feelings, demonstrating that it is a determinant that serves to increase worry about climate change. © 2016 Society for Risk Analysis.
A Model-Based Approach to Support Validation of Medical Cyber-Physical Systems
Silva, Lenardo C.; Almeida, Hyggo O.; Perkusich, Angelo; Perkusich, Mirko
2015-01-01
Medical Cyber-Physical Systems (MCPS) are context-aware, life-critical systems with patient safety as the main concern, demanding rigorous processes for validation to guarantee user requirement compliance and specification-oriented correctness. In this article, we propose a model-based approach for early validation of MCPS, focusing on promoting reusability and productivity. It enables system developers to build MCPS formal models based on a library of patient and medical device models, and simulate the MCPS to identify undesirable behaviors at design time. Our approach has been applied to three different clinical scenarios to evaluate its reusability potential for different contexts. We have also validated our approach through an empirical evaluation with developers to assess productivity and reusability. Finally, our models have been formally verified considering functional and safety requirements and model coverage. PMID:26528982
Louis R. Iverson; Anantha M. Prasad; Stephen N. Matthews; Matthew P. Peters
2010-01-01
Climate change will likely cause impacts that are species specific and significant; modeling is critical to better understand potential changes in suitable habitat. We use empirical, abundance-based habitat models utilizing decision tree-based ensemble methods to explore potential changes of 134 tree species habitats in the eastern United States (http://www.nrs.fs.fed....
Attention focussing and anomaly detection in real-time systems monitoring
NASA Technical Reports Server (NTRS)
Doyle, Richard J.; Chien, Steve A.; Fayyad, Usama M.; Porta, Harry J.
1993-01-01
In real-time monitoring situations, more information is not necessarily better. When faced with complex emergency situations, operators can experience information overload and a compromising of their ability to react quickly and correctly. We describe an approach to focusing operator attention in real-time systems monitoring based on a set of empirical and model-based measures for determining the relative importance of sensor data.
ERIC Educational Resources Information Center
Aleong, Chandra
2007-01-01
This paper discusses whether there are differences in performance based on differences in strategy. First, an attempt was made to determine whether the institution had a strategy, and if so, did it follow a particular model. Major models of strategy are the industry analysis approach, the resource based view or the RBV model and the more recent,…
Kepner, Gordon R
2010-04-13
The numerous natural phenomena that exhibit saturation behavior, e.g., ligand binding and enzyme kinetics, have been approached, to date, via empirical and particular analyses. This paper presents a mechanism-free, and assumption-free, second-order differential equation, designed only to describe a typical relationship between the variables governing these phenomena. It develops a mathematical model for this relation, based solely on the analysis of the typical experimental data plot and its saturation characteristics. Its utility complements the traditional empirical approaches. For the general saturation curve, described in terms of its independent (x) and dependent (y) variables, a second-order differential equation is obtained that applies to any saturation phenomena. It shows that the driving factor for the basic saturation behavior is the probability of the interactive site being free, which is described quantitatively. Solving the equation relates the variables in terms of the two empirical constants common to all these phenomena, the initial slope of the data plot and the limiting value at saturation. A first-order differential equation for the slope emerged that led to the concept of the effective binding rate at the active site and its dependence on the calculable probability the interactive site is free. These results are illustrated using specific cases, including ligand binding and enzyme kinetics. This leads to a revised understanding of how to interpret the empirical constants, in terms of the variables pertinent to the phenomenon under study. The second-order differential equation revealed the basic underlying relations that describe these saturation phenomena, and the basic mathematical properties of the standard experimental data plot. It was shown how to integrate this differential equation, and define the common basic properties of these phenomena. The results regarding the importance of the slope and the new perspectives on the empirical constants governing the behavior of these phenomena led to an alternative perspective on saturation behavior kinetics. Their essential commonality was revealed by this analysis, based on the second-order differential equation.
Scudder, Ashley; Herschell, Amy D
2015-08-01
In order to make EBTs available to a large number of children and families, developers and expert therapists have used their experience and expertise to train community-based therapists in EBTs. Understanding current training practices of treatment experts may be one method for establishing best practices for training community-based therapists prior to comprehensive empirical examinations of training practices. A qualitative study was conducted using surveys and phone interviews to identify the specific procedures used by treatment experts to train and implement an evidence-based treatment in community settings. Twenty-three doctoral-level, clinical psychologists were identified to participate because of their expertise in conducting and training Parent-Child Interaction Therapy. Semi-structured qualitative interviews were completed by phone, later transcribed verbatim, and analyzed using thematic coding. The de-identified data were coded by two independent qualitative data researchers and then compared for consistency of interpretation. The themes that emerged following the final coding were used to construct a training protocol to be empirically tested. The goal of this paper is to not only understand the current state of training practices for training therapists in a particular EBT, Parent-Child Interaction Therapy, but to illustrate the use of expert opinion as the best available evidence in preparation for empirical evaluation.
“Nobody tosses a dwarf!” The relation between the empirical and normative reexamined
Leget, C.; Borry, P.; De Vries, R.
2009-01-01
This article discusses the relation between empirical and normative approaches in bioethics. The issue of dwarf tossing, while admittedly unusual, is chosen as point of departure because it challenges the reader to look upon several central bioethical themes – including human dignity, autonomy, and the protection of vulnerable people – with fresh eyes. After an overview of current approaches to the integration of empirical and normative ethics, we consider five ways that the empirical and normative can be brought together to speak to the problem of dwarf tossing: prescriptive applied ethics, theorist ethics, critical applied ethics, particularist ethics and integrated empirical ethics. We defend a position of critical applied ethics that allows for a two-way relation between empirical and normative theories. The approach we endorse acknowledges that a social practice can and should be judged by both the gathering of empirical data and by the normative ethics. Critical applied ethics uses a five stage process that includes: (a) determination of the problem, (b) description of the problem, (c) empirical study of effects and alternatives, (d) normative weighing and (e) evaluation of the effects of a decision. In each stage, we explore the perspective from both the empirical (sociological) and the normative ethical poles that, in our view, should operate as two independent focuses of the ellipse that is called bioethics. We conclude by applying our five stage critical applied ethics to the example of dwarf tossing. PMID:19338523
The causes of international labor migrations--a demand-determined approach.
Straubhaar, T
1986-01-01
The author first studies the reasons why people migrate using a neoclassical approach concerning income differentials. He tests this approach empirically and demonstrates its limits. A demand-determination approach based on human capital theory is then outlined to overcome these limits and to take into account restrictive immigration controls. Migration from Italy, Spain, Greece, Portugal, and Turkey to the European Community destination countries is examined. It is concluded that "the demand for immigrants in the destination country is the decisive condition for the phenomenon of international labor migration, and the supply of migration-willing workers is only a necessary condition." excerpt
Empirical intrinsic geometry for nonlinear modeling and time series filtering.
Talmon, Ronen; Coifman, Ronald R
2013-07-30
In this paper, we present a method for time series analysis based on empirical intrinsic geometry (EIG). EIG enables one to reveal the low-dimensional parametric manifold as well as to infer the underlying dynamics of high-dimensional time series. By incorporating concepts of information geometry, this method extends existing geometric analysis tools to support stochastic settings and parametrizes the geometry of empirical distributions. However, the statistical models are not required as priors; hence, EIG may be applied to a wide range of real signals without existing definitive models. We show that the inferred model is noise-resilient and invariant under different observation and instrumental modalities. In addition, we show that it can be extended efficiently to newly acquired measurements in a sequential manner. These two advantages enable us to revisit the Bayesian approach and incorporate empirical dynamics and intrinsic geometry into a nonlinear filtering framework. We show applications to nonlinear and non-Gaussian tracking problems as well as to acoustic signal localization.
Empirical predictions of hypervelocity impact damage to the space station
NASA Technical Reports Server (NTRS)
Rule, W. K.; Hayashida, K. B.
1991-01-01
A family of user-friendly, DOS PC based, Microsoft BASIC programs written to provide spacecraft designers with empirical predictions of space debris damage to orbiting spacecraft is described. The spacecraft wall configuration is assumed to consist of multilayer insulation (MLI) placed between a Whipple style bumper and the pressure wall. Predictions are based on data sets of experimental results obtained from simulating debris impacts on spacecraft using light gas guns on Earth. A module of the program facilitates the creation of the data base of experimental results that are used by the damage prediction modules of the code. The user has the choice of three different prediction modules to predict damage to the bumper, the MLI, and the pressure wall. One prediction module is based on fitting low order polynomials through subsets of the experimental data. Another prediction module fits functions based on nondimensional parameters through the data. The last prediction technique is a unique approach that is based on weighting the experimental data according to the distance from the design point.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fenn, Mark E.; Driscoll, Charles; Zhou, Qingtao
2015-01-01
Empirical and dynamic biogeochemical modelling are complementary approaches for determining the critical load (CL) of atmospheric nitrogen (N) or other constituent deposition that an ecosystem can tolerate without causing ecological harm. The greatest benefits are obtained when these approaches are used in combination. Confounding environmental factors can complicate the determination of empirical CLs across depositional gradients, while the experimental application of N amendments for estimating the CL does not realistically mimic the effects of chronic atmospheric N deposition. Biogeochemical and vegetation simulation models can provide CL estimates and valuable ecosystem response information, allowing for past and future scenario testing withmore » various combinations of environmental factors, pollutants, pollutant control options, land management, and ecosystem response parameters. Even so, models are fundamentally gross simplifications of the real ecosystems they attempt to simulate. Empirical approaches are vital as a check on simulations and CL estimates, to parameterize models, and to elucidate mechanisms and responses under real world conditions. In this chapter, we provide examples of empirical and modelled N CL approaches in ecosystems from three regions of the United States: mixed conifer forest, desert scrub and pinyon- juniper woodland in California; alpine catchments in the Rocky Mountains; and lakes in the Adirondack region of New York state.« less
NASA Astrophysics Data System (ADS)
Poderoso, Charie
Science education reforms in U.S. schools emphasize the importance of students' construction of knowledge through inquiry. Organizations such as the National Science Foundation (NSF), the National Research Council (NRC), and the American Association for the Advancement of Science (AAAS) have demonstrated a commitment to searching for solutions and renewed efforts to improve science education. One suggestion for science education reform in U.S. schools was a transition from traditional didactic, textbook-based to inquiry-based instructional programs. While inquiry has shown evidence for improved student learning in science, what is needed is empirical evidence of those inquiry-based practices that affect student outcomes in a local context. This study explores the relationship between instructional programs and curricular changes affecting student outcomes in the Santa Ana Unified District (SAUSD): It provides evidence related to achievement and attitudes. SAUSD employs two approaches to teaching in the middle school science classrooms: traditional and inquiry-based approaches. The Leadership and Assistance for Science Education Reform (LASER) program is an inquiry-based science program that utilizes resources for implementation of the University of California Berkeley's Lawrence Hall of Science Education for Public Understanding Program (SEPUP) to support inquiry-based teaching and learning. Findings in this study provide empirical support related to outcomes of seventh-grade students, N = 328, in the LASER and traditional science programs in SAUSD.
NASA Astrophysics Data System (ADS)
Teal, Lorna R.; Marras, Stefano; Peck, Myron A.; Domenici, Paolo
2018-02-01
Models are useful tools for predicting the impact of global change on species distribution and abundance. As ectotherms, fish are being challenged to adapt or track changes in their environment, either in time through a phenological shift or in space by a biogeographic shift. Past modelling efforts have largely been based on correlative Species Distribution Models, which use known occurrences of species across landscapes of interest to define sets of conditions under which species are likely to maintain populations. The practical advantages of this correlative approach are its simplicity and the flexibility in terms of data requirements. However, effective conservation management requires models that make projections beyond the range of available data. One way to deal with such an extrapolation is to use a mechanistic approach based on physiological processes underlying climate change effects on organisms. Here we illustrate two approaches for developing physiology-based models to characterize fish habitat suitability. (i) Aerobic Scope Models (ASM) are based on the relationship between environmental factors and aerobic scope (defined as the difference between maximum and standard (basal) metabolism). This approach is based on experimental data collected by using a number of treatments that allow a function to be derived to predict aerobic metabolic scope from the stressor/environmental factor(s). This function is then integrated with environmental (oceanographic) data of current and future scenarios. For any given species, this approach allows habitat suitability maps to be generated at various spatiotemporal scales. The strength of the ASM approach relies on the estimate of relative performance when comparing, for example, different locations or different species. (ii) Dynamic Energy Budget (DEB) models are based on first principles including the idea that metabolism is organised in the same way within all animals. The (standard) DEB model aims to describe empirical relationships which can be found consistently within physiological data across the animal kingdom. The advantages of the DEB models are that they make use of the generalities found in terms of animal physiology and can therefore be applied to species for which little data or empirical observations are available. In addition, the limitations as well as useful potential refinements of these and other physiology-based modelling approaches are discussed. Inclusion of the physiological response of various life stages and modelling the patterns of extreme events observed in nature are suggested for future work.
A Market-Based Approach to Multi-factory Scheduling
NASA Astrophysics Data System (ADS)
Vytelingum, Perukrishnen; Rogers, Alex; MacBeth, Douglas K.; Dutta, Partha; Stranjak, Armin; Jennings, Nicholas R.
In this paper, we report on the design of a novel market-based approach for decentralised scheduling across multiple factories. Specifically, because of the limitations of scheduling in a centralised manner - which requires a center to have complete and perfect information for optimality and the truthful revelation of potentially commercially private preferences to that center - we advocate an informationally decentralised approach that is both agile and dynamic. In particular, this work adopts a market-based approach for decentralised scheduling by considering the different stakeholders representing different factories as self-interested, profit-motivated economic agents that trade resources for the scheduling of jobs. The overall schedule of these jobs is then an emergent behaviour of the strategic interaction of these trading agents bidding for resources in a market based on limited information and their own preferences. Using a simple (zero-intelligence) bidding strategy, we empirically demonstrate that our market-based approach achieves a lower bound efficiency of 84%. This represents a trade-off between a reasonable level of efficiency (compared to a centralised approach) and the desirable benefits of a decentralised solution.
Empirical Bayes estimation of proportions with application to cowbird parasitism rates
Link, W.A.; Hahn, D.C.
1996-01-01
Bayesian models provide a structure for studying collections of parameters such as are considered in the investigation of communities, ecosystems, and landscapes. This structure allows for improved estimation of individual parameters, by considering them in the context of a group of related parameters. Individual estimates are differentially adjusted toward an overall mean, with the magnitude of their adjustment based on their precision. Consequently, Bayesian estimation allows for a more credible identification of extreme values in a collection of estimates. Bayesian models regard individual parameters as values sampled from a specified probability distribution, called a prior. The requirement that the prior be known is often regarded as an unattractive feature of Bayesian analysis and may be the reason why Bayesian analyses are not frequently applied in ecological studies. Empirical Bayes methods provide an alternative approach that incorporates the structural advantages of Bayesian models while requiring a less stringent specification of prior knowledge. Rather than requiring that the prior distribution be known, empirical Bayes methods require only that it be in a certain family of distributions, indexed by hyperparameters that can be estimated from the available data. This structure is of interest per se, in addition to its value in allowing for improved estimation of individual parameters; for example, hypotheses regarding the existence of distinct subgroups in a collection of parameters can be considered under the empirical Bayes framework by allowing the hyperparameters to vary among subgroups. Though empirical Bayes methods have been applied in a variety of contexts, they have received little attention in the ecological literature. We describe the empirical Bayes approach in application to estimation of proportions, using data obtained in a community-wide study of cowbird parasitism rates for illustration. Since observed proportions based on small sample sizes are heavily adjusted toward the mean, extreme values among empirical Bayes estimates identify those species for which there is the greatest evidence of extreme parasitism rates. Applying a subgroup analysis to our data on cowbird parasitism rates, we conclude that parasitism rates for Neotropical Migrants as a group are no greater than those of Resident/Short-distance Migrant species in this forest community. Our data and analyses demonstrate that the parasitism rates for certain Neotropical Migrant species are remarkably low (Wood Thrush and Rose-breasted Grosbeak) while those for others are remarkably high (Ovenbird and Red-eyed Vireo).
An EQT-cDFT approach to determine thermodynamic properties of confined fluids.
Mashayak, S Y; Motevaselian, M H; Aluru, N R
2015-06-28
We present a continuum-based approach to predict the structure and thermodynamic properties of confined fluids at multiple length-scales, ranging from a few angstroms to macro-meters. The continuum approach is based on the empirical potential-based quasi-continuum theory (EQT) and classical density functional theory (cDFT). EQT is a simple and fast approach to predict inhomogeneous density and potential profiles of confined fluids. We use EQT potentials to construct a grand potential functional for cDFT. The EQT-cDFT-based grand potential can be used to predict various thermodynamic properties of confined fluids. In this work, we demonstrate the EQT-cDFT approach by simulating Lennard-Jones fluids, namely, methane and argon, confined inside slit-like channels of graphene. We show that the EQT-cDFT can accurately predict the structure and thermodynamic properties, such as density profiles, adsorption, local pressure tensor, surface tension, and solvation force, of confined fluids as compared to the molecular dynamics simulation results.
An Empirical Model and Ethnic Differences in Cultural Meanings Via Motives for Suicide.
Chu, Joyce; Khoury, Oula; Ma, Johnson; Bahn, Francesca; Bongar, Bruce; Goldblum, Peter
2017-10-01
The importance of cultural meanings via motives for suicide - what is considered acceptable to motivate suicide - has been advocated as a key step in understanding and preventing development of suicidal behaviors. There have been limited systematic empirical attempts to establish different cultural motives ascribed to suicide across ethnic groups. We used a mixed methods approach and grounded theory methodology to guide the analysis of qualitative data querying for meanings via motives for suicide among 232 Caucasians, Asian Americans, and Latino/a Americans with a history of suicide attempts, ideation, intent, or plan. We used subsequent logistic regression analyses to examine ethnic differences in suicide motive themes. This inductive approach of generating theory from data yielded an empirical model of 6 cultural meanings via motives for suicide themes: intrapersonal perceptions, intrapersonal emotions, intrapersonal behavior, interpersonal, mental health/medical, and external environment. Logistic regressions showed ethnic differences in intrapersonal perceptions (low endorsement by Latino/a Americans) and external environment (high endorsement by Latino/a Americans) categories. Results advance suicide research and practice by establishing 6 empirically based cultural motives for suicide themes that may represent a key intermediary step in the pathway toward suicidal behaviors. Clinicians can use these suicide meanings via motives to guide their assessment and determination of suicide risk. Emphasis on environmental stressors rather than negative perceptions like hopelessness should be considered with Latino/a clients. © 2017 Wiley Periodicals, Inc.
A method of reflexive balancing in a pragmatic, interdisciplinary and reflexive bioethics.
Ives, Jonathan
2014-07-01
In recent years there has been a wealth of literature arguing the need for empirical and interdisciplinary approaches to bioethics, based on the premise that an empirically informed ethical analysis is more grounded, contextually sensitive and therefore more relevant to clinical practice than an 'abstract' philosophical analysis. Bioethics has (arguably) always been an interdisciplinary field, and the rise of 'empirical' (bio)ethics need not be seen as an attempt to give a new name to the longstanding practice of interdisciplinary collaboration, but can perhaps best be understood as a substantive attempt to engage with the nature of that interdisciplinarity and to articulate the relationship between the many different disciplines (some of them empirical) that contribute to the field. It can also be described as an endeavour to explain how different disciplinary approaches can be integrated to effectively answer normative questions in bioethics, and fundamental to that endeavour is the need to think about how a robust methodology can be articulated that successfully marries apparently divergent epistemological and metaethical perspectives with method. This paper proposes 'Reflexive Bioethics' (RB) as a methodology for interdisciplinary and empirical bioethics, which utilizes a method of 'Reflexive Balancing' (RBL). RBL has been developed in response to criticisms of various forms of reflective equilibrium, and is built upon a pragmatic characterization of Bioethics and a 'quasi-moral foundationalism', which allows RBL to avoid some of the difficulties associated with RE and yet retain the flexible egalitarianism that makes it intuitively appealing to many. © 2013 John Wiley & Sons Ltd.
Dependence and risk assessment for oil prices and exchange rate portfolios: A wavelet based approach
NASA Astrophysics Data System (ADS)
Aloui, Chaker; Jammazi, Rania
2015-10-01
In this article, we propose a wavelet-based approach to accommodate the stylized facts and complex structure of financial data, caused by frequent and abrupt changes of markets and noises. Specifically, we show how the combination of both continuous and discrete wavelet transforms with traditional financial models helps improve portfolio's market risk assessment. In the empirical stage, three wavelet-based models (wavelet-EGARCH with dynamic conditional correlations, wavelet-copula, and wavelet-extreme value) are considered and applied to crude oil price and US dollar exchange rate data. Our findings show that the wavelet-based approach provides an effective and powerful tool for detecting extreme moments and improving the accuracy of VaR and Expected Shortfall estimates of oil-exchange rate portfolios after noise is removed from the original data.
Long-memory and the sea level-temperature relationship: a fractional cointegration approach.
Ventosa-Santaulària, Daniel; Heres, David R; Martínez-Hernández, L Catalina
2014-01-01
Through thermal expansion of oceans and melting of land-based ice, global warming is very likely contributing to the sea level rise observed during the 20th century. The amount by which further increases in global average temperature could affect sea level is only known with large uncertainties due to the limited capacity of physics-based models to predict sea levels from global surface temperatures. Semi-empirical approaches have been implemented to estimate the statistical relationship between these two variables providing an alternative measure on which to base potentially disrupting impacts on coastal communities and ecosystems. However, only a few of these semi-empirical applications had addressed the spurious inference that is likely to be drawn when one nonstationary process is regressed on another. Furthermore, it has been shown that spurious effects are not eliminated by stationary processes when these possess strong long memory. Our results indicate that both global temperature and sea level indeed present the characteristics of long memory processes. Nevertheless, we find that these variables are fractionally cointegrated when sea-ice extent is incorporated as an instrumental variable for temperature which in our estimations has a statistically significant positive impact on global sea level.
NASA Astrophysics Data System (ADS)
Attard, Guillaume; Rossier, Yvan; Eisenlohr, Laurent
2017-09-01
In a previous paper published in Journal of Hydrology, it was shown that underground structures are responsible for a mixing process between shallow and deep groundwater that can favour the spreading of urban contamination. In this paper, the impact of underground structures on the intrinsic vulnerability of urban aquifers was investigated. A sensitivity analysis was performed using a 2D deterministic modelling approach based on the reservoir theory generalized to hydrodispersive systems to better understand this mixing phenomenon and the mixing affected zone (MAZ) caused by underground structures. It was shown that the maximal extent of the MAZ caused by an underground structure is reached approximately 20 years after construction. Consequently, underground structures represent a long-term threat for deep aquifer reservoirs. Regarding the construction process, draining operations have a major impact and favour large-scale mixing between shallow and deep groundwater. Consequently, dewatering should be reduced and enclosed as much as possible. The role played by underground structures' dimensions was assessed. The obstruction of the first aquifer layer caused by construction has the greatest influence on the MAZ. The cumulative impact of several underground structures was assessed. It was shown that the total MAZ area increases linearly with underground structures' density. The role played by materials' properties and hydraulic gradient were assessed. Hydraulic conductivity, anisotropy and porosity have the strongest influence on the development of MAZ. Finally, an empirical law was derived to estimate the MAZ caused by an underground structure in a bi-layered aquifer under unconfined conditions. This empirical law, based on the results of the sensitivity analysis developed in this paper, allows for the estimation of MAZ dimensions under known material properties and underground structure dimensions. This empirical law can help urban planners assess the area of influence of underground structures and protect urban strategic reservoirs.
Henriques, D. A.; Ladbury, J. E.; Jackson, R. M.
2000-01-01
The prediction of binding energies from the three-dimensional (3D) structure of a protein-ligand complex is an important goal of biophysics and structural biology. Here, we critically assess the use of empirical, solvent-accessible surface area-based calculations for the prediction of the binding of Src-SH2 domain with a series of tyrosyl phosphopeptides based on the high-affinity ligand from the hamster middle T antigen (hmT), where the residue in the pY+ 3 position has been changed. Two other peptides based on the C-terminal regulatory site of the Src protein and the platelet-derived growth factor receptor (PDGFR) are also investigated. Here, we take into account the effects of proton linkage on binding, and test five different surface area-based models that include different treatments for the contributions to conformational change and protein solvation. These differences relate to the treatment of conformational flexibility in the peptide ligand and the inclusion of proximal ordered solvent molecules in the surface area calculations. This allowed the calculation of a range of thermodynamic state functions (deltaCp, deltaS, deltaH, and deltaG) directly from structure. Comparison with the experimentally derived data shows little agreement for the interaction of SrcSH2 domain and the range of tyrosyl phosphopeptides. Furthermore, the adoption of the different models to treat conformational change and solvation has a dramatic effect on the calculated thermodynamic functions, making the predicted binding energies highly model dependent. While empirical, solvent-accessible surface area based calculations are becoming widely adopted to interpret thermodynamic data, this study highlights potential problems with application and interpretation of this type of approach. There is undoubtedly some agreement between predicted and experimentally determined thermodynamic parameters: however, the tolerance of this approach is not sufficient to make it ubiquitously applicable. PMID:11106171
A framework for a teaching toolkit in entrepreneurship education.
Fellnhofer, Katharina
2017-01-01
Despite mounting interest in entrepreneurship education (EE), innovative approaches such as multimedia, web-based toolkits including entrepreneurial storytelling have been largely ignored in the EE discipline. Therefore, this conceptual contribution introduces eight propositions as a fruitful basis for assessing a 'learning-through-real-multimedia-entrepreneurial-narratives' pedagogical approach. These recommendations prepare the grounds for a future, empirical investigation of this currently under-researched topic, which could be essential for multiple domains including academic, business and society.
Liu, Hai-Ying; Skjetne, Erik; Kobernus, Mike
2013-11-04
We propose a new approach to assess the impact of traffic-related air pollution on public health by mapping personal trajectories using mobile phone tracking technology in an urban environment. Although this approach is not based on any empirical studies, we believe that this method has great potential and deserves serious attention. Mobile phone tracking technology makes it feasible to generate millions of personal trajectories and thereby cover a large fraction of an urban population. Through analysis, personal trajectories are not only associated to persons, but it can also be associated with vehicles, vehicle type, vehicle speed, vehicle emission rates, and sources of vehicle emissions. Pollution levels can be estimated by dispersion models from calculated traffic emissions. Traffic pollution exposure to individuals can be estimated based on the exposure along the individual human trajectories in the estimated pollution concentration fields by utilizing modelling tools. By data integration, one may identify trajectory patterns of particularly exposed human groups. The approach of personal trajectories may open a new paradigm in understanding urban dynamics and new perspectives in population-wide empirical public health research. This new approach can be further applied to individual commuter route planning, land use planning, urban traffic network planning, and used by authorities to formulate air pollution mitigation policies and regulations.
2013-01-01
We propose a new approach to assess the impact of traffic-related air pollution on public health by mapping personal trajectories using mobile phone tracking technology in an urban environment. Although this approach is not based on any empirical studies, we believe that this method has great potential and deserves serious attention. Mobile phone tracking technology makes it feasible to generate millions of personal trajectories and thereby cover a large fraction of an urban population. Through analysis, personal trajectories are not only associated to persons, but it can also be associated with vehicles, vehicle type, vehicle speed, vehicle emission rates, and sources of vehicle emissions. Pollution levels can be estimated by dispersion models from calculated traffic emissions. Traffic pollution exposure to individuals can be estimated based on the exposure along the individual human trajectories in the estimated pollution concentration fields by utilizing modelling tools. By data integration, one may identify trajectory patterns of particularly exposed human groups. The approach of personal trajectories may open a new paradigm in understanding urban dynamics and new perspectives in population-wide empirical public health research. This new approach can be further applied to individual commuter route planning, land use planning, urban traffic network planning, and used by authorities to formulate air pollution mitigation policies and regulations. PMID:24188173
Parto Dezfouli, Mohammad Ali; Dezfouli, Mohsen Parto; Rad, Hamidreza Saligheh
2014-01-01
Proton magnetic resonance spectroscopy ((1)H-MRS) is a non-invasive diagnostic tool for measuring biochemical changes in the human body. Acquired (1)H-MRS signals may be corrupted due to a wideband baseline signal generated by macromolecules. Recently, several methods have been developed for the correction of such baseline signals, however most of them are not able to estimate baseline in complex overlapped signal. In this study, a novel automatic baseline correction method is proposed for (1)H-MRS spectra based on ensemble empirical mode decomposition (EEMD). This investigation was applied on both the simulated data and the in-vivo (1)H-MRS of human brain signals. Results justify the efficiency of the proposed method to remove the baseline from (1)H-MRS signals.
Determination of a Limited Scope Network's Lightning Detection Efficiency
NASA Technical Reports Server (NTRS)
Rompala, John T.; Blakeslee, R.
2008-01-01
This paper outlines a modeling technique to map lightning detection efficiency variations over a region surveyed by a sparse array of ground based detectors. A reliable flash peak current distribution (PCD) for the region serves as the technique's base. This distribution is recast as an event probability distribution function. The technique then uses the PCD together with information regarding: site signal detection thresholds, type of solution algorithm used, and range attenuation; to formulate the probability that a flash at a specified location will yield a solution. Applying this technique to the full region produces detection efficiency contour maps specific to the parameters employed. These contours facilitate a comparative analysis of each parameter's effect on the network's detection efficiency. In an alternate application, this modeling technique gives an estimate of the number, strength, and distribution of events going undetected. This approach leads to a variety of event density contour maps. This application is also illustrated. The technique's base PCD can be empirical or analytical. A process for formulating an empirical PCD specific to the region and network being studied is presented. A new method for producing an analytical representation of the empirical PCD is also introduced.
Wang, Huaqing; Li, Ruitong; Tang, Gang; Yuan, Hongfang; Zhao, Qingliang; Cao, Xi
2014-01-01
A Compound fault signal usually contains multiple characteristic signals and strong confusion noise, which makes it difficult to separate week fault signals from them through conventional ways, such as FFT-based envelope detection, wavelet transform or empirical mode decomposition individually. In order to improve the compound faults diagnose of rolling bearings via signals’ separation, the present paper proposes a new method to identify compound faults from measured mixed-signals, which is based on ensemble empirical mode decomposition (EEMD) method and independent component analysis (ICA) technique. With the approach, a vibration signal is firstly decomposed into intrinsic mode functions (IMF) by EEMD method to obtain multichannel signals. Then, according to a cross correlation criterion, the corresponding IMF is selected as the input matrix of ICA. Finally, the compound faults can be separated effectively by executing ICA method, which makes the fault features more easily extracted and more clearly identified. Experimental results validate the effectiveness of the proposed method in compound fault separating, which works not only for the outer race defect, but also for the rollers defect and the unbalance fault of the experimental system. PMID:25289644
The Further Evolution of Cooperation
NASA Astrophysics Data System (ADS)
Axelrod, Robert; Dion, Douglas
1988-12-01
Axelrod's model of the evolution of cooperation was based on the iterated Prisoner's Dilemma. Empirical work following this approach has helped establish the prevalence of cooperation based on reciprocity. Theoretical work has led to a deeper understanding of the role of other factors in the evolution of cooperation: the number of players, the range of possible choices, variation in the payoff structure, noise, the shadow of the future, population dynamics, and population structure.
Extending Theory-Based Quantitative Predictions to New Health Behaviors.
Brick, Leslie Ann D; Velicer, Wayne F; Redding, Colleen A; Rossi, Joseph S; Prochaska, James O
2016-04-01
Traditional null hypothesis significance testing suffers many limitations and is poorly adapted to theory testing. A proposed alternative approach, called Testing Theory-based Quantitative Predictions, uses effect size estimates and confidence intervals to directly test predictions based on theory. This paper replicates findings from previous smoking studies and extends the approach to diet and sun protection behaviors using baseline data from a Transtheoretical Model behavioral intervention (N = 5407). Effect size predictions were developed using two methods: (1) applying refined effect size estimates from previous smoking research or (2) using predictions developed by an expert panel. Thirteen of 15 predictions were confirmed for smoking. For diet, 7 of 14 predictions were confirmed using smoking predictions and 6 of 16 using expert panel predictions. For sun protection, 3 of 11 predictions were confirmed using smoking predictions and 5 of 19 using expert panel predictions. Expert panel predictions and smoking-based predictions poorly predicted effect sizes for diet and sun protection constructs. Future studies should aim to use previous empirical data to generate predictions whenever possible. The best results occur when there have been several iterations of predictions for a behavior, such as with smoking, demonstrating that expected values begin to converge on the population effect size. Overall, the study supports necessity in strengthening and revising theory with empirical data.
Towards an Airframe Noise Prediction Methodology: Survey of Current Approaches
NASA Technical Reports Server (NTRS)
Farassat, Fereidoun; Casper, Jay H.
2006-01-01
In this paper, we present a critical survey of the current airframe noise (AFN) prediction methodologies. Four methodologies are recognized. These are the fully analytic method, CFD combined with the acoustic analogy, the semi-empirical method and fully numerical method. It is argued that for the immediate need of the aircraft industry, the semi-empirical method based on recent high quality acoustic database is the best available method. The method based on CFD and the Ffowcs William- Hawkings (FW-H) equation with penetrable data surface (FW-Hpds ) has advanced considerably and much experience has been gained in its use. However, more research is needed in the near future particularly in the area of turbulence simulation. The fully numerical method will take longer to reach maturity. Based on the current trends, it is predicted that this method will eventually develop into the method of choice. Both the turbulence simulation and propagation methods need to develop more for this method to become useful. Nonetheless, the authors propose that the method based on a combination of numerical and analytical techniques, e.g., CFD combined with FW-H equation, should also be worked on. In this effort, the current symbolic algebra software will allow more analytical approaches to be incorporated into AFN prediction methods.
Collaborative Practice: A Model of Successful Working in Schools
ERIC Educational Resources Information Center
James, C. R.; Dunning, G.; Connolly, M.; Elliott, T.
2007-01-01
Purpose: The purpose of this paper is to develop the notion of collaborative practice from theoretical and empirical bases. Design/methodology/approach: The research analysed the concepts of collaboration, reflective practice and the primary task. It also examined the ways of working of 18 primary schools in Wales where the level of student…
ERIC Educational Resources Information Center
Fidalgo, Angel M.; Ferreres, Doris; Muniz, Jose
2004-01-01
Sample-size restrictions limit the contingency table approaches based on asymptotic distributions, such as the Mantel-Haenszel (MH) procedure, for detecting differential item functioning (DIF) in many practical applications. Within this framework, the present study investigated the power and Type I error performance of empirical and inferential…
ERIC Educational Resources Information Center
Mock, Carol
Empirical hypotheses about organizational change are compared with actual case studies of change and leadership at the University of California (UC) system. The hypotheses are based on the sociological literature on complex organizations and are derived from three perspectives: (1) rational choice and analytic approaches, (2) cognitive…
Adolescents, Alcohol, and Substance Abuse: Reaching Teens through Brief Interventions.
ERIC Educational Resources Information Center
Monti, Peter M., Ed.; Colby, Suzanne M., Ed.; O'Leary, Tracy A., Ed.
This publication reviews a variety of empirically supported approaches to dealing with alcohol and drug problems in adolescents. Its focus is to provide motivationally based brief interventions that can be delivered in a variety of contexts address key developmental considerations and draw on the latest knowledge about the processes of addictive…
ERIC Educational Resources Information Center
Tropper, Natalie; Leiss, Dominik; Hänze, Martin
2015-01-01
Empirical findings show that students have manifold difficulties when dealing with mathematical modeling problems. Accordingly, approaches for supporting students in modeling-based learning environments have to be investigated. In the research presented here, we adopted a scaffolding perspective on teaching modeling with the aim of both providing…
Validity of the Learning Portfolio: Analysis of a Portfolio Proposal for the University
ERIC Educational Resources Information Center
Gregori-Giralt, Eva; Menéndez-Varela, José Luis
2015-01-01
Validity is a central issue in portfolio-based assessment. This empirical study used a quantitative approach to analyse the validity of the inferences drawn from a disciplinary course work portfolio assessment comprising profession-specific and learning competencies. The study also examined the problems involved in the development of the…
Computer-Assisted CBT for Child Anxiety: The Coping Cat CD-ROM
ERIC Educational Resources Information Center
Khanna, Muniya S.; Kendall, Philip C.
2008-01-01
Empirical data support the efficacy of cognitive-behavioral therapy (CBT) for child anxiety, but there is need and merit in the development and evaluation of cost-effective and transportable CBT approaches. Relatedly, a widely endorsed goal is the dissemination of evidence-based treatments from research clinics to community settings.…
History Teaching, Conflict and the Legacy of the Past
ERIC Educational Resources Information Center
McCully, Alan
2012-01-01
The article examines the utility of enquiry based, multi-perspective history teaching in divided societies and those emerging from conflict. Using findings from Northern Ireland as an example, it concludes that, while empirical research is required in a range of conflict settings, an enquiry approach, placing an emphasis on the examination of…
ERIC Educational Resources Information Center
Lucas, Nance; Goodman, Fallon R.
2015-01-01
The emerging fields of positive psychology and positive organizational scholarship (POS) contribute new perspectives and approaches for leadership education and leadership development in higher education. While there are emerging empirical studies in these new fields, little connection has been made to the intellectual and practical applications…
Does Inquiry Based Learning Affect Students' Beliefs and Attitudes towards Mathematics?
ERIC Educational Resources Information Center
McGregor, Darren
2014-01-01
Ill-structured tasks presented in an inquiry learning environment have the potential to affect students' beliefs and attitudes towards mathematics. This empirical research followed a Design Experiment approach to explore how aspects of using ill-structured tasks may have affected students' beliefs and attitudes. Results showed this task type and…
Alignment 2.0: Strategic Use of New Internet Technologies in Government
ERIC Educational Resources Information Center
Meijer, Albert; Thaens, Marcel
2010-01-01
This paper challenges the view that strategies for using Web 2.0 should primarily be based upon technological characteristics. The value of the organizational strategic alignment approach for developing specific operational Web 2.0 strategies for government organizations is explored both theoretically and empirically. On the basis of a review of…
An Empirical Investigation of Change in MCAT Scores upon Retest.
ERIC Educational Resources Information Center
Hynes, Kevin; Givner, Nathaniel
1980-01-01
An investigation of Medical College Admission Test (MCAT) retest scores indicates that limited retest improvement may result when initial scores are fairly low or below what might be predicted based on grade point averages. However, when initial scores approach the national, standardized MCAT mean, or are above what might be predicted, significant…
Factors Affecting University Students' Use of Moodle: An Empirical Study Based on TAM
ERIC Educational Resources Information Center
Essel, Daniel Danso; Wilson, Osafo Apeanti
2017-01-01
Higher education institutions are faced with the complex challenges of serving increased enrollment levels within tight budgets. This challenge is prompting many universities to explore new approaches including the use of Learning Management Systems (LMS) such as Moodle for delivering courses to help extend teaching and learning beyond the…
Applying Generalizability Theory To Evaluate Treatment Effect in Single-Subject Research.
ERIC Educational Resources Information Center
Lefebvre, Daniel J.; Suen, Hoi K.
An empirical investigation of methodological issues associated with evaluating treatment effect in single-subject research (SSR) designs is presented. This investigation: (1) conducted a generalizability (G) study to identify the sources of systematic and random measurement error (SRME); (2) used an analytic approach based on G theory to integrate…
Estimating Uncertainty in N2O Emissions from US Cropland Soils
USDA-ARS?s Scientific Manuscript database
A Monte Carlo analysis was combined with an empirically-based approach to quantify uncertainties in soil N2O emissions from US croplands estimated with the DAYCENT simulation model. Only a subset of croplands was simulated in the Monte Carlo analysis which was used to infer uncertainties across the ...
Development and Validation of an Observation System for Analyzing Teaching Roles.
ERIC Educational Resources Information Center
Southwell, Reba K.; Webb, Jeaninne N.
The construction and validation of a theoretically based sign system for the analysis of teaching roles in childhood education is described. A theoretical and empirical approach to validation were developed. In the first, the general concept of teacher role was identified as a viable construct for investigating characteristic patterns of classroom…
ERIC Educational Resources Information Center
Ulvund, Stein Erik
1982-01-01
Argues that in analyzing effects of early experience on development of cognitive competence, theoretical analyses as well as empirical investigations should be based on a transactional model of development. Shows optimal stimulation hypothesis, particularly the enhancement prediction, seems to represent a transactional approach to the study of…
ERIC Educational Resources Information Center
DeLyser, Dydia; Potter, Amy E.
2013-01-01
This article describes experiential-learning approaches to conveying the work and rewards involved in qualitative research. Seminar students interviewed one another, transcribed or took notes on those interviews, shared those materials to create a set of empirical materials for coding, developed coding schemes, and coded the materials using those…
A Common Factors Approach to Supporting University Students Experiencing Psychological Distress
ERIC Educational Resources Information Center
Surette, Tanya E.; Shier, Micheal L.
2017-01-01
This study empirically assessed the applicability of the common factors model to students accessing university-based counseling (n = 102). Participants rated symptoms of depression, anxiety, and somatization at intake and discharge. Therapists kept detailed session notes on client factors and therapy process variables. Data were analyzed utilizing…
ERIC Educational Resources Information Center
Poitras, Eric G.; Lajoie, Susanne P.; Doleck, Tenzin; Jarrell, Amanda
2016-01-01
Learner modeling, a challenging and complex endeavor, is an important and oft-studied research theme in computer-supported education. From this perspective, Educational Data Mining (EDM) research has focused on modeling and comprehending various dimensions of learning in computer-based learning environments (CBLE). Researchers and designers are…
Parent-Centered Intervention: A Practical Approach for Preventing Drug Abuse in Hispanic Adolescents
ERIC Educational Resources Information Center
Tapia, Maria I.; Schwartz, Seth J.; Prado, Guillermo; Lopez, Barbara; Pantin, Hilda
2006-01-01
Objective: The objective of the present article is to review and discuss Familias Unidas, an empirically supported, family-based, culturally specific drug abuse and HIV prevention intervention for Hispanic immigrant adolescents and their families. Method: The authors focus on engagement and retention as well as on intervention delivery.…
Putting Empirical Knowledge to Work: Linking Research and Programming on Marital Quality
ERIC Educational Resources Information Center
Adler-Baeder, Francesca; Higginbotham, Brian; Lamke, Leanne
2004-01-01
When selecting a marriage education curriculum, educators can turn to programs that have been evaluated for effectiveness; however, few curricula have undergone such study. An alternative approach, consistent with best practices, is to ensure a research base for program content. A translation process model is offered as an initial attempt to…
ERIC Educational Resources Information Center
Jennings, Jerry L.; Apsche, Jack A.; Blossom, Paige; Bayles, Corliss
2013-01-01
Although mindfulness has become a mainstream methodology in mental health treatment, it is a relatively new approach with adolescents, and perhaps especially youth with sexual behavior problems. Nevertheless, clinical experience and several empirical studies are available to show the effectiveness of a systematic mindfulness- based methodology for…
Cognitive Therapy for Obsessive-Compulsive Disorder: A Case Example
ERIC Educational Resources Information Center
Chosak, Anne; Marques, Luana; Fama, Jeanne; Renaud, Stefanie; Wilhelm, Sabine
2009-01-01
Cognitive therapy for OCD is an empirically validated alternative to the more widely used and validated behavioral therapy for OCD. The cognitive approach is based on the premise that belief systems contribute importantly to the development and maintenance of all types of OCD. By identifying and challenging maladaptive thoughts, beliefs, and core…
Identifying Threshold Concepts: Case Study of an Open Catchment Hydraulics Course
ERIC Educational Resources Information Center
Knight, D. B.; Callaghan, D. P.; Baldock, T. E.; Meyer, J. H. F.
2014-01-01
The Threshold Concept Framework is used to initiate a dialogue on an empirically supported pedagogy that focuses on students' conceptual understanding required for solving application-based problems. The present paper uses a triangulation approach to identify the threshold concept in a third-year undergraduate civil engineering course on open…
ERIC Educational Resources Information Center
Oordt, Mark S.; Jobes, David A.; Fonseca, Vincent P.; Schmidt, Steven M.
2009-01-01
Remarkably little systematic research has studied the effects of clinical suicidology training on changing practitioner attitudes and behaviors. In the current study we investigated whether training in an empirically-based assessment and treatment approach to suicidal patients administered through a continuing education workshop could meaningfully…
Constrained range expansion and climate change assessments
Yohay Carmel; Curtis H. Flather
2006-01-01
Modeling the future distribution of keystone species has proved to be an important approach to assessing the potential ecological consequences of climate change (Loehle and LeBlanc 1996; Hansen et al. 2001). Predictions of range shifts are typically based on empirical models derived from simple correlative relationships between climatic characteristics of occupied and...
Social Learning Theory: Toward a Unified Approach of Pediatric Procedural Pain
ERIC Educational Resources Information Center
Page, Lynn Olson; Blanchette, Jennifer A.
2009-01-01
Undermanaged procedural pain has been shown to have short and long term effects on children. While significant progress regarding empirically supported treatments has been made, theoretical bases for the development and management of procedural pain are lacking. This paper examines the role of social learning theory in our current understanding of…
The Treatment of Eating Disorder Clients in a Community-Based Partial Hospitalization Program.
ERIC Educational Resources Information Center
Levitt, John L.; Sansone, Randy A.
2003-01-01
Outlines a multi-faceted treatment approach to eating disorders within a partial hospital program that is affiliated with a community mental health hospital. Although empirical confirmation is not currently available, initial clinical impressions indicate that the program is facilitating the recovery of these difficult-to-treat individuals.…
Beauty in the Context of Particular Lives
ERIC Educational Resources Information Center
Rautio, Pauliina
2010-01-01
This paper is based on empirical research by the author into the everyday lives of people living in a small village. Everyday life is approached as a subjective process in time and space, experienced by particular people in a particular environmental and social context. The data of this research has been collected through correspondence in which…
ERIC Educational Resources Information Center
Confer, Jacob Russell
2013-01-01
The symptoms, assessment, and treatments of Post Traumatic Stress Disorder (PTSD) have been empirically investigated to the extent that there is a breadth of valid and reliable instruments investigating this psychopathological syndrome. There, too, exists a substantial evidence base for various treatment models demonstrating effectiveness in…
ERIC Educational Resources Information Center
Schultze-Krumbholz, Anja; Göbel, Kristin; Scheithauer, Herbert; Brighi, Antonella; Guarini, Annalisa; Tsorbatzoudis, Haralambos; Barkoukis, Vassilis; Pyzalski, Jacek; Plichta, Piotr; Del Rey, Rosario; Casas, José A.; Thompson, Fran; Smith, Peter K.
2015-01-01
In recently published studies on cyberbullying, students are frequently categorized into distinct (cyber)bully and (cyber)victim clusters based on theoretical assumptions and arbitrary cut-off scores adapted from traditional bullying research. The present study identified involvement classes empirically using latent class analysis (LCA), to…
Barriers to Adoption of Technology-Mediated Distance Education in Higher-Education Institutions
ERIC Educational Resources Information Center
Chen, Baiyun
2009-01-01
The purpose of the study was to empirically investigate the institutional approach to distance education, and examine whether the factors of concerns for program cost and faculty participation could statistically predict adoption of technology-mediated distance education (TMDE) among higher-education institutions. It is elusive to base the…
ERIC Educational Resources Information Center
Nesic, Sasa; Gasevic, Dragan; Jazayeri, Mehdi; Landoni, Monica
2011-01-01
Semantic web technologies have been applied to many aspects of learning content authoring including semantic annotation, semantic search, dynamic assembly, and personalization of learning content. At the same time, social networking services have started to play an important role in the authoring process by supporting authors' collaborative…
Migration, Remittances and Educational Outcomes: The Case of Haiti
ERIC Educational Resources Information Center
Bredl, Sebastian
2011-01-01
This paper empirically investigates how migration and the receipt of remittances affect educational outcomes in Haiti. Based on a theoretical approach it tries to disentangle the effects of both phenomena that have mostly been jointly modeled in previous literature. The results suggest that remittances play an important role for poor households in…
ERIC Educational Resources Information Center
Caison, Amy L.
2007-01-01
This study empirically explores the comparability of traditional survey-based retention research methodology with an alternative approach that relies on data commonly available in institutional student databases. Drawing on Tinto's [Tinto, V. (1993). "Leaving College: Rethinking the Causes and Cures of Student Attrition" (2nd Ed.), The University…
Behavioral Profiles in 4-5 Year-Old Children: Normal and Pathological Variants
ERIC Educational Resources Information Center
Larsson, Jan-Olov; Bergman, Lars R.; Earls, Felton; Rydelius, Per-Anders
2004-01-01
Normal and psychopathological patterns of behavior symptoms in preschool children were described by a classification approach using cluster analysis. The behavior of 406 children, average age 4 years 9 months, from the general population was evaluated at home visits. Seven clusters were identified based on empirically defined dimensions:…
Writing Instruction in First Grade: An Observational Study
ERIC Educational Resources Information Center
Coker, David L., Jr.; Farley-Ripple, Elizabeth; Jackson, Allison F.; Wen, Huijing; MacArthur, Charles A.; Jennings, Austin S.
2016-01-01
As schools work to meet the ambitious Common Core State Standards in writing (Common Core State Standards Initiation, 2010), instructional approaches are likely to be examined. However, there is little research that describes the current state of instruction. This study was designed to expand the empirical base on writing instruction in first…
ERIC Educational Resources Information Center
Moses, Tim
2011-01-01
The purpose of this study was to consider the relationships of prediction, measurement, and scaling invariance when these invariances were simultaneously evaluated in psychometric test data. An approach was developed to evaluate prediction, measurement, and scaling invariance based on linear and nonlinear prediction, measurement, and scaling…
NASA Astrophysics Data System (ADS)
Ren, Wenyi; Cao, Qizhi; Wu, Dan; Jiang, Jiangang; Yang, Guoan; Xie, Yingge; Wang, Guodong; Zhang, Sheqi
2018-01-01
Many observers using interference imaging spectrometer were plagued by the fringe-like pattern(FP) that occurs for optical wavelengths in red and near-infrared region. It brings us more difficulties in the data processing such as the spectrum calibration, information retrieval, and so on. An adaptive method based on the bi-dimensional empirical mode decomposition was developed to suppress the nonlinear FP in polarization interference imaging spectrometer. The FP and corrected interferogram were separated effectively. Meanwhile, the stripes introduced by CCD mosaic was suppressed. The nonlinear interferogram background removal and the spectrum distortion correction were implemented as well. It provides us an alternative method to adaptively suppress the nonlinear FP without prior experimental data and knowledge. This approach potentially is a powerful tool in the fields of Fourier transform spectroscopy, holographic imaging, optical measurement based on moire fringe, etc.
A theory-based approach to understanding suicide risk in shelter-seeking women.
Wolford-Clevenger, Caitlin; Smith, Phillip N
2015-04-01
Women seeking shelter from intimate partner violence are at an increased risk for suicide ideation and attempts compared to women in the general population. Control-based violence, which is common among shelter-seeking women, may play a pivotal role in the development of suicide ideation and attempts. Current risk assessment and management practices for shelter-seeking women are limited by the lack of an empirically grounded understanding of increased risk in this population. We argue that in order to more effectively promote risk assessment and management, an empirically supported theory that is sensitive to the experiences of shelter-seeking women is needed. Such a theory-driven approach has the benefits of identifying and prioritizing targetable areas for intervention. Here, we review the evidence for the link between coercive control and suicide ideation and attempts from the perspective of Baumeister's escape theory of suicide. This theory has the potential to explain the role of coercive control in the development of suicide ideation and eventual attempts in shelter-seeking women. Implications for suicide risk assessment and prevention in domestic violence shelters are discussed. © The Author(s) 2014.
The complexity of child protection recurrence: The case for a systems approach.
Jenkins, Brian Q; Tilbury, Clare; Mazerolle, Paul; Hayes, Hennessey
2017-01-01
Research on child protection recurrence has found consistent child, family, and case characteristics associated with repeated involvement with the child protection system. Despite the considerable body of empirical research, knowledge about why recurrence occurs, and what can be done to reduce it, is limited. This paper reviews the empirical literature and analyses the approaches of prior recurrence research. Four related conceptual challenges are identified: (1) a tendency to conflate child protection recurrence with repeated child maltreatment; (2) uncertainty about how best to operationalize and measure child protection recurrence in research; (3) inconsistency between prevailing explanations for the most frequently observed patterns of recurrence; and (4) difficulty in developing coherent strategies to address child protection recurrence based on research. Addressing these challenges requires a greater consideration of the effects of decision-making in the child protection system on recurrence. This paper proposes a methodology based in systems theory and drawing on existing administrative data to examine the characteristics of the child protection system that may also produce recurrence. Copyright © 2016 Elsevier Ltd. All rights reserved.
[Quantity versus quality: a review on current methodological dispute in health services research].
Sikorski, Claudia; Glaesmer, Heide; Bramesfeld, Anke
2010-10-01
The aim of this study was to determine the percentage of qualitative and quantitative research papers on health services research in two German journals. All publications of the two journals were viewed. Only empirical research papers were included. It was then assessed whether they dealt with health services research and what methodology was used to collect and analyse data. About half of all published empirical papers dealt with health services research. Of those, slightly over 20 % used qualitative methods at least partially. Ordered by topic, qualitative data collection and analysis is especially common in the fields of phenomenology, treatment determinants and treatment outcome. Sole qualitative methodology is still used rather seldom in health services research. Attempts to include quantitative as well as qualitative approaches are limited to sequential design, lowering the independent value of both approaches. The concept of triangulation yields the possibility to overcome paradigm based dichotomies. However, the choice of methodology ought to be based primarily on the research question. © Georg Thieme Verlag KG Stuttgart · New York.
Creasy, Arch; Reck, Jason; Pabst, Timothy; Hunter, Alan; Barker, Gregory; Carta, Giorgio
2018-05-29
A previously developed empirical interpolation (EI) method is extended to predict highly overloaded multicomponent elution behavior on a cation exchange (CEX) column based on batch isotherm data. Instead of a fully mechanistic model, the EI method employs an empirically modified multicomponent Langmuir equation to correlate two-component adsorption isotherm data at different salt concentrations. Piecewise cubic interpolating polynomials are then used to predict competitive binding at intermediate salt concentrations. The approach is tested for the separation of monoclonal antibody monomer and dimer mixtures by gradient elution on the cation exchange resin Nuvia HR-S. Adsorption isotherms are obtained over a range of salt concentrations with varying monomer and dimer concentrations. Coupled with a lumped kinetic model, the interpolated isotherms predict the column behavior for highly overloaded conditions. Predictions based on the EI method showed good agreement with experimental elution curves for protein loads up to 40 mg/mL column or about 50% of the column binding capacity. The approach can be extended to other chromatographic modalities and to more than two components. This article is protected by copyright. All rights reserved.
An objective analysis of the dynamic nature of field capacity
NASA Astrophysics Data System (ADS)
Twarakavi, Navin K. C.; Sakai, Masaru; Å Imå¯Nek, Jirka
2009-10-01
Field capacity is one of the most commonly used, and yet poorly defined, soil hydraulic properties. Traditionally, field capacity has been defined as the amount of soil moisture after excess water has drained away and the rate of downward movement has materially decreased. Unfortunately, this qualitative definition does not lend itself to an unambiguous quantitative approach for estimation. Because of the vagueness in defining what constitutes "drainage of excess water" from a soil, the estimation of field capacity has often been based upon empirical guidelines. These empirical guidelines are either time, pressure, or flux based. In this paper, we developed a numerical approach to estimate field capacity using a flux-based definition. The resulting approach was implemented on the soil parameter data set used by Schaap et al. (2001), and the estimated field capacity was compared to traditional definitions of field capacity. The developed modeling approach was implemented using the HYDRUS-1D software with the capability of simultaneously estimating field capacity for multiple soils with soil hydraulic parameter data. The Richards equation was used in conjunction with the van Genuchten-Mualem model to simulate variably saturated flow in a soil. Using the modeling approach to estimate field capacity also resulted in additional information such as (1) the pressure head, at which field capacity is attained, and (2) the drainage time needed to reach field capacity from saturated conditions under nonevaporative conditions. We analyzed the applicability of the modeling-based approach to estimate field capacity on real-world soils data. We also used the developed method to create contour diagrams showing the variation of field capacity with texture. It was found that using benchmark pressure heads to estimate field capacity from the retention curve leads to inaccurate results. Finally, a simple analytical equation was developed to predict field capacity from soil hydraulic parameter information. The analytical equation was found to be effective in its ability to predict field capacities.
Modeling the erythemal surface diffuse irradiance fraction for Badajoz, Spain
NASA Astrophysics Data System (ADS)
Sanchez, Guadalupe; Serrano, Antonio; Cancillo, María Luisa
2017-10-01
Despite its important role on the human health and numerous biological processes, the diffuse component of the erythemal ultraviolet irradiance (UVER) is scarcely measured at standard radiometric stations and therefore needs to be estimated. This study proposes and compares 10 empirical models to estimate the UVER diffuse fraction. These models are inspired from mathematical expressions originally used to estimate total diffuse fraction, but, in this study, they are applied to the UVER case and tested against experimental measurements. In addition to adapting to the UVER range the various independent variables involved in these models, the total ozone column has been added in order to account for its strong impact on the attenuation of ultraviolet radiation. The proposed models are fitted to experimental measurements and validated against an independent subset. The best-performing model (RAU3) is based on a model proposed by Ruiz-Arias et al. (2010) and shows values of r2 equal to 0.91 and relative root-mean-square error (rRMSE) equal to 6.1 %. The performance achieved by this entirely empirical model is better than those obtained by previous semi-empirical approaches and therefore needs no additional information from other physically based models. This study expands on previous research to the ultraviolet range and provides reliable empirical models to accurately estimate the UVER diffuse fraction.
Linardon, Jake; Fairburn, Christopher G; Fitzsimmons-Craft, Ellen E; Wilfley, Denise E; Brennan, Leah
2017-12-01
Although third-wave behaviour therapies are being increasingly used for the treatment of eating disorders, their efficacy is largely unknown. This systematic review and meta-analysis aimed to examine the empirical status of these therapies. Twenty-seven studies met full inclusion criteria. Only 13 randomized controlled trials (RCT) were identified, most on binge eating disorder (BED). Pooled within- (pre-post change) and between-groups effect sizes were calculated for the meta-analysis. Large pre-post symptom improvements were observed for all third-wave treatments, including dialectical behaviour therapy (DBT), schema therapy (ST), acceptance and commitment therapy (ACT), mindfulness-based interventions (MBI), and compassion-focused therapy (CFT). Third-wave therapies were not superior to active comparisons generally, or to cognitive-behaviour therapy (CBT) in RCTs. Based on our qualitative synthesis, none of the third-wave therapies meet established criteria for an empirically supported treatment for particular eating disorder subgroups. Until further RCTs demonstrate the efficacy of third-wave therapies for particular eating disorder subgroups, the available data suggest that CBT should retain its status as the recommended treatment approach for bulimia nervosa (BN) and BED, and the front running treatment for anorexia nervosa (AN) in adults, with interpersonal psychotherapy (IPT) considered a strong empirically-supported alternative. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Bora, S. S.; Scherbaum, F.; Kuehn, N. M.; Stafford, P.; Edwards, B.
2014-12-01
In a probabilistic seismic hazard assessment (PSHA) framework, it still remains a challenge to adjust ground motion prediction equations (GMPEs) for application in different seismological environments. In this context, this study presents a complete framework for the development of a response spectral GMPE easily adjustable to different seismological conditions; and which does not suffer from the technical problems associated with the adjustment in response spectral domain. Essentially, the approach consists of an empirical FAS (Fourier Amplitude Spectrum) model and a duration model for ground motion which are combined within the random vibration theory (RVT) framework to obtain the full response spectral ordinates. Additionally, FAS corresponding to individual acceleration records are extrapolated beyond the frequency range defined by the data using the stochastic FAS model, obtained by inversion as described in Edwards & Faeh, (2013). To that end, an empirical model for a duration, which is tuned to optimize the fit between RVT based and observed response spectral ordinate, at each oscillator frequency is derived. Although, the main motive of the presented approach was to address the adjustability issues of response spectral GMPEs; comparison, of median predicted response spectra with the other regional models indicate that presented approach can also be used as a stand-alone model. Besides that, a significantly lower aleatory variability (σ<0.5 in log units) in comparison to other regional models, at shorter periods brands it to a potentially viable alternative to the classical regression (on response spectral ordinates) based GMPEs for seismic hazard studies in the near future. The dataset used for the presented analysis is a subset of the recently compiled database RESORCE-2012 across Europe, Middle East and the Mediterranean region.
NASA Astrophysics Data System (ADS)
Van doninck, Jasper; Tuomisto, Hanna
2017-06-01
Biodiversity mapping in extensive tropical forest areas poses a major challenge for the interpretation of Landsat images, because floristically clearly distinct forest types may show little difference in reflectance. In such cases, the effects of the bidirectional reflection distribution function (BRDF) can be sufficiently strong to cause erroneous image interpretation and classification. Since the opening of the Landsat archive in 2008, several BRDF normalization methods for Landsat have been developed. The simplest of these consist of an empirical view angle normalization, whereas more complex approaches apply the semi-empirical Ross-Li BRDF model and the MODIS MCD43-series of products to normalize directional Landsat reflectance to standard view and solar angles. Here we quantify the effect of surface anisotropy on Landsat TM/ETM+ images over old-growth Amazonian forests, and evaluate five angular normalization approaches. Even for the narrow swath of the Landsat sensors, we observed directional effects in all spectral bands. Those normalization methods that are based on removing the surface reflectance gradient as observed in each image were adequate to normalize TM/ETM+ imagery to nadir viewing, but were less suitable for multitemporal analysis when the solar vector varied strongly among images. Approaches based on the MODIS BRDF model parameters successfully reduced directional effects in the visible bands, but removed only half of the systematic errors in the infrared bands. The best results were obtained when the semi-empirical BRDF model was calibrated using pairs of Landsat observation. This method produces a single set of BRDF parameters, which can then be used to operationally normalize Landsat TM/ETM+ imagery over Amazonian forests to nadir viewing and a standard solar configuration.
Linear dynamical modes as new variables for data-driven ENSO forecast
NASA Astrophysics Data System (ADS)
Gavrilov, Andrey; Seleznev, Aleksei; Mukhin, Dmitry; Loskutov, Evgeny; Feigin, Alexander; Kurths, Juergen
2018-05-01
A new data-driven model for analysis and prediction of spatially distributed time series is proposed. The model is based on a linear dynamical mode (LDM) decomposition of the observed data which is derived from a recently developed nonlinear dimensionality reduction approach. The key point of this approach is its ability to take into account simple dynamical properties of the observed system by means of revealing the system's dominant time scales. The LDMs are used as new variables for empirical construction of a nonlinear stochastic evolution operator. The method is applied to the sea surface temperature anomaly field in the tropical belt where the El Nino Southern Oscillation (ENSO) is the main mode of variability. The advantage of LDMs versus traditionally used empirical orthogonal function decomposition is demonstrated for this data. Specifically, it is shown that the new model has a competitive ENSO forecast skill in comparison with the other existing ENSO models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boero, Riccardo; Edwards, Brian Keith
Economists use computable general equilibrium (CGE) models to assess how economies react and self-organize after changes in policies, technology, and other exogenous shocks. CGE models are equation-based, empirically calibrated, and inspired by Neoclassical economic theory. The focus of this work was to validate the National Infrastructure Simulation and Analysis Center (NISAC) CGE model and apply it to the problem of assessing the economic impacts of severe events. We used the 2012 Hurricane Sandy event as our validation case. In particular, this work first introduces the model and then describes the validation approach and the empirical data available for studying themore » event of focus. Shocks to the model are then formalized and applied. Finally, model results and limitations are presented and discussed, pointing out both the model degree of accuracy and the assessed total damage caused by Hurricane Sandy.« less
Inferring causal molecular networks: empirical assessment through a community-based effort.
Hill, Steven M; Heiser, Laura M; Cokelaer, Thomas; Unger, Michael; Nesser, Nicole K; Carlin, Daniel E; Zhang, Yang; Sokolov, Artem; Paull, Evan O; Wong, Chris K; Graim, Kiley; Bivol, Adrian; Wang, Haizhou; Zhu, Fan; Afsari, Bahman; Danilova, Ludmila V; Favorov, Alexander V; Lee, Wai Shing; Taylor, Dane; Hu, Chenyue W; Long, Byron L; Noren, David P; Bisberg, Alexander J; Mills, Gordon B; Gray, Joe W; Kellen, Michael; Norman, Thea; Friend, Stephen; Qutub, Amina A; Fertig, Elana J; Guan, Yuanfang; Song, Mingzhou; Stuart, Joshua M; Spellman, Paul T; Koeppl, Heinz; Stolovitzky, Gustavo; Saez-Rodriguez, Julio; Mukherjee, Sach
2016-04-01
It remains unclear whether causal, rather than merely correlational, relationships in molecular networks can be inferred in complex biological settings. Here we describe the HPN-DREAM network inference challenge, which focused on learning causal influences in signaling networks. We used phosphoprotein data from cancer cell lines as well as in silico data from a nonlinear dynamical model. Using the phosphoprotein data, we scored more than 2,000 networks submitted by challenge participants. The networks spanned 32 biological contexts and were scored in terms of causal validity with respect to unseen interventional data. A number of approaches were effective, and incorporating known biology was generally advantageous. Additional sub-challenges considered time-course prediction and visualization. Our results suggest that learning causal relationships may be feasible in complex settings such as disease states. Furthermore, our scoring approach provides a practical way to empirically assess inferred molecular networks in a causal sense.
Landslide Hazard Probability Derived from Inherent and Dynamic Determinants
NASA Astrophysics Data System (ADS)
Strauch, Ronda; Istanbulluoglu, Erkan
2016-04-01
Landslide hazard research has typically been conducted independently from hydroclimate research. We unify these two lines of research to provide regional scale landslide hazard information for risk assessments and resource management decision-making. Our approach combines an empirical inherent landslide probability with a numerical dynamic probability, generated by combining routed recharge from the Variable Infiltration Capacity (VIC) macro-scale land surface hydrologic model with a finer resolution probabilistic slope stability model run in a Monte Carlo simulation. Landslide hazard mapping is advanced by adjusting the dynamic model of stability with an empirically-based scalar representing the inherent stability of the landscape, creating a probabilistic quantitative measure of geohazard prediction at a 30-m resolution. Climatology, soil, and topography control the dynamic nature of hillslope stability and the empirical information further improves the discriminating ability of the integrated model. This work will aid resource management decision-making in current and future landscape and climatic conditions. The approach is applied as a case study in North Cascade National Park Complex, a rugged terrain with nearly 2,700 m (9,000 ft) of vertical relief, covering 2757 sq km (1064 sq mi) in northern Washington State, U.S.A.
Hybrid BEM/empirical approach for scattering of correlated sources in rocket noise prediction
NASA Astrophysics Data System (ADS)
Barbarino, Mattia; Adamo, Francesco P.; Bianco, Davide; Bartoccini, Daniele
2017-09-01
Empirical models such as the Eldred standard model are commonly used for rocket noise prediction. Such models directly provide a definition of the Sound Pressure Level through the quadratic pressure term by uncorrelated sources. In this paper, an improvement of the Eldred Standard model has been formulated. This new formulation contains an explicit expression for the acoustic pressure of each noise source, in terms of amplitude and phase, in order to investigate the sources correlation effects and to propagate them through a wave equation. In particular, the correlation effects between adjacent and not-adjacent sources have been modeled and analyzed. The noise prediction obtained with the revised Eldred-based model has then been used for formulating an empirical/BEM (Boundary Element Method) hybrid approach that allows an evaluation of the scattering effects. In the framework of the European Space Agency funded program VECEP (VEga Consolidation and Evolution Programme), these models have been applied for the prediction of the aeroacoustics loads of the VEGA (Vettore Europeo di Generazione Avanzata - Advanced Generation European Carrier Rocket) launch vehicle at lift-off and the results have been compared with experimental data.
Enhancing the Impact of Family Justice Centers via Motivational Interviewing: An Integrated Review.
Simmons, Catherine A; Howell, Kathryn H; Duke, Michael R; Beck, J Gayle
2016-12-01
The Family Justice Center (FJC) model is an approach to assisting survivors of intimate partner violence (IPV) that focuses on integration of services under one roof and co-location of staff members from a range of multidisciplinary agencies. Even though the FJC model is touted as a best practice strategy to help IPV survivors, empirical support for the effectiveness of this approach is scarce. The current article consolidates this small yet promising body of empirically based literature in a clinically focused review. Findings point to the importance of integrating additional resources into the FJC model to engage IPV survivors who have ambivalent feelings about whether to accept help, leave the abusive relationship, and/or participate in criminal justice processes to hold the offender accountable. One such resource, motivational interviewing (MI), holds promise in aiding IPV survivors with these decisions, but empirical investigation into how MI can be incorporated into the FJC model has yet to be published. This article, therefore, also integrates the body of literature supporting the FJC model with the body of literature supporting MI with IPV survivors. Implications for practice, policy, and research are incorporated throughout this review. © The Author(s) 2015.
Novak, M.; Wootton, J.T.; Doak, D.F.; Emmerson, M.; Estes, J.A.; Tinker, M.T.
2011-01-01
How best to predict the effects of perturbations to ecological communities has been a long-standing goal for both applied and basic ecology. This quest has recently been revived by new empirical data, new analysis methods, and increased computing speed, with the promise that ecologically important insights may be obtainable from a limited knowledge of community interactions. We use empirically based and simulated networks of varying size and connectance to assess two limitations to predicting perturbation responses in multispecies communities: (1) the inaccuracy by which species interaction strengths are empirically quantified and (2) the indeterminacy of species responses due to indirect effects associated with network size and structure. We find that even modest levels of species richness and connectance (??25 pairwise interactions) impose high requirements for interaction strength estimates because system indeterminacy rapidly overwhelms predictive insights. Nevertheless, even poorly estimated interaction strengths provide greater average predictive certainty than an approach that uses only the sign of each interaction. Our simulations provide guidance in dealing with the trade-offs involved in maximizing the utility of network approaches for predicting dynamics in multispecies communities. ?? 2011 by the Ecological Society of America.
NASA Astrophysics Data System (ADS)
Butler, Samuel D.; Marciniak, Michael A.
2014-09-01
Since the development of the Torrance-Sparrow bidirectional re ectance distribution function (BRDF) model in 1967, several BRDF models have been created. Previous attempts to categorize BRDF models have relied upon somewhat vague descriptors, such as empirical, semi-empirical, and experimental. Our approach is to instead categorize BRDF models based on functional form: microfacet normal distribution, geometric attenua- tion, directional-volumetric and Fresnel terms, and cross section conversion factor. Several popular microfacet models are compared to a standardized notation for a microfacet BRDF model. A library of microfacet model components is developed, allowing for creation of unique microfacet models driven by experimentally measured BRDFs.
Aluminum Pitting Corrosion in Halide Media: A Quantum Model and Empirical Evidence
NASA Astrophysics Data System (ADS)
Lashgari, Mohsen; Kianpour, Effat; Mohammadi, Esmaeil
2013-12-01
The phenomenon of localized damage of aluminum oxide surface in the presence of halide anions was scrutinized at an atomistic level, through the cluster approach and density functional theory. The phenomenon was also investigated empirically through Tafel polarization plots and scanning electron microscopy. A distinct behavior witnessed in the fluoride medium was justified through the hard-soft acid-base principle. The atomistic investigations revealed the greatest potency for chloride entrance into the metal oxide lattice and rationalized to the severity of damage. The interaction of halide anions with the oxide surface causing some displacements on the position of Al atoms provides a mechanistic insight of the phenomenon.
NASA Technical Reports Server (NTRS)
Torres-Pomales, Wilfredo
2014-01-01
This report describes a modeling and simulation approach for disturbance patterns representative of the environment experienced by a digital system in an electromagnetic reverberation chamber. The disturbance is modeled by a multi-variate statistical distribution based on empirical observations. Extended versions of the Rejection Samping and Inverse Transform Sampling techniques are developed to generate multi-variate random samples of the disturbance. The results show that Inverse Transform Sampling returns samples with higher fidelity relative to the empirical distribution. This work is part of an ongoing effort to develop a resilience assessment methodology for complex safety-critical distributed systems.
The Empirical Distribution of Singletons for Geographic Samples of DNA Sequences.
Cubry, Philippe; Vigouroux, Yves; François, Olivier
2017-01-01
Rare variants are important for drawing inference about past demographic events in a species history. A singleton is a rare variant for which genetic variation is carried by a unique chromosome in a sample. How singletons are distributed across geographic space provides a local measure of genetic diversity that can be measured at the individual level. Here, we define the empirical distribution of singletons in a sample of chromosomes as the proportion of the total number of singletons that each chromosome carries, and we present a theoretical background for studying this distribution. Next, we use computer simulations to evaluate the potential for the empirical distribution of singletons to provide a description of genetic diversity across geographic space. In a Bayesian framework, we show that the empirical distribution of singletons leads to accurate estimates of the geographic origin of range expansions. We apply the Bayesian approach to estimating the origin of the cultivated plant species Pennisetum glaucum [L.] R. Br . (pearl millet) in Africa, and find support for range expansion having started from Northern Mali. Overall, we report that the empirical distribution of singletons is a useful measure to analyze results of sequencing projects based on large scale sampling of individuals across geographic space.
Daniel Goodman’s empirical approach to Bayesian statistics
Gerrodette, Tim; Ward, Eric; Taylor, Rebecca L.; Schwarz, Lisa K.; Eguchi, Tomoharu; Wade, Paul; Himes Boor, Gina
2016-01-01
Bayesian statistics, in contrast to classical statistics, uses probability to represent uncertainty about the state of knowledge. Bayesian statistics has often been associated with the idea that knowledge is subjective and that a probability distribution represents a personal degree of belief. Dr. Daniel Goodman considered this viewpoint problematic for issues of public policy. He sought to ground his Bayesian approach in data, and advocated the construction of a prior as an empirical histogram of “similar” cases. In this way, the posterior distribution that results from a Bayesian analysis combined comparable previous data with case-specific current data, using Bayes’ formula. Goodman championed such a data-based approach, but he acknowledged that it was difficult in practice. If based on a true representation of our knowledge and uncertainty, Goodman argued that risk assessment and decision-making could be an exact science, despite the uncertainties. In his view, Bayesian statistics is a critical component of this science because a Bayesian analysis produces the probabilities of future outcomes. Indeed, Goodman maintained that the Bayesian machinery, following the rules of conditional probability, offered the best legitimate inference from available data. We give an example of an informative prior in a recent study of Steller sea lion spatial use patterns in Alaska.
NASA Astrophysics Data System (ADS)
Murtazina, M. Sh; Avdeenko, T. V.
2018-05-01
The state of art and the progress in application of semantic technologies in the field of scientific and research activity have been analyzed. Even elementary empirical comparison has shown that the semantic search engines are superior in all respects to conventional search technologies. However, semantic information technologies are insufficiently used in the field of scientific and research activity in Russia. In present paper an approach to construction of ontological model of knowledge base is proposed. The ontological model is based on the upper-level ontology and the RDF mechanism for linking several domain ontologies. The ontological model is implemented in the Protégé environment.
NASA Astrophysics Data System (ADS)
Priyono, Wena, Made; Rahardjo, Boedi
2017-09-01
Experts and practitioners agree that the quality of higher education in Indonesia needs to be improved significantly and continuously. The low quality of university graduates is caused by many factors, one of which is the poor quality of learning. Today's instruction process tends to place great emphasis only on delivering knowledge. To avoid the pitfalls of such instruction, e.g. passive learning, thus Civil Engineering students should be given more opportunities to interact with others and actively participate in the learning process. Based on a number of theoretical and empirical studies, one appropriate strategy to overcome the aforementioned problem is by developing and implementing activity-based learning approach.
Development of Alabama traffic factors for use in mechanistic-empirical pavement design.
DOT National Transportation Integrated Search
2015-02-01
The pavement engineering community is moving toward design practices that use mechanistic-empirical (M-E) approaches to the design and analysis of pavement structures. This effort is : embodied in the Mechanistic-Empirical Pavement Design Guide (MEPD...
A balanced scorecard approach in assessing IT value in healthcare sector: an empirical examination.
Wu, Ing-Long; Kuo, Yi-Zu
2012-12-01
Healthcare sector indicates human-based and knowledge-intensive property. Massive IT investments are necessary to maintain competitiveness in this sector. The justification of IT investments is the major concern of senior management. Empirical studies examining IT value have found inconclusive results with little or no improvement in productivity. Little research has been conducted in healthcare sector. The balanced scorecard (BSC) strikes a balance between financial and non-financial measure and has been applied in evaluating organization-based performance. Moreover, healthcare organizations often consider their performance goal at customer satisfaction in addition to financial performance. This research thus proposed a new hierarchical structure for the BSC with placing both finance and customer at the top, internal process at the next, and learning and growth at the bottom. Empirical examination has found the importance of the new BSC structure in assessing IT investments. Learning and growth plays the initial driver for reaching both customer and financial performance through the mediator of internal process. This can provide deep insight into effectively managing IT resources in the hospitals.
NASA Astrophysics Data System (ADS)
Emami Niri, Mohammad; Amiri Kolajoobi, Rasool; Khodaiy Arbat, Mohammad; Shahbazi Raz, Mahdi
2018-06-01
Seismic wave velocities, along with petrophysical data, provide valuable information during the exploration and development stages of oil and gas fields. The compressional-wave velocity (VP ) is acquired using conventional acoustic logging tools in many drilled wells. But the shear-wave velocity (VS ) is recorded using advanced logging tools only in a limited number of wells, mainly because of the high operational costs. In addition, laboratory measurements of seismic velocities on core samples are expensive and time consuming. So, alternative methods are often used to estimate VS . Heretofore, several empirical correlations that predict VS by using well logging measurements and petrophysical data such as VP , porosity and density are proposed. However, these empirical relations can only be used in limited cases. The use of intelligent systems and optimization algorithms are inexpensive, fast and efficient approaches for predicting VS. In this study, in addition to the widely used Greenberg–Castagna empirical method, we implement three relatively recently developed metaheuristic algorithms to construct linear and nonlinear models for predicting VS : teaching–learning based optimization, imperialist competitive and artificial bee colony algorithms. We demonstrate the applicability and performance of these algorithms to predict Vs using conventional well logs in two field data examples, a sandstone formation from an offshore oil field and a carbonate formation from an onshore oil field. We compared the estimated VS using each of the employed metaheuristic approaches with observed VS and also with those predicted by Greenberg–Castagna relations. The results indicate that, for both sandstone and carbonate case studies, all three implemented metaheuristic algorithms are more efficient and reliable than the empirical correlation to predict VS . The results also demonstrate that in both sandstone and carbonate case studies, the performance of an artificial bee colony algorithm in VS prediction is slightly higher than two other alternative employed approaches.
Barkun, Alan N; Crott, Ralph; Fallone, Carlo A; Kennedy, Wendy A; Lachaine, Jean; Levinton, Carey; Armstrong, David; Chiba, Naoki; Thomson, Alan; Veldhuyzen van Zanten, Sander; Sinclair, Paul; Escobedo, Sergio; Chakraborty, Bijan; Smyth, Sandra; White, Robert; Kalra, Helen; Nevin, Krista
2010-08-01
The cost-effectiveness of initial strategies in managing Canadian patients with uninvestigated upper gastrointestinalsymptoms remains controversial. To assess the cost-effectiveness of six management approaches to uninvestigated upper gastrointestinal symptoms in the Canadian setting. The present study analyzed data from four randomized trials assessing homogeneous and complementary populations of Canadian patients with uninvestigated upper gastrointestinal symptoms with comparable outcomes. Symptom-free months, qualityadjusted life-years (QALYs) and direct costs in Canadian dollars of two management approaches based on the Canadian Dyspepsia Working Group (CanDys) Clinical Management Tool, and four additional strategies (two empirical antisecretory agents, and two prompt endoscopy) were examined and compared. Prevalence data, probabilities, utilities and costs were included in a Markov model, while sensitivity analysis used Monte Carlo simulations. Incremental cost-effectiveness ratios and cost-effectiveness acceptability curves were determined. Empirical omeprazole cost $226 per QALY ($49 per symptom-free month) per patient. CanDys omeprazole and endoscopy approaches were more effective than empirical omeprazole, but more costly. Alternatives using H2-receptor antagonists were less effective than those using a proton pump inhibitor. No significant differences were found for most incremental cost-effectiveness ratios. As willingness to pay (WTP) thresholds rose from $226 to $24,000 per QALY, empirical antisecretory approaches were less likely to be the most costeffective choice, with CanDys omeprazole progressively becoming a more likely option. For WTP values ranging from $24,000 to $70,000 per QALY, the most clinically relevant range, CanDys omeprazole was the most cost-effective strategy (32% to 46% of the time), with prompt endoscopy-proton pump inhibitor favoured at higher WTP values. Although no strategy was the indisputable cost effective option, CanDys omeprazole may be the strategy of choiceover a clinically relevant range of WTP assumptions in the initial management of Canadian patients with uninvestigated dyspepsia.
2017-01-01
The accurate prediction of protein chemical shifts using a quantum mechanics (QM)-based method has been the subject of intense research for more than 20 years but so far empirical methods for chemical shift prediction have proven more accurate. In this paper we show that a QM-based predictor of a protein backbone and CB chemical shifts (ProCS15, PeerJ, 2016, 3, e1344) is of comparable accuracy to empirical chemical shift predictors after chemical shift-based structural refinement that removes small structural errors. We present a method by which quantum chemistry based predictions of isotropic chemical shielding values (ProCS15) can be used to refine protein structures using Markov Chain Monte Carlo (MCMC) simulations, relating the chemical shielding values to the experimental chemical shifts probabilistically. Two kinds of MCMC structural refinement simulations were performed using force field geometry optimized X-ray structures as starting points: simulated annealing of the starting structure and constant temperature MCMC simulation followed by simulated annealing of a representative ensemble structure. Annealing of the CHARMM structure changes the CA-RMSD by an average of 0.4 Å but lowers the chemical shift RMSD by 1.0 and 0.7 ppm for CA and N. Conformational averaging has a relatively small effect (0.1–0.2 ppm) on the overall agreement with carbon chemical shifts but lowers the error for nitrogen chemical shifts by 0.4 ppm. If an amino acid specific offset is included the ProCS15 predicted chemical shifts have RMSD values relative to experiments that are comparable to popular empirical chemical shift predictors. The annealed representative ensemble structures differ in CA-RMSD relative to the initial structures by an average of 2.0 Å, with >2.0 Å difference for six proteins. In four of the cases, the largest structural differences arise in structurally flexible regions of the protein as determined by NMR, and in the remaining two cases, the large structural change may be due to force field deficiencies. The overall accuracy of the empirical methods are slightly improved by annealing the CHARMM structure with ProCS15, which may suggest that the minor structural changes introduced by ProCS15-based annealing improves the accuracy of the protein structures. Having established that QM-based chemical shift prediction can deliver the same accuracy as empirical shift predictors we hope this can help increase the accuracy of related approaches such as QM/MM or linear scaling approaches or interpreting protein structural dynamics from QM-derived chemical shift. PMID:28451325
Xu, Xu Steven; Yuan, Min; Yang, Haitao; Feng, Yan; Xu, Jinfeng; Pinheiro, Jose
2017-01-01
Covariate analysis based on population pharmacokinetics (PPK) is used to identify clinically relevant factors. The likelihood ratio test (LRT) based on nonlinear mixed effect model fits is currently recommended for covariate identification, whereas individual empirical Bayesian estimates (EBEs) are considered unreliable due to the presence of shrinkage. The objectives of this research were to investigate the type I error for LRT and EBE approaches, to confirm the similarity of power between the LRT and EBE approaches from a previous report and to explore the influence of shrinkage on LRT and EBE inferences. Using an oral one-compartment PK model with a single covariate impacting on clearance, we conducted a wide range of simulations according to a two-way factorial design. The results revealed that the EBE-based regression not only provided almost identical power for detecting a covariate effect, but also controlled the false positive rate better than the LRT approach. Shrinkage of EBEs is likely not the root cause for decrease in power or inflated false positive rate although the size of the covariate effect tends to be underestimated at high shrinkage. In summary, contrary to the current recommendations, EBEs may be a better choice for statistical tests in PPK covariate analysis compared to LRT. We proposed a three-step covariate modeling approach for population PK analysis to utilize the advantages of EBEs while overcoming their shortcomings, which allows not only markedly reducing the run time for population PK analysis, but also providing more accurate covariate tests.
NASA Astrophysics Data System (ADS)
Farzaneh, Saeed; Forootan, Ehsan
2018-03-01
The computerized ionospheric tomography is a method for imaging the Earth's ionosphere using a sounding technique and computing the slant total electron content (STEC) values from data of the global positioning system (GPS). The most common approach for ionospheric tomography is the voxel-based model, in which (1) the ionosphere is divided into voxels, (2) the STEC is then measured along (many) satellite signal paths, and finally (3) an inversion procedure is applied to reconstruct the electron density distribution of the ionosphere. In this study, a computationally efficient approach is introduced, which improves the inversion procedure of step 3. Our proposed method combines the empirical orthogonal function and the spherical Slepian base functions to describe the vertical and horizontal distribution of electron density, respectively. Thus, it can be applied on regional and global case studies. Numerical application is demonstrated using the ground-based GPS data over South America. Our results are validated against ionospheric tomography obtained from the constellation observing system for meteorology, ionosphere, and climate (COSMIC) observations and the global ionosphere map estimated by international centers, as well as by comparison with STEC derived from independent GPS stations. Using the proposed approach, we find that while using 30 GPS measurements in South America, one can achieve comparable accuracy with those from COSMIC data within the reported accuracy (1 × 1011 el/cm3) of the product. Comparisons with real observations of two GPS stations indicate an absolute difference is less than 2 TECU (where 1 total electron content unit, TECU, is 1016 electrons/m2).
Segmenting hospitals for improved management strategy.
Malhotra, N K
1989-09-01
The author presents a conceptual framework for the a priori and clustering-based approaches to segmentation and evaluates them in the context of segmenting institutional health care markets. An empirical study is reported in which the hospital market is segmented on three state-of-being variables. The segmentation approach also takes into account important organizational decision-making variables. The sophisticated Thurstone Case V procedure is employed. Several marketing implications for hospitals, other health care organizations, hospital suppliers, and donor publics are identified.
A framework for a teaching toolkit in entrepreneurship education
Fellnhofer, Katharina
2017-01-01
Despite mounting interest in entrepreneurship education (EE), innovative approaches such as multimedia, web-based toolkits including entrepreneurial storytelling have been largely ignored in the EE discipline. Therefore, this conceptual contribution introduces eight propositions as a fruitful basis for assessing a ‘learning-through-real-multimedia-entrepreneurial-narratives’ pedagogical approach. These recommendations prepare the grounds for a future, empirical investigation of this currently under-researched topic, which could be essential for multiple domains including academic, business and society. PMID:28680372
The Care Dialog: the "ethics of care" approach and its importance for clinical ethics consultation.
Schuchter, Patrick; Heller, Andreas
2018-03-01
Ethics consultation in institutions of the healthcare system has been given a standard form based on three pillars: education, the development of guidelines and concrete ethics consultation in case conferences. The spread of ethics committees, which perform these tasks on an organizational level, is a remarkable historic achievement. At the same time it cannot be denied that modern ethics consultation neglects relevant aspects of care ethics approaches. In our essay we present an "ethics of care" approach as well as an empirical pilot project-"Ethics from the bottom up"-which organizes ethics consultation based on this focus. Findings and philosophy of the project will be discussed as far as relevant for ethics consultation in the healthcare system.
Probabilistic empirical prediction of seasonal climate: evaluation and potential applications
NASA Astrophysics Data System (ADS)
Dieppois, B.; Eden, J.; van Oldenborgh, G. J.
2017-12-01
Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a new evaluation of an established empirical system used to predict seasonal climate across the globe. Forecasts for surface air temperature, precipitation and sea level pressure are produced by the KNMI Probabilistic Empirical Prediction (K-PREP) system every month and disseminated via the KNMI Climate Explorer (climexp.knmi.nl). K-PREP is based on multiple linear regression and built on physical principles to the fullest extent with predictive information taken from the global CO2-equivalent concentration, large-scale modes of variability in the climate system and regional-scale information. K-PREP seasonal forecasts for the period 1981-2016 will be compared with corresponding dynamically generated forecasts produced by operational forecast systems. While there are many regions of the world where empirical forecast skill is extremely limited, several areas are identified where K-PREP offers comparable skill to dynamical systems. We discuss two key points in the future development and application of the K-PREP system: (a) the potential for K-PREP to provide a more useful basis for reference forecasts than those based on persistence or climatology, and (b) the added value of including K-PREP forecast information in multi-model forecast products, at least for known regions of good skill. We also discuss the potential development of stakeholder-driven applications of the K-PREP system, including empirical forecasts for circumboreal fire activity.
Wavelet-bounded empirical mode decomposition for measured time series analysis
NASA Astrophysics Data System (ADS)
Moore, Keegan J.; Kurt, Mehmet; Eriten, Melih; McFarland, D. Michael; Bergman, Lawrence A.; Vakakis, Alexander F.
2018-01-01
Empirical mode decomposition (EMD) is a powerful technique for separating the transient responses of nonlinear and nonstationary systems into finite sets of nearly orthogonal components, called intrinsic mode functions (IMFs), which represent the dynamics on different characteristic time scales. However, a deficiency of EMD is the mixing of two or more components in a single IMF, which can drastically affect the physical meaning of the empirical decomposition results. In this paper, we present a new approached based on EMD, designated as wavelet-bounded empirical mode decomposition (WBEMD), which is a closed-loop, optimization-based solution to the problem of mode mixing. The optimization routine relies on maximizing the isolation of an IMF around a characteristic frequency. This isolation is measured by fitting a bounding function around the IMF in the frequency domain and computing the area under this function. It follows that a large (small) area corresponds to a poorly (well) separated IMF. An optimization routine is developed based on this result with the objective of minimizing the bounding-function area and with the masking signal parameters serving as free parameters, such that a well-separated IMF is extracted. As examples of application of WBEMD we apply the proposed method, first to a stationary, two-component signal, and then to the numerically simulated response of a cantilever beam with an essentially nonlinear end attachment. We find that WBEMD vastly improves upon EMD and that the extracted sets of IMFs provide insight into the underlying physics of the response of each system.
Morcke, Anne Mette; Dornan, Tim; Eika, Berit
2013-10-01
Outcome based or competency based education (OBE) is so firmly established in undergraduate medical education that it might not seem necessary to ask why it was included in recommendations for the future, like the Flexner centenary report. Uncritical acceptance may not, however, deliver its greatest benefits. Our aim was to explore the underpinnings of OBE: its historical origins, theoretical basis, and empirical evidence of its effects in order to answer the question: How can predetermined learning outcomes influence undergraduate medical education? This literature review had three components: A review of historical landmarks in the evolution of OBE; a review of conceptual frameworks and theories; and a systematic review of empirical publications from 1999 to 2010 that reported data concerning the effects of learning outcomes on undergraduate medical education. OBE had its origins in behaviourist theories of learning. It is tightly linked to the assessment and regulation of proficiency, but less clearly linked to teaching and learning activities. Over time, there have been cycles of advocacy for, then criticism of, OBE. A recurring critique concerns the place of complex personal and professional attributes as "competencies". OBE has been adopted by consensus in the face of weak empirical evidence. OBE, which has been advocated for over 50 years, can contribute usefully to defining requisite knowledge and skills, and blueprinting assessments. Its applicability to more complex aspects of clinical performance is not clear. OBE, we conclude, provides a valuable approach to some, but not all, important aspects of undergraduate medical education.
NASA Astrophysics Data System (ADS)
Dalguer, Luis A.; Fukushima, Yoshimitsu; Irikura, Kojiro; Wu, Changjiang
2017-09-01
Inspired by the first workshop on Best Practices in Physics-Based Fault Rupture Models for Seismic Hazard Assessment of Nuclear Installations (BestPSHANI) conducted by the International Atomic Energy Agency (IAEA) on 18-20 November, 2015 in Vienna (http://www-pub.iaea.org/iaeameetings/50896/BestPSHANI), this PAGEOPH topical volume collects several extended articles from this workshop as well as several new contributions. A total of 17 papers have been selected on topics ranging from the seismological aspects of earthquake cycle simulations for source-scaling evaluation, seismic source characterization, source inversion and ground motion modeling (based on finite fault rupture using dynamic, kinematic, stochastic and empirical Green's functions approaches) to the engineering application of simulated ground motion for the analysis of seismic response of structures. These contributions include applications to real earthquakes and description of current practice to assess seismic hazard in terms of nuclear safety in low seismicity areas, as well as proposals for physics-based hazard assessment for critical structures near large earthquakes. Collectively, the papers of this volume highlight the usefulness of physics-based models to evaluate and understand the physical causes of observed and empirical data, as well as to predict ground motion beyond the range of recorded data. Relevant importance is given on the validation and verification of the models by comparing synthetic results with observed data and empirical models.
Semi-empirical master curve concept describing the rate capability of lithium insertion electrodes
NASA Astrophysics Data System (ADS)
Heubner, C.; Seeba, J.; Liebmann, T.; Nickol, A.; Börner, S.; Fritsch, M.; Nikolowski, K.; Wolter, M.; Schneider, M.; Michaelis, A.
2018-03-01
A simple semi-empirical master curve concept, describing the rate capability of porous insertion electrodes for lithium-ion batteries, is proposed. The model is based on the evaluation of the time constants of lithium diffusion in the liquid electrolyte and the solid active material. This theoretical approach is successfully verified by comprehensive experimental investigations of the rate capability of a large number of porous insertion electrodes with various active materials and design parameters. It turns out, that the rate capability of all investigated electrodes follows a simple master curve governed by the time constant of the rate limiting process. We demonstrate that the master curve concept can be used to determine optimum design criteria meeting specific requirements in terms of maximum gravimetric capacity for a desired rate capability. The model further reveals practical limits of the electrode design, attesting the empirically well-known and inevitable tradeoff between energy and power density.
De Carli, Pietro; Tagini, Angela; Sarracino, Diego; Santona, Alessandra; Bonalda, Valentina; Cesari, Paola Elena; Parolin, Laura
2018-01-01
The authors discuss the issue of intergenerational transmission of parenting from an empirical and psychoanalytic perspective. After presenting a framework to explain their conception of parenting, they describe intergenerational transmission of parenting as a key to interpreting and eventually changing parenting behaviors. Then they present (1) the empirical approach aimed at determining if there is actually a stability across generations that contributes to harsh parenting and eventually maltreatment and (2) the psyphoanalytic thinking that seeks to explain the continuity in terms of representations and clinical phenomena. The authors also discuss the relationship between the attachment and the caregiving systems and hypothesize a common base for the two systems in childhood experience. Finally, they propose the psychoanalytic perspective as a fruitful theoretical framework to integrate the evidence for the neurophysiological mediators and moderators of intergenerational transmission. Psychoanalytically informed research can provide clinically relevant insights and hypotheses to be tested.
Ecological Forecasting in Chesapeake Bay: Using a Mechanistic-Empirical Modelling Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, C. W.; Hood, Raleigh R.; Long, Wen
The Chesapeake Bay Ecological Prediction System (CBEPS) automatically generates daily nowcasts and three-day forecasts of several environmental variables, such as sea-surface temperature and salinity, the concentrations of chlorophyll, nitrate, and dissolved oxygen, and the likelihood of encountering several noxious species, including harmful algal blooms and water-borne pathogens, for the purpose of monitoring the Bay's ecosystem. While the physical and biogeochemical variables are forecast mechanistically using the Regional Ocean Modeling System configured for the Chesapeake Bay, the species predictions are generated using a novel mechanistic empirical approach, whereby real-time output from the coupled physical biogeochemical model drives multivariate empirical habitat modelsmore » of the target species. The predictions, in the form of digital images, are available via the World Wide Web to interested groups to guide recreational, management, and research activities. Though full validation of the integrated forecasts for all species is still a work in progress, we argue that the mechanistic–empirical approach can be used to generate a wide variety of short-term ecological forecasts, and that it can be applied in any marine system where sufficient data exist to develop empirical habitat models. This paper provides an overview of this system, its predictions, and the approach taken.« less
Development of a new model for short period ocean tidal variations of Earth rotation
NASA Astrophysics Data System (ADS)
Schuh, Harald
2015-08-01
Within project SPOT (Short Period Ocean Tidal variations in Earth rotation) we develop a new high frequency Earth rotation model based on empirical ocean tide models. The main purpose of the SPOT model is its application to space geodetic observations such as GNSS and VLBI.We consider an empirical ocean tide model, which does not require hydrodynamic ocean modeling to determine ocean tidal angular momentum. We use here the EOT11a model of Savcenko & Bosch (2012), which is extended for some additional minor tides (e.g. M1, J1, T2). As empirical tidal models do not provide ocean tidal currents, which are re- quired for the computation of oceanic relative angular momentum, we implement an approach first published by Ray (2001) to estimate ocean tidal current veloci- ties for all tides considered in the extended EOT11a model. The approach itself is tested by application to tidal heights from hydrodynamic ocean tide models, which also provide tidal current velocities. Based on the tidal heights and the associated current velocities the oceanic tidal angular momentum (OTAM) is calculated.For the computation of the related short period variation of Earth rotation, we have re-examined the Euler-Liouville equation for an elastic Earth model with a liquid core. The focus here is on the consistent calculation of the elastic Love num- bers and associated Earth model parameters, which are considered in the Euler- Liouville equation for diurnal and sub-diurnal periods in the frequency domain.
A place-based model of local activity spaces: individual place exposure and characteristics
NASA Astrophysics Data System (ADS)
Hasanzadeh, Kamyar; Laatikainen, Tiina; Kyttä, Marketta
2018-01-01
Researchers for long have hypothesized relationships between mobility, urban context, and health. Despite the ample amount of discussions, the empirical findings corroborating such associations remain to be marginal in the literature. It is growingly believed that the weakness of the observed associations can be largely explained by the common misspecification of the geographical context. Researchers coming from different fields have developed a wide range of methods for estimating the extents of these geographical contexts. In this article, we argue that no single approach yet has sufficiently been capable of capturing the complexity of human mobility patterns. Subsequently, we discuss that reaching a better understanding of individual activity spaces can be possible through a spatially sensitive estimation of place exposure. Following this discussion, we take an integrative person and place-based approach to create an individualized residential exposure model (IREM) to estimate the local activity spaces (LAS) of the individuals. This model is created using data collected through public participation GIS. Following a brief comparison of IREM with other commonly used LAS models, the article continues by presenting an empirical study of aging citizens in Helsinki area to demonstrate the usability of the proposed framework. In this study, we identify the main dimensions of LASs and seek their associations with socio-demographic characteristics of individuals and their location in the region. The promising results from comparisons and the interesting findings from the empirical part suggest both a methodological and conceptual improvement in capturing the complexity of local activity spaces.
An evaluation of multiple trauma severity indices created by different index development strategies.
Gustafson, D H; Fryback, D G; Rose, J H; Prokop, C T; Detmer, D E; Rossmeissl, J C; Taylor, C M; Alemi, F; Carnazzo, A J
1983-07-01
Evaluation of the effectiveness of emergency trauma care systems is complicated by the need to adjust for the widely variable case mix found in trauma patient populations. Several strategies have been advanced to construct the severity indices that can control for these population differences. This article describes a validity and reliability comparison of trauma severity indices developed under three different approaches: 1) use of a multi-attribute utility (MAU) model; 2) an actuarial approach relying on empirical data bases; and 3) an "ad hoc" approach. Seven criteria were identified to serve as standards of comparison for four different indices. The study's findings indicate that the index developed using the MAU theory approach associates most closely with physician judgments of trauma severity. When correlated with a morbidity outcome measure, the MAU-based index shows higher levels of agreement than the other indices. The index development approach based on the principles of MAU theory has several advantages and it appears to be a powerful tool in the creation of effective severity indices.
NASA Astrophysics Data System (ADS)
Wang, Ten-See
1993-07-01
Excessive base heating has been a problem for many launch vehicles. For certain designs such as the direct dump of turbine exhaust in the nozzle section and at the nozzle lip of the Space Transportation Systems Engine (STME), the potential burning of the turbine exhaust in the base region has caused tremendous concern. Two conventional approaches have been considered for predicting the base environment: (1) empirical approach, and (2) experimental approach. The empirical approach uses a combination of data correlations and semi-theoretical calculations. It works best for linear problems, simple physics and geometry. However, it is highly suspicious when complex geometry and flow physics are involved, especially when the subject is out of historical database. The experimental approach is often used to establish database for engineering analysis. However, it is qualitative at best for base flow problems. Other criticisms include the inability to simulate forebody boundary layer correctly, the interference effect from tunnel walls, and the inability to scale all pertinent parameters. Furthermore, there is a contention that the information extrapolated from subscale tests with combustion is not conservative. One potential alternative to the conventional methods is computational fluid dynamics (CFD), which has none of the above restrictions and is becoming more feasible due to maturing algorithms and advancing computer technology. It provides more details of the flowfield and is only limited by computer resources. However, it has its share of criticisms as a predictive tool for base environment. One major concern is that CFD has not been extensively tested for base flow problems. It is therefore imperative that CFD be assessed and benchmarked satisfactorily for base flows. In this study, the turbulent base flowfield of a experimental investigation for a four-engine clustered nozzle is numerically benchmarked using a pressure based CFD method. Since the cold air was the medium, accurate prediction of the base pressure distributions at high altitudes is the primary goal. Other factors which may influence the numerical results such as the effects of grid density, turbulence model, differencing scheme, and boundary conditions are also being addressed.
NASA Technical Reports Server (NTRS)
Wang, Ten-See
1993-01-01
Excessive base heating has been a problem for many launch vehicles. For certain designs such as the direct dump of turbine exhaust in the nozzle section and at the nozzle lip of the Space Transportation Systems Engine (STME), the potential burning of the turbine exhaust in the base region has caused tremendous concern. Two conventional approaches have been considered for predicting the base environment: (1) empirical approach, and (2) experimental approach. The empirical approach uses a combination of data correlations and semi-theoretical calculations. It works best for linear problems, simple physics and geometry. However, it is highly suspicious when complex geometry and flow physics are involved, especially when the subject is out of historical database. The experimental approach is often used to establish database for engineering analysis. However, it is qualitative at best for base flow problems. Other criticisms include the inability to simulate forebody boundary layer correctly, the interference effect from tunnel walls, and the inability to scale all pertinent parameters. Furthermore, there is a contention that the information extrapolated from subscale tests with combustion is not conservative. One potential alternative to the conventional methods is computational fluid dynamics (CFD), which has none of the above restrictions and is becoming more feasible due to maturing algorithms and advancing computer technology. It provides more details of the flowfield and is only limited by computer resources. However, it has its share of criticisms as a predictive tool for base environment. One major concern is that CFD has not been extensively tested for base flow problems. It is therefore imperative that CFD be assessed and benchmarked satisfactorily for base flows. In this study, the turbulent base flowfield of a experimental investigation for a four-engine clustered nozzle is numerically benchmarked using a pressure based CFD method. Since the cold air was the medium, accurate prediction of the base pressure distributions at high altitudes is the primary goal. Other factors which may influence the numerical results such as the effects of grid density, turbulence model, differencing scheme, and boundary conditions are also being addressed. Preliminary results of the computed base pressure agreed reasonably well with that of the measurement. Basic base flow features such as the reverse jet, wall jet, recompression shock, and static pressure field in plane of impingement have been captured.
New approaches in agent-based modeling of complex financial systems
NASA Astrophysics Data System (ADS)
Chen, Ting-Ting; Zheng, Bo; Li, Yan; Jiang, Xiong-Fei
2017-12-01
Agent-based modeling is a powerful simulation technique to understand the collective behavior and microscopic interaction in complex financial systems. Recently, the concept for determining the key parameters of agent-based models from empirical data instead of setting them artificially was suggested. We first review several agent-based models and the new approaches to determine the key model parameters from historical market data. Based on the agents' behaviors with heterogeneous personal preferences and interactions, these models are successful in explaining the microscopic origination of the temporal and spatial correlations of financial markets. We then present a novel paradigm combining big-data analysis with agent-based modeling. Specifically, from internet query and stock market data, we extract the information driving forces and develop an agent-based model to simulate the dynamic behaviors of complex financial systems.
A Comprehensive Approach to the Patient at End of Life: Assessment of Multidimensional Suffering
Wachholtz, Amy B.; Fitch, Christina E.; Makowski, Suzana; Tjia, Jennifer
2016-01-01
Pain is a multidimensional, complex experience. There are many challenges in identifying and meeting the needs of patients experiencing pain. Evaluation of pain from a bio-psycho-social-spiritual framework is particularly germane for patients approaching the end of life. This review explores the relation between the psychospiritual dimensions of suffering and the experience of physical pain, and how to assess and treat pain in a multidimensional framework. A review of empirical data on the relation between pain and suffering as well as interdisciplinary evidence-based approaches to alleviate suffering are provided. PMID:27043799
Measuring Learning Gain: Comparing Anatomy Drawing Screencasts and Paper-Based Resources
ERIC Educational Resources Information Center
Pickering, James D.
2017-01-01
The use of technology-enhanced learning (TEL) resources is now a common tool across a variety of healthcare programs. Despite this popular approach to curriculum delivery there remains a paucity in empirical evidence that quantifies the change in learning gain. The aim of the study was to measure the changes in learning gain observed with anatomy…
ERIC Educational Resources Information Center
Kulik, Anastasia; Neyaskina, Yuliya; Frizen, Marina; Shiryaeva, Olga; Surikova, Yana
2016-01-01
This article presents the results of a detailed empirical research, aimed at studying the quality of life in the context of extreme climatic, geographical and specific sociocultural living conditions. Our research is based on the methodological approach including social, economical, ecological and psychological characteristics and reflecting…
A Co-Citation Network of Young Children's Learning with Technology
ERIC Educational Resources Information Center
Tang, Kai-Yu; Li, Ming-Chaun; Hsin, Ching-Ting; Tsai, Chin-Chung
2016-01-01
This paper used a novel literature review approach--co-citation network analysis--to illuminate the latent structure of 87 empirical papers in the field of young children's learning with technology (YCLT). Based on the document co-citation analysis, a total of 206 co-citation relationships among the 87 papers were identified and then graphically…
Supporting Literacy Across the Sunshine State: A Study of Florida Middle School Reading Coaches
ERIC Educational Resources Information Center
Marsh, Julie A.; McCombs, Jennifer Sloan; Lockwood, J. R.; Martorell, Francisco; Gershwin, Daniel; Naftel, Scott; Le, Vi-Nhuan; Shea, Molly; Barney, Heather; Crego, Al
2008-01-01
Although literacy skills needed to engage in the economy and public life have grown, the literacy skills of many adolescents remain low. One popular approach to improving student literacy is using school-based reading coaches; however, there is little empirical evidence regarding the nature of coaching and its effectiveness in changing teacher…
The Impact of Formal and Informal Learning on Students' Improvisational Processes
ERIC Educational Resources Information Center
Augustyniak, Sylvana
2014-01-01
This article, based on my PhD empirical study, was conducted in a qualitative and holistic approach. It had examined how students had used formal and informal strategies, styles and situations while improvising and composing for the research task. Eighteen research groups made up of a total of 40 males and nine females had participated in…
ERIC Educational Resources Information Center
Rolle, R. Anthony
2016-01-01
Little is known about the educational productivity of public schooling organizations when examined outside of market-based, cost-minimization frameworks. The purpose of this research was to extend the literature that supports the appropriateness of measuring levels of the economic efficiency of public schools via an alternative approach, utilizing…
NASA Astrophysics Data System (ADS)
Ganiev, R. F.; Reviznikov, D. L.; Rogoza, A. N.; Slastushenskiy, Yu. V.; Ukrainskiy, L. E.
2017-03-01
A description of a complex approach to investigation of nonlinear wave processes in the human cardiovascular system based on a combination of high-precision methods of measuring a pulse wave, mathematical methods of processing the empirical data, and methods of direct numerical modeling of hemodynamic processes in an arterial tree is given.
ERIC Educational Resources Information Center
Böhm, Stephan; Constantine, Georges Philip
2016-01-01
Purpose: This paper aims to focus on contextualized features for mobile language learning apps. The scope of this paper is to explore students' perceptions of contextualized mobile language learning. Design/Methodology/Approach: An extended Technology Acceptance Model was developed to analyze the effect of contextual app features on students'…
Appraising the reliability of visual impact assessment methods
Nickolaus R. Feimer; Kenneth H. Craik; Richard C. Smardon; Stephen R.J. Sheppard
1979-01-01
This paper presents the research approach and selected results of an empirical investigation aimed at the evaluation of selected observer-based visual impact assessment (VIA) methods. The VIA methods under examination were chosen to cover a range of VIA methods currently in use in both applied and research settings. Variation in three facets of VIA methods were...
The Meaning of Working among Professional Employees in Germany, Poland and Russia
ERIC Educational Resources Information Center
Kuchinke, K. Peter; Ardichvili, Alexandre; Borchert, Margret; Rozanski, Andrzej
2009-01-01
Purpose: The purpose of this paper is to report the results of an empirical study of the meaning of working, individual level work outcomes, and job and career satisfaction, among professional level employees in business organizations in Russia, Poland, and Germany. Design/methodology/approach: The theoretical framework for the study was based on…
ERIC Educational Resources Information Center
Gagnon, Joseph Calvin; Maccini, Paula
2007-01-01
A random sample of 167 secondary special and general educators who taught math to students with emotional and behavioral disorders (EBD) and learning disabilities (LD) responded to a mail survey. The survey examined teacher perceptions of (a) definition of math; (b) familiarity with course topics; (c) effectiveness of methods courses; (d)…
ERIC Educational Resources Information Center
Witte, T. C. H.; Jansen, E. P. W. A.
2015-01-01
This study makes a contribution to the development of empirically based, domain-specific teaching standards that are acknowledged by the professional community of teachers and which, therefore, have a good chance of being successfully implemented and used for professional development purposes. It was prompted by the resistance on the part of many…
ERIC Educational Resources Information Center
Alameda-Lawson, Tania; Lawson, Michael A.; Lawson, Hal A.
2010-01-01
Social workers have pivotal roles to play in facilitating collective parent involvement in economically poor school communities. Using a community-based, participatory, and empowerment-oriented approach to social work practice and research, this study provides empirical support for this claim. It examines the narratives of 17 economically poor…
ERIC Educational Resources Information Center
Menéndez-Varela, José-Luis; Gregori-Giralt, Eva
2016-01-01
Rubrics have attained considerable importance in the authentic and sustainable assessment paradigm; nevertheless, few studies have examined their contribution to validity, especially outside the domain of educational studies. This empirical study used a quantitative approach to analyse the validity of a rubrics-based performance assessment. Raters…
ERIC Educational Resources Information Center
Heuston, Edward Benjamin Hull
2010-01-01
Academic learning time (ALT) has long had the theoretical underpinnings sufficient to claim a causal relationship with academic achievement, but to this point empirical evidence has been lacking. This dearth of evidence has existed primarily due to difficulties associated with operationalizing ALT in traditional educational settings. Recent…
ERIC Educational Resources Information Center
Cotos, Elena
2010-01-01
This dissertation presents an innovative approach to the development and empirical evaluation of Automated Writing Evaluation (AWE) technology used for teaching and learning. It introduces IADE (Intelligent Academic Discourse Evaluator), a new web-based AWE program that analyzes research article Introduction sections and generates immediate,…
ERIC Educational Resources Information Center
Neirotti, Paolo; Paolucci, Emilio
2013-01-01
We explore the relationship between training and innovation using key insights from the resource-based approach, organizational learning and labour studies. By using data from 304 large enterprises in Italy, the study highlights a twofold role of training in favouring technological and organizational changes. First, training plays a role in…
An Empirical Study on Business English Teaching and Development in China--A Needs Analysis Approach
ERIC Educational Resources Information Center
Guiyu, Dai; Yang, Liu
2016-01-01
This paper first reviews the developmental history and status quo of Business English Program in China. Then based on the theory of needs analysis, it researches on 226 questionnaires from Business English Program students from Guangdong University of Foreign Studies to investigate the problems encountered and current situation of Business English…
ERIC Educational Resources Information Center
Arroyo, Andrew T.; Gasman, Marybeth
2014-01-01
This conceptual study builds an institution-focused, non-Eurocentric, theoretical framework of black college student success. Specifically, the study synthesizes the relevant empirical research on the contributions historically black colleges and universities (HBCUs) have made for black student success, leading to an original model that all…
ERIC Educational Resources Information Center
Pargament, Kenneth I.; Sweeney, Patrick J.
2011-01-01
This article describes the development of the spiritual fitness component of the Army's Comprehensive Soldier Fitness (CSF) program. Spirituality is defined in the human sense as the journey people take to discover and realize their essential selves and higher order aspirations. Several theoretically and empirically based reasons are articulated…
ERIC Educational Resources Information Center
Sammalisto, Kaisu; Arvidsson, Karin
2005-01-01
Purpose: This study of environment management systems implementation in Swedish universities contributes to the dialogue about the role of management systems as tools in developing sustainability in higher education. Design/methodology/approach: The empirical study is based on Government directives that make environmental management systems…
Sandak, Billie; Huss, Ephrat; Sarid, Orly; Harel, David
2015-01-01
Art therapy, as well as other arts-based therapies and interventions, is used to reduce pain, stress, depression, breathlessness and other symptoms in a wide variety of serious and chronic diseases, such as cancer, Alzheimer and schizophrenia. Arts-based approaches are also known to contribute to one’s well-being and quality of life. However, much research is required, since the mechanisms by which these non-pharmacological treatments exert their therapeutic and psychosocial effects are not adequately understood. A typical clinical setting utilizing the arts consists of the creation work itself, such as the artwork, as well as the therapist and the patient, all of which constitute a rich and dynamic environment of occurrences. The underlying complex, simultaneous and interwoven processes of this setting are often considered intractable to human observers, and as a consequence are usually interpreted subjectively and described verbally, which affect their subsequent analyses and understanding. We introduce a computational research method for elucidating and analyzing emergent expressive and social behaviors, aiming to understand how arts-based approaches operate. Our methodology, which centers on the visual language of Statecharts and tools for its execution, enables rigorous qualitative and quantitative tracking, analysis and documentation of the underlying creation and interaction processes. Also, it enables one to carry out exploratory, hypotheses-generating and knowledge discovery investigations, which are empirical-based. Furthermore, we illustrate our method’s use in a proof-of-principle study, applying it to a real-world artwork investigation with human participants. We explore individual and collective emergent behaviors impacted by diverse drawing tasks, yielding significant gender and age hypotheses, which may account for variation factors in response to art use. We also discuss how to gear our research method to systematic and mechanistic investigations, as we wish to provide a broad empirical evidence for the uptake of arts-based approaches, also aiming to ameliorate their use in clinical settings. PMID:26061736
NASA Technical Reports Server (NTRS)
Stecker, Floyd W.
2012-01-01
We calculate the intensity and photon spectrum of the intergalactic background light (IBL) as a function of red shift using an approach based on observational data obtained at in different wavelength bands from local to deep galaxy surveys. Our empirically based approach allows us, for the firs.t time, to obtain a completely model independent determination of the IBL and to quantify its uncertainties. Using our results on the IBL, we then place upper and lower limits on the opacity of the universe to gamma-rays, independent of previous constraints.
Shear velocity criterion for incipient motion of sediment
Simoes, Francisco J.
2014-01-01
The prediction of incipient motion has had great importance to the theory of sediment transport. The most commonly used methods are based on the concept of critical shear stress and employ an approach similar, or identical, to the Shields diagram. An alternative method that uses the movability number, defined as the ratio of the shear velocity to the particle’s settling velocity, was employed in this study. A large amount of experimental data were used to develop an empirical incipient motion criterion based on the movability number. It is shown that this approach can provide a simple and accurate method of computing the threshold condition for sediment motion.
Research on Group Decision-Making Mechanism of Internet Emergency Management
NASA Astrophysics Data System (ADS)
Xie, Kefan; Chen, Gang; Qian, Wu; Shi, Zhao
With the development of information technology, internet has become a popular term and internet emergency has an intensive influence on people's life. This article offers a short history of internet emergency management. It discusses the definition, characteristics, and factor of internet emergency management. A group decision-making mechanism of internet emergency is presented based on the discussion. The authors establish a so-called Rough Set Scenario Flow Graphs (RSSFG) of group decision-making mechanism of internet emergency management and make an empirical analysis based on the RSSFG approach. The experimental results confirm that this approach is effective in internet emergency decision-making.
Empirical validation of an agent-based model of wood markets in Switzerland
Hilty, Lorenz M.; Lemm, Renato; Thees, Oliver
2018-01-01
We present an agent-based model of wood markets and show our efforts to validate this model using empirical data from different sources, including interviews, workshops, experiments, and official statistics. Own surveys closed gaps where data was not available. Our approach to model validation used a variety of techniques, including the replication of historical production amounts, prices, and survey results, as well as a historical case study of a large sawmill entering the market and becoming insolvent only a few years later. Validating the model using this case provided additional insights, showing how the model can be used to simulate scenarios of resource availability and resource allocation. We conclude that the outcome of the rigorous validation qualifies the model to simulate scenarios concerning resource availability and allocation in our study region. PMID:29351300
New robust statistical procedures for the polytomous logistic regression models.
Castilla, Elena; Ghosh, Abhik; Martin, Nirian; Pardo, Leandro
2018-05-17
This article derives a new family of estimators, namely the minimum density power divergence estimators, as a robust generalization of the maximum likelihood estimator for the polytomous logistic regression model. Based on these estimators, a family of Wald-type test statistics for linear hypotheses is introduced. Robustness properties of both the proposed estimators and the test statistics are theoretically studied through the classical influence function analysis. Appropriate real life examples are presented to justify the requirement of suitable robust statistical procedures in place of the likelihood based inference for the polytomous logistic regression model. The validity of the theoretical results established in the article are further confirmed empirically through suitable simulation studies. Finally, an approach for the data-driven selection of the robustness tuning parameter is proposed with empirical justifications. © 2018, The International Biometric Society.
NASA Technical Reports Server (NTRS)
Sebok, Angelia; Wickens, Christopher; Sargent, Robert
2015-01-01
One human factors challenge is predicting operator performance in novel situations. Approaches such as drawing on relevant previous experience, and developing computational models to predict operator performance in complex situations, offer potential methods to address this challenge. A few concerns with modeling operator performance are that models need to realistic, and they need to be tested empirically and validated. In addition, many existing human performance modeling tools are complex and require that an analyst gain significant experience to be able to develop models for meaningful data collection. This paper describes an effort to address these challenges by developing an easy to use model-based tool, using models that were developed from a review of existing human performance literature and targeted experimental studies, and performing an empirical validation of key model predictions.
FONAGY, PETER
2003-01-01
The paper discusses the precarious position of psychoanalysis, a therapeutic approach which historically has defined itself by freedom from constraint and counted treatment length not in terms of number of sessions but in terms of years, in today's era of empirically validated treatments and brief structured interventions. The evidence that exists for the effectiveness of psychoanalysis as a treatment for psychological disorder is reviewed. The evidence base is significant and growing, but less than might meet criteria for an empirically based therapy. The author goes on to argue that the absence of evidence may be symptomatic of the epistemic difficulties that psychoanalysis faces in the context of 21st century psychiatry, and examines some of the philosophical problems faced by psychoanalysis as a model of the mind. Finally some changes necessary in order to ensure a future for psychoanalysis and psychoanalytic therapies within psychiatry are suggested. PMID:16946899
LD-SPatt: large deviations statistics for patterns on Markov chains.
Nuel, G
2004-01-01
Statistics on Markov chains are widely used for the study of patterns in biological sequences. Statistics on these models can be done through several approaches. Central limit theorem (CLT) producing Gaussian approximations are one of the most popular ones. Unfortunately, in order to find a pattern of interest, these methods have to deal with tail distribution events where CLT is especially bad. In this paper, we propose a new approach based on the large deviations theory to assess pattern statistics. We first recall theoretical results for empiric mean (level 1) as well as empiric distribution (level 2) large deviations on Markov chains. Then, we present the applications of these results focusing on numerical issues. LD-SPatt is the name of GPL software implementing these algorithms. We compare this approach to several existing ones in terms of complexity and reliability and show that the large deviations are more reliable than the Gaussian approximations in absolute values as well as in terms of ranking and are at least as reliable as compound Poisson approximations. We then finally discuss some further possible improvements and applications of this new method.
Roy, Kakoli; Chen, Zhuo Adam; Crawford, Carol A Gotway
2009-11-01
An organization's workforce--or human capital--is its most valuable asset. The 2002 President's Management Agenda emphasizes the importance of strategic human capital management by requiring all federal agencies to improve performance by enhancing personnel and compensation systems. In response to these directives, the Centers for Disease Control and Prevention (CDC) drafted its strategic human capital management plan to ensure that it is aligned strategically to support the agency's mission and its health protection goals. In this article, we explore the personnel economics literature to draw lessons from research studies that can help CDC enhance its human capital management and planning. To do so, we focus on topics that are of practical importance and empirical relevance to CDC's internal workforce and personnel needs with an emphasis on identifying promising research issues or methodological approaches. The personnel economics literature is rich with theoretically sound and empirically rigorous approaches for shaping an evidence-based approach to human capital management that can enhance incentives to attract, retain, and motivate a talented federal public health workforce, thereby promoting the culture of high-performance government.
Empirical modeling of dynamic behaviors of pneumatic artificial muscle actuators.
Wickramatunge, Kanchana Crishan; Leephakpreeda, Thananchai
2013-11-01
Pneumatic Artificial Muscle (PAM) actuators yield muscle-like mechanical actuation with high force to weight ratio, soft and flexible structure, and adaptable compliance for rehabilitation and prosthetic appliances to the disabled as well as humanoid robots or machines. The present study is to develop empirical models of the PAM actuators, that is, a PAM coupled with pneumatic control valves, in order to describe their dynamic behaviors for practical control design and usage. Empirical modeling is an efficient approach to computer-based modeling with observations of real behaviors. Different characteristics of dynamic behaviors of each PAM actuator are due not only to the structures of the PAM actuators themselves, but also to the variations of their material properties in manufacturing processes. To overcome the difficulties, the proposed empirical models are experimentally derived from real physical behaviors of the PAM actuators, which are being implemented. In case studies, the simulated results with good agreement to experimental results, show that the proposed methodology can be applied to describe the dynamic behaviors of the real PAM actuators. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Réveillet, Marion; Six, Delphine; Vincent, Christian; Rabatel, Antoine; Dumont, Marie; Lafaysse, Matthieu; Morin, Samuel; Vionnet, Vincent; Litt, Maxime
2018-04-01
This study focuses on simulations of the seasonal and annual surface mass balance (SMB) of Saint-Sorlin Glacier (French Alps) for the period 1996-2015 using the detailed SURFEX/ISBA-Crocus snowpack model. The model is forced by SAFRAN meteorological reanalysis data, adjusted with automatic weather station (AWS) measurements to ensure that simulations of all the energy balance components, in particular turbulent fluxes, are accurately represented with respect to the measured energy balance. Results indicate good model performance for the simulation of summer SMB when using meteorological forcing adjusted with in situ measurements. Model performance however strongly decreases without in situ meteorological measurements. The sensitivity of the model to meteorological forcing indicates a strong sensitivity to wind speed, higher than the sensitivity to ice albedo. Compared to an empirical approach, the model exhibited better performance for simulations of snow and firn melting in the accumulation area and similar performance in the ablation area when forced with meteorological data adjusted with nearby AWS measurements. When such measurements were not available close to the glacier, the empirical model performed better. Our results suggest that simulations of the evolution of future mass balance using an energy balance model require very accurate meteorological data. Given the uncertainties in the temporal evolution of the relevant meteorological variables and glacier surface properties in the future, empirical approaches based on temperature and precipitation could be more appropriate for simulations of glaciers in the future.
A Review of Singapore Principals' Leadership Qualities, Styles, and Roles
ERIC Educational Resources Information Center
Ng, David Foo Seong; Nguyen, Dong Thanh; Wong, Benjamin Koon Siak; Choy, William Kim Weng
2015-01-01
Purpose: The purpose of this paper is to present a review of empirical studies on principal leadership in Singapore. It seeks to provide a general picture of Singapore principals' leadership qualities, styles, and roles. Design/methodology/approach: This is a systematic review of empirical studies, using a "bounded" approach with a focus…
Development of Mathematical Literacy: Results of an Empirical Study
ERIC Educational Resources Information Center
Kaiser, Gabriele; Willander, Torben
2005-01-01
In the paper the results of an empirical study, which has evaluated the development of mathematical literacy in an innovative teaching programme, are presented. The theoretical approach of mathematical literacy relies strongly on applications and modelling and the study follows the approach of R. Bybee, who develops a theoretical concept of…
ERIC Educational Resources Information Center
Jackson, Duncan J. R.; Cooper-Thomas, Helena D.; van Gelderen, Marco; Davis, Jane
2010-01-01
Competencies represent an important and popular topic in human resource development. Despite this popularity, a divide exists between practitioner approaches to developmental competency measures and the empirical scrutiny of such approaches. However, the scarce empirical studies on competency measures have begun to bridge this gap. In the present…
Music Preferences and Their Relationship to Behaviors, Beliefs, and Attitudes toward Aggression
ERIC Educational Resources Information Center
Devlin, James M.; Seidel, Steven
2009-01-01
The content of violence within media has significantly increased over the years and has been approached using a diverse empirical research representation. Within these empirical attempts, the area of music violence has only been approached through the utilization of randomized experiments and thereby presses the need to explore the alternative…
Lead Slowing-Down Spectrometry Time Spectral Analysis for Spent Fuel Assay: FY11 Status Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kulisek, Jonathan A.; Anderson, Kevin K.; Bowyer, Sonya M.
2011-09-30
Developing a method for the accurate, direct, and independent assay of the fissile isotopes in bulk materials (such as used fuel) from next-generation domestic nuclear fuel cycles is a goal of the Office of Nuclear Energy, Fuel Cycle R&D, Material Protection and Control Technology (MPACT) Campaign. To meet this goal, MPACT supports a multi-institutional collaboration, of which PNNL is a part, to study the feasibility of Lead Slowing Down Spectroscopy (LSDS). This technique is an active nondestructive assay method that has the potential to provide independent, direct measurement of Pu and U isotopic masses in used fuel with an uncertaintymore » considerably lower than the approximately 10% typical of today's confirmatory assay methods. This document is a progress report for FY2011 PNNL analysis and algorithm development. Progress made by PNNL in FY2011 continues to indicate the promise of LSDS analysis and algorithms applied to used fuel. PNNL developed an empirical model based on calibration of the LSDS to responses generated from well-characterized used fuel. The empirical model, which accounts for self-shielding effects using empirical basis vectors calculated from the singular value decomposition (SVD) of a matrix containing the true self-shielding functions of the used fuel assembly models. The potential for the direct and independent assay of the sum of the masses of 239Pu and 241Pu to within approximately 3% over a wide used fuel parameter space was demonstrated. Also, in FY2011, PNNL continued to develop an analytical model. Such efforts included the addition of six more non-fissile absorbers in the analytical shielding function and the non-uniformity of the neutron flux across the LSDS assay chamber. A hybrid analytical-empirical approach was developed to determine the mass of total Pu (sum of the masses of 239Pu, 240Pu, and 241Pu), which is an important quantity in safeguards. Results using this hybrid method were of approximately the same accuracy as the pure empirical approach. In addition, total Pu with much better accuracy with the hybrid approach than the pure analytical approach. In FY2012, PNNL will continue efforts to optimize its empirical model and minimize its reliance on calibration data. In addition, PNNL will continue to develop an analytical model, considering effects such as neutron-scattering in the fuel and cladding, as well as neutrons streaming through gaps between fuel pins in the fuel assembly.« less
Barman, Linda; Silén, Charlotte; Bolander Laksov, Klara
2014-12-01
This paper reports on how teachers within health sciences education translate outcome-based education (OBE) into practice when they design courses. The study is an empirical contribution to the debate about outcome- and competency-based approaches in health sciences education. A qualitative method was used to study how teachers from 14 different study programmes designed courses before and after OBE was implemented. Using an interpretative approach, analysis of documents and interviews was carried out. The findings show that teachers enacted OBE either to design for more competency-oriented teaching-learning, or to further detail knowledge and thus move towards reductionism. Teachers mainly understood the outcome-based framework as useful to support students' learning, although the demand for accountability created tension and became a bureaucratic hindrance to design for development of professional competence. The paper shows variations of how teachers enacted the same outcome-based framework for instructional design. These differences can add a richer understanding of how outcome- or competency-based approaches relate to teaching-learning at a course level.
NASA Technical Reports Server (NTRS)
Ingold, T.; Schmid, B.; Maetzler, C.; Demoulin, P.; Kaempfer, N.
2000-01-01
A Sun photometer (18 channels between 300 and 1024 nm) has been used for measuring the columnar content of atmospheric water vapor (CWV) by solar transmittance measurements in absorption bands with channels centered at 719, 817, and 946 nm. The observable is the band-weighted transmittance function defined by the spectral absorption of water vapor and the spectral features of solar irradiance and system response. The transmittance function is approximated by a three-parameter model. Its parameters are determined from MODTRAN and LBLRTM simulations or empirical approaches using CWV data of a dual-channel microwave radiometer (MWR) or a Fourier transform spectrometer (FTS). Data acquired over a 2-year period during 1996-1998 at two different sites in Switzerland, Bern (560 m above sea level (asl)) and Jungfraujoch (3580 m asl) were compared to MWR, radiosonde (RS), and FTS retrievals. At the low-altitude station with an average CWV amount of 15 mm the LBLRTM approach (based on recently corrected line intensities) leads to negligible biases at 719 and 946 nm if compared to an average of MWR, RS, and GPS retrievals. However, at 817 nm an overestimate of 2.7 to 4.3 mm (18-29%) remains. At the high-altitude station with an average CWV amount of 1.4 mm the LBLRTM approaches overestimate the CWV by 1.0, 1.4. and 0.1 mm (58, 76, and 3%) at 719, 817, and 946 nm, compared to the ITS instrument. At the low-altitude station, CWV estimates, based on empirical approaches, agree with the MWR within 0.4 mm (2.5% of the mean); at the high-altitude site with a factor of 10 less water vapor the agreement of the sun photometers (SPM) with the ITS is 0.0 to 0.2 mm (1 to 9% of the mean CWV there). Sensitivity analyses show that for the conditions met at the two stations with CWV ranging from 0.2 to 30 mm, the retrieval errors are smallest if the 946 nm channel is used.
Evaluation of chiller modeling approaches and their usability for fault detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sreedharan, Priya
Selecting the model is an important and essential step in model based fault detection and diagnosis (FDD). Several factors must be considered in model evaluation, including accuracy, training data requirements, calibration effort, generality, and computational requirements. All modeling approaches fall somewhere between pure first-principles models, and empirical models. The objective of this study was to evaluate different modeling approaches for their applicability to model based FDD of vapor compression air conditioning units, which are commonly known as chillers. Three different models were studied: two are based on first-principles and the third is empirical in nature. The first-principles models are themore » Gordon and Ng Universal Chiller model (2nd generation), and a modified version of the ASHRAE Primary Toolkit model, which are both based on first principles. The DOE-2 chiller model as implemented in CoolTools{trademark} was selected for the empirical category. The models were compared in terms of their ability to reproduce the observed performance of an older chiller operating in a commercial building, and a newer chiller in a laboratory. The DOE-2 and Gordon-Ng models were calibrated by linear regression, while a direct-search method was used to calibrate the Toolkit model. The ''CoolTools'' package contains a library of calibrated DOE-2 curves for a variety of different chillers, and was used to calibrate the building chiller to the DOE-2 model. All three models displayed similar levels of accuracy. Of the first principles models, the Gordon-Ng model has the advantage of being linear in the parameters, which allows more robust parameter estimation methods to be used and facilitates estimation of the uncertainty in the parameter values. The ASHRAE Toolkit Model may have advantages when refrigerant temperature measurements are also available. The DOE-2 model can be expected to have advantages when very limited data are available to calibrate the model, as long as one of the previously identified models in the CoolTools library matches the performance of the chiller in question.« less
Atay, Christina; Conway, Erin R.; Angus, Daniel; Wiles, Janet; Baker, Rosemary; Chenery, Helen J.
2015-01-01
The progressive neuropathology involved in dementia frequently causes a gradual decline in communication skills. Communication partners who are unaware of the specific communication problems faced by people with dementia (PWD) can inadvertently challenge their conversation partner, leading to distress and a reduced flow of information between speakers. Previous research has produced an extensive literature base recommending strategies to facilitate conversational engagement in dementia. However, empirical evidence for the beneficial effects of these strategies on conversational dynamics is sparse. This study uses a time-efficient computational discourse analysis tool called Discursis to examine the link between specific communication behaviours and content-based conversational engagement in 20 conversations between PWD living in residential aged-care facilities and care staff members. Conversations analysed here were baseline conversations recorded before staff members underwent communication training. Care staff members spontaneously exhibited a wide range of facilitative and non-facilitative communication behaviours, which were coded for analysis of conversation dynamics within these baseline conversations. A hybrid approach combining manual coding and automated Discursis metric analysis provides two sets of novel insights. Firstly, this study revealed nine communication behaviours that, if used by the care staff member in a given turn, significantly increased the appearance of subsequent content-based engagement in the conversation by PWD. Secondly, the current findings reveal alignment between human- and computer-generated labelling of communication behaviour for 8 out of the total 22 behaviours under investigation. The approach demonstrated in this study provides an empirical procedure for the detailed evaluation of content-based conversational engagement associated with specific communication behaviours. PMID:26658135
Identifying Thresholds for Ecosystem-Based Management
Samhouri, Jameal F.; Levin, Phillip S.; Ainsworth, Cameron H.
2010-01-01
Background One of the greatest obstacles to moving ecosystem-based management (EBM) from concept to practice is the lack of a systematic approach to defining ecosystem-level decision criteria, or reference points that trigger management action. Methodology/Principal Findings To assist resource managers and policymakers in developing EBM decision criteria, we introduce a quantitative, transferable method for identifying utility thresholds. A utility threshold is the level of human-induced pressure (e.g., pollution) at which small changes produce substantial improvements toward the EBM goal of protecting an ecosystem's structural (e.g., diversity) and functional (e.g., resilience) attributes. The analytical approach is based on the detection of nonlinearities in relationships between ecosystem attributes and pressures. We illustrate the method with a hypothetical case study of (1) fishing and (2) nearshore habitat pressure using an empirically-validated marine ecosystem model for British Columbia, Canada, and derive numerical threshold values in terms of the density of two empirically-tractable indicator groups, sablefish and jellyfish. We also describe how to incorporate uncertainty into the estimation of utility thresholds and highlight their value in the context of understanding EBM trade-offs. Conclusions/Significance For any policy scenario, an understanding of utility thresholds provides insight into the amount and type of management intervention required to make significant progress toward improved ecosystem structure and function. The approach outlined in this paper can be applied in the context of single or multiple human-induced pressures, to any marine, freshwater, or terrestrial ecosystem, and should facilitate more effective management. PMID:20126647
Philosophy Pursued through Empirical Research: Introduction to the Special Issue
ERIC Educational Resources Information Center
Wilson, Terri S.; Santoro, Doris A.
2015-01-01
Many scholars have pursued philosophical inquiry through empirical research. These empirical projects have been shaped--to varying degrees and in different ways--by philosophical questions, traditions, frameworks and analytic approaches. This issue explores the methodological challenges and opportunities involved in these kinds of projects. In…
Counselor Training: Empirical Findings and Current Approaches
ERIC Educational Resources Information Center
Buser, Trevor J.
2008-01-01
The literature on counselor training has included attention to cognitive and interpersonal skill development and has reported on empirical findings regarding the relationship of training with client outcomes. This article reviews the literature on each of these topics and discusses empirical and theoretical underpinnings of recently developed…
Fault displacement hazard assessment for nuclear installations based on IAEA safety standards
NASA Astrophysics Data System (ADS)
Fukushima, Y.
2016-12-01
In the IAEA Safety NS-R-3, surface fault displacement hazard assessment (FDHA) is required for the siting of nuclear installations. If any capable faults exist in the candidate site, IAEA recommends the consideration of alternative sites. However, due to the progress in palaeoseismological investigations, capable faults may be found in existing site. In such a case, IAEA recommends to evaluate the safety using probabilistic FDHA (PFDHA), which is an empirical approach based on still quite limited database. Therefore a basic and crucial improvement is to increase the database. In 2015, IAEA produced a TecDoc-1767 on Palaeoseismology as a reference for the identification of capable faults. Another IAEA Safety Report 85 on ground motion simulation based on fault rupture modelling provides an annex introducing recent PFDHAs and fault displacement simulation methodologies. The IAEA expanded the project of FDHA for the probabilistic approach and the physics based fault rupture modelling. The first approach needs a refinement of the empirical methods by building a world wide database, and the second approach needs to shift from kinematic to the dynamic scheme. Both approaches can complement each other, since simulated displacement can fill the gap of a sparse database and geological observations can be useful to calibrate the simulations. The IAEA already supported a workshop in October 2015 to discuss the existing databases with the aim of creating a common worldwide database. A consensus of a unified database was reached. The next milestone is to fill the database with as many fault rupture data sets as possible. Another IAEA work group had a WS in November 2015 to discuss the state-of-the-art PFDHA as well as simulation methodologies. Two groups jointed a consultancy meeting in February 2016, shared information, identified issues, discussed goals and outputs, and scheduled future meetings. Now we may aim at coordinating activities for the whole FDHA tasks jointly.
ERIC Educational Resources Information Center
Breyer, F. Jay; Rupp, André A.; Bridgeman, Brent
2017-01-01
In this research report, we present an empirical argument for the use of a contributory scoring approach for the 2-essay writing assessment of the analytical writing section of the "GRE"® test in which human and machine scores are combined for score creation at the task and section levels. The approach was designed to replace a currently…
Exploring work-related issues on corporate sustainability.
Brunoro, C M; Bolis, I; Sznelwar, L I
2015-01-01
In a research project about work-related issues and corporate sustainability conducted in Brazil, the goal was to better understand how work-related issues were addressed in the corporate context. Particularly, there are some specific initiatives that serve as guides to organizational decisions, which make their performance indicators for the context of corporate sustainability. 1) To explore the presence of work-related issues and their origins in corporate sustainability approach, analyzing a) corporate disclosures; b) sustainability guidelines that are identified as relevant in corporate disclosures; c) documents that are related to sustainable development and also identified as key-documents for these guidelines and initiatives. 2) To present the activity-centered ergonomics and psychodynamics of work contributions to work-related issues in a corporate sustainability approach. An exploratory study based on multiple sources of evidence that were performed from 2012 to 2013, including interviews with companies that engaged in corporate sustainability and document analysis using the content analysis approach. Work-related issues have been presented since the earliest sustainable development documents. It is feasible to construct an empirical framework for work-related issues and corporate sustainability approaches. 1) Although some authors argue that corporate sustainability has its roots based only on the environmental dimension, there is strong empirical evidence showing that social dimension aspects such as work-related issues have been present since the beginning. 2) Some indicators should be redesigned to more precisely translate the reality of some workplaces, particularly those indicators related to organizational design and mental health.
Artemiou, Elpida; Adams, Cindy L; Toews, Lorraine; Violato, Claudio; Coe, Jason B
2014-01-01
We determined the Web-based configurations that are applied to teach medical and veterinary communication skills, evaluated their effectiveness, and suggested future educational directions for Web-based communication teaching in veterinary education. We performed a systematic search of CAB Abstracts, MEDLINE, Scopus, and ERIC limited to articles published in English between 2000 and 2012. The review focused on medical or veterinary undergraduate to clinical- or residency-level students. We selected studies for which the study population was randomized to the Web-based learning (WBL) intervention with a post-test comparison with another WBL or non-WBL method and that reported at least one empirical outcome. Two independent reviewers completed relevancy screening, data extraction, and synthesis of results using Kirkpatrick and Kirkpatrick's framework. The search retrieved 1,583 articles, and 10 met the final inclusion criteria. We identified no published articles on Web based communication platforms in veterinary medicine; however, publications summarized from human medicine demonstrated that WBL provides a potentially reliable and valid approach for teaching and assessing communication skills. Student feedback on the use of virtual patients for teaching clinical communication skills has been positive,though evidence has suggested that practice with virtual patients prompted lower relation-building responses.Empirical outcomes indicate that WBL is a viable method for expanding the approach to teaching history taking and possibly to additional tasks of the veterinary medical interview.
NASA Astrophysics Data System (ADS)
Leopold-Wildburger, Ulrike; Pickl, Stefan
2008-10-01
In our research we intend to use experiments to study human behavior in a simulation environment based on a simple Lotka-Volterra predator-prey ecology. The aim is to study the influence of participants' harvesting strategies and certain personality traits derived from [1] on the outcome in terms of sustainability and economic performance. Such an approach is embedded in a research program which intends to develop and understand interactive resource planning processes. We present the general framework as well as the new decision support system EXPOSIM. The key element is the combination of experimental design, analytical understanding of time-discrete systems (especially Lotka-Volterra systems) and economic performance. In the first part, the general role of laboratory experiments is discussed. The second part summarizes the concept of sustainable development. It is taken from [18]. As we use Lotka-Volterra systems as the basis for our simulations a theoretical framework is described afterwards. It is possible to determine optimal behavior for those systems. The empirical setting is based on the empirical approach that the subjects are put into the position of a decision-maker. They are able to model the environment in such a way that harvesting can be observed. We suggest an experimental setting which might lead to new insights in an anticipatory sense.
NASA Astrophysics Data System (ADS)
Strauch, R. L.; Istanbulluoglu, E.
2017-12-01
We develop a landslide hazard modeling approach that integrates a data-driven statistical model and a probabilistic process-based shallow landslide model for mapping probability of landslide initiation, transport, and deposition at regional scales. The empirical model integrates the influence of seven site attribute (SA) classes: elevation, slope, curvature, aspect, land use-land cover, lithology, and topographic wetness index, on over 1,600 observed landslides using a frequency ratio (FR) approach. A susceptibility index is calculated by adding FRs for each SA on a grid-cell basis. Using landslide observations we relate susceptibility index to an empirically-derived probability of landslide impact. This probability is combined with results from a physically-based model to produce an integrated probabilistic map. Slope was key in landslide initiation while deposition was linked to lithology and elevation. Vegetation transition from forest to alpine vegetation and barren land cover with lower root cohesion leads to higher frequency of initiation. Aspect effects are likely linked to differences in root cohesion and moisture controlled by solar insulation and snow. We demonstrate the model in the North Cascades of Washington, USA and identify locations of high and low probability of landslide impacts that can be used by land managers in their design, planning, and maintenance.
Bacterial meningitis - principles of antimicrobial treatment.
Jawień, Miroslaw; Garlicki, Aleksander M
2013-01-01
Bacterial meningitis is associated with significant morbidity and mortality despite the availability of effective antimicrobial therapy. The management approach to patients with suspected or proven bacterial meningitis includes emergent cerebrospinal fluid analysis and initiation of appropriate antimicrobial and adjunctive therapies. The choice of empirical antimicrobial therapy is based on the patient's age and underlying disease status; once the infecting pathogen is isolated, antimicrobial therapy can be modified for optimal treatment. Successful treatment of bacterial meningitis requires the knowledge on epidemiology including prevalence of antimicrobial resistant pathogens, pathogenesis of meningitis, pharmacokinetics and pharmacodynamics of antimicrobial agents. The emergence of antibiotic-resistant bacterial strains in recent years has necessitated the development of new strategies for empiric antimicrobial therapy for bacterial meningitis.
Semi-empirical quantum evaluation of peptide - MHC class II binding
NASA Astrophysics Data System (ADS)
González, Ronald; Suárez, Carlos F.; Bohórquez, Hugo J.; Patarroyo, Manuel A.; Patarroyo, Manuel E.
2017-01-01
Peptide presentation by the major histocompatibility complex (MHC) is a key process for triggering a specific immune response. Studying peptide-MHC (pMHC) binding from a structural-based approach has potential for reducing the costs of investigation into vaccine development. This study involved using two semi-empirical quantum chemistry methods (PM7 and FMO-DFTB) for computing the binding energies of peptides bonded to HLA-DR1 and HLA-DR2. We found that key stabilising water molecules involved in the peptide binding mechanism were required for finding high correlation with IC50 experimental values. Our proposal is computationally non-intensive, and is a reliable alternative for studying pMHC binding interactions.
Advances in Landslide Nowcasting: Evaluation of a Global and Regional Modeling Approach
NASA Technical Reports Server (NTRS)
Kirschbaum, Dalia Bach; Peters-Lidard, Christa; Adler, Robert; Hong, Yang; Kumar, Sujay; Lerner-Lam, Arthur
2011-01-01
The increasing availability of remotely sensed data offers a new opportunity to address landslide hazard assessment at larger spatial scales. A prototype global satellite-based landslide hazard algorithm has been developed to identify areas that may experience landslide activity. This system combines a calculation of static landslide susceptibility with satellite-derived rainfall estimates and uses a threshold approach to generate a set of nowcasts that classify potentially hazardous areas. A recent evaluation of this algorithm framework found that while this tool represents an important first step in larger-scale near real-time landslide hazard assessment efforts, it requires several modifications before it can be fully realized as an operational tool. This study draws upon a prior work s recommendations to develop a new approach for considering landslide susceptibility and hazard at the regional scale. This case study calculates a regional susceptibility map using remotely sensed and in situ information and a database of landslides triggered by Hurricane Mitch in 1998 over four countries in Central America. The susceptibility map is evaluated with a regional rainfall intensity duration triggering threshold and results are compared with the global algorithm framework for the same event. Evaluation of this regional system suggests that this empirically based approach provides one plausible way to approach some of the data and resolution issues identified in the global assessment. The presented methodology is straightforward to implement, improves upon the global approach, and allows for results to be transferable between regions. The results also highlight several remaining challenges, including the empirical nature of the algorithm framework and adequate information for algorithm validation. Conclusions suggest that integrating additional triggering factors such as soil moisture may help to improve algorithm performance accuracy. The regional algorithm scenario represents an important step forward in advancing regional and global-scale landslide hazard assessment.
Wavelet analysis for wind fields estimation.
Leite, Gladeston C; Ushizima, Daniela M; Medeiros, Fátima N S; de Lima, Gilson G
2010-01-01
Wind field analysis from synthetic aperture radar images allows the estimation of wind direction and speed based on image descriptors. In this paper, we propose a framework to automate wind direction retrieval based on wavelet decomposition associated with spectral processing. We extend existing undecimated wavelet transform approaches, by including à trous with B(3) spline scaling function, in addition to other wavelet bases as Gabor and Mexican-hat. The purpose is to extract more reliable directional information, when wind speed values range from 5 to 10 ms(-1). Using C-band empirical models, associated with the estimated directional information, we calculate local wind speed values and compare our results with QuikSCAT scatterometer data. The proposed approach has potential application in the evaluation of oil spills and wind farms.
Estimating Casualties for Large Earthquakes Worldwide Using an Empirical Approach
Jaiswal, Kishor; Wald, David J.; Hearne, Mike
2009-01-01
We developed an empirical country- and region-specific earthquake vulnerability model to be used as a candidate for post-earthquake fatality estimation by the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) system. The earthquake fatality rate is based on past fatal earthquakes (earthquakes causing one or more deaths) in individual countries where at least four fatal earthquakes occurred during the catalog period (since 1973). Because only a few dozen countries have experienced four or more fatal earthquakes since 1973, we propose a new global regionalization scheme based on idealization of countries that are expected to have similar susceptibility to future earthquake losses given the existing building stock, its vulnerability, and other socioeconomic characteristics. The fatality estimates obtained using an empirical country- or region-specific model will be used along with other selected engineering risk-based loss models for generation of automated earthquake alerts. These alerts could potentially benefit the rapid-earthquake-response agencies and governments for better response to reduce earthquake fatalities. Fatality estimates are also useful to stimulate earthquake preparedness planning and disaster mitigation. The proposed model has several advantages as compared with other candidate methods, and the country- or region-specific fatality rates can be readily updated when new data become available.
Campos, G W
1998-01-01
This paper describes a new health care management method. A triangular confrontation system was constructed, based on a theoretical review, empirical facts observed from health services, and the researcher's knowledge, jointly analyzed. This new management model was termed 'health-team-focused collegiate management', entailing several original organizational concepts: production unity, matrix-based reference team, collegiate management system, cogovernance, and product/production interface.
Wolfe, C R
2001-02-01
Analogy and metaphor are figurative forms of communication that help people integrate new information with prior knowledge to facilitate comprehension and appropriate inferences. The novelty and versatility of the Web place cognitive burdens on learners that can be overcome through the use of analogies and metaphors. This paper explores three uses of figurative communication as design elements in Web-based learning environments, and provides empirical illustrations of each. First, extended analogies can be used as the basis of cover stories that create an analogy between the learner's position and a hypothetical situation. The Dragonfly Web pages make extensive use of analogous cover stories in the design of interactive decision-making games. Feedback from visitors, patterns of usage, and external reviews provide evidence of effectiveness. A second approach is visual analogies based on the principles of ecological psychology. An empirical example suggests that visual analogies are most effective when there is a one-to-one correspondence between the base and visual target analogs. The use of learner-generated analogies is a third approach. Data from an offline study with undergraduate science students are presented indicating that generating analogies are associated with significant improvements in the ability to place events in natural history on a time line. It is concluded that cyberspace itself might form the basis of the next guiding metaphor of mind.
Bridging Empirical and Physical Approaches for Landslide Monitoring and Early Warning
NASA Technical Reports Server (NTRS)
Kirschbaum, Dalia; Peters-Lidard, Christa; Adler, Robert; Kumar, Sujay; Harrison, Ken
2011-01-01
Rainfall-triggered landslides typically occur and are evaluated at local scales, using slope-stability models to calculate coincident changes in driving and resisting forces at the hillslope level in order to anticipate slope failures. Over larger areas, detailed high resolution landslide modeling is often infeasible due to difficulties in quantifying the complex interaction between rainfall infiltration and surface materials as well as the dearth of available in situ soil and rainfall estimates and accurate landslide validation data. This presentation will discuss how satellite precipitation and surface information can be applied within a landslide hazard assessment framework to improve landslide monitoring and early warning by considering two disparate approaches to landslide hazard assessment: an empirical landslide forecasting algorithm and a physical slope-stability model. The goal of this research is to advance near real-time landslide hazard assessment and early warning at larger spatial scales. This is done by employing high resolution surface and precipitation information within a probabilistic framework to provide more physically-based grounding to empirical landslide triggering thresholds. The empirical landslide forecasting tool, running in near real-time at http://trmm.nasa.gov, considers potential landslide activity at the global scale and relies on Tropical Rainfall Measuring Mission (TRMM) precipitation data and surface products to provide a near real-time picture of where landslides may be triggered. The physical approach considers how rainfall infiltration on a hillslope affects the in situ hydro-mechanical processes that may lead to slope failure. Evaluation of these empirical and physical approaches are performed within the Land Information System (LIS), a high performance land surface model processing and data assimilation system developed within the Hydrological Sciences Branch at NASA's Goddard Space Flight Center. LIS provides the capabilities to quantify uncertainty from model inputs and calculate probabilistic estimates for slope failures. Results indicate that remote sensing data can provide many of the spatiotemporal requirements for accurate landslide monitoring and early warning; however, higher resolution precipitation inputs will help to better identify small-scale precipitation forcings that contribute to significant landslide triggering. Future missions, such as the Global Precipitation Measurement (GPM) mission will provide more frequent and extensive estimates of precipitation at the global scale, which will serve as key inputs to significantly advance the accuracy of landslide hazard assessment, particularly over larger spatial scales.