Sample records for empirically pre-defined statistical

  1. An Empirical Study of Presage Variables in the Teaching-Learning of Statistics, in the Light of Research on Competencies

    ERIC Educational Resources Information Center

    Rodriguez, Clemente; Gutierrez-Perez, Jose; Pozo, Teresa

    2010-01-01

    Introduction: This research seeks to determine the influence exercised by a set of presage and process variables (students' pre-existing opinion towards statistics, their dedication to mastery of statistics content, assessment of the teaching materials, and the teacher's effort in the teaching of statistics) in students' resolution of activities…

  2. The Thurgood Marshall School of Law Empirical Findings: A Report of the Statistical Analysis of the July 2010 TMSL Texas Bar Results

    ERIC Educational Resources Information Center

    Kadhi, Tau; Holley, D.

    2010-01-01

    The following report gives the statistical findings of the July 2010 TMSL Bar results. Procedures: Data is pre-existing and was given to the Evaluator by email from the Registrar and Dean. Statistical analyses were run using SPSS 17 to address the following research questions: 1. What are the statistical descriptors of the July 2010 overall TMSL…

  3. Meta-Analysis and Cost Comparison of Empirical versus Pre-Emptive Antifungal Strategies in Hematologic Malignancy Patients with High-Risk Febrile Neutropenia.

    PubMed

    Fung, Monica; Kim, Jane; Marty, Francisco M; Schwarzinger, Michaël; Koo, Sophia

    2015-01-01

    Invasive fungal disease (IFD) causes significant morbidity and mortality in hematologic malignancy patients with high-risk febrile neutropenia (FN). These patients therefore often receive empirical antifungal therapy. Diagnostic test-guided pre-emptive antifungal therapy has been evaluated as an alternative treatment strategy in these patients. We conducted an electronic search for literature comparing empirical versus pre-emptive antifungal strategies in FN among adult hematologic malignancy patients. We systematically reviewed 9 studies, including randomized-controlled trials, cohort studies, and feasibility studies. Random and fixed-effect models were used to generate pooled relative risk estimates of IFD detection, IFD-related mortality, overall mortality, and rates and duration of antifungal therapy. Heterogeneity was measured via Cochran's Q test, I2 statistic, and between study τ2. Incorporating these parameters and direct costs of drugs and diagnostic testing, we constructed a comparative costing model for the two strategies. We conducted probabilistic sensitivity analysis on pooled estimates and one-way sensitivity analyses on other key parameters with uncertain estimates. Nine published studies met inclusion criteria. Compared to empirical antifungal therapy, pre-emptive strategies were associated with significantly lower antifungal exposure (RR 0.48, 95% CI 0.27-0.85) and duration without an increase in IFD-related mortality (RR 0.82, 95% CI 0.36-1.87) or overall mortality (RR 0.95, 95% CI 0.46-1.99). The pre-emptive strategy cost $324 less (95% credible interval -$291.88 to $418.65 pre-emptive compared to empirical) than the empirical approach per FN episode. However, the cost difference was influenced by relatively small changes in costs of antifungal therapy and diagnostic testing. Compared to empirical antifungal therapy, pre-emptive antifungal therapy in patients with high-risk FN may decrease antifungal use without increasing mortality. We demonstrate a state of economic equipoise between empirical and diagnostic-directed pre-emptive antifungal treatment strategies, influenced by small changes in cost of antifungal therapy and diagnostic testing, in the current literature. This work emphasizes the need for optimization of existing fungal diagnostic strategies, development of more efficient diagnostic strategies, and less toxic and more cost-effective antifungals.

  4. The Thurgood Marshall School of Law Empirical Findings: A Report of the Statistical Analysis of the February 2010 TMSL Texas Bar Results

    ERIC Educational Resources Information Center

    Kadhi, T.; Holley, D.; Rudley, D.; Garrison, P.; Green, T.

    2010-01-01

    The following report gives the statistical findings of the 2010 Thurgood Marshall School of Law (TMSL) Texas Bar results. This data was pre-existing and was given to the Evaluator by email from the Dean. Then, in-depth statistical analyses were run using the SPSS 17 to address the following questions: 1. What are the statistical descriptors of the…

  5. The Thurgood Marshall School of Law Empirical Findings: A Report of the Watson-Glaser for the 2009-2010 Test Takers

    ERIC Educational Resources Information Center

    Kadhi, T.; Palasota, A.; Holley, D.; Rudley, D.

    2010-01-01

    The following report gives the statistical findings of the 2009-2010 Watson-Glaser test. Data is pre-existing and was given to the Evaluator by email from the Director, Center for Legal Pedagogy. Statistical analyses were run using SPSS 17 to address the following questions: 1. What are the statistical descriptors of the Watson-Glaser results of…

  6. Evaluation of Regression Models of Balance Calibration Data Using an Empirical Criterion

    NASA Technical Reports Server (NTRS)

    Ulbrich, Norbert; Volden, Thomas R.

    2012-01-01

    An empirical criterion for assessing the significance of individual terms of regression models of wind tunnel strain gage balance outputs is evaluated. The criterion is based on the percent contribution of a regression model term. It considers a term to be significant if its percent contribution exceeds the empirical threshold of 0.05%. The criterion has the advantage that it can easily be computed using the regression coefficients of the gage outputs and the load capacities of the balance. First, a definition of the empirical criterion is provided. Then, it is compared with an alternate statistical criterion that is widely used in regression analysis. Finally, calibration data sets from a variety of balances are used to illustrate the connection between the empirical and the statistical criterion. A review of these results indicated that the empirical criterion seems to be suitable for a crude assessment of the significance of a regression model term as the boundary between a significant and an insignificant term cannot be defined very well. Therefore, regression model term reduction should only be performed by using the more universally applicable statistical criterion.

  7. An empirical approach to sufficient similarity in dose-responsiveness: Utilization of statistical distance as a similarity measure.

    EPA Science Inventory

    Using statistical equivalence testing logic and mixed model theory an approach has been developed, that extends the work of Stork et al (JABES,2008), to define sufficient similarity in dose-response for chemical mixtures containing the same chemicals with different ratios ...

  8. Data mining of tree-based models to analyze freeway accident frequency.

    PubMed

    Chang, Li-Yen; Chen, Wen-Chieh

    2005-01-01

    Statistical models, such as Poisson or negative binomial regression models, have been employed to analyze vehicle accident frequency for many years. However, these models have their own model assumptions and pre-defined underlying relationship between dependent and independent variables. If these assumptions are violated, the model could lead to erroneous estimation of accident likelihood. Classification and Regression Tree (CART), one of the most widely applied data mining techniques, has been commonly employed in business administration, industry, and engineering. CART does not require any pre-defined underlying relationship between target (dependent) variable and predictors (independent variables) and has been shown to be a powerful tool, particularly for dealing with prediction and classification problems. This study collected the 2001-2002 accident data of National Freeway 1 in Taiwan. A CART model and a negative binomial regression model were developed to establish the empirical relationship between traffic accidents and highway geometric variables, traffic characteristics, and environmental factors. The CART findings indicated that the average daily traffic volume and precipitation variables were the key determinants for freeway accident frequencies. By comparing the prediction performance between the CART and the negative binomial regression models, this study demonstrates that CART is a good alternative method for analyzing freeway accident frequencies. By comparing the prediction performance between the CART and the negative binomial regression models, this study demonstrates that CART is a good alternative method for analyzing freeway accident frequencies.

  9. Balancing geo-privacy and spatial patterns in epidemiological studies.

    PubMed

    Chen, Chien-Chou; Chuang, Jen-Hsiang; Wang, Da-Wei; Wang, Chien-Min; Lin, Bo-Cheng; Chan, Ta-Chien

    2017-11-08

    To balance the protection of geo-privacy and the accuracy of spatial patterns, we developed a geo-spatial tool (GeoMasker) intended to mask the residential locations of patients or cases in a geographic information system (GIS). To elucidate the effects of geo-masking parameters, we applied 2010 dengue epidemic data from Taiwan testing the tool's performance in an empirical situation. The similarity of pre- and post-spatial patterns was measured by D statistics under a 95% confidence interval. In the empirical study, different magnitudes of anonymisation (estimated Kanonymity ≥10 and 100) were achieved and different degrees of agreement on the pre- and post-patterns were evaluated. The application is beneficial for public health workers and researchers when processing data with individuals' spatial information.

  10. The Thurgood Marshall School of Law Empirical Findings: A Report of the Relationship between Graduate GPAs and First-Time Texas Bar Scores of February 2010 and July 2009

    ERIC Educational Resources Information Center

    Kadhi, T.; Holley, D.; Palasota, A.

    2010-01-01

    The following report gives descriptive and correlational statistical findings of the Grade Point Averages (GPAs) of the February 2010 and July 2009 TMSL First Time Texas Bar Test Takers to their TMSL Final GPA. Data was pre-existing and was given to the Evaluator by email from the Dean and Registrar. Statistical analyses were run using SPSS 17 to…

  11. The Thurgood Marshall School of Law Empirical Findings: A Report of the 2012 Friday Academy Attendance and Statistical Comparisons of 1L GPA (Predicted and Actual)

    ERIC Educational Resources Information Center

    Kadhi, T.; Rudley, D.; Holley, D.; Krishna, K.; Ogolla, C.; Rene, E.; Green, T.

    2010-01-01

    The following report of descriptive statistics addresses the attendance of the 2012 class and the average Actual and Predicted 1L Grade Point Averages (GPAs). Correlational and Inferential statistics are also run on the variables of Attendance (Y/N), Attendance Number of Times, Actual GPA, and Predictive GPA (Predictive GPA is defined as the Index…

  12. Evaluating the effect of intraoperative peritoneal lavage on bacterial culture in dogs with suspected septic peritonitis.

    PubMed

    Swayne, Seanna L; Brisson, Brigitte; Weese, J Scott; Sears, William

    2012-09-01

    This pilot study describes the effect of intraoperative peritoneal lavage (IOPL) on bacterial counts and outcome in clinical cases of septic peritonitis. Intraoperative samples were cultured before and after IOPL. Thirty-three dogs with presumed septic peritonitis on the basis of cytology were managed surgically during the study period. Positive pre-lavage bacterial cultures were found in 14 cases, 13 of which were a result of intestinal leakage. The post-lavage cultures showed fewer isolates in 9 cases and in 1 case became negative. The number of dogs with a decrease in the concentration of bacteria cultured from pre-lavage to post-lavage samples was not statistically significant. There was no significant effect of the change in pre- to post-lavage culture, single versus multiple types of bacteria, selection of an appropriate empiric antimicrobial on survival or the need for subsequent surgery. This pilot study describes the effect of intraoperative peritoneal lavage (IOPL) on bacterial counts and outcome in clinical cases of septic peritonitis. Intraoperative samples were cultured before and after IOPL. Thirty-three dogs with presumed septic peritonitis on the basis of cytology were managed surgically during the study period. Positive pre-lavage bacterial cultures were found in 14 cases, 13 of which were a result of intestinal leakage. The post-lavage cultures showed fewer isolates in 9 cases and in 1 case became negative. The number of dogs with a decrease in the concentration of bacteria cultured from pre-lavage to post-lavage samples was not statistically significant. There was no significant effect of the change in pre- to post-lavage culture, single versus multiple types of bacteria, selection of an appropriate empiric antimicrobial on survival or the need for subsequent surgery.

  13. Content Analysis of Chemistry Curricula in Germany Case Study: Chemical Reactions

    ERIC Educational Resources Information Center

    Timofte, Roxana S.

    2015-01-01

    Curriculum-assessment alignment is a well known foundation for good practice in educational assessment, for items' curricular validity purposes. Nowadays instruments are designed to measure pupils' competencies in one or more areas of competence. Sub-competence areas could be defined theoretically and statistical analysis of empirical data by…

  14. Risk and utility in portfolio optimization

    NASA Astrophysics Data System (ADS)

    Cohen, Morrel H.; Natoli, Vincent D.

    2003-06-01

    Modern portfolio theory (MPT) addresses the problem of determining the optimum allocation of investment resources among a set of candidate assets. In the original mean-variance approach of Markowitz, volatility is taken as a proxy for risk, conflating uncertainty with risk. There have been many subsequent attempts to alleviate that weakness which, typically, combine utility and risk. We present here a modification of MPT based on the inclusion of separate risk and utility criteria. We define risk as the probability of failure to meet a pre-established investment goal. We define utility as the expectation of a utility function with positive and decreasing marginal value as a function of yield. The emphasis throughout is on long investment horizons for which risk-free assets do not exist. Analytic results are presented for a Gaussian probability distribution. Risk-utility relations are explored via empirical stock-price data, and an illustrative portfolio is optimized using the empirical data.

  15. Exploring the relationship between time management skills and the academic achievement of African engineering students - a case study

    NASA Astrophysics Data System (ADS)

    Swart, Arthur James; Lombard, Kobus; de Jager, Henk

    2010-03-01

    Poor academic success by African engineering students is currently experienced in many higher educational institutions, contributing to lower financial subsidies by local governments. One of the contributing factors to this low academic success may be the poor time management skills of these students. This article endeavours to explore this relationship by means of a theoretical literature review and an empirical study. Numerous studies have been conducted in this regard, but with mixed results. The case study of this article involves a design module termed Design Projects III, where the empirical study incorporated an ex post facto study involving a pre-experimental/exploratory design using descriptive statistics. The results of this study were applied to various tests, which indicated no statistically significant relationship between time management skills and the academic achievement of African engineering students.

  16. Modeling noisy resonant system response

    NASA Astrophysics Data System (ADS)

    Weber, Patrick Thomas; Walrath, David Edwin

    2017-02-01

    In this paper, a theory-based model replicating empirical acoustic resonant signals is presented and studied to understand sources of noise present in acoustic signals. Statistical properties of empirical signals are quantified and a noise amplitude parameter, which models frequency and amplitude-based noise, is created, defined, and presented. This theory-driven model isolates each phenomenon and allows for parameters to be independently studied. Using seven independent degrees of freedom, this model will accurately reproduce qualitative and quantitative properties measured from laboratory data. Results are presented and demonstrate success in replicating qualitative and quantitative properties of experimental data.

  17. An Application of Structural Equation Modeling for Developing Good Teaching Characteristics Ontology

    ERIC Educational Resources Information Center

    Phiakoksong, Somjin; Niwattanakul, Suphakit; Angskun, Thara

    2013-01-01

    Ontology is a knowledge representation technique which aims to make knowledge explicit by defining the core concepts and their relationships. The Structural Equation Modeling (SEM) is a statistical technique which aims to explore the core factors from empirical data and estimates the relationship between these factors. This article presents an…

  18. Application of Fuzzy Reasoning for Filtering and Enhancement of Ultrasonic Images

    NASA Technical Reports Server (NTRS)

    Sacha, J. P.; Cios, K. J.; Roth, D. J.; Berke, L.; Vary, A.

    1994-01-01

    This paper presents a new type of an adaptive fuzzy operator for detection of isolated abnormalities, and enhancement of raw ultrasonic images. Fuzzy sets used in decision rules are defined for each image based on empirical statistics of the color intensities. Examples of the method are also presented in the paper.

  19. Modeling stock price dynamics by continuum percolation system and relevant complex systems analysis

    NASA Astrophysics Data System (ADS)

    Xiao, Di; Wang, Jun

    2012-10-01

    The continuum percolation system is developed to model a random stock price process in this work. Recent empirical research has demonstrated various statistical features of stock price changes, the financial model aiming at understanding price fluctuations needs to define a mechanism for the formation of the price, in an attempt to reproduce and explain this set of empirical facts. The continuum percolation model is usually referred to as a random coverage process or a Boolean model, the local interaction or influence among traders is constructed by the continuum percolation, and a cluster of continuum percolation is applied to define the cluster of traders sharing the same opinion about the market. We investigate and analyze the statistical behaviors of normalized returns of the price model by some analysis methods, including power-law tail distribution analysis, chaotic behavior analysis and Zipf analysis. Moreover, we consider the daily returns of Shanghai Stock Exchange Composite Index from January 1997 to July 2011, and the comparisons of return behaviors between the actual data and the simulation data are exhibited.

  20. An empirical evaluation of genetic distance statistics using microsatellite data from bear (Ursidae) populations.

    PubMed

    Paetkau, D; Waits, L P; Clarkson, P L; Craighead, L; Strobeck, C

    1997-12-01

    A large microsatellite data set from three species of bear (Ursidae) was used to empirically test the performance of six genetic distance measures in resolving relationships at a variety of scales ranging from adjacent areas in a continuous distribution to species that diverged several million years ago. At the finest scale, while some distance measures performed extremely well, statistics developed specifically to accommodate the mutational processes of microsatellites performed relatively poorly, presumably because of the relatively higher variance of these statistics. At the other extreme, no statistic was able to resolve the close sister relationship of polar bears and brown bears from more distantly related pairs of species. This failure is most likely due to constraints on allele distributions at microsatellite loci. At intermediate scales, both within continuous distributions and in comparisons to insular populations of late Pleistocene origin, it was not possible to define the point where linearity was lost for each of the statistics, except that it is clearly lost after relatively short periods of independent evolution. All of the statistics were affected by the amount of genetic diversity within the populations being compared, significantly complicating the interpretation of genetic distance data.

  1. An Empirical Evaluation of Genetic Distance Statistics Using Microsatellite Data from Bear (Ursidae) Populations

    PubMed Central

    Paetkau, D.; Waits, L. P.; Clarkson, P. L.; Craighead, L.; Strobeck, C.

    1997-01-01

    A large microsatellite data set from three species of bear (Ursidae) was used to empirically test the performance of six genetic distance measures in resolving relationships at a variety of scales ranging from adjacent areas in a continuous distribution to species that diverged several million years ago. At the finest scale, while some distance measures performed extremely well, statistics developed specifically to accommodate the mutational processes of microsatellites performed relatively poorly, presumably because of the relatively higher variance of these statistics. At the other extreme, no statistic was able to resolve the close sister relationship of polar bears and brown bears from more distantly related pairs of species. This failure is most likely due to constraints on allele distributions at microsatellite loci. At intermediate scales, both within continuous distributions and in comparisons to insular populations of late Pleistocene origin, it was not possible to define the point where linearity was lost for each of the statistics, except that it is clearly lost after relatively short periods of independent evolution. All of the statistics were affected by the amount of genetic diversity within the populations being compared, significantly complicating the interpretation of genetic distance data. PMID:9409849

  2. Worrying trends in econophysics

    NASA Astrophysics Data System (ADS)

    Gallegati, Mauro; Keen, Steve; Lux, Thomas; Ormerod, Paul

    2006-10-01

    Econophysics has already made a number of important empirical contributions to our understanding of the social and economic world. These fall mainly into the areas of finance and industrial economics, where in each case there is a large amount of reasonably well-defined data. More recently, Econophysics has also begun to tackle other areas of economics where data is much more sparse and much less reliable. In addition, econophysicists have attempted to apply the theoretical approach of statistical physics to try to understand empirical findings. Our concerns are fourfold. First, a lack of awareness of work that has been done within economics itself. Second, resistance to more rigorous and robust statistical methodology. Third, the belief that universal empirical regularities can be found in many areas of economic activity. Fourth, the theoretical models which are being used to explain empirical phenomena. The latter point is of particular concern. Essentially, the models are based upon models of statistical physics in which energy is conserved in exchange processes. There are examples in economics where the principle of conservation may be a reasonable approximation to reality, such as primitive hunter-gatherer societies. But in the industrialised capitalist economies, income is most definitely not conserved. The process of production and not exchange is responsible for this. Models which focus purely on exchange and not on production cannot by definition offer a realistic description of the generation of income in the capitalist, industrialised economies.

  3. Statistical wave climate projections for coastal impact assessments

    NASA Astrophysics Data System (ADS)

    Camus, P.; Losada, I. J.; Izaguirre, C.; Espejo, A.; Menéndez, M.; Pérez, J.

    2017-09-01

    Global multimodel wave climate projections are obtained at 1.0° × 1.0° scale from 30 Coupled Model Intercomparison Project Phase 5 (CMIP5) global circulation model (GCM) realizations. A semi-supervised weather-typing approach based on a characterization of the ocean wave generation areas and the historical wave information from the recent GOW2 database are used to train the statistical model. This framework is also applied to obtain high resolution projections of coastal wave climate and coastal impacts as port operability and coastal flooding. Regional projections are estimated using the collection of weather types at spacing of 1.0°. This assumption is feasible because the predictor is defined based on the wave generation area and the classification is guided by the local wave climate. The assessment of future changes in coastal impacts is based on direct downscaling of indicators defined by empirical formulations (total water level for coastal flooding and number of hours per year with overtopping for port operability). Global multimodel projections of the significant wave height and peak period are consistent with changes obtained in previous studies. Statistical confidence of expected changes is obtained due to the large number of GCMs to construct the ensemble. The proposed methodology is proved to be flexible to project wave climate at different spatial scales. Regional changes of additional variables as wave direction or other statistics can be estimated from the future empirical distribution with extreme values restricted to high percentiles (i.e., 95th, 99th percentiles). The statistical framework can also be applied to evaluate regional coastal impacts integrating changes in storminess and sea level rise.

  4. Heterogeneity in chronic fatigue syndrome - empirically defined subgroups from the PACE trial.

    PubMed

    Williams, T E; Chalder, T; Sharpe, M; White, P D

    2017-06-01

    Chronic fatigue syndrome is likely to be a heterogeneous condition. Previous studies have empirically defined subgroups using combinations of clinical and biological variables. We aimed to explore the heterogeneity of chronic fatigue syndrome. We used baseline data from the PACE trial, which included 640 participants with chronic fatigue syndrome. Variable reduction, using a combination of clinical knowledge and principal component analyses, produced a final dataset of 26 variables for 541 patients. Latent class analysis was then used to empirically define subgroups. The most statistically significant and clinically recognizable model comprised five subgroups. The largest, 'core' subgroup (33% of participants), had relatively low scores across all domains and good self-efficacy. A further three subgroups were defined by: the presence of mood disorders (21%); the presence of features of other functional somatic syndromes (such as fibromyalgia or irritable bowel syndrome) (21%); or by many symptoms - a group which combined features of both of the above (14%). The smallest 'avoidant-inactive' subgroup was characterized by physical inactivity, belief that symptoms were entirely physical in nature, and fear that they indicated harm (11%). Differences in the severity of fatigue and disability provided some discriminative validation of the subgroups. In addition to providing further evidence for the heterogeneity of chronic fatigue syndrome, the subgroups identified may aid future research into the important aetiological factors of specific subtypes of chronic fatigue syndrome and the development of more personalized treatment approaches.

  5. Impact of orbit modeling on DORIS station position and Earth rotation estimates

    NASA Astrophysics Data System (ADS)

    Štěpánek, Petr; Rodriguez-Solano, Carlos Javier; Hugentobler, Urs; Filler, Vratislav

    2014-04-01

    The high precision of estimated station coordinates and Earth rotation parameters (ERP) obtained from satellite geodetic techniques is based on the precise determination of the satellite orbit. This paper focuses on the analysis of the impact of different orbit parameterizations on the accuracy of station coordinates and the ERPs derived from DORIS observations. In a series of experiments the DORIS data from the complete year 2011 were processed with different orbit model settings. First, the impact of precise modeling of the non-conservative forces on geodetic parameters was compared with results obtained with an empirical-stochastic modeling approach. Second, the temporal spacing of drag scaling parameters was tested. Third, the impact of estimating once-per-revolution harmonic accelerations in cross-track direction was analyzed. And fourth, two different approaches for solar radiation pressure (SRP) handling were compared, namely adjusting SRP scaling parameter or fixing it on pre-defined values. Our analyses confirm that the empirical-stochastic orbit modeling approach, which does not require satellite attitude information and macro models, results for most of the monitored station parameters in comparable accuracy as the dynamical model that employs precise non-conservative force modeling. However, the dynamical orbit model leads to a reduction of the RMS values for the estimated rotation pole coordinates by 17% for x-pole and 12% for y-pole. The experiments show that adjusting atmospheric drag scaling parameters each 30 min is appropriate for DORIS solutions. Moreover, it was shown that the adjustment of cross-track once-per-revolution empirical parameter increases the RMS of the estimated Earth rotation pole coordinates. With recent data it was however not possible to confirm the previously known high annual variation in the estimated geocenter z-translation series as well as its mitigation by fixing the SRP parameters on pre-defined values.

  6. Application of validity theory and methodology to patient-reported outcome measures (PROMs): building an argument for validity.

    PubMed

    Hawkins, Melanie; Elsworth, Gerald R; Osborne, Richard H

    2018-07-01

    Data from subjective patient-reported outcome measures (PROMs) are now being used in the health sector to make or support decisions about individuals, groups and populations. Contemporary validity theorists define validity not as a statistical property of the test but as the extent to which empirical evidence supports the interpretation of test scores for an intended use. However, validity testing theory and methodology are rarely evident in the PROM validation literature. Application of this theory and methodology would provide structure for comprehensive validation planning to support improved PROM development and sound arguments for the validity of PROM score interpretation and use in each new context. This paper proposes the application of contemporary validity theory and methodology to PROM validity testing. The validity testing principles will be applied to a hypothetical case study with a focus on the interpretation and use of scores from a translated PROM that measures health literacy (the Health Literacy Questionnaire or HLQ). Although robust psychometric properties of a PROM are a pre-condition to its use, a PROM's validity lies in the sound argument that a network of empirical evidence supports the intended interpretation and use of PROM scores for decision making in a particular context. The health sector is yet to apply contemporary theory and methodology to PROM development and validation. The theoretical and methodological processes in this paper are offered as an advancement of the theory and practice of PROM validity testing in the health sector.

  7. Statistical Power for the Comparative Regression Discontinuity Design With a Pretest No-Treatment Control Function: Theory and Evidence From the National Head Start Impact Study.

    PubMed

    Tang, Yang; Cook, Thomas D

    2018-01-01

    The basic regression discontinuity design (RDD) has less statistical power than a randomized control trial (RCT) with the same sample size. Adding a no-treatment comparison function to the basic RDD creates a comparative RDD (CRD); and when this function comes from the pretest value of the study outcome, a CRD-Pre design results. We use a within-study comparison (WSC) to examine the power of CRD-Pre relative to both basic RDD and RCT. We first build the theoretical foundation for power in CRD-Pre, then derive the relevant variance formulae, and finally compare them to the theoretical RCT variance. We conclude from this theoretical part of this article that (1) CRD-Pre's power gain depends on the partial correlation between the pretest and posttest measures after conditioning on the assignment variable, (2) CRD-Pre is less responsive than basic RDD to how the assignment variable is distributed and where the cutoff is located, and (3) under a variety of conditions, the efficiency of CRD-Pre is very close to that of the RCT. Data from the National Head Start Impact Study are then used to construct RCT, RDD, and CRD-Pre designs and to compare their power. The empirical results indicate (1) a high level of correspondence between the predicted and obtained power results for RDD and CRD-Pre relative to the RCT, and (2) power levels in CRD-Pre and RCT that are very close. The study is unique among WSCs for its focus on the correspondence between RCT and observational study standard errors rather than means.

  8. Empirically defining rapid response to intensive treatment to maximize prognostic utility for bulimia nervosa and purging disorder.

    PubMed

    MacDonald, Danielle E; Trottier, Kathryn; McFarlane, Traci; Olmsted, Marion P

    2015-05-01

    Rapid response (RR) to eating disorder treatment has been reliably identified as a predictor of post-treatment and sustained remission, but its definition has varied widely. Although signal detection methods have been used to empirically define RR thresholds in outpatient settings, RR to intensive treatment has not been investigated. This study investigated the optimal definition of RR to day hospital treatment for bulimia nervosa and purging disorder. Participants were 158 patients who completed ≥6 weeks of day hospital treatment. Receiver operating characteristic (ROC) analysis was used to create four definitions of RR that could differentiate between remission and nonremission at the end of treatment. Definitions were based on binge/vomit episode frequency or percent reduction from pre-treatment, during either the first four or first two weeks of treatment. All definitions were associated with higher remission rates in rapid compared to nonrapid responders. Only one definition (i.e., ≤3 episodes in the first four weeks of treatment) predicted sustained remission (versus relapse) at 6- and 12-month follow-up. These findings provide an empirically derived definition of RR to intensive eating disorder treatment, and provide further evidence that early change is an important prognostic indicator. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Simulating Pre-Asymptotic, Non-Fickian Transport Although Doing Simple Random Walks - Supported By Empirical Pore-Scale Velocity Distributions and Memory Effects

    NASA Astrophysics Data System (ADS)

    Most, S.; Jia, N.; Bijeljic, B.; Nowak, W.

    2016-12-01

    Pre-asymptotic characteristics are almost ubiquitous when analyzing solute transport processes in porous media. These pre-asymptotic aspects are caused by spatial coherence in the velocity field and by its heterogeneity. For the Lagrangian perspective of particle displacements, the causes of pre-asymptotic, non-Fickian transport are skewed velocity distribution, statistical dependencies between subsequent increments of particle positions (memory) and dependence between the x, y and z-components of particle increments. Valid simulation frameworks should account for these factors. We propose a particle tracking random walk (PTRW) simulation technique that can use empirical pore-space velocity distributions as input, enforces memory between subsequent random walk steps, and considers cross dependence. Thus, it is able to simulate pre-asymptotic non-Fickian transport phenomena. Our PTRW framework contains an advection/dispersion term plus a diffusion term. The advection/dispersion term produces time-series of particle increments from the velocity CDFs. These time series are equipped with memory by enforcing that the CDF values of subsequent velocities change only slightly. The latter is achieved through a random walk on the axis of CDF values between 0 and 1. The virtual diffusion coefficient for that random walk is our only fitting parameter. Cross-dependence can be enforced by constraining the random walk to certain combinations of CDF values between the three velocity components in x, y and z. We will show that this modelling framework is capable of simulating non-Fickian transport by comparison with a pore-scale transport simulation and we analyze the approach to asymptotic behavior.

  10. Sleep state classification using pressure sensor mats.

    PubMed

    Baran Pouyan, M; Nourani, M; Pompeo, M

    2015-08-01

    Sleep state detection is valuable in assessing patient's sleep quality and in-bed general behavior. In this paper, a novel classification approach of sleep states (sleep, pre-wake, wake) is proposed that uses only surface pressure sensors. In our method, a mobility metric is defined based on successive pressure body maps. Then, suitable statistical features are computed based on the mobility metric. Finally, a customized random forest classifier is employed to identify various classes including a new class for pre-wake state. Our algorithm achieves 96.1% and 88% accuracies for two (sleep, wake) and three (sleep, pre-wake, wake) class identification, respectively.

  11. Testing the non-unity of rate ratio under inverse sampling.

    PubMed

    Tang, Man-Lai; Liao, Yi Jie; Ng, Hong Keung Tony; Chan, Ping Shing

    2007-08-01

    Inverse sampling is considered to be a more appropriate sampling scheme than the usual binomial sampling scheme when subjects arrive sequentially, when the underlying response of interest is acute, and when maximum likelihood estimators of some epidemiologic indices are undefined. In this article, we study various statistics for testing non-unity rate ratios in case-control studies under inverse sampling. These include the Wald, unconditional score, likelihood ratio and conditional score statistics. Three methods (the asymptotic, conditional exact, and Mid-P methods) are adopted for P-value calculation. We evaluate the performance of different combinations of test statistics and P-value calculation methods in terms of their empirical sizes and powers via Monte Carlo simulation. In general, asymptotic score and conditional score tests are preferable for their actual type I error rates are well controlled around the pre-chosen nominal level, and their powers are comparatively the largest. The exact version of Wald test is recommended if one wants to control the actual type I error rate at or below the pre-chosen nominal level. If larger power is expected and fluctuation of sizes around the pre-chosen nominal level are allowed, then the Mid-P version of Wald test is a desirable alternative. We illustrate the methodologies with a real example from a heart disease study. (c) 2007 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

  12. Signal Processing Methods for Liquid Rocket Engine Combustion Stability Assessments

    NASA Technical Reports Server (NTRS)

    Kenny, R. Jeremy; Lee, Erik; Hulka, James R.; Casiano, Matthew

    2011-01-01

    The J2X Gas Generator engine design specifications include dynamic, spontaneous, and broadband combustion stability requirements. These requirements are verified empirically based high frequency chamber pressure measurements and analyses. Dynamic stability is determined with the dynamic pressure response due to an artificial perturbation of the combustion chamber pressure (bomb testing), and spontaneous and broadband stability are determined from the dynamic pressure responses during steady operation starting at specified power levels. J2X Workhorse Gas Generator testing included bomb tests with multiple hardware configurations and operating conditions, including a configuration used explicitly for engine verification test series. This work covers signal processing techniques developed at Marshall Space Flight Center (MSFC) to help assess engine design stability requirements. Dynamic stability assessments were performed following both the CPIA 655 guidelines and a MSFC in-house developed statistical-based approach. The statistical approach was developed to better verify when the dynamic pressure amplitudes corresponding to a particular frequency returned back to pre-bomb characteristics. This was accomplished by first determining the statistical characteristics of the pre-bomb dynamic levels. The pre-bomb statistical characterization provided 95% coverage bounds; these bounds were used as a quantitative measure to determine when the post-bomb signal returned to pre-bomb conditions. The time for post-bomb levels to acceptably return to pre-bomb levels was compared to the dominant frequency-dependent time recommended by CPIA 655. Results for multiple test configurations, including stable and unstable configurations, were reviewed. Spontaneous stability was assessed using two processes: 1) characterization of the ratio of the peak response amplitudes to the excited chamber acoustic mode amplitudes and 2) characterization of the variability of the peak response's frequency over the test duration. This characterization process assists in evaluating the discreteness of a signal as well as the stability of the chamber response. Broadband stability was assessed using a running root-mean-square evaluation. These techniques were also employed, in a comparative analysis, on available Fastrac data, and these results are presented here.

  13. Bootstrapping under constraint for the assessment of group behavior in human contact networks

    NASA Astrophysics Data System (ADS)

    Tremblay, Nicolas; Barrat, Alain; Forest, Cary; Nornberg, Mark; Pinton, Jean-François; Borgnat, Pierre

    2013-11-01

    The increasing availability of time- and space-resolved data describing human activities and interactions gives insights into both static and dynamic properties of human behavior. In practice, nevertheless, real-world data sets can often be considered as only one realization of a particular event. This highlights a key issue in social network analysis: the statistical significance of estimated properties. In this context, we focus here on the assessment of quantitative features of specific subset of nodes in empirical networks. We present a method of statistical resampling based on bootstrapping groups of nodes under constraints within the empirical network. The method enables us to define acceptance intervals for various null hypotheses concerning relevant properties of the subset of nodes under consideration in order to characterize by a statistical test its behavior as “normal” or not. We apply this method to a high-resolution data set describing the face-to-face proximity of individuals during two colocated scientific conferences. As a case study, we show how to probe whether colocating the two conferences succeeded in bringing together the two corresponding groups of scientists.

  14. Time series regression-based pairs trading in the Korean equities market

    NASA Astrophysics Data System (ADS)

    Kim, Saejoon; Heo, Jun

    2017-07-01

    Pairs trading is an instance of statistical arbitrage that relies on heavy quantitative data analysis to profit by capitalising low-risk trading opportunities provided by anomalies of related assets. A key element in pairs trading is the rule by which open and close trading triggers are defined. This paper investigates the use of time series regression to define the rule which has previously been identified with fixed threshold-based approaches. Empirical results indicate that our approach may yield significantly increased excess returns compared to ones obtained by previous approaches on large capitalisation stocks in the Korean equities market.

  15. Dynamic Fracture Properties of Rocks Subjected to Static Pre-load Using Notched Semi-circular Bend Method

    NASA Astrophysics Data System (ADS)

    Chen, Rong; Li, Kang; Xia, Kaiwen; Lin, Yuliang; Yao, Wei; Lu, Fangyun

    2016-10-01

    A dynamic load superposed on a static pre-load is a key problem in deep underground rock engineering projects. Based on a modified split Hopkinson pressure bar test system, the notched semi-circular bend (NSCB) method is selected to investigate the fracture initiation toughness of rocks subjected to pre-load. In this study, a two-dimensional ANSYS finite element simulation model is developed to calculate the dimensionless stress intensity factor. Three groups of NSCB specimen are tested under a pre-load of 0, 37 and 74 % of the maximum static load and with the loading rate ranging from 0 to 60 GPa m1/2 s-1. The results show that under a given pre-load, the fracture initiation toughness of rock increases with the loading rate, resembling the typical rate dependence of materials. Furthermore, the dynamic rock fracture toughness decreases with the static pre-load at a given loading rate. The total fracture toughness, defined as the sum of the dynamic fracture toughness and initial stress intensity factor calculated from the pre-load, increases with the pre-load at a given loading rate. An empirical equation is used to represent the effect of loading rate and pre-load force, and the results show that this equation can depict the trend of the experimental data.

  16. Homeopathy in chronic sinusitis: a prospective multi-centric observational study.

    PubMed

    Nayak, Chaturbhuja; Singh, Vikram; Singh, V P; Oberai, Praveen; Roja, Varanasi; Shitanshu, Shashi Shekhar; Sinha, M N; Deewan, Deepti; Lakhera, B C; Ramteke, Sunil; Kaushik, Subhash; Sarkar, Sarabjit; Mandal, N R; Mohanan, P G; Singh, J R; Biswas, Sabyasachi; Mathew, Georgekutty

    2012-04-01

    The primary objective was to ascertain the therapeutic usefulness of homeopathic medicine in the management of chronic sinusitis (CS). Multicentre observational study at Institutes and Units of the Central Council for Research in Homoeopathy, India. Symptoms were assessed using the chronic sinusitis assessment score (CSAS). 17 pre-defined homeopathic medicines were shortlisted for prescription on the basis of repertorisation for the pathological symptoms of CS. Regimes and adjustment of regimes in the event of a change of symptoms were pre-defined. The follow-up period was for 6 months. Statistical analysis was done using SPSS version 16. 628 patients suffering from CS confirmed on X-ray were enrolled from eight Institutes and Units of the Central Council for Research in Homoeopathy. All 550 patients with at least one follow-up assessment were analyzed. There was a statistically significant reduction in CSAS (P = 0.0001, Friedman test) after 3 and 6 months of treatment. Radiological appearances also improved. A total of 13 out of 17 pre-defined medicines were prescribed in 550 patients, Sil. (55.2% of 210), Calc. (62.5% of 98), Lyc. (69% of 55), Phos. (66.7% of 45) and Kali iod. (65% of 40) were found to be most useful having marked improvement. 4/17 medicines were never prescribed. No complications were observed during treatment. Homeopathic treatment may be effective for CS patients. Controlled trials are required for further validation. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. A Classification and Analysis of National Contract Management Journal Articles from 1966 Through 1989

    DTIC Science & Technology

    1991-06-01

    THEORETICAL, NORMATIVE, EMPIRICAL, INDUCTIVE [24] "New Approaches for Quantifying Risk and Determining Sharing Arrangements," Raymond S. Lieber, pp...the negotiation process can be enhanced by quantifying risk using statistical methods. The article discusses two approaches which allow the...in Incentive Contracting," Melvin W. Lifson, pp. 59-80. The purpose to this article is to suggest a general approach for defining and quantifying

  18. Menzerath-Altmann Law: Statistical Mechanical Interpretation as Applied to a Linguistic Organization

    NASA Astrophysics Data System (ADS)

    Eroglu, Sertac

    2014-10-01

    The distribution behavior described by the empirical Menzerath-Altmann law is frequently encountered during the self-organization of linguistic and non-linguistic natural organizations at various structural levels. This study presents a statistical mechanical derivation of the law based on the analogy between the classical particles of a statistical mechanical organization and the distinct words of a textual organization. The derived model, a transformed (generalized) form of the Menzerath-Altmann model, was termed as the statistical mechanical Menzerath-Altmann model. The derived model allows interpreting the model parameters in terms of physical concepts. We also propose that many organizations presenting the Menzerath-Altmann law behavior, whether linguistic or not, can be methodically examined by the transformed distribution model through the properly defined structure-dependent parameter and the energy associated states.

  19. Compounding approach for univariate time series with nonstationary variances

    NASA Astrophysics Data System (ADS)

    Schäfer, Rudi; Barkhofen, Sonja; Guhr, Thomas; Stöckmann, Hans-Jürgen; Kuhl, Ulrich

    2015-12-01

    A defining feature of nonstationary systems is the time dependence of their statistical parameters. Measured time series may exhibit Gaussian statistics on short time horizons, due to the central limit theorem. The sample statistics for long time horizons, however, averages over the time-dependent variances. To model the long-term statistical behavior, we compound the local distribution with the distribution of its parameters. Here, we consider two concrete, but diverse, examples of such nonstationary systems: the turbulent air flow of a fan and a time series of foreign exchange rates. Our main focus is to empirically determine the appropriate parameter distribution for the compounding approach. To this end, we extract the relevant time scales by decomposing the time signals into windows and determine the distribution function of the thus obtained local variances.

  20. Compounding approach for univariate time series with nonstationary variances.

    PubMed

    Schäfer, Rudi; Barkhofen, Sonja; Guhr, Thomas; Stöckmann, Hans-Jürgen; Kuhl, Ulrich

    2015-12-01

    A defining feature of nonstationary systems is the time dependence of their statistical parameters. Measured time series may exhibit Gaussian statistics on short time horizons, due to the central limit theorem. The sample statistics for long time horizons, however, averages over the time-dependent variances. To model the long-term statistical behavior, we compound the local distribution with the distribution of its parameters. Here, we consider two concrete, but diverse, examples of such nonstationary systems: the turbulent air flow of a fan and a time series of foreign exchange rates. Our main focus is to empirically determine the appropriate parameter distribution for the compounding approach. To this end, we extract the relevant time scales by decomposing the time signals into windows and determine the distribution function of the thus obtained local variances.

  1. Does peer learning or higher levels of e-learning improve learning abilities? A randomized controlled trial

    PubMed Central

    Worm, Bjarne Skjødt; Jensen, Kenneth

    2013-01-01

    Background and aims The fast development of e-learning and social forums demands us to update our understanding of e-learning and peer learning. We aimed to investigate if higher, pre-defined levels of e-learning or social interaction in web forums improved students’ learning ability. Methods One hundred and twenty Danish medical students were randomized to six groups all with 20 students (eCases level 1, eCases level 2, eCases level 2+, eTextbook level 1, eTextbook level 2, and eTextbook level 2+). All students participated in a pre-test, Group 1 participated in an interactive case-based e-learning program, while Group 2 was presented with textbook material electronically. The 2+ groups were able to discuss the material between themselves in a web forum. The subject was head injury and associated treatment and observation guidelines in the emergency room. Following the e-learning, all students completed a post-test. Pre- and post-tests both consisted of 25 questions randomly chosen from a pool of 50 different questions. Results All students concluded the study with comparable pre-test results. Students at Level 2 (in both groups) improved statistically significant compared to students at level 1 (p>0.05). There was no statistically significant difference between level 2 and level 2+. However, level 2+ was associated with statistically significant greater student's satisfaction than the rest of the students (p>0.05). Conclusions This study applies a new way of comparing different types of e-learning using a pre-defined level division and the possibility of peer learning. Our findings show that higher levels of e-learning does in fact provide better results when compared with the same type of e-learning at lower levels. While social interaction in web forums increase student satisfaction, learning ability does not seem to change. Both findings are relevant when designing new e-learning materials. PMID:24229729

  2. Does peer learning or higher levels of e-learning improve learning abilities? A randomized controlled trial.

    PubMed

    Worm, Bjarne Skjødt; Jensen, Kenneth

    2013-01-01

    Background and aims The fast development of e-learning and social forums demands us to update our understanding of e-learning and peer learning. We aimed to investigate if higher, pre-defined levels of e-learning or social interaction in web forums improved students' learning ability. Methods One hundred and twenty Danish medical students were randomized to six groups all with 20 students (eCases level 1, eCases level 2, eCases level 2+, eTextbook level 1, eTextbook level 2, and eTextbook level 2+). All students participated in a pre-test, Group 1 participated in an interactive case-based e-learning program, while Group 2 was presented with textbook material electronically. The 2+ groups were able to discuss the material between themselves in a web forum. The subject was head injury and associated treatment and observation guidelines in the emergency room. Following the e-learning, all students completed a post-test. Pre- and post-tests both consisted of 25 questions randomly chosen from a pool of 50 different questions. Results All students concluded the study with comparable pre-test results. Students at Level 2 (in both groups) improved statistically significant compared to students at level 1 (p>0.05). There was no statistically significant difference between level 2 and level 2+. However, level 2+ was associated with statistically significant greater student's satisfaction than the rest of the students (p>0.05). Conclusions This study applies a new way of comparing different types of e-learning using a pre-defined level division and the possibility of peer learning. Our findings show that higher levels of e-learning does in fact provide better results when compared with the same type of e-learning at lower levels. While social interaction in web forums increase student satisfaction, learning ability does not seem to change. Both findings are relevant when designing new e-learning materials.

  3. Statistical microeconomics and commodity prices: theory and empirical results.

    PubMed

    Baaquie, Belal E

    2016-01-13

    A review is made of the statistical generalization of microeconomics by Baaquie (Baaquie 2013 Phys. A 392, 4400-4416. (doi:10.1016/j.physa.2013.05.008)), where the market price of every traded commodity, at each instant of time, is considered to be an independent random variable. The dynamics of commodity market prices is given by the unequal time correlation function and is modelled by the Feynman path integral based on an action functional. The correlation functions of the model are defined using the path integral. The existence of the action functional for commodity prices that was postulated to exist in Baaquie (Baaquie 2013 Phys. A 392, 4400-4416. (doi:10.1016/j.physa.2013.05.008)) has been empirically ascertained in Baaquie et al. (Baaquie et al. 2015 Phys. A 428, 19-37. (doi:10.1016/j.physa.2015.02.030)). The model's action functionals for different commodities has been empirically determined and calibrated using the unequal time correlation functions of the market commodity prices using a perturbation expansion (Baaquie et al. 2015 Phys. A 428, 19-37. (doi:10.1016/j.physa.2015.02.030)). Nine commodities drawn from the energy, metal and grain sectors are empirically studied and their auto-correlation for up to 300 days is described by the model to an accuracy of R(2)>0.90-using only six parameters. © 2015 The Author(s).

  4. Empirical Investigation of Job Applicants' Reactions to Taking a Pre-Employment Honesty Test.

    ERIC Educational Resources Information Center

    Jones, John W.; Joy, Dennis

    Employee theft is widespread and difficult to detect. Many companies have attempted to control the employee theft problem through pre-employment screening. The use of paper-and-pencil honesty tests in this process has become increasingly common. These two studies empirically investigated job applicants' (N=450) reactions to taking a pre-employment…

  5. The Feeling of Agency: Empirical Indicators for a Pre-Reflective Level of Action Awareness

    PubMed Central

    David, Nicole; Stenzel, Anna; Schneider, Till R.; Engel, Andreas K.

    2011-01-01

    The sense of agency has been defined as the sense that I am the author of my own actions. This sense, however, is usually not reflected upon but instead pre-reflectively experienced. Experimental approaches usually measure the sense of agency by judgments or verbal reports, despite evidence that the sense of agency is not sufficiently assessed on such a reflective level. Here we sought to identify non-verbal measures of the sense of agency, particularly testing the relevance of physiological activity such as skin conductance and heart rate. Manipulating the visual feedback to an executed movement, we investigated how well physiological activity and other movement parameters differed between real and false feedback (i.e., between actual agency and non-agency), and how they related to accuracy of agency judgments. Skin conductance and heart rate did not differ between agency and non-agency situations; neither did they inform agency judgments. In contrast, movement onsets – particularly, discrepancies between feedback and movement onsets – were related to agency judgments. Overall, our results indicate weak visceral–somatic associations with the sense of agency. Thus, physiological activity did not prove to be an empirical indicator for the feeling of agency. PMID:21779268

  6. Indirect medical education and disproportionate share adjustments to Medicare inpatient payment rates.

    PubMed

    Nguyen, Nguyen Xuan; Sheingold, Steven H

    2011-11-04

    The indirect medical education (IME) and disproportionate share hospital (DSH) adjustments to Medicare's prospective payment rates for inpatient services are generally intended to compensate hospitals for patient care costs related to teaching activities and care of low income populations. These adjustments were originally established based on the statistical relationships between IME and DSH and hospital costs. Due to a variety of policy considerations, the legislated levels of these adjustments may have deviated over time from these "empirically justified levels," or simply, "empirical levels." In this paper, we estimate the empirical levels of IME and DSH using 2006 hospital data and 2009 Medicare final payment rules. Our analyses suggest that the empirical level for IME would be much smaller than under current law-about one-third to one-half. Our analyses also support the DSH adjustment prescribed by the Affordable Care Act of 2010 (ACA)--about one-quarter of the pre-ACA level. For IME, the estimates imply an increase in costs of 1.88% for each 10% increase in teaching intensity. For DSH, the estimates imply that costs would rise by 0.52% for each 10% increase in the low-income patient share for large urban hospitals. Public Domain.

  7. Global properties of physically interesting Lorentzian spacetimes

    NASA Astrophysics Data System (ADS)

    Nawarajan, Deloshan; Visser, Matt

    Under normal circumstances most members of the general relativity community focus almost exclusively on the local properties of spacetime, such as the locally Euclidean structure of the manifold and the Lorentzian signature of the metric tensor. When combined with the classical Einstein field equations this gives an extremely successful empirical model of classical gravity and classical matter — at least as long as one does not ask too many awkward questions about global issues, (such as global topology and global causal structure). We feel however that this is a tactical error — even without invoking full-fledged “quantum gravity” we know that the standard model of particle physics is also an extremely good representation of some parts of empirical reality; and we had better be able to carry over all the good features of the standard model of particle physics — at least into the realm of semi-classical quantum gravity. Doing so gives us some interesting global features that spacetime should possess: On physical grounds spacetime should be space-orientable, time-orientable, and spacetime-orientable, and it should possess a globally defined tetrad (vierbein, or in general a globally defined vielbein/n-bein). So on physical grounds spacetime should be parallelizable. This strongly suggests that the metric is not the fundamental physical quantity; a very good case can be made for the tetrad being more fundamental than the metric. Furthermore, a globally-defined “almost complex structure” is almost unavoidable. Ideas along these lines have previously been mooted, but much is buried in the pre-arXiv literature and is either forgotten or inaccessible. We shall revisit these ideas taking a perspective very much based on empirical physical observation.

  8. Empirical Testing of an Algorithm for Defining Somatization in Children

    PubMed Central

    Eisman, Howard D.; Fogel, Joshua; Lazarovich, Regina; Pustilnik, Inna

    2007-01-01

    Introduction A previous article proposed an algorithm for defining somatization in children by classifying them into three categories: well, medically ill, and somatizer; the authors suggested further empirical validation of the algorithm (Postilnik et al., 2006). We use the Child Behavior Checklist (CBCL) to provide this empirical validation. Method Parents of children seen in pediatric clinics completed the CBCL (n=126). The physicians of these children completed specially-designed questionnaires. The sample comprised of 62 boys and 64 girls (age range 2 to 15 years). Classification categories included: well (n=53), medically ill (n=55), and somatizer (n=18). Analysis of variance (ANOVA) was used for statistical comparisons. Discriminant function analysis was conducted with the CBCL subscales. Results There were significant differences between the classification categories for the somatic complaints (p=<0.001), social problems (p=0.004), thought problems (p=0.01), attention problems (0.006), and internalizing (p=0.003) subscales and also total (p=0.001), and total-t (p=0.001) scales of the CBCL. Discriminant function analysis showed that 78% of somatizers and 66% of well were accurately classified, while only 35% of medically ill were accurately classified. Conclusion The somatization classification algorithm proposed by Postilnik et al. (2006) shows promise for classification of children and adolescents with somatic symptoms. PMID:18421368

  9. Defining window-boundaries for genomic analyses using smoothing spline techniques

    DOE PAGES

    Beissinger, Timothy M.; Rosa, Guilherme J.M.; Kaeppler, Shawn M.; ...

    2015-04-17

    High-density genomic data is often analyzed by combining information over windows of adjacent markers. Interpretation of data grouped in windows versus at individual locations may increase statistical power, simplify computation, reduce sampling noise, and reduce the total number of tests performed. However, use of adjacent marker information can result in over- or under-smoothing, undesirable window boundary specifications, or highly correlated test statistics. We introduce a method for defining windows based on statistically guided breakpoints in the data, as a foundation for the analysis of multiple adjacent data points. This method involves first fitting a cubic smoothing spline to the datamore » and then identifying the inflection points of the fitted spline, which serve as the boundaries of adjacent windows. This technique does not require prior knowledge of linkage disequilibrium, and therefore can be applied to data collected from individual or pooled sequencing experiments. Moreover, in contrast to existing methods, an arbitrary choice of window size is not necessary, since these are determined empirically and allowed to vary along the genome.« less

  10. Statistical power as a function of Cronbach alpha of instrument questionnaire items.

    PubMed

    Heo, Moonseong; Kim, Namhee; Faith, Myles S

    2015-10-14

    In countless number of clinical trials, measurements of outcomes rely on instrument questionnaire items which however often suffer measurement error problems which in turn affect statistical power of study designs. The Cronbach alpha or coefficient alpha, here denoted by C(α), can be used as a measure of internal consistency of parallel instrument items that are developed to measure a target unidimensional outcome construct. Scale score for the target construct is often represented by the sum of the item scores. However, power functions based on C(α) have been lacking for various study designs. We formulate a statistical model for parallel items to derive power functions as a function of C(α) under several study designs. To this end, we assume fixed true score variance assumption as opposed to usual fixed total variance assumption. That assumption is critical and practically relevant to show that smaller measurement errors are inversely associated with higher inter-item correlations, and thus that greater C(α) is associated with greater statistical power. We compare the derived theoretical statistical power with empirical power obtained through Monte Carlo simulations for the following comparisons: one-sample comparison of pre- and post-treatment mean differences, two-sample comparison of pre-post mean differences between groups, and two-sample comparison of mean differences between groups. It is shown that C(α) is the same as a test-retest correlation of the scale scores of parallel items, which enables testing significance of C(α). Closed-form power functions and samples size determination formulas are derived in terms of C(α), for all of the aforementioned comparisons. Power functions are shown to be an increasing function of C(α), regardless of comparison of interest. The derived power functions are well validated by simulation studies that show that the magnitudes of theoretical power are virtually identical to those of the empirical power. Regardless of research designs or settings, in order to increase statistical power, development and use of instruments with greater C(α), or equivalently with greater inter-item correlations, is crucial for trials that intend to use questionnaire items for measuring research outcomes. Further development of the power functions for binary or ordinal item scores and under more general item correlation strutures reflecting more real world situations would be a valuable future study.

  11. Statistical framework and noise sensitivity of the amplitude radial correlation contrast method.

    PubMed

    Kipervaser, Zeev Gideon; Pelled, Galit; Goelman, Gadi

    2007-09-01

    A statistical framework for the amplitude radial correlation contrast (RCC) method, which integrates a conventional pixel threshold approach with cluster-size statistics, is presented. The RCC method uses functional MRI (fMRI) data to group neighboring voxels in terms of their degree of temporal cross correlation and compares coherences in different brain states (e.g., stimulation OFF vs. ON). By defining the RCC correlation map as the difference between two RCC images, the map distribution of two OFF states is shown to be normal, enabling the definition of the pixel cutoff. The empirical cluster-size null distribution obtained after the application of the pixel cutoff is used to define a cluster-size cutoff that allows 5% false positives. Assuming that the fMRI signal equals the task-induced response plus noise, an analytical expression of amplitude-RCC dependency on noise is obtained and used to define the pixel threshold. In vivo and ex vivo data obtained during rat forepaw electric stimulation are used to fine-tune this threshold. Calculating the spatial coherences within in vivo and ex vivo images shows enhanced coherence in the in vivo data, but no dependency on the anesthesia method, magnetic field strength, or depth of anesthesia, strengthening the generality of the proposed cutoffs. Copyright (c) 2007 Wiley-Liss, Inc.

  12. Neonatal heart rate prediction.

    PubMed

    Abdel-Rahman, Yumna; Jeremic, Aleksander; Tan, Kenneth

    2009-01-01

    Technological advances have caused a decrease in the number of infant deaths. Pre-term infants now have a substantially increased chance of survival. One of the mechanisms that is vital to saving the lives of these infants is continuous monitoring and early diagnosis. With continuous monitoring huge amounts of data are collected with so much information embedded in them. By using statistical analysis this information can be extracted and used to aid diagnosis and to understand development. In this study we have a large dataset containing over 180 pre-term infants whose heart rates were recorded over the length of their stay in the Neonatal Intensive Care Unit (NICU). We test two types of models, empirical bayesian and autoregressive moving average. We then attempt to predict future values. The autoregressive moving average model showed better results but required more computation.

  13. Methodological framework for the ergonomic design of children's playground equipment: a Serbian experience.

    PubMed

    Grozdanovic, Miroljub; Jekic, Savko; Stojiljkovic, Evica

    2014-01-01

    Adequate application of the static and dynamic anthropometric measures of pre-school children in ergonometric design of children's playground equipment should eliminate all dangers and difficulties in their use. Possibilities of injuries, insecure movements, discomfort able positions and some other dangerous actions may be minimized; and safety and health protection of pre-school children will be increased. Children's playground represents a significant space of activity for pre-school children. Therefore, it is necessary to apply ergonomic principles which contribute to the adjustment of the playground elements to children's anatomic features. Based on the results presented in this paper, new constructions were designed and new playgrounds were installed in Serbia. Participants were children from three pre-school age groups: Junior age group (3-4 years of age, 17 children), Medium age group (4-5 years of age, 22 children), and Senior age group (5-6 years of age, 26 children). Thirty-one static anthropometric measures (12 in standing position, 11 in sitting position, 7 related to dimensions of hand, foot an head, with body weight and shoe size) and 15 dynamic anthropometric measures (7 in standing position, 6 in sitting position and 2 dimension of foot and hand) were defined for the study. Measurements were taken using an anthrop-meter, a flexible measuring tape. Equations for ergonomic design of children's playground elements were also defined. Basic statistical data of static and dynamic anthropometric measurements of the pre-school children are presented in this paper, as well as the statistical calculation of the corrective anthropometric measurements. Measurements were performed in "Poletarac" kindergarten, part of the pre-school institution "Radost" in Cacak. Elements of playground equipment in "Bambi" kindergarten in Kragujevac (the Indian tent "wigwam", gate-house, swing and carousel) were designed and built using these parameters. Based on the obtained results, several playgrounds were designed, manufactured and equipped with the appropriate items.

  14. oPOSSUM: identification of over-represented transcription factor binding sites in co-expressed genes

    PubMed Central

    Ho Sui, Shannan J.; Mortimer, James R.; Arenillas, David J.; Brumm, Jochen; Walsh, Christopher J.; Kennedy, Brian P.; Wasserman, Wyeth W.

    2005-01-01

    Targeted transcript profiling studies can identify sets of co-expressed genes; however, identification of the underlying functional mechanism(s) is a significant challenge. Established methods for the analysis of gene annotations, particularly those based on the Gene Ontology, can identify functional linkages between genes. Similar methods for the identification of over-represented transcription factor binding sites (TFBSs) have been successful in yeast, but extension to human genomics has largely proved ineffective. Creation of a system for the efficient identification of common regulatory mechanisms in a subset of co-expressed human genes promises to break a roadblock in functional genomics research. We have developed an integrated system that searches for evidence of co-regulation by one or more transcription factors (TFs). oPOSSUM combines a pre-computed database of conserved TFBSs in human and mouse promoters with statistical methods for identification of sites over-represented in a set of co-expressed genes. The algorithm successfully identified mediating TFs in control sets of tissue-specific genes and in sets of co-expressed genes from three transcript profiling studies. Simulation studies indicate that oPOSSUM produces few false positives using empirically defined thresholds and can tolerate up to 50% noise in a set of co-expressed genes. PMID:15933209

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vasylkivska, Veronika S.; Huerta, Nicolas J.

    Determining the spatiotemporal characteristics of natural and induced seismic events holds the opportunity to gain new insights into why these events occur. Linking the seismicity characteristics with other geologic, geographic, natural, or anthropogenic factors could help to identify the causes and suggest mitigation strategies that reduce the risk associated with such events. The nearest-neighbor approach utilized in this work represents a practical first step toward identifying statistically correlated clusters of recorded earthquake events. Detailed study of the Oklahoma earthquake catalog’s inherent errors, empirical model parameters, and model assumptions is presented. We found that the cluster analysis results are stable withmore » respect to empirical parameters (e.g., fractal dimension) but were sensitive to epicenter location errors and seismicity rates. Most critically, we show that the patterns in the distribution of earthquake clusters in Oklahoma are primarily defined by spatial relationships between events. This observation is a stark contrast to California (also known for induced seismicity) where a comparable cluster distribution is defined by both spatial and temporal interactions between events. These results highlight the difficulty in understanding the mechanisms and behavior of induced seismicity but provide insights for future work.« less

  16. Empirical microeconomics action functionals

    NASA Astrophysics Data System (ADS)

    Baaquie, Belal E.; Du, Xin; Tanputraman, Winson

    2015-06-01

    A statistical generalization of microeconomics has been made in Baaquie (2013), where the market price of every traded commodity, at each instant of time, is considered to be an independent random variable. The dynamics of commodity market prices is modeled by an action functional-and the focus of this paper is to empirically determine the action functionals for different commodities. The correlation functions of the model are defined using a Feynman path integral. The model is calibrated using the unequal time correlation of the market commodity prices as well as their cubic and quartic moments using a perturbation expansion. The consistency of the perturbation expansion is verified by a numerical evaluation of the path integral. Nine commodities drawn from the energy, metal and grain sectors are studied and their market behavior is described by the model to an accuracy of over 90% using only six parameters. The paper empirically establishes the existence of the action functional for commodity prices that was postulated to exist in Baaquie (2013).

  17. Prediction of episodic acidification in North-eastern USA: An empirical/mechanistic approach

    USGS Publications Warehouse

    Davies, T.D.; Tranter, M.; Wigington, P.J.; Eshleman, K.N.; Peters, N.E.; Van Sickle, J.; DeWalle, David R.; Murdoch, Peter S.

    1999-01-01

    Observations from the US Environmental Protection Agency's Episodic Response Project (ERP) in the North-eastern United States are used to develop an empirical/mechanistic scheme for prediction of the minimum values of acid neutralizing capacity (ANC) during episodes. An acidification episode is defined as a hydrological event during which ANC decreases. The pre-episode ANC is used to index the antecedent condition, and the stream flow increase reflects how much the relative contributions of sources of waters change during the episode. As much as 92% of the total variation in the minimum ANC in individual catchments can be explained (with levels of explanation >70% for nine of the 13 streams) by a multiple linear regression model that includes pre-episode ANC and change in discharge as independent variable. The predictive scheme is demonstrated to be regionally robust, with the regional variance explained ranging from 77 to 83%. The scheme is not successful for each ERP stream, and reasons are suggested for the individual failures. The potential for applying the predictive scheme to other watersheds is demonstrated by testing the model with data from the Panola Mountain Research Watershed in the South-eastern United States, where the variance explained by the model was 74%. The model can also be utilized to assess 'chemically new' and 'chemically old' water sources during acidification episodes.Observations from the US Environmental Protection Agency's Episodic Response Project (ERP) in the Northeastern United States are used to develop an empirical/mechanistic scheme for prediction of the minimum values of acid neutralizing capacity (ANC) during episodes. An acidification episode is defined as a hydrological event during which ANC decreases. The pre-episode ANC is used to index the antecedent condition, and the stream flow increase reflects how much the relative contributions of sources of waters change during the episode. As much as 92% of the total variation in the minimum ANC in individual catchments can be explained (with levels of explanation >70% for nine of the 13 streams) by a multiple linear regression model that includes pre-episode ANC and change in discharge as independent variables. The predictive scheme is demonstrated to be regionally robust, with the regional variance explained ranging from 77 to 83%. The scheme is not successful for each ERP stream, and reasons are suggested for the individual failures. The potential for applying the predictive scheme to other watersheds is demonstrated by testing the model with data from the Panola Mountain Research Watershed in the South-eastern United States, where the variance explained by the model was 74%. The model can also be utilized to assess `chemically new' and `chemically old' water sources during acidification episodes.

  18. An empirical approach to the stopping power of solids and gases for ions from 3Li to 18Ar - Part II

    NASA Astrophysics Data System (ADS)

    Paul, Helmut; Schinner, Andreas

    2002-10-01

    This paper is a continuation of the work presented in Nucl. Instr. and Meth. Phys. Res. B 179 (2001) 299. Its aim is to produce a table of stopping powers by fitting empirical stopping values. Our database has been increased and we use a better fit function. As before, we treat solid and gaseous targets separately, but we now obtain results also for H 2 and He targets. Using an improved version of our program MSTAR, we can calculate the stopping power for any ion (3⩽ Z1⩽18) at specific energies from 0.001 to 1000 MeV/nucleon and for any element, mixture or compound contained in ICRU Report 49. MSTAR is available in the internet; it can be used as stand alone or built as a subroutine into other programs. Using a statistical program for comparing our fits with the experimental data, we find that MSTAR represents the data within 2% at high energy and within up to 20% (25% for gases) at the lowest energies. Fitting errors are 40-110% larger than experimental errors given by the authors. For some gas targets, MSTAR describes the data better than Ziegler's program TRIM.

  19. Status of pre-processing of waste electrical and electronic equipment in Germany and its influence on the recovery of gold.

    PubMed

    Chancerel, Perrine; Bolland, Til; Rotter, Vera Susanne

    2011-03-01

    Waste electrical and electronic equipment (WEEE) contains gold in low but from an environmental and economic point of view relevant concentration. After collection, WEEE is pre-processed in order to generate appropriate material fractions that are sent to the subsequent end-processing stages (recovery, reuse or disposal). The goal of this research is to quantify the overall recovery rates of pre-processing technologies used in Germany for the reference year 2007. To achieve this goal, facilities operating in Germany were listed and classified according to the technology they apply. Information on their processing capacity was gathered by evaluating statistical databases. Based on a literature review of experimental results for gold recovery rates of different pre-processing technologies, the German overall recovery rate of gold at the pre-processing level was quantified depending on the characteristics of the treated WEEE. The results reveal that - depending on the equipment groups - pre-processing recovery rates of gold of 29 to 61% are achieved in Germany. Some practical recommendations to reduce the losses during pre-processing could be formulated. Defining mass-based recovery targets in the legislation does not set incentives to recover trace elements. Instead, the priorities for recycling could be defined based on other parameters like the environmental impacts of the materials. The implementation of measures to reduce the gold losses would also improve the recovery of several other non-ferrous metals like tin, nickel, and palladium.

  20. On the Helicity in 3D-Periodic Navier-Stokes Equations II: The Statistical Case

    NASA Astrophysics Data System (ADS)

    Foias, Ciprian; Hoang, Luan; Nicolaenko, Basil

    2009-09-01

    We study the asymptotic behavior of the statistical solutions to the Navier-Stokes equations using the normalization map [9]. It is then applied to the study of mean energy, mean dissipation rate of energy, and mean helicity of the spatial periodic flows driven by potential body forces. The statistical distribution of the asymptotic Beltrami flows are also investigated. We connect our mathematical analysis with the empirical theory of decaying turbulence. With appropriate mathematically defined ensemble averages, the Kolmogorov universal features are shown to be transient in time. We provide an estimate for the time interval in which those features may still be present. Our collaborator and friend Basil Nicolaenko passed away in September of 2007, after this work was completed. Honoring his contribution and friendship, we dedicate this article to him.

  1. Empirical study on human acupuncture point network

    NASA Astrophysics Data System (ADS)

    Li, Jian; Shen, Dan; Chang, Hui; He, Da-Ren

    2007-03-01

    Chinese medical theory is ancient and profound, however is confined by qualitative and faint understanding. The effect of Chinese acupuncture in clinical practice is unique and effective, and the human acupuncture points play a mysterious and special role, however there is no modern scientific understanding on human acupuncture points until today. For this reason, we attend to use complex network theory, one of the frontiers in the statistical physics, for describing the human acupuncture points and their connections. In the network nodes are defined as the acupuncture points, two nodes are connected by an edge when they are used for a medical treatment of a common disease. A disease is defined as an act. Some statistical properties have been obtained. The results certify that the degree distribution, act degree distribution, and the dependence of the clustering coefficient on both of them obey SPL distribution function, which show a function interpolating between a power law and an exponential decay. The results may be helpful for understanding Chinese medical theory.

  2. Challenges and solutions to pre- and post-randomization subgroup analyses.

    PubMed

    Desai, Manisha; Pieper, Karen S; Mahaffey, Ken

    2014-01-01

    Subgroup analyses are commonly performed in the clinical trial setting with the purpose of illustrating that the treatment effect was consistent across different patient characteristics or identifying characteristics that should be targeted for treatment. There are statistical issues involved in performing subgroup analyses, however. These have been given considerable attention in the literature for analyses where subgroups are defined by a pre-randomization feature. Although subgroup analyses are often performed with subgroups defined by a post-randomization feature--including analyses that estimate the treatment effect among compliers--discussion of these analyses has been neglected in the clinical literature. Such analyses pose a high risk of presenting biased descriptions of treatment effects. We summarize the challenges of doing all types of subgroup analyses described in the literature. In particular, we emphasize issues with post-randomization subgroup analyses. Finally, we provide guidelines on how to proceed across the spectrum of subgroup analyses.

  3. Limit order book and its modeling in terms of Gibbs Grand-Canonical Ensemble

    NASA Astrophysics Data System (ADS)

    Bicci, Alberto

    2016-12-01

    In the domain of so called Econophysics some attempts have been already made for applying the theory of thermodynamics and statistical mechanics to economics and financial markets. In this paper a similar approach is made from a different perspective, trying to model the limit order book and price formation process of a given stock by the Grand-Canonical Gibbs Ensemble for the bid and ask orders. The application of the Bose-Einstein statistics to this ensemble allows then to derive the distribution of the sell and buy orders as a function of price. As a consequence we can define in a meaningful way expressions for the temperatures of the ensembles of bid orders and of ask orders, which are a function of minimum bid, maximum ask and closure prices of the stock as well as of the exchanged volume of shares. It is demonstrated that the difference between the ask and bid orders temperatures can be related to the VAO (Volume Accumulation Oscillator), an indicator empirically defined in Technical Analysis of stock markets. Furthermore the derived distributions for aggregate bid and ask orders can be subject to well defined validations against real data, giving a falsifiable character to the model.

  4. On the meaning of the weighted alternative free-response operating characteristic figure of merit.

    PubMed

    Chakraborty, Dev P; Zhai, Xuetong

    2016-05-01

    The free-response receiver operating characteristic (FROC) method is being increasingly used to evaluate observer performance in search tasks. Data analysis requires definition of a figure of merit (FOM) quantifying performance. While a number of FOMs have been proposed, the recommended one, namely, the weighted alternative FROC (wAFROC) FOM, is not well understood. The aim of this work is to clarify the meaning of this FOM by relating it to the empirical area under a proposed wAFROC curve. The weighted wAFROC FOM is defined in terms of a quasi-Wilcoxon statistic that involves weights, coding the clinical importance, assigned to each lesion. A new wAFROC curve is proposed, the y-axis of which incorporates the weights, giving more credit for marking clinically important lesions, while the x-axis is identical to that of the AFROC curve. An expression is derived relating the area under the empirical wAFROC curve to the wAFROC FOM. Examples are presented with small numbers of cases showing how AFROC and wAFROC curves are affected by correct and incorrect decisions and how the corresponding FOMs credit or penalize these decisions. The wAFROC, AFROC, and inferred ROC FOMs were applied to three clinical data sets involving multiple reader FROC interpretations in different modalities. It is shown analytically that the area under the empirical wAFROC curve equals the wAFROC FOM. This theorem is the FROC analog of a well-known theorem developed in 1975 for ROC analysis, which gave meaning to a Wilcoxon statistic based ROC FOM. A similar equivalence applies between the area under the empirical AFROC curve and the AFROC FOM. The examples show explicitly that the wAFROC FOM gives equal importance to all diseased cases, regardless of the number of lesions, a desirable statistical property not shared by the AFROC FOM. Applications to the clinical data sets show that the wAFROC FOM yields results comparable to that using the AFROC FOM. The equivalence theorem gives meaning to the weighted AFROC FOM, namely, it is identical to the empirical area under weighted AFROC curve.

  5. Estimating earthquake location and magnitude from seismic intensity data

    USGS Publications Warehouse

    Bakun, W.H.; Wentworth, C.M.

    1997-01-01

    Analysis of Modified Mercalli intensity (MMI) observations for a training set of 22 California earthquakes suggests a strategy for bounding the epicentral region and moment magnitude M from MMI observations only. We define an intensity magnitude MI that is calibrated to be equal in the mean to M. MI = mean (Mi), where Mi = (MMIi + 3.29 + 0.0206 * ??i)/1.68 and ??i is the epicentral distance (km) of observation MMIi. The epicentral region is bounded by contours of rms [MI] = rms (MI - Mi) - rms0 (MI - Mi-), where rms is the root mean square, rms0 (MI - Mi) is the minimum rms over a grid of assumed epicenters, and empirical site corrections and a distance weighting function are used. Empirical contour values for bounding the epicenter location and empirical bounds for M estimated from MI appropriate for different levels of confidence and different quantities of intensity observations are tabulated. The epicentral region bounds and MI obtained for an independent test set of western California earthquakes are consistent with the instrumental epicenters and moment magnitudes of these earthquakes. The analysis strategy is particularly appropriate for the evaluation of pre-1900 earthquakes for which the only available data are a sparse set of intensity observations.

  6. Comparing extracorporeal cardiopulmonary resuscitation with conventional cardiopulmonary resuscitation: A meta-analysis.

    PubMed

    Kim, Su Jin; Kim, Hyun Jung; Lee, Hee Young; Ahn, Hyeong Sik; Lee, Sung Woo

    2016-06-01

    The objective was to determine whether extracorporeal cardiopulmonary resuscitation (ECPR), when compared with conventional cardiopulmonary resuscitation (CCPR), improves outcomes in adult patients, and to determine appropriate conditions that can predict good survival outcome in ECPR patients through a meta-analysis. We searched the relevant literature of comparative studies between ECPR and CCPR in adults, from the MEDLINE, EMBASE, and Cochrane databases. The baseline information and outcome data (survival, good neurologic outcome at discharge, at 3-6 months, and at 1 year after arrest) were extracted. Beneficial effect of ECPR on outcome was analyzed according to time interval, location of arrest (out-of-hospital cardiac arrest (OHCA) and in-hospital cardiac arrest (IHCA)), and pre-defined population inclusion criteria (witnessed arrest, initial shockable rhythm, cardiac etiology of arrest and CPR duration) by using Review Manager 5.3. Cochran's Q test and I(2) were calculated. 10 of 1583 publications were included. Although survival to discharge did not show clear superiority in OHCA, ECPR showed statistically improved survival and good neurologic outcome as compared to CCPR, especially at 3-6 months after arrest. In the subgroup of patients with pre-defined inclusion criteria, the pooled meta-analysis found similar results in studies with pre-defined criteria. Survival and good neurologic outcome tended to be superior in the ECPR group at 3-6 months after arrest. The effect of ECPR on survival to discharge in OHCA was not clearly shown. As ECPR showed better outcomes than CCPR in studies with pre-defined criteria, strict indications criteria should be considered when implementation of ECPR. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  7. Nonparametric spirometry reference values for Hispanic Americans.

    PubMed

    Glenn, Nancy L; Brown, Vanessa M

    2011-02-01

    Recent literature sites ethnic origin as a major factor in developing pulmonary function reference values. Extensive studies established reference values for European and African Americans, but not for Hispanic Americans. The Third National Health and Nutrition Examination Survey defines Hispanic as individuals of Spanish speaking cultures. While no group was excluded from the target population, sample size requirements only allowed inclusion of individuals who identified themselves as Mexican Americans. This research constructs nonparametric reference value confidence intervals for Hispanic American pulmonary function. The method is applicable to all ethnicities. We use empirical likelihood confidence intervals to establish normal ranges for reference values. Its major advantage: it is model free, but shares asymptotic properties of model based methods. Statistical comparisons indicate that empirical likelihood interval lengths are comparable to normal theory intervals. Power and efficiency studies agree with previously published theoretical results.

  8. Conceptualisations of infinity by primary pre-service teachers

    NASA Astrophysics Data System (ADS)

    Date-Huxtable, Elizabeth; Cavanagh, Michael; Coady, Carmel; Easey, Michael

    2018-05-01

    As part of the Opening Real Science: Authentic Mathematics and Science Education for Australia project, an online mathematics learning module embedding conceptual thinking about infinity in science-based contexts, was designed and trialled with a cohort of 22 pre-service teachers during 1 week of intensive study. This research addressed the question: "How do pre-service teachers conceptualise infinity mathematically?" Participants argued the existence of infinity in a summative reflective task, using mathematical and empirical arguments that were coded according to five themes: definition, examples, application, philosophy and teaching; and 17 codes. Participants' reflections were differentiated as to whether infinity was referred to as an abstract (A) or a real (R) concept or whether both (B) codes were used. Principal component analysis of the reflections, using frequency of codings, revealed that A and R codes occurred at different frequencies in three groups of reflections. Distinct methods of argument were associated with each group of reflections: mathematical numerical examples and empirical measurement comparisons characterised arguments for infinity as an abstract concept, geometric and empirical dynamic examples and belief statements characterised arguments for infinity as a real concept and empirical measurement and mathematical examples and belief statements characterised arguments for infinity as both an abstract and a real concept. An implication of the results is that connections between mathematical and empirical applications of infinity may assist pre-service teachers to contrast finite with infinite models of the world.

  9. Learning algebra through MCREST strategy in junior high school students

    NASA Astrophysics Data System (ADS)

    Siregar, Nurfadilah; Kusumah, Yaya S.; Sabandar, J.; Dahlan, J. A.

    2017-09-01

    The aims of this paper are to describe the use of MCREST strategy in learning algebra and to obtain empirical evidence on the effect of MCREST strategy es specially on reasoning ability. Students in eight grade in one of schools at Cimahi City are chosen as the sample of this study. Using pre-test and post-test control group design, the data then analyzed in descriptive and inferential statistics. The results of this study show the students who got MCREST strategy in their class have better result in test of reasoning ability than students who got direct learning. It means that MCREST strategy gives good impact in learning algebra.

  10. A Simulation Study of Threats to Validity in Quasi-Experimental Designs: Interrelationship between Design, Measurement, and Analysis.

    PubMed

    Holgado-Tello, Fco P; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana; Pérez-Gil, José A

    2016-01-01

    The Campbellian tradition provides a conceptual framework to assess threats to validity. On the other hand, different models of causal analysis have been developed to control estimation biases in different research designs. However, the link between design features, measurement issues, and concrete impact estimation analyses is weak. In order to provide an empirical solution to this problem, we use Structural Equation Modeling (SEM) as a first approximation to operationalize the analytical implications of threats to validity in quasi-experimental designs. Based on the analogies established between the Classical Test Theory (CTT) and causal analysis, we describe an empirical study based on SEM in which range restriction and statistical power have been simulated in two different models: (1) A multistate model in the control condition (pre-test); and (2) A single-trait-multistate model in the control condition (post-test), adding a new mediator latent exogenous (independent) variable that represents a threat to validity. Results show, empirically, how the differences between both the models could be partially or totally attributed to these threats. Therefore, SEM provides a useful tool to analyze the influence of potential threats to validity.

  11. A Simulation Study of Threats to Validity in Quasi-Experimental Designs: Interrelationship between Design, Measurement, and Analysis

    PubMed Central

    Holgado-Tello, Fco. P.; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana; Pérez-Gil, José A.

    2016-01-01

    The Campbellian tradition provides a conceptual framework to assess threats to validity. On the other hand, different models of causal analysis have been developed to control estimation biases in different research designs. However, the link between design features, measurement issues, and concrete impact estimation analyses is weak. In order to provide an empirical solution to this problem, we use Structural Equation Modeling (SEM) as a first approximation to operationalize the analytical implications of threats to validity in quasi-experimental designs. Based on the analogies established between the Classical Test Theory (CTT) and causal analysis, we describe an empirical study based on SEM in which range restriction and statistical power have been simulated in two different models: (1) A multistate model in the control condition (pre-test); and (2) A single-trait-multistate model in the control condition (post-test), adding a new mediator latent exogenous (independent) variable that represents a threat to validity. Results show, empirically, how the differences between both the models could be partially or totally attributed to these threats. Therefore, SEM provides a useful tool to analyze the influence of potential threats to validity. PMID:27378991

  12. Statistical Methods in Ai: Rare Event Learning Using Associative Rules and Higher-Order Statistics

    NASA Astrophysics Data System (ADS)

    Iyer, V.; Shetty, S.; Iyengar, S. S.

    2015-07-01

    Rare event learning has not been actively researched since lately due to the unavailability of algorithms which deal with big samples. The research addresses spatio-temporal streams from multi-resolution sensors to find actionable items from a perspective of real-time algorithms. This computing framework is independent of the number of input samples, application domain, labelled or label-less streams. A sampling overlap algorithm such as Brooks-Iyengar is used for dealing with noisy sensor streams. We extend the existing noise pre-processing algorithms using Data-Cleaning trees. Pre-processing using ensemble of trees using bagging and multi-target regression showed robustness to random noise and missing data. As spatio-temporal streams are highly statistically correlated, we prove that a temporal window based sampling from sensor data streams converges after n samples using Hoeffding bounds. Which can be used for fast prediction of new samples in real-time. The Data-cleaning tree model uses a nonparametric node splitting technique, which can be learned in an iterative way which scales linearly in memory consumption for any size input stream. The improved task based ensemble extraction is compared with non-linear computation models using various SVM kernels for speed and accuracy. We show using empirical datasets the explicit rule learning computation is linear in time and is only dependent on the number of leafs present in the tree ensemble. The use of unpruned trees (t) in our proposed ensemble always yields minimum number (m) of leafs keeping pre-processing computation to n × t log m compared to N2 for Gram Matrix. We also show that the task based feature induction yields higher Qualify of Data (QoD) in the feature space compared to kernel methods using Gram Matrix.

  13. Sexual dysfunctions in women.

    PubMed

    Meston, Cindy M; Bradford, Andrea

    2007-01-01

    In this article, we summarize the definition, etiology, assessment, and treatment of sexual dysfunctions in women. Although the Diagnostic and Statistical Manual of Mental Disorders, fourth edition (DSM-IV-TR) is our guiding framework for classifying and defining women's sexual dysfunctions, we draw special attention to recent discussion in the literature criticizing the DSM-IV-TR diagnostic criteria and their underlying assumptions. Our review of clinical research on sexual dysfunction summarizes psychosocial and biomedical management approaches, with a critical examination of the empirical support for commonly prescribed therapies and limitations of recent clinical trials.

  14. Is it beneficial to approximate pre-failure topography to predict landslide susceptibility with empirical models?

    NASA Astrophysics Data System (ADS)

    Steger, Stefan; Schmaltz, Elmar; Glade, Thomas

    2017-04-01

    Empirical landslide susceptibility maps spatially depict the areas where future slope failures are likely due to specific environmental conditions. The underlying statistical models are based on the assumption that future landsliding is likely to occur under similar circumstances (e.g. topographic conditions, lithology, land cover) as past slope failures. This principle is operationalized by applying a supervised classification approach (e.g. a regression model with a binary response: landslide presence/absence) that enables discrimination between conditions that favored past landslide occurrences and the circumstances typical for landslide absences. The derived empirical relation is then transferred to each spatial unit of an area. Literature reveals that the specific topographic conditions representative for landslide presences are frequently extracted from derivatives of digital terrain models at locations were past landslides were mapped. The underlying morphology-based landslide identification becomes possible due to the fact that the topography at a specific locality usually changes after landslide occurrence (e.g. hummocky surface, concave and steep scarp). In a strict sense, this implies that topographic predictors used within conventional statistical landslide susceptibility models relate to post-failure topographic conditions - and not to the required pre-failure situation. This study examines the assumption that models calibrated on the basis of post-failure topographies may not be appropriate to predict future landslide locations, because (i) post-failure and pre-failure topographic conditions may differ and (ii) areas were future landslides will occur do not yet exhibit such a distinct post-failure morphology. The study was conducted for an area located in the Walgau region (Vorarlberg, western Austria), where a detailed inventory consisting of shallow landslides was available. The methodology comprised multiple systematic comparisons of models generated on the basis of post-failure conditions (i.e. the standard approach) with models based on an approximated pre-failure topography. Pre-failure topography was approximated by (i) erasing the area of mapped landslide polygons within a digital terrain model and (ii) filling these "empty" areas by interpolating elevation points located outside the mapped landslides. Landslide presence information was extracted from the respective landslide scarp locations while an equal number of randomly sampled points represented landslide absences. After an initial exploratory data analysis, mixed-effects logistic regression was applied to model landslide susceptibility on the basis of two predictor sets (post-failure versus pre-failure predictors). Furthermore, all analyses were separately conducted for five different modelling resolutions to elaborate the suspicion that the degree of generalization of topographic parameters may as well play a role on how the respective models may differ. Model evaluation was conducted by means of multiple procedures (i.e. odds ratios, k-fold cross validation, permutation-based variable importance, difference maps of predictions). The results revealed that models based on highest resolutions (e.g. 1 m, 2.5 m) and post-failure topography performed best from a purely quantitative perspective. A confrontation of models (post-failure versus pre-failure based models) based on an identical modelling resolution exposed that validation results, modelled relationships as well as the prediction pattern tended to converge with a decreasing raster resolution. Based on the results, we concluded that an approximation of pre-failure topography does not significantly contribute to improved landslide susceptibility models in the case (i) the underlying inventory consists of small landslide features and (ii) the models are based on coarse raster resolutions (e.g. 25 m). However, in the case modelling with high raster resolutions is envisaged (e.g. 1 m, 2.5 m) or the inventory mainly consists of larger events, a reconstruction of pre-failure conditions might be highly expedient, even though conventional validation results might indicate an opposite tendency. Finally, we recommend to consider that topographic predictors highly useful to detect past slope movements (e.g. roughness) are not necessarily valuable to predict future slope instabilities.

  15. The Medical Duty Officer: An Attempt to Mitigate the Ambulance At-Hospital Interval

    PubMed Central

    Halliday, Megan H.; Bouland, Andrew J.; Lawner, Benjamin J.; Comer, Angela C.; Ramos, Daniel C.; Fletcher, Mark

    2016-01-01

    Introduction A lack of coordination between emergency medical services (EMS), emergency departments (ED) and systemwide management has contributed to extended ambulance at-hospital times at local EDs. In an effort to improve communication within the local EMS system, the Baltimore City Fire Department (BCFD) placed a medical duty officer (MDO) in the fire communications bureau. It was hypothesized that any real-time intervention suggested by the MDO would be manifested in a decrease in the EMS at-hospital time. Methods The MDO was implemented on November 11, 2013. A senior EMS paramedic was assigned to the position and was placed in the fire communication bureau from 9 a.m. to 9 p.m., seven days a week. We defined the pre-intervention period as August 2013 – October 2013 and the post-intervention period as December 2013 – February 2014. We also compared the post-intervention period to the “seasonal match control” one year earlier to adjust for seasonal variation in EMS volume. The MDO was tasked with the prospective management of city EMS resources through intensive monitoring of unit availability and hospital ED traffic. The MDO could suggest alternative transport destinations in the event of ED crowding. We collected and analyzed data from BCFD computer-aided dispatch (CAD) system for the following: ambulance response times, ambulance at-hospital interval, hospital diversion and alert status, and “suppression wait time” (defined as the total time suppression units remained on scene until ambulance arrival). The data analysis used a pre/post intervention design to examine the MDO impact on the BCFD EMS system. Results There were a total of 15,567 EMS calls during the pre-intervention period, 13,921 in the post-intervention period and 14,699 in the seasonal match control period one year earlier. The average at-hospital time decreased by 1.35 minutes from pre- to post-intervention periods and 4.53 minutes from the pre- to seasonal match control, representing a statistically significant decrease in this interval. There was also a statistically significant decrease in hospital alert time (approximately 1,700 hour decrease pre- to post-intervention periods) and suppression wait time (less than one minute decrease from pre- to post- and pre- to seasonal match control periods). The decrease in ambulance response time was not statistically significant. Conclusion Proactive deployment of a designated MDO was associated with a small, contemporaneous reduction in at-hospital time within an urban EMS jurisdiction. This project emphasized the importance of better communication between EMS systems and area hospitals as well as uniform reporting of variables for future iterations of this and similar projects. PMID:27625737

  16. Comparison between volatility return intervals of the S&P 500 index and two common models

    NASA Astrophysics Data System (ADS)

    Vodenska-Chitkushev, I.; Wang, F. Z.; Weber, P.; Yamasaki, K.; Havlin, S.; Stanley, H. E.

    2008-01-01

    We analyze the S&P 500 index data for the 13-year period, from January 1, 1984 to December 31, 1996, with one data point every 10 min. For this database, we study the distribution and clustering of volatility return intervals, which are defined as the time intervals between successive volatilities above a certain threshold q. We find that the long memory in the volatility leads to a clustering of above-median as well as below-median return intervals. In addition, it turns out that the short return intervals form larger clusters compared to the long return intervals. When comparing the empirical results to the ARMA-FIGARCH and fBm models for volatility, we find that the fBm model predicts scaling better than the ARMA-FIGARCH model, which is consistent with the argument that both ARMA-FIGARCH and fBm capture the long-term dependence in return intervals to a certain extent, but only fBm accounts for the scaling. We perform the Student's t-test to compare the empirical data with the shuffled records, ARMA-FIGARCH and fBm. We analyze separately the clusters of above-median return intervals and the clusters of below-median return intervals for different thresholds q. We find that the empirical data are statistically different from the shuffled data for all thresholds q. Our results also suggest that the ARMA-FIGARCH model is statistically different from the S&P 500 for intermediate q for both above-median and below-median clusters, while fBm is statistically different from S&P 500 for small and large q for above-median clusters and for small q for below-median clusters. Neither model can fully explain the entire regime of q studied.

  17. Exploring the impact of mindfulness meditation training in pre-licensure and post graduate nurses.

    PubMed

    Sanko, Jill; Mckay, Mary; Rogers, Scott

    2016-10-01

    The complex, high stress, technologically laden healthcare environment compromises providers' ability to be fully present in the moment; especially during patient interactions. This "pulling away" of attention (mindlessness) from the present moment creates an environment where decision making can take place in the absence of thoughtful, deliberate engagement in the task at hand. Mindfulness, can be cultivated through a variety of mindfulness practices. Few schools of nursing or hospitals offer mindfulness training, despite study findings supporting its effectiveness in improving levels of mindfulness, and perceived connections with patients and families. A mindfulness program developed for this study and tailored to nursing was used to provide the mindfulness training. Pre and post training assessments were completed and included administration of the Freiburg Mindfulness Inventory (FMI) and the Defining Issues Test (DIT) of moral judgment version 2. A statistically significant improvement in the FMI scores p=0.003 was found. The pre-licensure group did not show a statistically significant improvement in their FMI scores pre to post training (p=0.281), however the post graduate group did (p=0.004). Statistically significant pre - post scores were found in two schemas of the DIT-2 (P [Post conventional] score, p=0.039 and N2 [Maintaining norms] score, p=0.032). Mindfulness training improves mindfulness and some aspects of ethical decision making in the groups studied as part of this project. The findings of this study are promising and further demonstrate the merits of a mindfulness practice, however aspects of mindfulness training would need to be addressed prior to launching a full scale attempt to incorporate this into a work life or some other quality improvement program. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Personality traits in patients with Parkinson's disease: assessment and clinical implications.

    PubMed

    Poletti, Michele; Bonuccelli, Ubaldo

    2012-06-01

    This study reviews empirical evidence on the association between personality traits and Parkinson's disease (PD), with a twofold aim. First, to better identify non-motor symptoms, such as affective symptoms and personality changes, that could help to define the pre-motor phase of PD; second, to better understand the neurobiological bases of personality traits, a goal that is not fully accomplished by a purely anatomical approach. A literature review was performed on studies of personality traits in PD patients, in electronic databases ISI Web of Knowledge, Medline and PsychInfo, conducted in July 2011. We found evidence that the existence of a characteristic premorbid personality profile of PD patients is not actually sustained by robust empirical evidence, mainly due to the methodological bias of the retrospective assessment of personality; PD patients present a personality profile of low novelty seeking and high harm avoidance. We concluded that the definition of a pre-motor phase of PD, based on non-motor symptoms, should search for the presence of concomitant affective disorders and for a positive psychiatric history for affective disorders rather than for a typical personality profile or personality changes. The low novelty seeking profile is probably related to the dopaminergic deficit, while the high harm avoidance profile is probably associated with the presence of affective disorders. Clinical implications of these findings, in regard to personality assessment and pharmacological treatments in PD, are also discussed.

  19. Synchrony in Dyadic Psychotherapy Sessions

    NASA Astrophysics Data System (ADS)

    Ramseyer, Fabian; Tschacher, Wolfgang

    Synchrony is a multi-faceted concept used in diverse domains such as physics, biology, and the social sciences. This chapter reviews some of the evidence of nonverbal synchrony in human communication, with a main focus on the role of synchrony in the psychotherapeutic setting. Nonverbal synchrony describes coordinated behavior of patient and therapist. Its association with empathy, rapport and the therapeutic relationship has been pointed out repeatedly, yet close evaluation of empirical studies suggests that the evidence remains inconclusive. Particularly in naturalistic studies, research with quantitative measures of synchrony is still lacking. We introduce a new empirical approach for the study of synchrony in psychotherapies under field conditions: Motion Energy Analysis (MEA). This is a video-based algorithm that quantifies the amount of movement in freely definable regions of interest. Our statistical analysis detects synchrony on a global level, irrespective of the specific body parts moving. Synchrony thus defined can be considered as a general measure of movement coordination between interacting individuals. Data from a sequence of N = 21 therapy sessions taken from one psychotherapy dyad shows a high positive relationship between synchrony and the therapeutic bond. Nonverbal synchrony can thus be considered a promising concept for research on the therapeutic alliance. Further areas of application are discussed.

  20. Spatio-temporal distribution of Oklahoma earthquakes: Exploring relationships using a nearest-neighbor approach: Nearest-neighbor analysis of Oklahoma

    DOE PAGES

    Vasylkivska, Veronika S.; Huerta, Nicolas J.

    2017-06-24

    Determining the spatiotemporal characteristics of natural and induced seismic events holds the opportunity to gain new insights into why these events occur. Linking the seismicity characteristics with other geologic, geographic, natural, or anthropogenic factors could help to identify the causes and suggest mitigation strategies that reduce the risk associated with such events. The nearest-neighbor approach utilized in this work represents a practical first step toward identifying statistically correlated clusters of recorded earthquake events. Detailed study of the Oklahoma earthquake catalog’s inherent errors, empirical model parameters, and model assumptions is presented. We found that the cluster analysis results are stable withmore » respect to empirical parameters (e.g., fractal dimension) but were sensitive to epicenter location errors and seismicity rates. Most critically, we show that the patterns in the distribution of earthquake clusters in Oklahoma are primarily defined by spatial relationships between events. This observation is a stark contrast to California (also known for induced seismicity) where a comparable cluster distribution is defined by both spatial and temporal interactions between events. These results highlight the difficulty in understanding the mechanisms and behavior of induced seismicity but provide insights for future work.« less

  1. Spatiotemporal distribution of Oklahoma earthquakes: Exploring relationships using a nearest-neighbor approach

    NASA Astrophysics Data System (ADS)

    Vasylkivska, Veronika S.; Huerta, Nicolas J.

    2017-07-01

    Determining the spatiotemporal characteristics of natural and induced seismic events holds the opportunity to gain new insights into why these events occur. Linking the seismicity characteristics with other geologic, geographic, natural, or anthropogenic factors could help to identify the causes and suggest mitigation strategies that reduce the risk associated with such events. The nearest-neighbor approach utilized in this work represents a practical first step toward identifying statistically correlated clusters of recorded earthquake events. Detailed study of the Oklahoma earthquake catalog's inherent errors, empirical model parameters, and model assumptions is presented. We found that the cluster analysis results are stable with respect to empirical parameters (e.g., fractal dimension) but were sensitive to epicenter location errors and seismicity rates. Most critically, we show that the patterns in the distribution of earthquake clusters in Oklahoma are primarily defined by spatial relationships between events. This observation is a stark contrast to California (also known for induced seismicity) where a comparable cluster distribution is defined by both spatial and temporal interactions between events. These results highlight the difficulty in understanding the mechanisms and behavior of induced seismicity but provide insights for future work.

  2. Spatio-temporal distribution of Oklahoma earthquakes: Exploring relationships using a nearest-neighbor approach: Nearest-neighbor analysis of Oklahoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vasylkivska, Veronika S.; Huerta, Nicolas J.

    Determining the spatiotemporal characteristics of natural and induced seismic events holds the opportunity to gain new insights into why these events occur. Linking the seismicity characteristics with other geologic, geographic, natural, or anthropogenic factors could help to identify the causes and suggest mitigation strategies that reduce the risk associated with such events. The nearest-neighbor approach utilized in this work represents a practical first step toward identifying statistically correlated clusters of recorded earthquake events. Detailed study of the Oklahoma earthquake catalog’s inherent errors, empirical model parameters, and model assumptions is presented. We found that the cluster analysis results are stable withmore » respect to empirical parameters (e.g., fractal dimension) but were sensitive to epicenter location errors and seismicity rates. Most critically, we show that the patterns in the distribution of earthquake clusters in Oklahoma are primarily defined by spatial relationships between events. This observation is a stark contrast to California (also known for induced seismicity) where a comparable cluster distribution is defined by both spatial and temporal interactions between events. These results highlight the difficulty in understanding the mechanisms and behavior of induced seismicity but provide insights for future work.« less

  3. The role of empirical Bayes methodology as a leading principle in modern medical statistics.

    PubMed

    van Houwelingen, Hans C

    2014-11-01

    This paper reviews and discusses the role of Empirical Bayes methodology in medical statistics in the last 50 years. It gives some background on the origin of the empirical Bayes approach and its link with the famous Stein estimator. The paper describes the application in four important areas in medical statistics: disease mapping, health care monitoring, meta-analysis, and multiple testing. It ends with a warning that the application of the outcome of an empirical Bayes analysis to the individual "subjects" is a delicate matter that should be handled with prudence and care. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. The development of ensemble theory. A new glimpse at the history of statistical mechanics

    NASA Astrophysics Data System (ADS)

    Inaba, Hajime

    2015-12-01

    This paper investigates the history of statistical mechanics from the viewpoint of the development of the ensemble theory from 1871 to 1902. In 1871, Ludwig Boltzmann introduced a prototype model of an ensemble that represents a polyatomic gas. In 1879, James Clerk Maxwell defined an ensemble as copies of systems of the same energy. Inspired by H.W. Watson, he called his approach "statistical". Boltzmann and Maxwell regarded the ensemble theory as a much more general approach than the kinetic theory. In the 1880s, influenced by Hermann von Helmholtz, Boltzmann made use of ensembles to establish thermodynamic relations. In Elementary Principles in Statistical Mechanics of 1902, Josiah Willard Gibbs tried to get his ensemble theory to mirror thermodynamics, including thermodynamic operations in its scope. Thermodynamics played the role of a "blind guide". His theory of ensembles can be characterized as more mathematically oriented than Einstein's theory proposed in the same year. Mechanical, empirical, and statistical approaches to foundations of statistical mechanics are presented. Although it was formulated in classical terms, the ensemble theory provided an infrastructure still valuable in quantum statistics because of its generality.

  5. Evaluating the impact of an integrated multidisciplinary head & neck competency-based anatomy & radiology teaching approach in radiation oncology: a prospective cohort study

    PubMed Central

    2014-01-01

    Background Modern radiation oncology demands a thorough understanding of gross and cross-sectional anatomy for diagnostic and therapeutic applications. Complex anatomic sites present challenges for learners and are not well-addressed in traditional postgraduate curricula. A multidisciplinary team (MDT) based head-and-neck gross and radiologic anatomy program for radiation oncology trainees was developed, piloted, and empirically assessed for efficacy and learning outcomes. Methods Four site-specific MDT head-and-neck seminars were implemented, each involving a MDT delivering didactic and case-based instruction, supplemented by cadaveric presentations. There was no dedicated contouring instruction. Pre- and post-testing were performed to assess knowledge, and ability to apply knowledge to the clinical setting as defined by accuracy of contouring. Paired analyses of knowledge pretests and posttests were performed by Wilcoxon matched-pair signed-rank test. Results Fifteen post-graduate trainees participated. A statistically significant (p < 0.001) mean absolute improvement of 4.6 points (17.03%) was observed between knowledge pretest and posttest scores. Contouring accuracy was analyzed quantitatively by comparing spatial overlap of participants’ pretest and posttest contours with a gold standard through the dice similarity coefficient. A statistically significant improvement in contouring accuracy was observed for 3 out of 20 anatomical structures. Qualitative and quantitative feedback revealed that participants were more confident at contouring and were enthusiastic towards the seminars. Conclusions MDT seminars were associated with improved knowledge scores and resident satisfaction; however, increased gross and cross-sectional anatomic knowledge did not translate into improvements in contouring accuracy. Further research should evaluate the impact of hands-on contouring sessions in addition to dedicated instructional sessions to develop competencies. PMID:24969509

  6. Evaluating the impact of an integrated multidisciplinary head & neck competency-based anatomy & radiology teaching approach in radiation oncology: a prospective cohort study.

    PubMed

    D'Souza, Leah; Jaswal, Jasbir; Chan, Francis; Johnson, Marjorie; Tay, Keng Yeow; Fung, Kevin; Palma, David

    2014-06-26

    Modern radiation oncology demands a thorough understanding of gross and cross-sectional anatomy for diagnostic and therapeutic applications. Complex anatomic sites present challenges for learners and are not well-addressed in traditional postgraduate curricula. A multidisciplinary team (MDT) based head-and-neck gross and radiologic anatomy program for radiation oncology trainees was developed, piloted, and empirically assessed for efficacy and learning outcomes. Four site-specific MDT head-and-neck seminars were implemented, each involving a MDT delivering didactic and case-based instruction, supplemented by cadaveric presentations. There was no dedicated contouring instruction. Pre- and post-testing were performed to assess knowledge, and ability to apply knowledge to the clinical setting as defined by accuracy of contouring. Paired analyses of knowledge pretests and posttests were performed by Wilcoxon matched-pair signed-rank test. Fifteen post-graduate trainees participated. A statistically significant (p < 0.001) mean absolute improvement of 4.6 points (17.03%) was observed between knowledge pretest and posttest scores. Contouring accuracy was analyzed quantitatively by comparing spatial overlap of participants' pretest and posttest contours with a gold standard through the dice similarity coefficient. A statistically significant improvement in contouring accuracy was observed for 3 out of 20 anatomical structures. Qualitative and quantitative feedback revealed that participants were more confident at contouring and were enthusiastic towards the seminars. MDT seminars were associated with improved knowledge scores and resident satisfaction; however, increased gross and cross-sectional anatomic knowledge did not translate into improvements in contouring accuracy. Further research should evaluate the impact of hands-on contouring sessions in addition to dedicated instructional sessions to develop competencies.

  7. Empirical likelihood-based tests for stochastic ordering

    PubMed Central

    BARMI, HAMMOU EL; MCKEAGUE, IAN W.

    2013-01-01

    This paper develops an empirical likelihood approach to testing for the presence of stochastic ordering among univariate distributions based on independent random samples from each distribution. The proposed test statistic is formed by integrating a localized empirical likelihood statistic with respect to the empirical distribution of the pooled sample. The asymptotic null distribution of this test statistic is found to have a simple distribution-free representation in terms of standard Brownian bridge processes. The approach is used to compare the lengths of rule of Roman Emperors over various historical periods, including the “decline and fall” phase of the empire. In a simulation study, the power of the proposed test is found to improve substantially upon that of a competing test due to El Barmi and Mukerjee. PMID:23874142

  8. Approximating Long-Term Statistics Early in the Global Precipitation Measurement Era

    NASA Technical Reports Server (NTRS)

    Stanley, Thomas; Kirschbaum, Dalia B.; Huffman, George J.; Adler, Robert F.

    2017-01-01

    Long-term precipitation records are vital to many applications, especially the study of extreme events. The Tropical Rainfall Measuring Mission (TRMM) has served this need, but TRMMs successor mission, Global Precipitation Measurement (GPM), does not yet provide a long-term record. Quantile mapping, the conversion of values across paired empirical distributions, offers a simple, established means to approximate such long-term statistics, but only within appropriately defined domains. This method was applied to a case study in Central America, demonstrating that quantile mapping between TRMM and GPM data maintains the performance of a real-time landslide model. Use of quantile mapping could bring the benefits of the latest satellite-based precipitation dataset to existing user communities such as those for hazard assessment, crop forecasting, numerical weather prediction, and disease tracking.

  9. GTest: a software tool for graphical assessment of empirical distributions' Gaussianity.

    PubMed

    Barca, E; Bruno, E; Bruno, D E; Passarella, G

    2016-03-01

    In the present paper, the novel software GTest is introduced, designed for testing the normality of a user-specified empirical distribution. It has been implemented with two unusual characteristics; the first is the user option of selecting four different versions of the normality test, each of them suited to be applied to a specific dataset or goal, and the second is the inferential paradigm that informs the output of such tests: it is basically graphical and intrinsically self-explanatory. The concept of inference-by-eye is an emerging inferential approach which will find a successful application in the near future due to the growing need of widening the audience of users of statistical methods to people with informal statistical skills. For instance, the latest European regulation concerning environmental issues introduced strict protocols for data handling (data quality assurance, outliers detection, etc.) and information exchange (areal statistics, trend detection, etc.) between regional and central environmental agencies. Therefore, more and more frequently, laboratory and field technicians will be requested to utilize complex software applications for subjecting data coming from monitoring, surveying or laboratory activities to specific statistical analyses. Unfortunately, inferential statistics, which actually influence the decisional processes for the correct managing of environmental resources, are often implemented in a way which expresses its outcomes in a numerical form with brief comments in a strict statistical jargon (degrees of freedom, level of significance, accepted/rejected H0, etc.). Therefore, often, the interpretation of such outcomes is really difficult for people with poor statistical knowledge. In such framework, the paradigm of the visual inference can contribute to fill in such gap, providing outcomes in self-explanatory graphical forms with a brief comment in the common language. Actually, the difficulties experienced by colleagues and their request for an effective tool for addressing such difficulties motivated us in adopting the inference-by-eye paradigm and implementing an easy-to-use, quick and reliable statistical tool. GTest visualizes its outcomes as a modified version of the Q-Q plot. The application has been developed in Visual Basic for Applications (VBA) within MS Excel 2010, which demonstrated to have all the characteristics of robustness and reliability needed. GTest provides true graphical normality tests which are as reliable as any statistical quantitative approach but much easier to understand. The Q-Q plots have been integrated with the outlining of an acceptance region around the representation of the theoretical distribution, defined in accordance with the alpha level of significance and the data sample size. The test decision rule is the following: if the empirical scatterplot falls completely within the acceptance region, then it can be concluded that the empirical distribution fits the theoretical one at the given alpha level. A comprehensive case study has been carried out with simulated and real-world data in order to check the robustness and reliability of the software.

  10. Reconstruction of stochastic temporal networks through diffusive arrival times

    NASA Astrophysics Data System (ADS)

    Li, Xun; Li, Xiang

    2017-06-01

    Temporal networks have opened a new dimension in defining and quantification of complex interacting systems. Our ability to identify and reproduce time-resolved interaction patterns is, however, limited by the restricted access to empirical individual-level data. Here we propose an inverse modelling method based on first-arrival observations of the diffusion process taking place on temporal networks. We describe an efficient coordinate-ascent implementation for inferring stochastic temporal networks that builds in particular but not exclusively on the null model assumption of mutually independent interaction sequences at the dyadic level. The results of benchmark tests applied on both synthesized and empirical network data sets confirm the validity of our algorithm, showing the feasibility of statistically accurate inference of temporal networks only from moderate-sized samples of diffusion cascades. Our approach provides an effective and flexible scheme for the temporally augmented inverse problems of network reconstruction and has potential in a broad variety of applications.

  11. Large-displacement statistics of the rightmost particle of the one-dimensional branching Brownian motion.

    PubMed

    Derrida, Bernard; Meerson, Baruch; Sasorov, Pavel V

    2016-04-01

    Consider a one-dimensional branching Brownian motion and rescale the coordinate and time so that the rates of branching and diffusion are both equal to 1. If X_{1}(t) is the position of the rightmost particle of the branching Brownian motion at time t, the empirical velocity c of this rightmost particle is defined as c=X_{1}(t)/t. Using the Fisher-Kolmogorov-Petrovsky-Piscounov equation, we evaluate the probability distribution P(c,t) of this empirical velocity c in the long-time t limit for c>2. It is already known that, for a single seed particle, P(c,t)∼exp[-(c^{2}/4-1)t] up to a prefactor that can depend on c and t. Here we show how to determine this prefactor. The result can be easily generalized to the case of multiple seed particles and to branching random walks associated with other traveling-wave equations.

  12. Empirical Investigations of the Opportunity Limits of Automatic Residential Electric Load Shaping

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cruickshank, Robert F.; Henze, Gregor P.; Balaji, Rajagopalan

    Residential electric load shaping is often modeled as infrequent, utility-initiated, short-duration deferral of peak demand through direct load control. In contrast, modeled herein is the potential for frequent, transactive, intraday, consumer-configurable load shaping for storage-capable thermostatically controlled electric loads (TCLs), including refrigerators, freezers, and hot water heaters. Unique to this study are 28 months of 15-minute-interval observations of usage in 101 homes in the Pacific Northwest United States that specify exact start, duration, and usage patterns of approximately 25 submetered loads per home. The magnitudes of the load shift from voluntarily-participating TCL appliances are aggregated to form hourly upper andmore » lower load-shaping limits for the coordination of electrical generation, transmission, distribution, storage, and demand. Empirical data are statistically analyzed to define metrics that help quantify load-shaping opportunities.« less

  13. Empirical Investigations of the Opportunity Limits of Automatic Residential Electric Load Shaping: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cruickshank, Robert F.; Henze, Gregor P.; Balaji, Rajagopalan

    Residential electric load shaping is often modeled as infrequent, utility-initiated, short-duration deferral of peak demand through direct load control. In contrast, modeled herein is the potential for frequent, transactive, intraday, consumer-configurable load shaping for storage-capable thermostatically controlled electric loads (TCLs), including refrigerators, freezers, and hot water heaters. Unique to this study are 28 months of 15-minute-interval observations of usage in 101 homes in the Pacific Northwest United States that specify exact start, duration, and usage patterns of approximately 25 submetered loads per home. The magnitudes of the load shift from voluntarily-participating TCL appliances are aggregated to form hourly upper andmore » lower load-shaping limits for the coordination of electrical generation, transmission, distribution, storage, and demand. Empirical data are statistically analyzed to define metrics that help quantify load-shaping opportunities.« less

  14. Reconstruction of stochastic temporal networks through diffusive arrival times

    PubMed Central

    Li, Xun; Li, Xiang

    2017-01-01

    Temporal networks have opened a new dimension in defining and quantification of complex interacting systems. Our ability to identify and reproduce time-resolved interaction patterns is, however, limited by the restricted access to empirical individual-level data. Here we propose an inverse modelling method based on first-arrival observations of the diffusion process taking place on temporal networks. We describe an efficient coordinate-ascent implementation for inferring stochastic temporal networks that builds in particular but not exclusively on the null model assumption of mutually independent interaction sequences at the dyadic level. The results of benchmark tests applied on both synthesized and empirical network data sets confirm the validity of our algorithm, showing the feasibility of statistically accurate inference of temporal networks only from moderate-sized samples of diffusion cascades. Our approach provides an effective and flexible scheme for the temporally augmented inverse problems of network reconstruction and has potential in a broad variety of applications. PMID:28604687

  15. When the Single Matters more than the Group (II): Addressing the Problem of High False Positive Rates in Single Case Voxel Based Morphometry Using Non-parametric Statistics.

    PubMed

    Scarpazza, Cristina; Nichols, Thomas E; Seramondi, Donato; Maumet, Camille; Sartori, Giuseppe; Mechelli, Andrea

    2016-01-01

    In recent years, an increasing number of studies have used Voxel Based Morphometry (VBM) to compare a single patient with a psychiatric or neurological condition of interest against a group of healthy controls. However, the validity of this approach critically relies on the assumption that the single patient is drawn from a hypothetical population with a normal distribution and variance equal to that of the control group. In a previous investigation, we demonstrated that family-wise false positive error rate (i.e., the proportion of statistical comparisons yielding at least one false positive) in single case VBM are much higher than expected (Scarpazza et al., 2013). Here, we examine whether the use of non-parametric statistics, which does not rely on the assumptions of normal distribution and equal variance, would enable the investigation of single subjects with good control of false positive risk. We empirically estimated false positive rates (FPRs) in single case non-parametric VBM, by performing 400 statistical comparisons between a single disease-free individual and a group of 100 disease-free controls. The impact of smoothing (4, 8, and 12 mm) and type of pre-processing (Modulated, Unmodulated) was also examined, as these factors have been found to influence FPRs in previous investigations using parametric statistics. The 400 statistical comparisons were repeated using two independent, freely available data sets in order to maximize the generalizability of the results. We found that the family-wise error rate was 5% for increases and 3.6% for decreases in one data set; and 5.6% for increases and 6.3% for decreases in the other data set (5% nominal). Further, these results were not dependent on the level of smoothing and modulation. Therefore, the present study provides empirical evidence that single case VBM studies with non-parametric statistics are not susceptible to high false positive rates. The critical implication of this finding is that VBM can be used to characterize neuroanatomical alterations in individual subjects as long as non-parametric statistics are employed.

  16. Remote Sensing of Evapotranspiration and Carbon Uptake at Harvard Forest

    NASA Technical Reports Server (NTRS)

    Min, Qilong; Lin, Bing

    2005-01-01

    A land surface vegetation index, defined as the difference of microwave land surface emissivity at 19 and 37 GHz, was calculated for a heavily forested area in north central Massachusetts. The microwave emissivity difference vegetation index (EDVI) was estimated from satellite SSM/I measurements at the defined wavelengths and used to estimate land surface turbulent fluxes. Narrowband visible and infrared measurements and broadband solar radiation observations were used in the EDVI retrievals and turbulent flux estimations. The EDVI values represent physical properties of crown vegetation such as vegetation water content of crown canopies. The collocated land surface turbulent and radiative fluxes were empirically linked together by the EDVI values. The EDVI values are statistically sensitive to evapotranspiration fractions (EF) with a correlation coefficient (R) greater than 0.79 under all-sky conditions. For clear skies, EDVI estimates exhibit a stronger relationship with EF than normalized difference vegetation index (NDVI). Furthermore, the products of EDVI and input energy (solar and photosynthetically-active radiation) are statistically significantly correlated to evapotranspiration (R=0.95) and CO2 uptake flux (R=0.74), respectively.

  17. Statistics, gymnastics and the origins of sport science in Belgium (and Europe).

    PubMed

    Delheye, Pascal

    2014-01-01

    This paper analyses the introduction of statistics in the field of gymnastics and its effect on the institutionalisation of physical education as a fully fledged academic discipline. Soon after Belgian independence, Adolphe Quetelet's research already resulted in large-scale anthropometric statistics - indeed, he developed an index that is still being used and is better known under the name of the body mass index. His insights were applied by promoters of gymnastics who wanted to make physical education more scientific. Thus, Clément Lefébure, director of the Ecole Normale de Gymnastique et d'Escrime in Brussels, set up a comparative experiment (with pre- and post-test measurements) by which he intended to show that the 'rational' method of Swedish gymnastics produced much better results than the 'empirical' method of Belgian/German Turnen. Lefébure's experiment, which was cited internationally but which was also strongly contested by opponents, was one of the factors that led to Swedish gymnastics being officially institutionalised in 1908 at the newly founded Higher Institute of Physical Education of the State University of Ghent, the first institute in the world where students could obtain a doctoral degree in physical education. Although it rested actually on very weak scientific foundations, the bastion of Swedish gymnastics built in Belgium in that pre-war period collapsed only in the 1960s. From then on, sport science could develop fully within the institutes for physical education.

  18. Pre-main-sequence isochrones - II. Revising star and planet formation time-scales

    NASA Astrophysics Data System (ADS)

    Bell, Cameron P. M.; Naylor, Tim; Mayne, N. J.; Jeffries, R. D.; Littlefair, S. P.

    2013-09-01

    We have derived ages for 13 young (<30 Myr) star-forming regions and find that they are up to a factor of 2 older than the ages typically adopted in the literature. This result has wide-ranging implications, including that circumstellar discs survive longer (≃ 10-12 Myr) and that the average Class I lifetime is greater (≃1 Myr) than currently believed. For each star-forming region, we derived two ages from colour-magnitude diagrams. First, we fitted models of the evolution between the zero-age main sequence and terminal-age main sequence to derive a homogeneous set of main-sequence ages, distances and reddenings with statistically meaningful uncertainties. Our second age for each star-forming region was derived by fitting pre-main-sequence stars to new semi-empirical model isochrones. For the first time (for a set of clusters younger than 50 Myr), we find broad agreement between these two ages, and since these are derived from two distinct mass regimes that rely on different aspects of stellar physics, it gives us confidence in the new age scale. This agreement is largely due to our adoption of empirical colour-Teff relations and bolometric corrections for pre-main-sequence stars cooler than 4000 K. The revised ages for the star-forming regions in our sample are: ˜2 Myr for NGC 6611 (Eagle Nebula; M 16), IC 5146 (Cocoon Nebula), NGC 6530 (Lagoon Nebula; M 8) and NGC 2244 (Rosette Nebula); ˜6 Myr for σ Ori, Cep OB3b and IC 348; ≃10 Myr for λ Ori (Collinder 69); ≃11 Myr for NGC 2169; ≃12 Myr for NGC 2362; ≃13 Myr for NGC 7160; ≃14 Myr for χ Per (NGC 884); and ≃20 Myr for NGC 1960 (M 36).

  19. Simulation, identification and statistical variation in cardiovascular analysis (SISCA) - A software framework for multi-compartment lumped modeling.

    PubMed

    Huttary, Rudolf; Goubergrits, Leonid; Schütte, Christof; Bernhard, Stefan

    2017-08-01

    It has not yet been possible to obtain modeling approaches suitable for covering a wide range of real world scenarios in cardiovascular physiology because many of the system parameters are uncertain or even unknown. Natural variability and statistical variation of cardiovascular system parameters in healthy and diseased conditions are characteristic features for understanding cardiovascular diseases in more detail. This paper presents SISCA, a novel software framework for cardiovascular system modeling and its MATLAB implementation. The framework defines a multi-model statistical ensemble approach for dimension reduced, multi-compartment models and focuses on statistical variation, system identification and patient-specific simulation based on clinical data. We also discuss a data-driven modeling scenario as a use case example. The regarded dataset originated from routine clinical examinations and comprised typical pre and post surgery clinical data from a patient diagnosed with coarctation of aorta. We conducted patient and disease specific pre/post surgery modeling by adapting a validated nominal multi-compartment model with respect to structure and parametrization using metadata and MRI geometry. In both models, the simulation reproduced measured pressures and flows fairly well with respect to stenosis and stent treatment and by pre-treatment cross stenosis phase shift of the pulse wave. However, with post-treatment data showing unrealistic phase shifts and other more obvious inconsistencies within the dataset, the methods and results we present suggest that conditioning and uncertainty management of routine clinical data sets needs significantly more attention to obtain reasonable results in patient-specific cardiovascular modeling. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Grain boundary oxidation and an analysis of the effects of pre-oxidation on subsequent fatigue life

    NASA Technical Reports Server (NTRS)

    Oshida, Y.; Liu, H. W.

    1986-01-01

    The effects of preoxidation on subsequent fatigue life were studied. Surface oxidation and grain boundary oxidation of a nickel-base superalloy (TAZ-8A) were studied at 600 to 1000 C for 10 to 1000 hours in air. Surface oxides were identified and the kinetics of surface oxidation was discussed. Grain boundary oxide penetration and morphology were studied. Pancake type grain boundary oxide penetrates deeper and its size is larger, therefore, it is more detrimental to fatigue life than cone-type grain boundary oxide. Oxide penetration depth, a (sub m), is related to oxidation temperature, T, and exposure time, t, by an empirical relation of the Arrhenius type. Effects of T and t on statistical variation of a (sub m) were analyzed according to the Weibull distribution function. Once the oxide is cracked, it serves as a fatigue crack nucleus. Statistical variation of the remaining fatigue life, after the formation of an oxide crack of a critical length, is related directly to the statistical variation of grain boundary oxide penetration depth.

  1. Culture of science: strange history of the methodological thinking in psychology.

    PubMed

    Toomela, Aaro

    2007-03-01

    In pre-World-War-II psychology, two directions in methodological thought-the German-Austrian and North American ways-could be differentiated. After the war, the German-Austrian methodological orientation has been largely abandoned. Compared to the pre-WWII German-Austrian psychology, modern mainstream psychology is more concerned with accumulation of facts than with general theory. Furthermore, the focus on qualitative data-in addition to quantitative data-is rarely visible. Only external-physical or statistical-rather than psychological controls are taken into account in empirical studies. Fragments--rather than wholes-and relationships are studied, and single cases that contradict group data are not analyzed. Instead of complex psychological types simple trait differences are studied, and prediction is not followed by thorough analysis of the whole situation. Last (but not least), data are not systematically related to complex theory. These limits have hindered the growth of knowledge in the behavioral sciences. A new return to an updated version of the German-Austrian methodological trajectory is suggested.

  2. Prediction of rainfall anomalies during the dry to wet transition season over the Southern Amazonia using machine learning tools

    NASA Astrophysics Data System (ADS)

    Shan, X.; Zhang, K.; Zhuang, Y.; Fu, R.; Hong, Y.

    2017-12-01

    Seasonal prediction of rainfall during the dry-to-wet transition season in austral spring (September-November) over southern Amazonia is central for improving planting crops and fire mitigation in that region. Previous studies have identified the key large-scale atmospheric dynamic and thermodynamics pre-conditions during the dry season (June-August) that influence the rainfall anomalies during the dry to wet transition season over Southern Amazonia. Based on these key pre-conditions during dry season, we have evaluated several statistical models and developed a Neural Network based statistical prediction system to predict rainfall during the dry to wet transition for Southern Amazonia (5-15°S, 50-70°W). Multivariate Empirical Orthogonal Function (EOF) Analysis is applied to the following four fields during JJA from the ECMWF Reanalysis (ERA-Interim) spanning from year 1979 to 2015: geopotential height at 200 hPa, surface relative humidity, convective inhibition energy (CIN) index and convective available potential energy (CAPE), to filter out noise and highlight the most coherent spatial and temporal variations. The first 10 EOF modes are retained for inputs to the statistical models, accounting for at least 70% of the total variance in the predictor fields. We have tested several linear and non-linear statistical methods. While the regularized Ridge Regression and Lasso Regression can generally capture the spatial pattern and magnitude of rainfall anomalies, we found that that Neural Network performs best with an accuracy greater than 80%, as expected from the non-linear dependence of the rainfall on the large-scale atmospheric thermodynamic conditions and circulation. Further tests of various prediction skill metrics and hindcasts also suggest this Neural Network prediction approach can significantly improve seasonal prediction skill than the dynamic predictions and regression based statistical predictions. Thus, this statistical prediction system could have shown potential to improve real-time seasonal rainfall predictions in the future.

  3. Effects of local geology on ground motion in the San Francisco Bay region, California—A continued study

    USGS Publications Warehouse

    Gibbs, James F.; Borcherdt, Roger D.

    1974-01-01

    Measurements of ground motion generated by nuclear explosions in Nevada have been completed for 99 locations in the San Francisco Bay region, California. The seismograms, Fourier amplitude spectra, spectral amplification curves for the signal, and the Fourier amplitude spectra of the seismic noise are presented for 60 locations. Analog amplifications, based on the maximum signal amplitude, are computed for an additional 39 locations. The recordings of the nuclear explosions show marked amplitude variations which are consistently related to the local geologic conditions of the recording site. The average spectral amplifications observed for vertical and horizontal ground motions are, respectively: (1, 1) for granite, (1.5, 1.6) for the Franciscan Formation, (2.3, 2.3), for other pre-Tertiary and Tertiary rocks, (3.0, 2.7) for the Santa Clara Formation, (3.3, 4.4) for older bay sediments, and (3.7, 11.3) for younger bay mud. Spectral amplification curves define predominant ground frequencies for younger bay mud sites and for some older bay sediment sites. The predominant frequencies for most sites were not clearly defined by the amplitude spectra computed from the seismic background noise. The intensities ascribed to various sites in the San Francisco Bay region for the California earthquake of April 18, 1906, are strongly dependent on distance from the zone of surface faulting and the geological character of the ground. Considering only those sites (approximately one square city block in size) for which there is good evidence for the degree of ascribed intensity, the intensities for 917 sites on Franciscan rocks generally decrease with the logarithm of distance as Intensity = 2.69 - 1.90 log (Distance Km). For sites on other geologic units, intensity increments, derived from this empirical rela.tion, correlate strongly with the Average Horizontal Spectral Amplifications (MISA) according to the empirical relation Intensity Increment= 0.27 + 2.70 log(AHSA). Average intensity increments predicted for various geologic units are -0.3 for granite, 0.2 for Franciscan Formation, 0.6 for other pre-Tertiary, Tertiary bedrock, 0.8 for Santa Clara Formation, 1 .3 for older bay sediments, 2.4 for younger bay mud. These empirical relations, together with detailed geologic maps, delineate areas in the San Francisco Bay region of potentially high intensity from future earthquakes on either the San Andreas fault or the Hayward fault.

  4. Assessing the interruption of the transmission of human helminths with mass drug administration alone: optimizing the design of cluster randomized trials.

    PubMed

    Anderson, Roy; Farrell, Sam; Turner, Hugo; Walson, Judd; Donnelly, Christl A; Truscott, James

    2017-02-17

    A method is outlined for the use of an individual-based stochastic model of parasite transmission dynamics to assess different designs for a cluster randomized trial in which mass drug administration (MDA) is employed in attempts to eliminate the transmission of soil-transmitted helminths (STH) in defined geographic locations. The hypothesis to be tested is: Can MDA alone interrupt the transmission of STH species in defined settings? Clustering is at a village level and the choice of clusters of villages is stratified by transmission intensity (low, medium and high) and parasite species mix (either Ascaris, Trichuris or hookworm dominant). The methodological approach first uses an age-structured deterministic model to predict the MDA coverage required for treating pre-school aged children (Pre-SAC), school aged children (SAC) and adults (Adults) to eliminate transmission (crossing the breakpoint in transmission created by sexual mating in dioecious helminths) with 3 rounds of annual MDA. Stochastic individual-based models are then used to calculate the positive and negative predictive values (PPV and NPV, respectively, for observing elimination or the bounce back of infection) for a defined prevalence of infection 2 years post the cessation of MDA. For the arm only involving the treatment of Pre-SAC and SAC, the failure rate is predicted to be very high (particularly for hookworm-infected villages) unless transmission intensity is very low (R 0 , or the effective reproductive number R, just above unity in value). The calculations are designed to consider various trial arms and stratifications; namely, community-based treatment and Pre-SAC and SAC only treatment (the two arms of the trial), different STH transmission settings of low, medium and high, and different STH species mixes. Results are considered in the light of the complications introduced by the choice of statistic to define success or failure, varying adherence to treatment, migration and parameter uncertainty.

  5. Empirical Correction to the Likelihood Ratio Statistic for Structural Equation Modeling with Many Variables.

    PubMed

    Yuan, Ke-Hai; Tian, Yubin; Yanagihara, Hirokazu

    2015-06-01

    Survey data typically contain many variables. Structural equation modeling (SEM) is commonly used in analyzing such data. The most widely used statistic for evaluating the adequacy of a SEM model is T ML, a slight modification to the likelihood ratio statistic. Under normality assumption, T ML approximately follows a chi-square distribution when the number of observations (N) is large and the number of items or variables (p) is small. However, in practice, p can be rather large while N is always limited due to not having enough participants. Even with a relatively large N, empirical results show that T ML rejects the correct model too often when p is not too small. Various corrections to T ML have been proposed, but they are mostly heuristic. Following the principle of the Bartlett correction, this paper proposes an empirical approach to correct T ML so that the mean of the resulting statistic approximately equals the degrees of freedom of the nominal chi-square distribution. Results show that empirically corrected statistics follow the nominal chi-square distribution much more closely than previously proposed corrections to T ML, and they control type I errors reasonably well whenever N ≥ max(50,2p). The formulations of the empirically corrected statistics are further used to predict type I errors of T ML as reported in the literature, and they perform well.

  6. Using antibiograms to improve antibiotic prescribing in skilled nursing facilities.

    PubMed

    Furuno, Jon P; Comer, Angela C; Johnson, J Kristie; Rosenberg, Joseph H; Moore, Susan L; MacKenzie, Thomas D; Hall, Kendall K; Hirshon, Jon Mark

    2014-10-01

    Antibiograms have effectively improved antibiotic prescribing in acute-care settings; however, their effectiveness in skilled nursing facilities (SNFs) is currently unknown. To develop SNF-specific antibiograms and identify opportunities to improve antibiotic prescribing. Cross-sectional and pretest-posttest study among residents of 3 Maryland SNFs. Antibiograms were created using clinical culture data from a 6-month period in each SNF. We also used admission clinical culture data from the acute care facility primarily associated with each SNF for transferred residents. We manually collected all data from medical charts, and antibiograms were created using WHONET software. We then used a pretest-posttest study to evaluate the effectiveness of an antibiogram on changing antibiotic prescribing practices in a single SNF. Appropriate empirical antibiotic therapy was defined as an empirical antibiotic choice that sufficiently covered the infecting organism, considering antibiotic susceptibilities. We reviewed 839 patient charts from SNF and acute care facilities. During the initial assessment period, 85% of initial antibiotic use in the SNFs was empirical, and thus only 15% of initial antibiotics were based on culture results. Fluoroquinolones were the most frequently used empirical antibiotics, accounting for 54.5% of initial prescribing instances. Among patients with available culture data, only 35% of empirical antibiotic prescribing was determined to be appropriate. In the single SNF in which we evaluated antibiogram effectiveness, prevalence of appropriate antibiotic prescribing increased from 32% to 45% after antibiogram implementation; however, this was not statistically significant ([Formula: see text]). Implementation of antibiograms may be effective in improving empirical antibiotic prescribing in SNFs.

  7. Maximum likelihood: Extracting unbiased information from complex networks

    NASA Astrophysics Data System (ADS)

    Garlaschelli, Diego; Loffredo, Maria I.

    2008-07-01

    The choice of free parameters in network models is subjective, since it depends on what topological properties are being monitored. However, we show that the maximum likelihood (ML) principle indicates a unique, statistically rigorous parameter choice, associated with a well-defined topological feature. We then find that, if the ML condition is incompatible with the built-in parameter choice, network models turn out to be intrinsically ill defined or biased. To overcome this problem, we construct a class of safely unbiased models. We also propose an extension of these results that leads to the fascinating possibility to extract, only from topological data, the “hidden variables” underlying network organization, making them “no longer hidden.” We test our method on World Trade Web data, where we recover the empirical gross domestic product using only topological information.

  8. Liver Stiffness Decreases Rapidly in Response to Successful Hepatitis C Treatment and Then Plateaus.

    PubMed

    Chekuri, Sweta; Nickerson, Jillian; Bichoupan, Kian; Sefcik, Roberta; Doobay, Kamini; Chang, Sanders; DelBello, David; Harty, Alyson; Dieterich, Douglas T; Perumalswami, Ponni V; Branch, Andrea D

    2016-01-01

    To investigate the impact of a sustained virological response (SVR) to hepatitis C virus (HCV) treatment on liver stiffness (LS). LS, measured by transient elastography (FibroScan), demographic and laboratory data of patients treated with interferon (IFN)-containing or IFN-free regimens who had an SVR24 (undetectable HCV viral load 24 weeks after the end of treatment) were analyzed using two-tailed paired t-tests, Mann-Whitney Wilcoxon Signed-rank tests and linear regression. Two time intervals were investigated: pre-treatment to SVR24 and SVR24 to the end of follow-up. LS scores ≥ 12.5 kPa indicated LS-defined cirrhosis. A p-value below 0.05 was considered statistically significant. The median age of the patients (n = 100) was 60 years [IQR (interquartile range) 54-64); 72% were male; 60% were Caucasian; and 42% had cirrhosis pre-treatment according to the FibroScan measurement. The median LS score dropped from 10.40 kPa (IQR: 7.25-18.60) pre-treatment to 7.60 kPa (IQR: 5.60-12.38) at SVR24, p <0.01. Among the 42 patients with LS-defined cirrhosis pre-treatment, 25 (60%) of patients still had LS scores ≥ 12.5 kPa at SVR24, indicating the persistence of cirrhosis. The median change in LS was similar in patients receiving IFN-containing and IFN-free regimens: -1.95 kPa (IQR: -5.75 --0.38) versus -2.40 kPa (IQR: -7.70 --0.23), p = 0.74. Among 56 patients with a post-SVR24 LS measurement, the LS score changed by an additional -0.90 kPa (IQR: -2.98-0.5) during a median follow-up time of 1.17 (IQR: 0.88-1.63) years, which was not a statistically significant decrease (p = 0.99). LS decreased from pre-treatment to SVR24, but did not decrease significantly during additional follow-up. Earlier treatment may be needed to reduce the burden of liver disease.

  9. Estimates of natural salinity and hydrology in a subtropical estuarine ecosystem: implications for Greater Everglades restoration

    USGS Publications Warehouse

    Marshall, Frank E.; Wingard, G. Lynn; Pitts, Patrick A.

    2014-01-01

    Disruption of the natural patterns of freshwater flow into estuarine ecosystems occurred in many locations around the world beginning in the twentieth century. To effectively restore these systems, establishing a pre-alteration perspective allows managers to develop science-based restoration targets for salinity and hydrology. This paper describes a process to develop targets based on natural hydrologic functions by coupling paleoecology and regression models using the subtropical Greater Everglades Ecosystem as an example. Paleoecological investigations characterize the circa 1900 CE (pre-alteration) salinity regime in Florida Bay based on molluscan remains in sediment cores. These paleosalinity estimates are converted into time series estimates of paleo-based salinity, stage, and flow using numeric and statistical models. Model outputs are weighted using the mean square error statistic and then combined. Results indicate that, in the absence of water management, salinity in Florida Bay would be about 3 to 9 salinity units lower than current conditions. To achieve this target, upstream freshwater levels must be about 0.25 m higher than indicated by recent observed data, with increased flow inputs to Florida Bay between 2.1 and 3.7 times existing flows. This flow deficit is comparable to the average volume of water currently being diverted from the Everglades ecosystem by water management. The products (paleo-based Florida Bay salinity and upstream hydrology) provide estimates of pre-alteration hydrology and salinity that represent target restoration conditions. This method can be applied to any estuarine ecosystem with available paleoecologic data and empirical and/or model-based hydrologic data.

  10. Is Project Based Learning More Effective than Direct Instruction in School Science Classrooms? An Analysis of the Empirical Research Evidence

    NASA Astrophysics Data System (ADS)

    Dann, Clifford

    An increasingly loud call by parents, school administrators, teachers, and even business leaders for "authentic learning", emphasizing both group-work and problem solving, has led to growing enthusiasm for inquiry-based learning over the past decade. Although "inquiry" can be defined in many ways, a curriculum called "project-based learning" has recently emerged as the inquiry practice-of-choice with roots in the educational constructivism that emerged in the mid-twentieth century. Often, project-based learning is framed as an alternative instructional strategy to direct instruction for maximizing student content knowledge. This study investigates the empirical evidence for such a comparison while also evaluating the overall quality of the available studies in the light of accepted standards for educational research. Specifically, this thesis investigates what the body of quantitative research says about the efficacy of project-based learning vs. direct instruction when considering student acquisition of content knowledge in science classrooms. Further, existing limitations of the research pertaining to project based learning and secondary school education are explored. The thesis concludes with a discussion of where and how we should focus our empirical efforts in the future. The research revealed that the available empirical research contains flaws in both design and instrumentation. In particular, randomization is poor amongst all the studies considered. The empirical evidence indicates that project-based learning curricula improved student content knowledge but that, while the results were statistically significant, increases in raw test scores were marginal.

  11. Antibiotic stewardship in the newborn surgical patient: A quality improvement project in the neonatal intensive care unit.

    PubMed

    Walker, Sarah; Datta, Ankur; Massoumi, Roxanne L; Gross, Erica R; Uhing, Michael; Arca, Marjorie J

    2017-12-01

    There is significant diversity in the utilization of antibiotics for neonates undergoing surgical procedures. Our institution standardized antibiotic administration for surgical neonates, in which no empiric antibiotics were given to infants with surgical conditions postnatally, and antibiotics are given no more than 72 hours perioperatively. We compared the time periods before and after implementation of antibiotic protocol in an institution review board-approved, retrospective review of neonates with congenital surgical conditions who underwent surgical correction within 30 days after birth. Surgical site infection at 30 days was the primary outcome, and development of hospital-acquired infections or multidrug-resistant organism were secondary outcomes. One hundred forty-eight infants underwent surgical procedures pre-protocol, and 127 underwent procedures post-protocol implementation. Surgical site infection rates were similar pre- and post-protocol, 14% and 9% respectively, (P = .21.) The incidence of hospital-acquired infections (13.7% vs 8.7%, P = .205) and multidrug-resistant organism (4.7% vs 1.6%, P = .143) was similar between the 2 periods. Elimination of empiric postnatal antibiotics did not statistically change rates of surgical site infection, hospital-acquired infections, or multidrug-resistant organisms. Limiting the duration of perioperative antibiotic prophylaxis to no more than 72 hours after surgery did not increase the rate of surgical site infection, hospital-acquired infections, or multidrug-resistant organism. Median antibiotic days were decreased with antibiotic standardization for surgical neonates. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Long-term outcomes following laparoscopic adjustable gastric banding: postoperative psychological sequelae predict outcome at 5-year follow-up.

    PubMed

    Scholtz, Samantha; Bidlake, Louise; Morgan, John; Fiennes, Alberic; El-Etar, Ashraf; Lacey, John Hubert; McCluskey, Sara

    2007-09-01

    NICE guidelines state that patients with psychological contra-indications should not be considered for bariatric surgery, including Laparoscopic Adjustable Gastric Banding (LAGB) surgery as treatment of morbid obesity, although no consistent correlation between psychiatric illness and long-term outcome in LAGB has been established. This is to our knowledge the first study to evaluate long-term outcomes in LAGB for a full range of DSM-IV defined psychiatric and eating disorders, and forms part of a research portfolio developed by the authors aimed at defining psychological predictors of bariatric surgery in the short-, medium- and long-term. Case notes of 37 subjects operated on between April 1997 and June 2000, who had undergone structured clinical interview during pre-surgical assessment to yield diagnoses of mental and eating disorders according to DSM-IV criteria were analyzed according to a set of operationally defined criteria. Statistical analysis was carried out to compare those with a poor outcome and those considered to have a good outcome in terms of psychiatric profile. In this group of mainly female, Caucasian subjects, ranging in age from 27 to 60 years, one-third were diagnosed with a mental disorder according to DSM-IV criteria. The development of postoperative DSM-IV defined binge eating disorder (BED) or depression strongly predicted poor surgical outcome, but pre-surgical psychiatric factors alone did not. Although pre-surgical psychiatric assessment alone cannot predict outcome, an absence of preoperative psychiatric illness should not reassure surgeons who should be mindful of postoperative psychiatric sequelae, particularly BED. The importance of providing an integrated biopsychosocial model of care in bariatric teams is highlighted.

  13. Automated PET-only quantification of amyloid deposition with adaptive template and empirically pre-defined ROI

    NASA Astrophysics Data System (ADS)

    Akamatsu, G.; Ikari, Y.; Ohnishi, A.; Nishida, H.; Aita, K.; Sasaki, M.; Yamamoto, Y.; Sasaki, M.; Senda, M.

    2016-08-01

    Amyloid PET is useful for early and/or differential diagnosis of Alzheimer’s disease (AD). Quantification of amyloid deposition using PET has been employed to improve diagnosis and to monitor AD therapy, particularly in research. Although MRI is often used for segmentation of gray matter and for spatial normalization into standard Montreal Neurological Institute (MNI) space where region-of-interest (ROI) template is defined, 3D MRI is not always available in clinical practice. The purpose of this study was to examine the feasibility of PET-only amyloid quantification with an adaptive template and a pre-defined standard ROI template that has been empirically generated from typical cases. A total of 68 subjects who underwent brain 11C-PiB PET were examined. The 11C-PiB images were non-linearly spatially normalized to the standard MNI T1 atlas using the same transformation parameters of MRI-based normalization. The automatic-anatomical-labeling-ROI (AAL-ROI) template was applied to the PET images. All voxel values were normalized by the mean value of cerebellar cortex to generate the SUVR-scaled images. Eleven typical positive images and eight typical negative images were normalized and averaged, respectively, and were used as the positive and negative template. Positive and negative masks which consist of voxels with SUVR  ⩾1.7 were extracted from both templates. Empirical PiB-prone ROI (EPP-ROI) was generated by subtracting the negative mask from the positive mask. The 11C-PiB image of each subject was non-rigidly normalized to the positive and negative template, respectively, and the one with higher cross-correlation was adopted. The EPP-ROI was then inversely transformed to individual PET images. We evaluated differences of SUVR between standard MRI-based method and PET-only method. We additionally evaluated whether the PET-only method would correctly categorize 11C-PiB scans as positive or negative. Significant correlation was observed between the SUVRs obtained with AAL-ROI and those with EPP-ROI when MRI-based normalization was used, the latter providing higher SUVR. When EPP-ROI was used, MRI-based method and PET-only method provided almost identical SUVR. All 11C-PiB scans were correctly categorized into positive and negative using a cutoff value of 1.7 as compared to visual interpretation. The 11C-PiB SUVR were 2.30  ±  0.24 and 1.25  ±  0.11 for the positive and negative images. PET-only amyloid quantification method with adaptive templates and EPP-ROI can provide accurate, robust and simple amyloid quantification without MRI.

  14. Epistasis and the Structure of Fitness Landscapes: Are Experimental Fitness Landscapes Compatible with Fisher’s Geometric Model?

    PubMed Central

    Blanquart, François; Bataillon, Thomas

    2016-01-01

    The fitness landscape defines the relationship between genotypes and fitness in a given environment and underlies fundamental quantities such as the distribution of selection coefficient and the magnitude and type of epistasis. A better understanding of variation in landscape structure across species and environments is thus necessary to understand and predict how populations will adapt. An increasing number of experiments investigate the properties of fitness landscapes by identifying mutations, constructing genotypes with combinations of these mutations, and measuring the fitness of these genotypes. Yet these empirical landscapes represent a very small sample of the vast space of all possible genotypes, and this sample is often biased by the protocol used to identify mutations. Here we develop a rigorous statistical framework based on Approximate Bayesian Computation to address these concerns and use this flexible framework to fit a broad class of phenotypic fitness models (including Fisher’s model) to 26 empirical landscapes representing nine diverse biological systems. Despite uncertainty owing to the small size of most published empirical landscapes, the inferred landscapes have similar structure in similar biological systems. Surprisingly, goodness-of-fit tests reveal that this class of phenotypic models, which has been successful so far in interpreting experimental data, is a plausible in only three of nine biological systems. More precisely, although Fisher’s model was able to explain several statistical properties of the landscapes—including the mean and SD of selection and epistasis coefficients—it was often unable to explain the full structure of fitness landscapes. PMID:27052568

  15. Psychosocial functioning in the context of diagnosis: assessment and theoretical issues.

    PubMed

    Ro, Eunyoe; Clark, Lee Anna

    2009-09-01

    Psychosocial functioning is an important focus of attention in the revision of the Diagnostic and Statistical Manual of Mental Disorders. Researchers and clinicians are converging upon the opinion that psychometrically strong, comprehensive assessment of individuals' functioning is needed to characterize disorder fully. Also shared is the realization that existing theory and research in this domain have critical shortcomings. The authors urge that the field reexamine the empirical evidence and address theoretical issues to guide future development of the construct and its measurement. The authors first discuss several theoretical issues relevant to the conceptualization and assessment of functioning: (a) definitions of functioning, (b) the role of functioning in defining disorder, and (c) understanding functioning within environmental contexts. The authors then present data regarding empirical domains of psychosocial functioning and their interrelations. Self-reported data on multiple domains of psychosocial functioning were collected from 429 participants. Factor-analytic results (promax rotation) suggest a 4-factor structure of psychosocial functioning: Well-Being, Basic Functioning, Self-Mastery, and Interpersonal and Social Relationships. Finally, the authors propose an integration of theory and empirical findings, which they believe will better incorporate psychosocial functioning into future diagnostic systems. Copyright 2009 APA, all rights reserved.

  16. Pretransplant soluble CD30 level has limited effect on acute rejection, but affects graft function in living donor kidney transplantation.

    PubMed

    Kim, Myoung Soo; Kim, Hae Jin; Kim, Soon Il; Ahn, Hyung Joon; Ju, Man Ki; Kim, Hyun Jung; Jeon, Kyung Ock; Kim, Yu Seun

    2006-12-27

    Serum soluble CD30 (sCD30) levels might be a useful marker of immunologic status in pre transplant (Tx) recipients. We retrospectively correlated preTx sCD30 levels (high versus low) on postTx graft survival, incidence of acute rejection, and graft function using stored preTx serum. Of 254 recipients who underwent kidney Tx, 120 recipients were enrolled under the uniform criteria (living donor, age >25 years, viral hepatitis free, diabetes free). The preTx sCD30 was not significantly associated with differences in graft survival rate during 47.5+/-11.4 months of follow-up (P = 0.5901). High sCD30 (> or =115 U/ml) was associated with a higher incidence of clinically or pathologically defined acute rejection than low sCD30, but the difference was not statistically significant (33.9% vs. 22.4%, P = 0.164). The response rate to antirejection therapy in patients with high sCD30 was inferior to those with low sCD30, but also was not statistically significant (33.3% vs. 7.7%, P = 0.087). However, mean serum creatinine levels in high sCD30 patients at one month, one year, and three years postTx were significantly different from those with low sCD30 (P < 0.05). In multiple regression analysis, acute rejection episodes, donor age, kidney weight/recipient body weight ratio, and preTx sCD30 levels were independent variables affecting the serum creatinine level three years postTx. PreTx sCD30 level has a limited effect on the incidence of acute rejection and response to antirejection treatment, but inversely and independently affects serum creatinine level after living donor kidney transplantation.

  17. Statistical similarities of pre-earthquake electromagnetic emissions to biological and economic extreme events

    NASA Astrophysics Data System (ADS)

    Potirakis, Stelios M.; Contoyiannis, Yiannis; Kopanas, John; Kalimeris, Anastasios; Antonopoulos, George; Peratzakis, Athanasios; Eftaxias, Konstantinos; Nomicos, Costantinos

    2014-05-01

    When one considers a phenomenon that is "complex" refers to a system whose phenomenological laws that describe the global behavior of the system, are not necessarily directly related to the "microscopic" laws that regulate the evolution of its elementary parts. The field of study of complex systems considers that the dynamics of complex systems are founded on universal principles that may be used to describe disparate problems ranging from particle physics to economies of societies. Several authors have suggested that earthquake (EQ) dynamics can be analyzed within similar mathematical frameworks with economy dynamics, and neurodynamics. A central property of the EQ preparation process is the occurrence of coherent large-scale collective behavior with a very rich structure, resulting from repeated nonlinear interactions among the constituents of the system. As a result, nonextensive statistics is an appropriate, physically meaningful, tool for the study of EQ dynamics. Since the fracture induced electromagnetic (EM) precursors are observable manifestations of the underlying EQ preparation process, the analysis of a fracture induced EM precursor observed prior to the occurrence of a large EQ can also be conducted within the nonextensive statistics framework. Within the frame of the investigation for universal principles that may hold for different dynamical systems that are related to the genesis of extreme events, we present here statistical similarities of the pre-earthquake EM emissions related to an EQ, with the pre-ictal electrical brain activity related to an epileptic seizure, and with the pre-crisis economic observables related to the collapse of a share. It is demonstrated the all three dynamical systems' observables can be analyzed in the frame of nonextensive statistical mechanics, while the frequency-size relations of appropriately defined "events" that precede the extreme event related to each one of these different systems present striking quantitative similarities. It is also demonstrated that, for the considered systems, the nonextensive parameter q increases as the extreme event approaches, which indicates that the strength of the long-memory / long-range interactions between the constituents of the system increases characterizing the dynamics of the system.

  18. Assessment of rockfall susceptibility by integrating statistical and physically-based approaches

    NASA Astrophysics Data System (ADS)

    Frattini, Paolo; Crosta, Giovanni; Carrara, Alberto; Agliardi, Federico

    In Val di Fassa (Dolomites, Eastern Italian Alps) rockfalls constitute the most significant gravity-induced natural disaster that threatens both the inhabitants of the valley, who are few, and the thousands of tourists who populate the area in summer and winter. To assess rockfall susceptibility, we developed an integrated statistical and physically-based approach that aimed to predict both the susceptibility to onset and the probability that rockfalls will attain specific reaches. Through field checks and multi-temporal aerial photo-interpretation, we prepared a detailed inventory of both rockfall source areas and associated scree-slope deposits. Using an innovative technique based on GIS tools and a 3D rockfall simulation code, grid cells pertaining to the rockfall source-area polygons were classified as active or inactive, based on the state of activity of the associated scree-slope deposits. The simulation code allows one to link each source grid cell with scree deposit polygons by calculating the trajectory of each simulated launch of blocks. By means of discriminant analysis, we then identified the mix of environmental variables that best identifies grid cells with low or high susceptibility to rockfalls. Among these variables, structural setting, land use, and morphology were the most important factors that led to the initiation of rockfalls. We developed 3D simulation models of the runout distance, intensity and frequency of rockfalls, whose source grid cells corresponded either to the geomorphologically-defined source polygons ( geomorphological scenario) or to study area grid cells with slope angle greater than an empirically-defined value of 37° ( empirical scenario). For each scenario, we assigned to the source grid cells an either fixed or variable onset susceptibility; the latter was derived from the discriminant model group (active/inactive) membership probabilities. Comparison of these four models indicates that the geomorphological scenario with variable onset susceptibility appears to be the most realistic model. Nevertheless, political and legal issues seem to guide local administrators, who tend to select the more conservative empirically-based scenario as a land-planning tool.

  19. Clinical target volume delineation in glioblastomas: pre-operative versus post-operative/pre-radiotherapy MRI

    PubMed Central

    Farace, P; Giri, M G; Meliadò, G; Amelio, D; Widesott, L; Ricciardi, G K; Dall'Oglio, S; Rizzotti, A; Sbarbati, A; Beltramello, A; Maluta, S; Amichetti, M

    2011-01-01

    Objectives Delineation of clinical target volume (CTV) is still controversial in glioblastomas. In order to assess the differences in volume and shape of the radiotherapy target, the use of pre-operative vs post-operative/pre-radiotherapy T1 and T2 weighted MRI was compared. Methods 4 CTVs were delineated in 24 patients pre-operatively and post-operatively using T1 contrast-enhanced (T1PRECTV and T1POSTCTV) and T2 weighted images (T2PRECTV and T2POSTCTV). Pre-operative MRI examinations were performed the day before surgery, whereas post-operative examinations were acquired 1 month after surgery and before chemoradiation. A concordance index (CI) was defined as the ratio between the overlapping and composite volumes. Results The volumes of T1PRECTV and T1POSTCTV were not statistically different (248 ± 88 vs 254 ± 101), although volume differences >100 cm3 were observed in 6 out of 24 patients. A marked increase due to tumour progression was shown in three patients. Three patients showed a decrease because of a reduced mass effect. A significant reduction occurred between pre-operative and post-operative T2 volumes (139 ± 68 vs 78 ± 59). Lack of concordance was observed between T1PRECTV and T1POSTCTV (CI = 0.67 ± 0.09), T2PRECTV and T2POSTCTV (CI = 0.39 ± 0.20) and comparing the portion of the T1PRECTV and T1POSTCTV not covered by that defined on T2PRECTV images (CI = 0.45 ± 0.16 and 0.44 ± 0.17, respectively). Conclusion Using T2 MRI, huge variations can be observed in peritumoural oedema, which are probably due to steroid treatment. Using T1 MRI, brain shifts after surgery and possible progressive enhancing lesions produce substantial differences in CTVs. Our data support the use of post-operative/pre-radiotherapy T1 weighted MRI for planning purposes. PMID:21045069

  20. Menopause and age-driven changes in blood level of fat- and water-soluble vitamins.

    PubMed

    Wiacek, M; Zubrzycki, I Z; Bojke, O; Kim, H-J

    2013-12-01

    The purpose of this cross-sectional study was to assess the association of the menopausal transition with changes in vitamins. The study group comprised women aged 17-85 years from the Third National Health and Nutrition Examination Survey (NHANES), which was conducted between 1988 and 1994, and from the NHANES surveys conducted between 1999 and 2006. Menopausal status was defined using the time since the last period, < 2, 2-12, and > 12 months, for the pre-, peri-, and postmenopause, respectively. The data-cleaning technique employing serum follicle stimulating hormone activity resulted in pre-, peri- and postmenopausal samples encompassing the following age brackets: 17-50, 42-51, and 46-85 years. Statistical inferences were analyzed using non-parametric techniques. Significant increases in vitamin A and vitamin E concentrations across all phases of the menopausal transition were observed. There was a gradual decrease in the vitamin C concentration across all stages of the menopause but a fairly stable concentration of vitamin B12. There was a statistically significant increase in vitamin D between the pre- and postmenopause. Body mass index correlated negatively with serum vitamin concentration in the pre- and postmenopause. Vitamin A should be supplemented in postmenopausal women to decrease the risk of bone fracture. The daily diet should be supplemented with vitamin B12, to avoid possible neurological symptoms due to vitamin B12 deficiency, and with vitamin D to decrease the risk of developing secondary hyperparathyroidism. Due to an adverse influence on serum vitamin concentration, body mass index should be monitored in pre- and postmenopausal women.

  1. Slovenian Pre-Service Teachers' Prototype Biography

    ERIC Educational Resources Information Center

    Lipovec, Alenka; Antolin, Darja

    2014-01-01

    In this article we apply narrative methodology to the study of pre-service elementary teachers' school-time memories connected to mathematics education. In the first phase of our empirical study we asked 214 Slovenian pre-service teachers to write their mathematical autobiographies. On the basis of the mathematical autobiographies we constructed a…

  2. Sanov and central limit theorems for output statistics of quantum Markov chains

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horssen, Merlijn van, E-mail: merlijn.vanhorssen@nottingham.ac.uk; Guţă, Mădălin, E-mail: madalin.guta@nottingham.ac.uk

    2015-02-15

    In this paper, we consider the statistics of repeated measurements on the output of a quantum Markov chain. We establish a large deviations result analogous to Sanov’s theorem for the multi-site empirical measure associated to finite sequences of consecutive outcomes of a classical stochastic process. Our result relies on the construction of an extended quantum transition operator (which keeps track of previous outcomes) in terms of which we compute moment generating functions, and whose spectral radius is related to the large deviations rate function. As a corollary to this, we obtain a central limit theorem for the empirical measure. Suchmore » higher level statistics may be used to uncover critical behaviour such as dynamical phase transitions, which are not captured by lower level statistics such as the sample mean. As a step in this direction, we give an example of a finite system whose level-1 (empirical mean) rate function is independent of a model parameter while the level-2 (empirical measure) rate is not.« less

  3. The Effects of Pre-Operative Enteral Nutrition from Nasal Feeding Tubes on Gastric Outlet Obstruction.

    PubMed

    Chen, Zhi-Hua; Lin, Su-Yong; Dai, Qi-Bao; Hua, Jin; Chen, Shao-Qin

    2017-04-10

    We examined gastric outlet obstruction (GOO) patients who received two weeks of strengthening pre-operative enteral nutrition therapy (pre-EN) through a nasal-jejenal feeding tube placed under a gastroscope to evaluate the feasibility and potential benefit of pre-EN compared to parenteral nutrition (PN). In this study, 68 patients confirmed to have GOO with upper-gastrointestinal contrast and who accepted the operation were randomized into an EN group and a PN group. The differences in nutritional status, immune function, post-operative complications, weight of patients, first bowel sound and first flatus time, pull tube time, length of hospital stay (LOH), and cost of hospitalization between pre-operation and post-operation were all recorded. Statistical analyses were performed using the chi square test and t -test; statistical significance was defined as p < 0.05. The success rate of the placement was 91.18% (three out of 31 cases). After pre-EN, the levels of weight, albumin (ALB), prealbumin (PA), and transferrin (TNF) in the EN group were significantly increased by pre-operation day compared to admission day, but were not significantly increased in the PN group; the weights in the EN group were significantly increased compared to the PN group by pre-operation day and day of discharge; total protein (TP), ALB, PA, and TNF of the EN group were significantly increased compared to the PN group on pre-operation and post-operative days one and three. The levels of CD3+, CD4+/CD8+, IgA, and IgM in the EN group were higher than those of the PN group at pre-operation and post-operation; the EN group had a significantly lower incidence of poor wound healing, peritoneal cavity infection, pneumonia, and a shorter first bowel sound time, first flatus time, and post-operation hospital stay than the PN group. Pre-EN through a nasal-jejunum feeding tube and placed under a gastroscope in GOO patients was safe, feasible, and beneficial to the nutrition status, immune function, and gastrointestinal function, and sped up recovery, while not increasing the cost of hospitalization.

  4. Modeling Tropical Cyclone Storm Surge and Wind Induced Risk Along the Bay of Bengal Coastline Using a Statistical Copula

    NASA Astrophysics Data System (ADS)

    Bushra, N.; Trepanier, J. C.; Rohli, R. V.

    2017-12-01

    High winds, torrential rain, and storm surges from tropical cyclones (TCs) cause massive destruction to property and cost the lives of many people. The coastline of the Bay of Bengal (BoB) ranks as one of the most susceptible to TC storm surges in the world due to low-lying elevation and a high frequency of occurrence. Bangladesh suffers the most due to its geographical setting and population density. Various models have been developed to predict storm surge in this region but none of them quantify statistical risk with empirical data. This study describes the relationship and dependency between empirical TC storm surge and peak reported wind speed at the BoB using a bivariate statistical copula and data from 1885-2011. An Archimedean, Gumbel copula with margins defined by the empirical distributions is specified as the most appropriate choice for the BoB. The model provides return periods for pairs of TC storm surge and peak wind along the BoB coastline. The BoB can expect a TC with peak reported winds of at least 24 m s-1 and surge heights of at least 4.0 m, on average, once every 3.2 years, with a quartile pointwise confidence interval of 2.7-3.8 years. In addition, the BoB can expect peak reported winds of 62 m s-1 and surge heights of at least 8.0 m, on average, once every 115.4 years, with a quartile pointwise confidence interval of 55.8-381.1 years. The purpose of the analysis is to increase the understanding of these dangerous TC characteristics to reduce fatalities and monetary losses into the future. Application of the copula will mitigate future threats of storm surge impacts on coastal communities of the BoB.

  5. Pre- and Post-equinox ROSINA production rates calculated using a realistic empirical coma model derived from AMPS-DSMC simulations of comet 67P/Churyumov-Gerasimenko

    NASA Astrophysics Data System (ADS)

    Hansen, Kenneth; Altwegg, Kathrin; Berthelier, Jean-Jacques; Bieler, Andre; Calmonte, Ursina; Combi, Michael; De Keyser, Johan; Fiethe, Björn; Fougere, Nicolas; Fuselier, Stephen; Gombosi, Tamas; Hässig, Myrtha; Huang, Zhenguang; Le Roy, Lena; Rubin, Martin; Tenishev, Valeriy; Toth, Gabor; Tzou, Chia-Yu

    2016-04-01

    We have previously used results from the AMPS DSMC (Adaptive Mesh Particle Simulator Direct Simulation Monte Carlo) model to create an empirical model of the near comet coma (<400 km) of comet 67P for the pre-equinox orbit of comet 67P/Churyumov-Gerasimenko. In this work we extend the empirical model to the post-equinox, post-perihelion time period. In addition, we extend the coma model to significantly further from the comet (~100,000-1,000,000 km). The empirical model characterizes the neutral coma in a comet centered, sun fixed reference frame as a function of heliocentric distance, radial distance from the comet, local time and declination. Furthermore, we have generalized the model beyond application to 67P by replacing the heliocentric distance parameterizations and mapping them to production rates. Using this method, the model become significantly more general and can be applied to any comet. The model is a significant improvement over simpler empirical models, such as the Haser model. For 67P, the DSMC results are, of course, a more accurate representation of the coma at any given time, but the advantage of a mean state, empirical model is the ease and speed of use. One application of the empirical model is to de-trend the spacecraft motion from the ROSINA COPS and DFMS data (Rosetta Orbiter Spectrometer for Ion and Neutral Analysis, Comet Pressure Sensor, Double Focusing Mass Spectrometer). The ROSINA instrument measures the neutral coma density at a single point and the measured value is influenced by the location of the spacecraft relative to the comet and the comet-sun line. Using the empirical coma model we can correct for the position of the spacecraft and compute a total production rate based on the single point measurement. In this presentation we will present the coma production rate as a function of heliocentric distance both pre- and post-equinox and perihelion.

  6. Benchmarking test of empirical root water uptake models

    NASA Astrophysics Data System (ADS)

    dos Santos, Marcos Alex; de Jong van Lier, Quirijn; van Dam, Jos C.; Freire Bezerra, Andre Herman

    2017-01-01

    Detailed physical models describing root water uptake (RWU) are an important tool for the prediction of RWU and crop transpiration, but the hydraulic parameters involved are hardly ever available, making them less attractive for many studies. Empirical models are more readily used because of their simplicity and the associated lower data requirements. The purpose of this study is to evaluate the capability of some empirical models to mimic the RWU distribution under varying environmental conditions predicted from numerical simulations with a detailed physical model. A review of some empirical models used as sub-models in ecohydrological models is presented, and alternative empirical RWU models are proposed. All these empirical models are analogous to the standard Feddes model, but differ in how RWU is partitioned over depth or how the transpiration reduction function is defined. The parameters of the empirical models are determined by inverse modelling of simulated depth-dependent RWU. The performance of the empirical models and their optimized empirical parameters depends on the scenario. The standard empirical Feddes model only performs well in scenarios with low root length density R, i.e. for scenarios with low RWU compensation. For medium and high R, the Feddes RWU model cannot mimic properly the root uptake dynamics as predicted by the physical model. The Jarvis RWU model in combination with the Feddes reduction function (JMf) only provides good predictions for low and medium R scenarios. For high R, it cannot mimic the uptake patterns predicted by the physical model. Incorporating a newly proposed reduction function into the Jarvis model improved RWU predictions. Regarding the ability of the models to predict plant transpiration, all models accounting for compensation show good performance. The Akaike information criterion (AIC) indicates that the Jarvis (2010) model (JMII), with no empirical parameters to be estimated, is the best model. The proposed models are better in predicting RWU patterns similar to the physical model. The statistical indices point to them as the best alternatives for mimicking RWU predictions of the physical model.

  7. IT product competition Network

    NASA Astrophysics Data System (ADS)

    Xu, Xiu-Lian; Zhou, Lei; Shi, Jian-Jun; Wang, Yong-Li; Feng, Ai-Xia; He, Da-Ren

    2008-03-01

    Along with the technical development, the IT product competition becomes increasingly fierce in recent years. The factories, which produce the same IT product, have to improve continuously their own product quality for taking a large piece of cake in the product sale market. We suggest using a complex network description for the IT product competition. In the network the factories are defined as nodes, and two nodes are connected by a link if they produce a common IT product. The edge represents the sale competition relationship. 2121 factories and 265 products have been investigated. Some statistical properties, such as the degree distribution, node strength distribution, assortativity, and node degree correlation have been empirically obtained.

  8. Deep learning with word embeddings improves biomedical named entity recognition.

    PubMed

    Habibi, Maryam; Weber, Leon; Neves, Mariana; Wiegandt, David Luis; Leser, Ulf

    2017-07-15

    Text mining has become an important tool for biomedical research. The most fundamental text-mining task is the recognition of biomedical named entities (NER), such as genes, chemicals and diseases. Current NER methods rely on pre-defined features which try to capture the specific surface properties of entity types, properties of the typical local context, background knowledge, and linguistic information. State-of-the-art tools are entity-specific, as dictionaries and empirically optimal feature sets differ between entity types, which makes their development costly. Furthermore, features are often optimized for a specific gold standard corpus, which makes extrapolation of quality measures difficult. We show that a completely generic method based on deep learning and statistical word embeddings [called long short-term memory network-conditional random field (LSTM-CRF)] outperforms state-of-the-art entity-specific NER tools, and often by a large margin. To this end, we compared the performance of LSTM-CRF on 33 data sets covering five different entity classes with that of best-of-class NER tools and an entity-agnostic CRF implementation. On average, F1-score of LSTM-CRF is 5% above that of the baselines, mostly due to a sharp increase in recall. The source code for LSTM-CRF is available at https://github.com/glample/tagger and the links to the corpora are available at https://corposaurus.github.io/corpora/ . habibima@informatik.hu-berlin.de. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  9. Deep learning with word embeddings improves biomedical named entity recognition

    PubMed Central

    Habibi, Maryam; Weber, Leon; Neves, Mariana; Wiegandt, David Luis; Leser, Ulf

    2017-01-01

    Abstract Motivation: Text mining has become an important tool for biomedical research. The most fundamental text-mining task is the recognition of biomedical named entities (NER), such as genes, chemicals and diseases. Current NER methods rely on pre-defined features which try to capture the specific surface properties of entity types, properties of the typical local context, background knowledge, and linguistic information. State-of-the-art tools are entity-specific, as dictionaries and empirically optimal feature sets differ between entity types, which makes their development costly. Furthermore, features are often optimized for a specific gold standard corpus, which makes extrapolation of quality measures difficult. Results: We show that a completely generic method based on deep learning and statistical word embeddings [called long short-term memory network-conditional random field (LSTM-CRF)] outperforms state-of-the-art entity-specific NER tools, and often by a large margin. To this end, we compared the performance of LSTM-CRF on 33 data sets covering five different entity classes with that of best-of-class NER tools and an entity-agnostic CRF implementation. On average, F1-score of LSTM-CRF is 5% above that of the baselines, mostly due to a sharp increase in recall. Availability and implementation: The source code for LSTM-CRF is available at https://github.com/glample/tagger and the links to the corpora are available at https://corposaurus.github.io/corpora/. Contact: habibima@informatik.hu-berlin.de PMID:28881963

  10. Primarily Statistics: Developing an Introductory Statistics Course for Pre-Service Elementary Teachers

    ERIC Educational Resources Information Center

    Green, Jennifer L.; Blankenship, Erin E.

    2013-01-01

    We developed an introductory statistics course for pre-service elementary teachers. In this paper, we describe the goals and structure of the course, as well as the assessments we implemented. Additionally, we use example course work to demonstrate pre-service teachers' progress both in learning statistics and as novice teachers. Overall, the…

  11. Three Empirical Strategies for Teaching Statistics

    ERIC Educational Resources Information Center

    Marson, Stephen M.

    2007-01-01

    This paper employs a three-step process to analyze three empirically supported strategies for teaching statistics to BSW students. The strategies included: repetition, immediate feedback, and use of original data. First, each strategy is addressed through the literature. Second, the application of employing each of the strategies over the period…

  12. The Effects of Pre-Lecture Quizzes on Test Anxiety and Performance in a Statistics Course

    ERIC Educational Resources Information Center

    Brown, Michael J.; Tallon, Jennifer

    2015-01-01

    The purpose of our study was to examine the effects of pre-lecture quizzes in a statistics course. Students (N = 70) from 2 sections of an introductory statistics course served as participants in this study. One section completed pre-lecture quizzes whereas the other section did not. Completing pre-lecture quizzes was associated with improved exam…

  13. The Impact of Human Capital on the Cost of Air Force Acquisition Programs

    DTIC Science & Technology

    2007-03-01

    inspection as well as an empirical test of the data. To empirically test for heteroskedasticty, the Breusch - Pagan /Cook-Weisberg test was used. With a p...19 Pre-Specification Tests ......................................................................................................20 Post...Specification Tests .....................................................................................................22 Regression Results

  14. Structural Model of the Effects of Cognitive and Affective Factors on the Achievement of Arabic-Speaking Pre-Service Teachers in Introductory Statistics

    ERIC Educational Resources Information Center

    Nasser, Fadia M.

    2004-01-01

    This study examined the extent to which statistics and mathematics anxiety, attitudes toward mathematics and statistics, motivation and mathematical aptitude can explain the achievement of Arabic speaking pre-service teachers in introductory statistics. Complete data were collected from 162 pre-service teachers enrolled in an academic…

  15. Reframing Serial Murder Within Empirical Research.

    PubMed

    Gurian, Elizabeth A

    2017-04-01

    Empirical research on serial murder is limited due to the lack of consensus on a definition, the continued use of primarily descriptive statistics, and linkage to popular culture depictions. These limitations also inhibit our understanding of these offenders and affect credibility in the field of research. Therefore, this comprehensive overview of a sample of 508 cases (738 total offenders, including partnered groups of two or more offenders) provides analyses of solo male, solo female, and partnered serial killers to elucidate statistical differences and similarities in offending and adjudication patterns among the three groups. This analysis of serial homicide offenders not only supports previous research on offending patterns present in the serial homicide literature but also reveals that empirically based analyses can enhance our understanding beyond traditional case studies and descriptive statistics. Further research based on these empirical analyses can aid in the development of more accurate classifications and definitions of serial murderers.

  16. Statistical analysis of modeling error in structural dynamic systems

    NASA Technical Reports Server (NTRS)

    Hasselman, T. K.; Chrostowski, J. D.

    1990-01-01

    The paper presents a generic statistical model of the (total) modeling error for conventional space structures in their launch configuration. Modeling error is defined as the difference between analytical prediction and experimental measurement. It is represented by the differences between predicted and measured real eigenvalues and eigenvectors. Comparisons are made between pre-test and post-test models. Total modeling error is then subdivided into measurement error, experimental error and 'pure' modeling error, and comparisons made between measurement error and total modeling error. The generic statistical model presented in this paper is based on the first four global (primary structure) modes of four different structures belonging to the generic category of Conventional Space Structures (specifically excluding large truss-type space structures). As such, it may be used to evaluate the uncertainty of predicted mode shapes and frequencies, sinusoidal response, or the transient response of other structures belonging to the same generic category.

  17. The magnetic nature of umbra-penumbra boundary in sunspots

    NASA Astrophysics Data System (ADS)

    Jurčák, J.; Rezaei, R.; González, N. Bello; Schlichenmaier, R.; Vomlel, J.

    2018-03-01

    Context. Sunspots are the longest-known manifestation of solar activity, and their magnetic nature has been known for more than a century. Despite this, the boundary between umbrae and penumbrae, the two fundamental sunspot regions, has hitherto been solely defined by an intensity threshold. Aim. Here, we aim at studying the magnetic nature of umbra-penumbra boundaries in sunspots of different sizes, morphologies, evolutionary stages, and phases of the solar cycle. Methods: We used a sample of 88 scans of the Hinode/SOT spectropolarimeter to infer the magnetic field properties in at the umbral boundaries. We defined these umbra-penumbra boundaries by an intensity threshold and performed a statistical analysis of the magnetic field properties on these boundaries. Results: We statistically prove that the umbra-penumbra boundary in stable sunspots is characterised by an invariant value of the vertical magnetic field component: the vertical component of the magnetic field strength does not depend on the umbra size, its morphology, and phase of the solar cycle. With the statistical Bayesian inference, we find that the strength of the vertical magnetic field component is, with a likelihood of 99%, in the range of 1849-1885 G with the most probable value of 1867 G. In contrast, the magnetic field strength and inclination averaged along individual boundaries are found to be dependent on the umbral size: the larger the umbra, the stronger and more horizontal the magnetic field at its boundary. Conclusions: The umbra and penumbra of sunspots are separated by a boundary that has hitherto been defined by an intensity threshold. We now unveil the empirical law of the magnetic nature of the umbra-penumbra boundary in stable sunspots: it is an invariant vertical component of the magnetic field.

  18. Detection of Person Misfit in Computerized Adaptive Tests with Polytomous Items.

    ERIC Educational Resources Information Center

    van Krimpen-Stoop, Edith M. L. A.; Meijer, Rob R.

    2002-01-01

    Compared the nominal and empirical null distributions of the standardized log-likelihood statistic for polytomous items for paper-and-pencil (P&P) and computerized adaptive tests (CATs). Results show that the empirical distribution of the statistic differed from the assumed standard normal distribution for both P&P tests and CATs. Also…

  19. The inhibitory potential of the condensed-tannin-rich fraction of Plathymenia reticulata Benth. (Fabaceae) against Bothrops atrox envenomation.

    PubMed

    de Moura, Valéria Mourão; da Silva, Wania Cristina Rodrigues; Raposo, Juliana D A; Freitas-de-Sousa, Luciana A; Dos-Santos, Maria Cristina; de Oliveira, Ricardo Bezerra; Veras Mourão, Rosa Helena

    2016-05-13

    Ethnobotanical studies have shown that Plathymenia reticulata Benth. (Fabaceae) has been widely used in cases of snake envenomation, particularly in Northern Brazil. In light of this, the aim of this study was to evaluate the inhibitory potential of the condensed-tannin-rich fraction obtained from the bark of P. reticulata against the main biological activities induced by Bothrops atrox venom (BaV). The chemical composition of the aqueous extract of P. reticulata (AEPr) was first investigated by thin-layer chromatography (TLC) and the extract was then fractionated by column chromatography on Sephadex LH-20. This yielded five main fractions (Pr1, Pr2, Pr3, Pr4 and Pr5), which were analyzed by colorimetry to determine their concentrations of total phenolics, total tannins and condensed tannins and to assess their potential for blocking the phospholipase activity of BaV. The Pr5 fraction was defined as the fraction rich in condensed tannins (CTPr), and its inhibitory potential against the activities of the venom was evaluated. CTPr was evaluated in different in vivo and in vitro experimental protocols. The in vivo protocols consisted of (1) pre-incubation (venom:CTPr, w/w), (2) pre-treatment (orally administered) and (3) post-treatment (orally administered) to evaluate the effect on the hemorrhagic and edematogenic activities of BaV; in the in vitro protocol the effect on phospholipase and coagulant activity using pre-incubation in both tests was evaluated. There was statistically significant inhibition (p<0.05) of hemorrhagic activity by CTPr when the pre-incubation protocol was used [55% (1:5, w/w) and 74% (1:10, w/w)] and when pre-treatment with doses of 50 and 100mg/kg was used (19% and 13%, respectively). However, for the concentrations tested, there was no statistically significant inhibition in the group subjected to post-treatment administered orally. CTPr blocked 100% of phospholipase activity and 63.3% (1:10, w/w) of coagulant activity when it was pre-incubated with BaV. There was a statistically significant reduction (p<0.05) in edema induced by BaV in the oral protocols. Maximum inhibition was 95% (pre-treatment). Our findings indicate that CTPr could be a good source of natural inhibitors of the components of snake venom responsible for inducing local inflammation. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. Increasing the relevance of GCM simulations for Climate Services

    NASA Astrophysics Data System (ADS)

    Smith, L. A.; Suckling, E.

    2012-12-01

    The design and interpretation of model simulations for climate services differ significantly from experimental design for the advancement of the fundamental research on predictability that underpins it. Climate services consider the sources of best information available today; this calls for a frank evaluation of model skill in the face of statistical benchmarks defined by empirical models. The fact that Physical simulation models are thought to provide the only reliable method for extrapolating into conditions not previously observed has no bearing on whether or not today's simulation models outperform empirical models. Evidence on the length scales on which today's simulation models fail to outperform empirical benchmarks is presented; it is illustrated that this occurs even on global scales in decadal prediction. At all timescales considered thus far (as of July 2012), predictions based on simulation models are improved by blending with the output of statistical models. Blending is shown to be more interesting in the climate context than it is in the weather context, where blending with a history-based climatology is straightforward. As GCMs improve and as the Earth's climate moves further from that of the last century, the skill from simulation models and their relevance to climate services is expected to increase. Examples from both seasonal and decadal forecasting will be used to discuss a third approach that may increase the role of current GCMs more quickly. Specifically, aspects of the experimental design in previous hind cast experiments are shown to hinder the use of GCM simulations for climate services. Alternative designs are proposed. The value in revisiting Thompson's classic approach to improving weather forecasting in the fifties in the context of climate services is discussed.

  1. Pre-emptive effect of ibuprofen versus placebo on pain relief and success rates of medical abortion: a double-blind, randomized, controlled study.

    PubMed

    Avraham, Sarit; Gat, Itai; Duvdevani, Nir-Ram; Haas, Jigal; Frenkel, Yair; Seidman, Daniel S

    2012-03-01

    To determine the efficacy of pre-emptive administration of the nonsteroidal anti-inflammatory drug (NSAID) ibuprofen vs. a placebo on pain relief during medical abortion and to evaluate whether NSAIDs interfere with the action of misoprostol. Prospective, double-blind, randomized, controlled study. University-affiliated tertiary hospital. Sixty-one women who underwent first-trimester termination of pregnancy. Patients received 600 mg mifepristone orally, followed by 400 μg oral misoprostol 2 days later. They were randomized to receive pre-emptively two tablets of 400 mg ibuprofen orally or a placebo, when taking the misoprostol. The patients completed a questionnaire about side effects and pain score and returned for an ultrasound follow-up examination 10-14 days after the medical abortion. Significant pain, assessed by the need for additional analgesia, and failure rates, defined by a need for surgical intervention. Pre-emptive ibuprofen treatment was found to be more effective than a placebo in pain prevention, as determined by a significantly lower need for additional analgesia: 11 of 29 (38%) vs. 25 of 32 (78%), respectively. Treatment failure rate was not statistically different between the ibuprofen and placebo groups: 4 of 28 (14.2%) vs. 3 of 31 (9.7%), respectively. History of menstrual pain was predictive for the need of additional analgesia. Pre-emptive use of ibuprofen had a statistically significant beneficial effect on the need for pain relief during a mifepristone and misoprostol regimen for medical abortion. Ibuprofen did not adversely affect the outcome of medical abortion. NCT00997074. Copyright © 2012 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  2. Pre-test probability of obstructive coronary stenosis in patients undergoing coronary CT angiography: Comparative performance of the modified diamond-Forrester algorithm versus methods incorporating cardiovascular risk factors.

    PubMed

    Ferreira, António Miguel; Marques, Hugo; Tralhão, António; Santos, Miguel Borges; Santos, Ana Rita; Cardoso, Gonçalo; Dores, Hélder; Carvalho, Maria Salomé; Madeira, Sérgio; Machado, Francisco Pereira; Cardim, Nuno; de Araújo Gonçalves, Pedro

    2016-11-01

    Current guidelines recommend the use of the Modified Diamond-Forrester (MDF) method to assess the pre-test likelihood of obstructive coronary artery disease (CAD). We aimed to compare the performance of the MDF method with two contemporary algorithms derived from multicenter trials that additionally incorporate cardiovascular risk factors: the calculator-based 'CAD Consortium 2' method, and the integer-based CONFIRM score. We assessed 1069 consecutive patients without known CAD undergoing coronary CT angiography (CCTA) for stable chest pain. Obstructive CAD was defined as the presence of coronary stenosis ≥50% on 64-slice dual-source CT. The three methods were assessed for calibration, discrimination, net reclassification, and changes in proposed downstream testing based upon calculated pre-test likelihoods. The observed prevalence of obstructive CAD was 13.8% (n=147). Overestimations of the likelihood of obstructive CAD were 140.1%, 9.8%, and 18.8%, respectively, for the MDF, CAD Consortium 2 and CONFIRM methods. The CAD Consortium 2 showed greater discriminative power than the MDF method, with a C-statistic of 0.73 vs. 0.70 (p<0.001), while the CONFIRM score did not (C-statistic 0.71, p=0.492). Reclassification of pre-test likelihood using the 'CAD Consortium 2' or CONFIRM scores resulted in a net reclassification improvement of 0.19 and 0.18, respectively, which would change the diagnostic strategy in approximately half of the patients. Newer risk factor-encompassing models allow for a more precise estimation of pre-test probabilities of obstructive CAD than the guideline-recommended MDF method. Adoption of these scores may improve disease prediction and change the diagnostic pathway in a significant proportion of patients. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  3. Initial therapeutic strategy of invasive candidiasis for intensive care unit patients: a retrospective analysis from the China-SCAN study.

    PubMed

    Cui, Na; Wang, Hao; Su, Longxiang; Qiu, Haibo; Li, Ruoyu; Liu, Dawei

    2017-01-23

    To investigate the impact of initial antifungal therapeutic strategies on the prognosis of invasive Candida infections (ICIs) in intensive care units (ICUs) in China. A total of 306 patients with proven ICIs in the China-SCAN study were analyzed retrospectively. Empiric, pre-emptive, and targeted therapy were adopted based on starting criteria including clinical, microbiological, and other conventional prediction rules. The primary outcome was hospital mortality and the secondary endpoints were duration days in ICU and duration days in hospital. The global responses (clinical and microbiological) at the end of the empirical therapy were also assessed. A total of 268/306 (87.6%) ICI patients received antifungal therapy, including 142/268 (53.0%) initial empirical therapy, 53/268 (19.8%) initial pre-emptive therapy, and 73/268 (27.2%) initial targeted therapy. Compared with the initial empirical antifungal therapy and targeted antifungal therapy, patients with initial pre-emptive antifungal therapy had significantly less clinical remission [11/53 (21.2%) vs. 61/142 (43.3%) vs. 22/73 (30.1%), P = 0.009], higher ICU [26/53 (57.8%) vs. 42/142 (32.2%) vs. 27/73 (43.5%), P = 0.008] and hospital mortality [27/53 (60.0%) vs. 43/142 (32.8%) vs. 29/73 (46.8%), P = 0.004] and more microbiological persistence [9/53 (17.0%) vs. 6/142 (4.2%) vs. 9/73 (12.3%), P = 0.011]. Kaplan-Meier survival analysis revealed that ICI patients with initial pre-emptive antifungal therapy and targeted antifungal therapy were associated with reduced hospital duration compared with patients with initial empirical antifungal therapy after confirmation of fungal infection (log-rank test: P = 0.021). Multivariate regression analysis provided evidence that initial empirical antifungal therapy was an independent predictor for DECREASING the hospital mortality in ICI patients on ICU admission and at ICI diagnosis (odds ratio 0.327, 95% confidence interval 0.160-0.667, P = 0.002; odds ratio 0.351, 95% confidence interval 0.168-0.735, P = 0.006). The initial therapeutic strategy for invasive candidiasis was independently associated with hospital mortality. Prompt empirical antifungal therapy could be critical to decrease early hospital mortality. Clinicaltrials.gov NCT01253954 (retrospectively registration date: December 3, 2010).

  4. Statistical field theory of futures commodity prices

    NASA Astrophysics Data System (ADS)

    Baaquie, Belal E.; Yu, Miao

    2018-02-01

    The statistical theory of commodity prices has been formulated by Baaquie (2013). Further empirical studies of single (Baaquie et al., 2015) and multiple commodity prices (Baaquie et al., 2016) have provided strong evidence in support the primary assumptions of the statistical formulation. In this paper, the model for spot prices (Baaquie, 2013) is extended to model futures commodity prices using a statistical field theory of futures commodity prices. The futures prices are modeled as a two dimensional statistical field and a nonlinear Lagrangian is postulated. Empirical studies provide clear evidence in support of the model, with many nontrivial features of the model finding unexpected support from market data.

  5. Lagrangian single-particle turbulent statistics through the Hilbert-Huang transform.

    PubMed

    Huang, Yongxiang; Biferale, Luca; Calzavarini, Enrico; Sun, Chao; Toschi, Federico

    2013-04-01

    The Hilbert-Huang transform is applied to analyze single-particle Lagrangian velocity data from numerical simulations of hydrodynamic turbulence. The velocity trajectory is described in terms of a set of intrinsic mode functions C(i)(t) and of their instantaneous frequency ω(i)(t). On the basis of this decomposition we define the ω-conditioned statistical moments of the C(i) modes, named q-order Hilbert spectra (HS). We show that such quantities have enhanced scaling properties as compared to traditional Fourier transform- or correlation-based (structure functions) statistical indicators, thus providing better insights into the turbulent energy transfer process. We present clear empirical evidence that the energylike quantity, i.e., the second-order HS, displays a linear scaling in time in the inertial range, as expected from a dimensional analysis. We also measure high-order moment scaling exponents in a direct way, without resorting to the extended self-similarity procedure. This leads to an estimate of the Lagrangian structure function exponents which are consistent with the multifractal prediction in the Lagrangian frame as proposed by Biferale et al. [Phys. Rev. Lett. 93, 064502 (2004)].

  6. Evaluation and parameterization of ATCOR3 topographic correction method for forest cover mapping in mountain areas

    NASA Astrophysics Data System (ADS)

    Balthazar, Vincent; Vanacker, Veerle; Lambin, Eric F.

    2012-08-01

    A topographic correction of optical remote sensing data is necessary to improve the quality of quantitative forest cover change analyses in mountainous terrain. The implementation of semi-empirical correction methods requires the calibration of model parameters that are empirically defined. This study develops a method to improve the performance of topographic corrections for forest cover change detection in mountainous terrain through an iterative tuning method of model parameters based on a systematic evaluation of the performance of the correction. The latter was based on: (i) the general matching of reflectances between sunlit and shaded slopes and (ii) the occurrence of abnormal reflectance values, qualified as statistical outliers, in very low illuminated areas. The method was tested on Landsat ETM+ data for rough (Ecuadorian Andes) and very rough mountainous terrain (Bhutan Himalayas). Compared to a reference level (no topographic correction), the ATCOR3 semi-empirical correction method resulted in a considerable reduction of dissimilarities between reflectance values of forested sites in different topographic orientations. Our results indicate that optimal parameter combinations are depending on the site, sun elevation and azimuth and spectral conditions. We demonstrate that the results of relatively simple topographic correction methods can be greatly improved through a feedback loop between parameter tuning and evaluation of the performance of the correction model.

  7. Van Allen Probes Observations of Plasmasphere Refilling Inside and Outside the Plasmapause

    NASA Astrophysics Data System (ADS)

    De Pascuale, S.; Kletzing, C.; Kurth, W. S.; Jordanova, V. K.

    2017-12-01

    We survey several geomagnetic storms observed by the Van Allen Probes to determine the rate of plasmasphere refilling following the initial erosion of the plasmapause region. The EMFISIS instrument on board the spacecraft provides near-equatorial in situ electron density measurements, which are accurate to 10% error in the detectable range 2 < L < 6. Two-dimensional plasmasphere density simulations, providing global context of local observations, are driven by the incident solar wind electric field as a proxy for geomagnetic activity. The simulations utilize a semi-empirical model of convection and a semi-empirical model of ionospheric outflow to dynamically evolve plasmaspheric densities. We find that at high L the plasmasphere undergoes orders of magnitude density depletion (from 100s - 10s cm-3) in response to a geomagnetic event and recovers to pre-storm levels over many days. At low L ( 1000s cm-3), and within the plasmapause, the plasmasphere loses density by a factor of 2 to 3 (from 3000 - 1000 cm-3) producing a depletion that can persist over weeks during sustained geomagnetic activity. We describe the impact of these results on the challenge of defining a saturated quiet state of the plasmasphere.

  8. Solar-terrestrial predictions proceedings. Volume 4: Prediction of terrestrial effects of solar activity

    NASA Technical Reports Server (NTRS)

    Donnelly, R. E. (Editor)

    1980-01-01

    Papers about prediction of ionospheric and radio propagation conditions based primarily on empirical or statistical relations is discussed. Predictions of sporadic E, spread F, and scintillations generally involve statistical or empirical predictions. The correlation between solar-activity and terrestrial seismic activity and the possible relation between solar activity and biological effects is discussed.

  9. An Empirical Investigation of Methods for Assessing Item Fit for Mixed Format Tests

    ERIC Educational Resources Information Center

    Chon, Kyong Hee; Lee, Won-Chan; Ansley, Timothy N.

    2013-01-01

    Empirical information regarding performance of model-fit procedures has been a persistent need in measurement practice. Statistical procedures for evaluating item fit were applied to real test examples that consist of both dichotomously and polytomously scored items. The item fit statistics used in this study included the PARSCALE's G[squared],…

  10. Empirical performance of interpolation techniques in risk-neutral density (RND) estimation

    NASA Astrophysics Data System (ADS)

    Bahaludin, H.; Abdullah, M. H.

    2017-03-01

    The objective of this study is to evaluate the empirical performance of interpolation techniques in risk-neutral density (RND) estimation. Firstly, the empirical performance is evaluated by using statistical analysis based on the implied mean and the implied variance of RND. Secondly, the interpolation performance is measured based on pricing error. We propose using the leave-one-out cross-validation (LOOCV) pricing error for interpolation selection purposes. The statistical analyses indicate that there are statistical differences between the interpolation techniques:second-order polynomial, fourth-order polynomial and smoothing spline. The results of LOOCV pricing error shows that interpolation by using fourth-order polynomial provides the best fitting to option prices in which it has the lowest value error.

  11. Testing Transitivity of Preferences on Two-Alternative Forced Choice Data

    PubMed Central

    Regenwetter, Michel; Dana, Jason; Davis-Stober, Clintin P.

    2010-01-01

    As Duncan Luce and other prominent scholars have pointed out on several occasions, testing algebraic models against empirical data raises difficult conceptual, mathematical, and statistical challenges. Empirical data often result from statistical sampling processes, whereas algebraic theories are nonprobabilistic. Many probabilistic specifications lead to statistical boundary problems and are subject to nontrivial order constrained statistical inference. The present paper discusses Luce's challenge for a particularly prominent axiom: Transitivity. The axiom of transitivity is a central component in many algebraic theories of preference and choice. We offer the currently most complete solution to the challenge in the case of transitivity of binary preference on the theory side and two-alternative forced choice on the empirical side, explicitly for up to five, and implicitly for up to seven, choice alternatives. We also discuss the relationship between our proposed solution and weak stochastic transitivity. We recommend to abandon the latter as a model of transitive individual preferences. PMID:21833217

  12. Enhancing predictive accuracy and reproducibility in clinical evaluation research: Commentary on the special section of the Journal of Evaluation in Clinical Practice.

    PubMed

    Bryant, Fred B

    2016-12-01

    This paper introduces a special section of the current issue of the Journal of Evaluation in Clinical Practice that includes a set of 6 empirical articles showcasing a versatile, new machine-learning statistical method, known as optimal data (or discriminant) analysis (ODA), specifically designed to produce statistical models that maximize predictive accuracy. As this set of papers clearly illustrates, ODA offers numerous important advantages over traditional statistical methods-advantages that enhance the validity and reproducibility of statistical conclusions in empirical research. This issue of the journal also includes a review of a recently published book that provides a comprehensive introduction to the logic, theory, and application of ODA in empirical research. It is argued that researchers have much to gain by using ODA to analyze their data. © 2016 John Wiley & Sons, Ltd.

  13. A support vector machine based test for incongruence between sets of trees in tree space

    PubMed Central

    2012-01-01

    Background The increased use of multi-locus data sets for phylogenetic reconstruction has increased the need to determine whether a set of gene trees significantly deviate from the phylogenetic patterns of other genes. Such unusual gene trees may have been influenced by other evolutionary processes such as selection, gene duplication, or horizontal gene transfer. Results Motivated by this problem we propose a nonparametric goodness-of-fit test for two empirical distributions of gene trees, and we developed the software GeneOut to estimate a p-value for the test. Our approach maps trees into a multi-dimensional vector space and then applies support vector machines (SVMs) to measure the separation between two sets of pre-defined trees. We use a permutation test to assess the significance of the SVM separation. To demonstrate the performance of GeneOut, we applied it to the comparison of gene trees simulated within different species trees across a range of species tree depths. Applied directly to sets of simulated gene trees with large sample sizes, GeneOut was able to detect very small differences between two set of gene trees generated under different species trees. Our statistical test can also include tree reconstruction into its test framework through a variety of phylogenetic optimality criteria. When applied to DNA sequence data simulated from different sets of gene trees, results in the form of receiver operating characteristic (ROC) curves indicated that GeneOut performed well in the detection of differences between sets of trees with different distributions in a multi-dimensional space. Furthermore, it controlled false positive and false negative rates very well, indicating a high degree of accuracy. Conclusions The non-parametric nature of our statistical test provides fast and efficient analyses, and makes it an applicable test for any scenario where evolutionary or other factors can lead to trees with different multi-dimensional distributions. The software GeneOut is freely available under the GNU public license. PMID:22909268

  14. Evaluation of excitation functions of proton and deuteron induced reactions on enriched tellurium isotopes with special relevance to the production of iodine-124.

    PubMed

    Aslam, M N; Sudár, S; Hussain, M; Malik, A A; Shah, H A; Qaim, S M

    2010-09-01

    Cross-section data for the production of medically important radionuclide (124)I via five proton and deuteron induced reactions on enriched tellurium isotopes were evaluated. The nuclear model codes, STAPRE, EMPIRE and TALYS, were used for consistency checks of the experimental data. Recommended excitation functions were derived using a well-defined statistical procedure. Therefrom integral yields were calculated. The various production routes of (124)I were compared. Presently the (124)Te(p,n)(124)I reaction is the method of choice; however, the (125)Te(p,2n)(124)I reaction also appears to have great potential.

  15. Free and Open Source Tools (FOSTs): An Empirical Investigation of Pre-Service Teachers' Competencies, Attitudes, and Pedagogical Intentions

    ERIC Educational Resources Information Center

    Asing-Cashman, Joyce G.; Gurung, Binod; Limbu, Yam B.; Rutledge, David

    2014-01-01

    This study examines the digital native pre-service teachers' (DNPSTs) perceptions of their competency, attitude, and pedagogical intention to use free and open source tools (FOSTs) in their future teaching. Participants were 294 PSTs who responded to pre-course surveys at the beginning of an educational technology course. Using the structural…

  16. Effectiveness of integrative and instrumental reminiscence therapies on depression symptoms reduction in institutionalized older adults: an empirical study.

    PubMed

    Karimi, H; Dolatshahee, B; Momeni, K; Khodabakhshi, A; Rezaei, M; Kamrani, A A

    2010-09-01

    Reminiscence therapy is a psychological intervention which is specifically designed to address issues of particular relevance to older adults, such as depression. The latest approach to the research on therapeutic utility of reminiscence is gaining popularity among researchers and practitioners, and has yielded promising results. Specifying different types of reminiscence is a crucial component of the approach. The aim of this study was to examine the therapeutic effectiveness of integrative and instrumental types of reminiscence for the treatment of depression in institutionalized older adults dwelling in a nursing home. The study employed a three-group pre-post-test design with random allocation to instrumental or integrative reminiscence or an active social discussion control condition. Twenty-nine institutionalized older adults (12 men and 17 women) with depressive symptoms varying from mild to severe constituted the sample. The interventions were implemented in a short-form group format. Analysis of changes from pre-test to post-test revealed that integrative reminiscence therapy led to statistically significant reduction in symptoms of depression in contrast with the control group. Although instrumental reminiscence therapy also reduced depressive symptoms, this improvement was not statistically significant compared to the control group. This study provides additional support for the effectiveness of integrative reminiscence therapy as an intervention for depressed older adults living in residential care settings. This study also provides support for the hypothesis that certain types of reminiscence produce their own specific effects.

  17. Familiarizing Students with the Empirically Supported Treatment Approaches for Childhood Problems.

    ERIC Educational Resources Information Center

    Wilkins, Victoria; Chambliss, Catherine

    The clinical research literature exploring the efficacy of particular treatment approaches is reviewed with the intent to facilitate the training of counseling students. Empirically supported treatments (ESTs) is defined operationally as evidence-based treatments following the listing of empirically validated psychological treatments reported by…

  18. Using exploratory regression to identify optimal driving factors for cellular automaton modeling of land use change.

    PubMed

    Feng, Yongjiu; Tong, Xiaohua

    2017-09-22

    Defining transition rules is an important issue in cellular automaton (CA)-based land use modeling because these models incorporate highly correlated driving factors. Multicollinearity among correlated driving factors may produce negative effects that must be eliminated from the modeling. Using exploratory regression under pre-defined criteria, we identified all possible combinations of factors from the candidate factors affecting land use change. Three combinations that incorporate five driving factors meeting pre-defined criteria were assessed. With the selected combinations of factors, three logistic regression-based CA models were built to simulate dynamic land use change in Shanghai, China, from 2000 to 2015. For comparative purposes, a CA model with all candidate factors was also applied to simulate the land use change. Simulations using three CA models with multicollinearity eliminated performed better (with accuracy improvements about 3.6%) than the model incorporating all candidate factors. Our results showed that not all candidate factors are necessary for accurate CA modeling and the simulations were not sensitive to changes in statistically non-significant driving factors. We conclude that exploratory regression is an effective method to search for the optimal combinations of driving factors, leading to better land use change models that are devoid of multicollinearity. We suggest identification of dominant factors and elimination of multicollinearity before building land change models, making it possible to simulate more realistic outcomes.

  19. Two statistics for evaluating parameter identifiability and error reduction

    USGS Publications Warehouse

    Doherty, John; Hunt, Randall J.

    2009-01-01

    Two statistics are presented that can be used to rank input parameters utilized by a model in terms of their relative identifiability based on a given or possible future calibration dataset. Identifiability is defined here as the capability of model calibration to constrain parameters used by a model. Both statistics require that the sensitivity of each model parameter be calculated for each model output for which there are actual or presumed field measurements. Singular value decomposition (SVD) of the weighted sensitivity matrix is then undertaken to quantify the relation between the parameters and observations that, in turn, allows selection of calibration solution and null spaces spanned by unit orthogonal vectors. The first statistic presented, "parameter identifiability", is quantitatively defined as the direction cosine between a parameter and its projection onto the calibration solution space. This varies between zero and one, with zero indicating complete non-identifiability and one indicating complete identifiability. The second statistic, "relative error reduction", indicates the extent to which the calibration process reduces error in estimation of a parameter from its pre-calibration level where its value must be assigned purely on the basis of prior expert knowledge. This is more sophisticated than identifiability, in that it takes greater account of the noise associated with the calibration dataset. Like identifiability, it has a maximum value of one (which can only be achieved if there is no measurement noise). Conceptually it can fall to zero; and even below zero if a calibration problem is poorly posed. An example, based on a coupled groundwater/surface-water model, is included that demonstrates the utility of the statistics. ?? 2009 Elsevier B.V.

  20. Monitoring the North Atlantic using ocean colour data

    NASA Astrophysics Data System (ADS)

    Fuentes-Yaco, C.; Caverhill, C.; Maass, H.; Porter, C.; White, GN, III

    2016-04-01

    The Remote Sensing Unit (RSU) at the Bedford Institute of Oceanography (BIO) has been monitoring the North Atlantic using ocean colour products for decades. Optical sensors used include CZCS, POLDER, SeaWiFS, MODIS/Aqua and MERIS. The monitoring area is defined by the Atlantic Zone Monitoring Program (AZMP) but certain products extend into Arctic waters, and all-Canadian waters which include the Pacific coast. RSU provides Level 3 images for various products in several formats and a range of temporal and spatial resolutions. Basic statistics for pre-defined areas of interest are compiled for each product. Climatologies and anomaly maps are also routinely produced, and custom products are delivered by request. RSU is involved in the generation of Level 4 products, such as characterizing the phenology of spring and fall phytoplankton blooms, computing primary production, using ocean colour to aid in EBSA (Ecologically and Biologically Significant Area) definition and developing habitat suitability maps. Upcoming operational products include maps of diatom distribution, biogeochemical province boundaries, and products from sensors such as VIIRS (Visible Infrared Imaging Radiometer Suite), OLCI (Ocean Land Colour Instrument), and PACE (Pre-Aerosol, Clouds and ocean Ecosystem) hyperspectral microsatellite mission.

  1. Teaching Singing in the Russian Empire Educational Institutions: Importance and Results

    ERIC Educational Resources Information Center

    Molchanova, Violetta S.; Artemova, Svetlana F.; Balaniuk, Leonid L.

    2018-01-01

    The article deals with the system of Singing lessons in the educational institutions of the Russian Empire. Attention is paid to the historical and educational significance of musical and choral training in schools, the difficulties and methodological approaches in teaching. Pre-revolutionary, Soviet and modern scientific literature was used as…

  2. An Empirical Test of Mnemonic Devices to Improve Learning in Elementary Accounting

    ERIC Educational Resources Information Center

    Laing, Gregory Kenneth

    2010-01-01

    The author empirically examined the use of mnemonic devices to enhance learning in first-year accounting at university. The experiment was conducted on three groups using learning strategy application as between participant's factors. The means of the scores from pre- and posttests were analyzed using the student "t" test. No significant…

  3. The Socratic Method: Empirical Assessment of a Psychology Capstone Course

    ERIC Educational Resources Information Center

    Burns, Lawrence R.; Stephenson, Paul L.; Bellamy, Katy

    2016-01-01

    Although students make some epistemological progress during college, most graduate without developing meaning-making strategies that reflect an understanding that knowledge is socially constructed. Using a pre-test-post-test design and a within-subjects 2 × 2 mixed-design ANOVA, this study reports on empirical findings which support the Socratic…

  4. Modeling thrombin generation: plasma composition based approach.

    PubMed

    Brummel-Ziedins, Kathleen E; Everse, Stephen J; Mann, Kenneth G; Orfeo, Thomas

    2014-01-01

    Thrombin has multiple functions in blood coagulation and its regulation is central to maintaining the balance between hemorrhage and thrombosis. Empirical and computational methods that capture thrombin generation can provide advancements to current clinical screening of the hemostatic balance at the level of the individual. In any individual, procoagulant and anticoagulant factor levels together act to generate a unique coagulation phenotype (net balance) that is reflective of the sum of its developmental, environmental, genetic, nutritional and pharmacological influences. Defining such thrombin phenotypes may provide a means to track disease progression pre-crisis. In this review we briefly describe thrombin function, methods for assessing thrombin dynamics as a phenotypic marker, computationally derived thrombin phenotypes versus determined clinical phenotypes, the boundaries of normal range thrombin generation using plasma composition based approaches and the feasibility of these approaches for predicting risk.

  5. Studying Weather and Climate Extremes in a Non-stationary Framework

    NASA Astrophysics Data System (ADS)

    Wu, Z.

    2010-12-01

    The study of weather and climate extremes often uses the theory of extreme values. Such a detection method has a major problem: to obtain the probability distribution of extremes, one has to implicitly assume the Earth’s climate is stationary over a long period within which the climatology is defined. While such detection makes some sense in a purely statistical view of stationary processes, it can lead to misleading statistical properties of weather and climate extremes caused by long term climate variability and change, and may also cause enormous difficulty in attributing and predicting these extremes. To alleviate this problem, here we report a novel non-stationary framework for studying weather and climate extremes in a non-stationary framework. In this new framework, the weather and climate extremes will be defined as timescale-dependent quantities derived from the anomalies with respect to non-stationary climatologies of different timescales. With this non-stationary framework, the non-stationary and nonlinear nature of climate system will be taken into account; and the attribution and the prediction of weather and climate extremes can then be separated into 1) the change of the statistical properties of the weather and climate extremes themselves and 2) the background climate variability and change. The new non-stationary framework will use the ensemble empirical mode decomposition (EEMD) method, which is a recent major improvement of the Hilbert-Huang Transform for time-frequency analysis. Using this tool, we will adaptively decompose various weather and climate data from observation and climate models in terms of the components of the various natural timescales contained in the data. With such decompositions, the non-stationary statistical properties (both spatial and temporal) of weather and climate anomalies and of their corresponding climatologies will be analyzed and documented.

  6. Sexual selection and mate choice.

    PubMed

    Andersson, Malte; Simmons, Leigh W

    2006-06-01

    The past two decades have seen extensive growth of sexual selection research. Theoretical and empirical work has clarified many components of pre- and postcopulatory sexual selection, such as aggressive competition, mate choice, sperm utilization and sexual conflict. Genetic mechanisms of mate choice evolution have been less amenable to empirical testing, but molecular genetic analyses can now be used for incisive experimentation. Here, we highlight some of the currently debated areas in pre- and postcopulatory sexual selection. We identify where new techniques can help estimate the relative roles of the various selection mechanisms that might work together in the evolution of mating preferences and attractive traits, and in sperm-egg interactions.

  7. Summer drought predictability over Europe: empirical versus dynamical forecasts

    NASA Astrophysics Data System (ADS)

    Turco, Marco; Ceglar, Andrej; Prodhomme, Chloé; Soret, Albert; Toreti, Andrea; Doblas-Reyes Francisco, J.

    2017-08-01

    Seasonal climate forecasts could be an important planning tool for farmers, government and insurance companies that can lead to better and timely management of seasonal climate risks. However, climate seasonal forecasts are often under-used, because potential users are not well aware of the capabilities and limitations of these products. This study aims at assessing the merits and caveats of a statistical empirical method, the ensemble streamflow prediction system (ESP, an ensemble based on reordering historical data) and an operational dynamical forecast system, the European Centre for Medium-Range Weather Forecasts—System 4 (S4) in predicting summer drought in Europe. Droughts are defined using the Standardized Precipitation Evapotranspiration Index for the month of August integrated over 6 months. Both systems show useful and mostly comparable deterministic skill. We argue that this source of predictability is mostly attributable to the observed initial conditions. S4 shows only higher skill in terms of ability to probabilistically identify drought occurrence. Thus, currently, both approaches provide useful information and ESP represents a computationally fast alternative to dynamical prediction applications for drought prediction.

  8. Reading the News: The Statistical Preparation of Pre-Service Secondary Mathematics Teachers

    ERIC Educational Resources Information Center

    Chesler, Joshua

    2015-01-01

    Undergraduate mathematics programs must prepare teachers for the challenges of teaching statistical thinking as advocated in standards documents and statistics education literature. This study investigates the statistical thinking of pre-service secondary mathematics teachers at the end of their undergraduate educations. Although all had completed…

  9. Are Physics-Based Simulators Ready for Prime Time? Comparisons of RSQSim with UCERF3 and Observations.

    NASA Astrophysics Data System (ADS)

    Milner, K. R.; Shaw, B. E.; Gilchrist, J. J.; Jordan, T. H.

    2017-12-01

    Probabilistic seismic hazard analysis (PSHA) is typically performed by combining an earthquake rupture forecast (ERF) with a set of empirical ground motion prediction equations (GMPEs). ERFs have typically relied on observed fault slip rates and scaling relationships to estimate the rate of large earthquakes on pre-defined fault segments, either ignoring or relying on expert opinion to set the rates of multi-fault or multi-segment ruptures. Version 3 of the Uniform California Earthquake Rupture Forecast (UCERF3) is a significant step forward, replacing expert opinion and fault segmentation with an inversion approach that matches observations better than prior models while incorporating multi-fault ruptures. UCERF3 is a statistical model, however, and doesn't incorporate the physics of earthquake nucleation, rupture propagation, and stress transfer. We examine the feasibility of replacing UCERF3, or components therein, with physics-based rupture simulators such as the Rate-State Earthquake Simulator (RSQSim), developed by Dieterich & Richards-Dinger (2010). RSQSim simulations on the UCERF3 fault system produce catalogs of seismicity that match long term rates on major faults, and produce remarkable agreement with UCERF3 when carried through to PSHA calculations. Averaged over a representative set of sites, the RSQSim-UCERF3 hazard-curve differences are comparable to the small differences between UCERF3 and its predecessor, UCERF2. The hazard-curve agreement between the empirical and physics-based models provides substantial support for the PSHA methodology. RSQSim catalogs include many complex multi-fault ruptures, which we compare with the UCERF3 rupture-plausibility metrics as well as recent observations. Complications in generating physically plausible kinematic descriptions of multi-fault ruptures have thus far prevented us from using UCERF3 in the CyberShake physics-based PSHA platform, which replaces GMPEs with deterministic ground motion simulations. RSQSim produces full slip/time histories that can be directly implemented as sources in CyberShake, without relying on the conditional hypocenter and slip distributions needed for the UCERF models. We also compare RSQSim with time-dependent PSHA calculations based on multi-fault renewal models.

  10. Towards representation of a perceptual color manifold using associative memory for color constancy.

    PubMed

    Seow, Ming-Jung; Asari, Vijayan K

    2009-01-01

    In this paper, we propose the concept of a manifold of color perception through empirical observation that the center-surround properties of images in a perceptually similar environment define a manifold in the high dimensional space. Such a manifold representation can be learned using a novel recurrent neural network based learning algorithm. Unlike the conventional recurrent neural network model in which the memory is stored in an attractive fixed point at discrete locations in the state space, the dynamics of the proposed learning algorithm represent memory as a nonlinear line of attraction. The region of convergence around the nonlinear line is defined by the statistical characteristics of the training data. This learned manifold can then be used as a basis for color correction of the images having different color perception to the learned color perception. Experimental results show that the proposed recurrent neural network learning algorithm is capable of color balance the lighting variations in images captured in different environments successfully.

  11. Investigating the Distribution of Medical Services among Socioeconomic Groups in Texas

    NASA Astrophysics Data System (ADS)

    Daniel, A.; Zhao, Ph D., S.; O'Keefe, Ph D., CRNP, RN, L.

    2016-12-01

    The Environmental Justice (EJ) literature generally focuses on negative environmental externalities and disamenities found around certain types of demographic conditions such as poor and ethnic groups. This study aims to identify any relationships among environmental risks, communities, and access to hospital services. Community demographic variables will be defined by census tracts and units based on a geographic information system, such as buffer tools. Empirical analyses of the relationships between demographics and environmental burdens take a prominent position in the large EJ literature. However, there is a dearth of research regarding exposed communities and access to hospitals for medical services. Leveraging a dataset that combines hospital locations, pollution sources, and demographic information, the authors will analyze whether different social groups (defined by gender, age, income, and education level) have equal access to hospitals. The research team consists of researchers from Earth system science, public policy, and nursing, and adopts an interdisciplinary approach including ArcGIS analysis and statistical modeling. This project also bridges the literature of health, air pollution, and environmental policy.

  12. The Building Game: From Enumerative Combinatorics to Conformational Diffusion

    NASA Astrophysics Data System (ADS)

    Johnson-Chyzhykov, Daniel; Menon, Govind

    2016-08-01

    We study a discrete attachment model for the self-assembly of polyhedra called the building game. We investigate two distinct aspects of the model: (i) enumerative combinatorics of the intermediate states and (ii) a notion of Brownian motion for the polyhedral linkage defined by each intermediate that we term conformational diffusion. The combinatorial configuration space of the model is computed for the Platonic, Archimedean, and Catalan solids of up to 30 faces, and several novel enumerative results are generated. These represent the most exhaustive computations of this nature to date. We further extend the building game to include geometric information. The combinatorial structure of each intermediate yields a systems of constraints specifying a polyhedral linkage and its moduli space. We use a random walk to simulate a reflected Brownian motion in each moduli space. Empirical statistics of the random walk may be used to define the rates of transition for a Markov process modeling the process of self-assembly.

  13. Quantifying memory in complex physiological time-series.

    PubMed

    Shirazi, Amir H; Raoufy, Mohammad R; Ebadi, Haleh; De Rui, Michele; Schiff, Sami; Mazloom, Roham; Hajizadeh, Sohrab; Gharibzadeh, Shahriar; Dehpour, Ahmad R; Amodio, Piero; Jafari, G Reza; Montagnese, Sara; Mani, Ali R

    2013-01-01

    In a time-series, memory is a statistical feature that lasts for a period of time and distinguishes the time-series from a random, or memory-less, process. In the present study, the concept of "memory length" was used to define the time period, or scale over which rare events within a physiological time-series do not appear randomly. The method is based on inverse statistical analysis and provides empiric evidence that rare fluctuations in cardio-respiratory time-series are 'forgotten' quickly in healthy subjects while the memory for such events is significantly prolonged in pathological conditions such as asthma (respiratory time-series) and liver cirrhosis (heart-beat time-series). The memory length was significantly higher in patients with uncontrolled asthma compared to healthy volunteers. Likewise, it was significantly higher in patients with decompensated cirrhosis compared to those with compensated cirrhosis and healthy volunteers. We also observed that the cardio-respiratory system has simple low order dynamics and short memory around its average, and high order dynamics around rare fluctuations.

  14. Data-driven modeling of background and mine-related acidity and metals in river basins

    USGS Publications Warehouse

    Friedel, Michael J

    2013-01-01

    A novel application of self-organizing map (SOM) and multivariate statistical techniques is used to model the nonlinear interaction among basin mineral-resources, mining activity, and surface-water quality. First, the SOM is trained using sparse measurements from 228 sample sites in the Animas River Basin, Colorado. The model performance is validated by comparing stochastic predictions of basin-alteration assemblages and mining activity at 104 independent sites. The SOM correctly predicts (>98%) the predominant type of basin hydrothermal alteration and presence (or absence) of mining activity. Second, application of the Davies–Bouldin criteria to k-means clustering of SOM neurons identified ten unique environmental groups. Median statistics of these groups define a nonlinear water-quality response along the spatiotemporal hydrothermal alteration-mining gradient. These results reveal that it is possible to differentiate among the continuum between inputs of background and mine-related acidity and metals, and it provides a basis for future research and empirical model development.

  15. How to hit HIV where it hurts

    NASA Astrophysics Data System (ADS)

    Chakraborty, Arup

    No medical procedure has saved more lives than vaccination. But, today, some pathogens have evolved which have defied successful vaccination using the empirical paradigms pioneered by Pasteur and Jenner. One characteristic of many pathogens for which successful vaccines do not exist is that they present themselves in various guises. HIV is an extreme example because of its high mutability. This highly mutable virus can evade natural or vaccine induced immune responses, often by mutating at multiple sites linked by compensatory interactions. I will describe first how by bringing to bear ideas from statistical physics (e.g., maximum entropy models, Hopfield models, Feynman variational theory) together with in vitro experiments and clinical data, the fitness landscape of HIV is beginning to be defined with explicit account for collective mutational pathways. I will describe how this knowledge can be harnessed for vaccine design. Finally, I will describe how ideas at the intersection of evolutionary biology, immunology, and statistical physics can help guide the design of strategies that may be able to induce broadly neutralizing antibodies.

  16. Quantifying Memory in Complex Physiological Time-Series

    PubMed Central

    Shirazi, Amir H.; Raoufy, Mohammad R.; Ebadi, Haleh; De Rui, Michele; Schiff, Sami; Mazloom, Roham; Hajizadeh, Sohrab; Gharibzadeh, Shahriar; Dehpour, Ahmad R.; Amodio, Piero; Jafari, G. Reza; Montagnese, Sara; Mani, Ali R.

    2013-01-01

    In a time-series, memory is a statistical feature that lasts for a period of time and distinguishes the time-series from a random, or memory-less, process. In the present study, the concept of “memory length” was used to define the time period, or scale over which rare events within a physiological time-series do not appear randomly. The method is based on inverse statistical analysis and provides empiric evidence that rare fluctuations in cardio-respiratory time-series are ‘forgotten’ quickly in healthy subjects while the memory for such events is significantly prolonged in pathological conditions such as asthma (respiratory time-series) and liver cirrhosis (heart-beat time-series). The memory length was significantly higher in patients with uncontrolled asthma compared to healthy volunteers. Likewise, it was significantly higher in patients with decompensated cirrhosis compared to those with compensated cirrhosis and healthy volunteers. We also observed that the cardio-respiratory system has simple low order dynamics and short memory around its average, and high order dynamics around rare fluctuations. PMID:24039811

  17. Mathematics pre-service teachers’ statistical reasoning about meaning

    NASA Astrophysics Data System (ADS)

    Kristanto, Y. D.

    2018-01-01

    This article offers a descriptive qualitative analysis of 3 second-year pre-service teachers’ statistical reasoning about the mean. Twenty-six pre-service teachers were tested using an open-ended problem where they were expected to analyze a method in finding the mean of a data. Three of their test results are selected to be analyzed. The results suggest that the pre-service teachers did not use context to develop the interpretation of mean. Therefore, this article also offers strategies to promote statistical reasoning about mean that use various contexts.

  18. Revising Star and Planet Formation Timescales

    NASA Astrophysics Data System (ADS)

    Bell, Cameron P. M.; Naylor, Tim; Mayne, N. J.; Jeffries, R. D.; Littlefair, S. P.

    2013-07-01

    We have derived ages for 13 young (<30 Myr) star-forming regions and find that they are up to a factor of 2 older than the ages typically adopted in the literature. This result has wide-ranging implications, including that circumstellar discs survive longer (≃ 10-12 Myr) and that the average Class I lifetime is greater (≃1 Myr) than currently believed. For each star-forming region, we derived two ages from colour-magnitude diagrams. First, we fitted models of the evolution between the zero-age main sequence and terminal-age main sequence to derive a homogeneous set of main-sequence ages, distances and reddenings with statistically meaningful uncertainties. Our second age for each star-forming region was derived by fitting pre-main-sequence stars to new semi-empirical model isochrones. For the first time (for a set of clusters younger than 50 Myr), we find broad agreement between these two ages, and since these are derived from two distinct mass regimes that rely on different aspects of stellar physics, it gives us confidence in the new age scale. This agreement is largely due to our adoption of empirical colour-Teff relations and bolometric corrections for pre-main-sequence stars cooler than 4000 K. The revised ages for the star-forming regions in our sample are: 2 Myr for NGC 6611 (Eagle Nebula; M 16), IC 5146 (Cocoon Nebula), NGC 6530 (Lagoon Nebula; M 8) and NGC 2244 (Rosette Nebula); 6 Myr for σ Ori, Cep OB3b and IC 348; ≃10 Myr for λ Ori (Collinder 69); ≃11 Myr for NGC 2169; ≃12 Myr for NGC 2362; ≃13 Myr for NGC 7160; ≃14 Myr for χ Per (NGC 884); and ≃20 Myr for NGC 1960 (M 36).

  19. A Critical Assessment of Ages Derived Using Pre-Main-Sequence Isochrones in Colour-Magnitude Diagrams

    NASA Astrophysics Data System (ADS)

    Bell, Cameron P. M.

    2012-11-01

    In this thesis a critical assessment of the ages derived using theoretical pre-main-sequence (pre-MS) stellar evolutionary models is presented by comparing the predictions to the low-mass pre-MS population of 14 young star-forming regions (SFRs) in colour-magnitude diagrams (CMDs). Deriving pre-MS ages requires precise distances and estimates of the reddening. Therefore, the main-sequence (MS) members of the SFRs have been used to derive a self-consistent set of statistically robust ages, distances and reddenings with associated uncertainties using a maximum-likelihood fitting statistic and MS evolutionary models. A photometric method for de-reddening individual stars - known as the Q-method - in regions where the extinction is spatially variable has been updated and is presented. The effects of both the model dependency and the SFR composition on these derived parameters are also discussed. The problem of calibrating photometric observations of red pre-MS stars is examined and it is shown that using observations of MS stars to transform the data into a standard photometric system can introduce significant errors in the position of the pre-MS locus in CMD space. Hence, it is crucial that precise photometric studies - especially of pre-MS objects - be carried out in the natural photometric system of the observations. This therefore requires a robust model of the system responses for the instrument used, and thus the calculated responses for the Wide-Field Camera on the Isaac Newton Telescope are presented. These system responses have been tested using standard star observations and have been shown to be a good representation of the photometric system. A benchmark test for the pre-MS evolutionary models is performed by comparing them to a set of well-calibrated CMDs of the Pleiades in the wavelength regime 0.4-2.5 μm. The masses predicted by these models are also tested against dynamical masses using a sample of MS binaries by calculating the system magnitude in a given photometric bandpass. This analysis shows that for Teff ≤ 4000 K the models systematically overestimate the flux by a factor of 2 at 0.5 μm, though this decreases with wavelength, becoming negligible at 2.2 μm. Thus before the pre-MS models are used to derive ages, a recalibration of the models is performed by incorporating an empirical colour-Teff relation and bolometric corrections based on the Ks-band luminosity of Pleiades members, with theoretical corrections for the dependence on the surface gravity (log g). The recalibrated pre-MS model isochrones are used to derive ages from the pre-MS populations of the SFRs. These ages are then compared with the MS derivations, thus providing a powerful diagnostic tool with which to discriminate between the different pre-MS age scales that arise from a much stronger model dependency in the pre-MS regime. The revised ages assigned to each of the 14 SFRs are up to a factor two older than previous derivations, a result with wide-ranging implications, including that circumstellar discs survive longer and that the average Class II lifetime is greater than currently believed.

  20. Statistical validation and an empirical model of hydrogen production enhancement found by utilizing passive flow disturbance in the steam-reformation process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erickson, Paul A.; Liao, Chang-hsien

    2007-11-15

    A passive flow disturbance has been proven to enhance the conversion of fuel in a methanol-steam reformer. This study presents a statistical validation of the experiment based on a standard 2{sup k} factorial experiment design and the resulting empirical model of the enhanced hydrogen producing process. A factorial experiment design was used to statistically analyze the effects and interactions of various input factors in the experiment. Three input factors, including the number of flow disturbers, catalyst size, and reactant flow rate were investigated for their effects on the fuel conversion in the steam-reformation process. Based on the experimental results, anmore » empirical model was developed and further evaluated with an uncertainty analysis and interior point data. (author)« less

  1. Pressure monitoring during lipofilling procedures.

    PubMed

    Klein, S M; Prantl, L; Geis, S; Eisenmann-Klein, M; Dolderer, J; Felthaus, O; Loibl, M; Heine, N

    2014-01-01

    Grafting of autologous lipoaspirate for various clinical applications has become a common procedure in clinical practice. With an estimated mortality rate of 10-15 percent, fat embolism is among the most severe complications to be expected after lipofilling therapies. The aim of this study was to determine the level of interstitial pressure after the injection of defined volumes of lipoaspirate into the subcutaneous tissue of female breasts. It was hypothesized, that interstitial pressure levels exceed the physiologic capillary pressure during lipofilling procedures and hence increase the potential risk for fat embolism. Further it was investigated if external tissue expansion has the potential to significantly reduce interstitial tissue pressure. Interstitial pressure was monitored in 36 female patients, that underwent autologous fat injections into the breast. Measurements were conducted with a sensor needle connected to a pressure transducer (LogiCal Pressure Monitoring Kit, Smiths medical int. Ltd., UK). Patients were divided into 4 subcohorts differing in their pre-treatment regimen or local tissue conditions. Pre-treatment consisted of tissue expansion, achieved with the Brava™ (Brava LLC Miami, Fla., USA) vacuum-chamber. The increase in interstitial pressure after injection volumes of 100 ml (p = 0.006), 200 ml (p = 0.000) and between 100 ml and 200 ml (p = 0.004) respectively, were significant in non-mastectomized patients without pre-treatment. Patients pre-treated with Brava™ did not show such statistically significant differences in interstitial pressures before and after the injection of 100 ml and 200 ml of lipoaspirate (p = 0.178). The difference in interstitial pressure in mastectomized patients between 0 ml and 100 ml (p = 0.003), as well as 0 ml and 200 ml (p = 0.028) was significant. The difference in pressures between pre-treated patients and patients without pre-treatment did not differ significantly in the mastectomized patient cohort. During lipofilling procedures interstitial pressures are reached that exceed pressure limits defined as hazardous for fat embolism. To date it is unknown what pressure levels need to be considered critical for complications in soft tissue interventions. Further the results indicate higher interstitial pressures for patients that had undergone mastectomy, whereas pre-treatment with external tissue expansion seemed to diminish pressure values.

  2. Statistical shear lag model - unraveling the size effect in hierarchical composites.

    PubMed

    Wei, Xiaoding; Filleter, Tobin; Espinosa, Horacio D

    2015-05-01

    Numerous experimental and computational studies have established that the hierarchical structures encountered in natural materials, such as the brick-and-mortar structure observed in sea shells, are essential for achieving defect tolerance. Due to this hierarchy, the mechanical properties of natural materials have a different size dependence compared to that of typical engineered materials. This study aimed to explore size effects on the strength of bio-inspired staggered hierarchical composites and to define the influence of the geometry of constituents in their outstanding defect tolerance capability. A statistical shear lag model is derived by extending the classical shear lag model to account for the statistics of the constituents' strength. A general solution emerges from rigorous mathematical derivations, unifying the various empirical formulations for the fundamental link length used in previous statistical models. The model shows that the staggered arrangement of constituents grants composites a unique size effect on mechanical strength in contrast to homogenous continuous materials. The model is applied to hierarchical yarns consisting of double-walled carbon nanotube bundles to assess its predictive capabilities for novel synthetic materials. Interestingly, the model predicts that yarn gauge length does not significantly influence the yarn strength, in close agreement with experimental observations. Copyright © 2015 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  3. Implementation of statistical process control for proteomic experiments via LC MS/MS.

    PubMed

    Bereman, Michael S; Johnson, Richard; Bollinger, James; Boss, Yuval; Shulman, Nick; MacLean, Brendan; Hoofnagle, Andrew N; MacCoss, Michael J

    2014-04-01

    Statistical process control (SPC) is a robust set of tools that aids in the visualization, detection, and identification of assignable causes of variation in any process that creates products, services, or information. A tool has been developed termed Statistical Process Control in Proteomics (SProCoP) which implements aspects of SPC (e.g., control charts and Pareto analysis) into the Skyline proteomics software. It monitors five quality control metrics in a shotgun or targeted proteomic workflow. None of these metrics require peptide identification. The source code, written in the R statistical language, runs directly from the Skyline interface, which supports the use of raw data files from several of the mass spectrometry vendors. It provides real time evaluation of the chromatographic performance (e.g., retention time reproducibility, peak asymmetry, and resolution), and mass spectrometric performance (targeted peptide ion intensity and mass measurement accuracy for high resolving power instruments) via control charts. Thresholds are experiment- and instrument-specific and are determined empirically from user-defined quality control standards that enable the separation of random noise and systematic error. Finally, Pareto analysis provides a summary of performance metrics and guides the user to metrics with high variance. The utility of these charts to evaluate proteomic experiments is illustrated in two case studies.

  4. A prostate MRI atlas of biochemical failures following cancer treatment

    NASA Astrophysics Data System (ADS)

    Rusu, Mirabela; Kurhanewicz, John; Tewari, Ashutosh; Madabhushi, Anant

    2014-03-01

    Radical prostatectomy (RP) and radiation therapy (RT) are the most common treatment options for prostate cancer (PCa). Despite advancements in radiation delivery and surgical procedures, RP and RT can result in failure rates as high as 40% and >25%, respectively. Treatment failure is characterized by biochemical recurrence (BcR), which is defined in terms of prostate specific antigen (PSA) concentrations and its variation following treatment. PSA is expected to decrease following treatment, thereby its presence in even small concentrations (e.g 0.2 ng/ml for surgery or 2 ng/ml over the nadir PSA for radiation therapy) is indicative of treatment failure. Early identification of treatment failure may enable the use of more aggressive or neo-adjuvant therapies. Moreover, predicting failure prior to treatment may spare the patient from a procedure that is unlikely to be successful. Our goal is to identify differences on pre-treatment MRI between patients who have BcR and those who remain disease-free at 5 years post-treatment. Specifically, we focus on (1) identifying statistically significant differences in MRI intensities, (2) establishing morphological differences in prostatic anatomic structures, and (3) comparing these differences with the natural variability of prostatic structures. In order to attain these objectives, we use an anatomically constrained registration framework to construct BcR and non-BcR statistical atlases based on the pre-treatment magnetic resonance images (MRI) of the prostate. The patients included in the atlas either underwent RP or RT and were followed up for at least 5 years. The BcR atlas was constructed from a combined population of 12 pre-RT 1.5 Tesla (T) MRI and 33 pre-RP 3T MRI from patients with BcR within 5 years of treatment. Similarly, the non-BcR atlas was built based on a combined cohort of 20 pre-RT 1.5T MRI and 41 pre-RP 3T MRI from patients who remain disease-free 5 years post treatment. We chose the atlas framework as it enables the mapping of MR images from all subjects into the same canonical space, while constructing both an imaging and a morphological statistical atlas. Such co-registration allowed us to perform voxel-by-voxel comparisons of MRI intensities and capsule and central gland morphology to identify statistically significant differences between the BcR and non-BcR patient populations. To assess whether the morphological differences are valid, we performed an additional experiment where we constructed sub-population atlases by randomly sampling RT patients to construct the BcR and non-BcR atlases. Following these experiments we observed that: (1) statistically significant MRI intensity differences exist between BcR and non-BcR patients, specifically on the border of the central gland; (2) statistically significant morphological differences are visible in the prostate and central gland, specifically in the proximity of the apex, and (3) the differences between the BcR and non-BcR cohorts in terms of shape appeared to be consistent across these sub-population atlases as observed in our RT atlases.

  5. Couriers in the Inca Empire: Getting Your Message Across. [Lesson Plan].

    ERIC Educational Resources Information Center

    2002

    This lesson shows how the Inca communicated across the vast stretches of their mountain realm, the largest empire of the pre-industrial world. The lesson explains how couriers carried messages along mountain-ridge roads, up and down stone steps, and over chasm-spanning footbridges. It states that couriers could pass a message from Quito (Ecuador)…

  6. A teaching-learning sequence on a socio-scientific issue: analysis and evaluation of its implementation in the classroom*

    NASA Astrophysics Data System (ADS)

    Vázquez-Alonso, Ángel; Aponte, Abdiel; Manassero-Mas, María-Antonia; Montesano, Marisa

    2016-07-01

    This study examines the effectiveness of a teaching-learning sequence (TLS) to improve the understanding of the influences and interactions between a technology (mining) and society. The aim of the study is also to show the possibility of both teaching and assessing the most innovative issues and aspects of scientific competence and their impact on the understanding of the nature of science. The methodology used a quasi-experimental, pre-post-test design with a control group, with pre-post-test differences as the empirical indicators of improved understanding. Improvements were modest, as the empirical differences (pre-post and experimental-control group) were not large, but the experimental group scored more highly than the control group. The areas that showed improvement were identified. The paper includes the TLS itself and the standardized assessment tools that are functional and transferable to other researchers and teachers.

  7. Cluster-level statistical inference in fMRI datasets: The unexpected behavior of random fields in high dimensions.

    PubMed

    Bansal, Ravi; Peterson, Bradley S

    2018-06-01

    Identifying regional effects of interest in MRI datasets usually entails testing a priori hypotheses across many thousands of brain voxels, requiring control for false positive findings in these multiple hypotheses testing. Recent studies have suggested that parametric statistical methods may have incorrectly modeled functional MRI data, thereby leading to higher false positive rates than their nominal rates. Nonparametric methods for statistical inference when conducting multiple statistical tests, in contrast, are thought to produce false positives at the nominal rate, which has thus led to the suggestion that previously reported studies should reanalyze their fMRI data using nonparametric tools. To understand better why parametric methods may yield excessive false positives, we assessed their performance when applied both to simulated datasets of 1D, 2D, and 3D Gaussian Random Fields (GRFs) and to 710 real-world, resting-state fMRI datasets. We showed that both the simulated 2D and 3D GRFs and the real-world data contain a small percentage (<6%) of very large clusters (on average 60 times larger than the average cluster size), which were not present in 1D GRFs. These unexpectedly large clusters were deemed statistically significant using parametric methods, leading to empirical familywise error rates (FWERs) as high as 65%: the high empirical FWERs were not a consequence of parametric methods failing to model spatial smoothness accurately, but rather of these very large clusters that are inherently present in smooth, high-dimensional random fields. In fact, when discounting these very large clusters, the empirical FWER for parametric methods was 3.24%. Furthermore, even an empirical FWER of 65% would yield on average less than one of those very large clusters in each brain-wide analysis. Nonparametric methods, in contrast, estimated distributions from those large clusters, and therefore, by construct rejected the large clusters as false positives at the nominal FWERs. Those rejected clusters were outlying values in the distribution of cluster size but cannot be distinguished from true positive findings without further analyses, including assessing whether fMRI signal in those regions correlates with other clinical, behavioral, or cognitive measures. Rejecting the large clusters, however, significantly reduced the statistical power of nonparametric methods in detecting true findings compared with parametric methods, which would have detected most true findings that are essential for making valid biological inferences in MRI data. Parametric analyses, in contrast, detected most true findings while generating relatively few false positives: on average, less than one of those very large clusters would be deemed a true finding in each brain-wide analysis. We therefore recommend the continued use of parametric methods that model nonstationary smoothness for cluster-level, familywise control of false positives, particularly when using a Cluster Defining Threshold of 2.5 or higher, and subsequently assessing rigorously the biological plausibility of the findings, even for large clusters. Finally, because nonparametric methods yielded a large reduction in statistical power to detect true positive findings, we conclude that the modest reduction in false positive findings that nonparametric analyses afford does not warrant a re-analysis of previously published fMRI studies using nonparametric techniques. Copyright © 2018 Elsevier Inc. All rights reserved.

  8. Catalogue of isolated emission episodes in gamma-ray bursts from Fermi, Swift and BATSE

    NASA Astrophysics Data System (ADS)

    Charisi, M.; Márka, S.; Bartos, I.

    2015-04-01

    We report a comprehensive catalogue of emission episodes within long gamma-ray bursts (GRBs) that are separated by a quiescent period during which gamma-ray emission falls below the background level. We use a fully automated identification method for an unbiased, large-scale and expandable search. We examine a comprehensive sample of long GRBs from the BATSE (Burst and Transient Source Experiment), Swift and Fermi missions, assembling a total searched set of 2710 GRBs, the largest catalogue of isolated emission episodes so far. Our search extends out to [-1000 s, 750 s] around the burst trigger, expanding the covered time interval beyond previous studies and far beyond the nominal durations (T90) of most bursts. We compare our results to previous works by identifying pre-peak emission (or precursors), defined as isolated emission periods prior to the episode with the highest peak luminosity of the burst. We also systematically search for similarly defined periods after the burst's peak emission. We find that the pre-peak and post-peak emission periods are statistically similar, possibly indicating a common origin. For the analysed GRBs, we identify 24 per cent to have more than one isolated emission episode, with 11 per cent having at least one pre-peak event and 15 per cent having at least one post-peak event. We identify GRB activity significantly beyond their T90, which can be important for understanding the central engine activity as well as, e.g. gravitational-wave searches.

  9. Sexual Knowledge and Attitude among Girls Who are Getting Married Based on the Information from Yas Pre-marriage Counseling Center.

    PubMed

    Baghersad, Zahra; Fahami, Fariba; Beigi, Marjan; Hasanzadeh, Akbar

    2017-01-01

    High prevalence of sexual dysfunction results from inadequate knowledge or inappropriate attitude toward the natural phenomenon of sexual desire. This study aimed to define sexual knowledge and attitude among girls who were getting married and referred to Yas pre-marriage counseling center. This research was a descriptive analytical study. The information of 165 girls, who were about to get married, were collected through convenient sampling using a researcher-made questionnaire. Data were analyzed using SPSS version 16 software. Inferential statistical method and Pearson correlation were used for data analysis. Results showed that the mean scores of sexual knowledge and attitude among the participants were 57.42 and 69.02, respectively. There was a significant association between the mean scores of sexual knowledge and sexual attitude ( P < 0.001, r = 0.63). Results showed that the participants had relatively appropriate knowledge and attitude toward sexual relationship.

  10. An empirical method to determine inadequacy of dietary water.

    PubMed

    Armstrong, Lawrence E; Johnson, Evan C; McKenzie, Amy L; Muñoz, Colleen X

    2016-01-01

    The physiological regulation of total body water and fluid concentrations is complex and dynamic. The human daily water requirement varies because of differences in body size, dietary solute load, exercise, and activities. Although chronically concentrated urine increases the risk of renal diseases, an empirical method to determine inadequate daily water consumption has not been described for any demographic group; instead, statistical analyses are applied to estimate nutritional guidelines (i.e., adequate intake). This investigation describes a novel empirical method to determine the 24-h total fluid intake (TFI; TFI = water + beverages + moisture in food) and 24-h urine volume, which correspond to inadequate 24-h water intake (defined as urine osmolality of 800 mOsm/kg; U800). Healthy young women (mean ± standard deviation; age, 20 ± 2 y, mass, 60.8 ± 11.7 kg; n = 28) were observed for 7 consecutive days. A 24-h urine sample was analyzed for volume and osmolality. Diet records were analyzed to determine 24-h TFI. For these 28 healthy young women, the U800 corresponded to a TFI ≥2.4 L/d (≥39 mL/kg/d) and a urine volume ≥1.3 L/d. The U800 method could be employed to empirically determine 24-h TFI and 24-h urine volumes that correspond to inadequate water intake in diverse demographic groups, residents of specific geographic regions, and individuals who consume specialized diets or experience large daily water turnover. Because laboratory expertise and instrumentation are required, this technique provides greatest value in research and clinical settings. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Characterization of the Geosynchronous Plasma Environment for the SENSER/RROE Optical Instrument

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woodroffe, Jesse Richard

    2016-11-08

    In this report, we summarize available research in order to characterize expected rates of particle incidence on the SENSER/RROE optical instrument. We first investigate the “normal” background levels using data from statistical studies of spacecraft in geosynchronous orbit and empirical models. We then consider “worst case” scenarios based on event studies in which extreme fluxes have been observed. We use these data to define “maximum” rates of particle incidence. We then consider how incident particles will actually produce counts in the instrument by considering the effects of screening by the instrument housing and the possibility of direct particle access tomore » the housing, with rates for both primary access and secondary electron generation.« less

  12. The Bootstrap, the Jackknife, and the Randomization Test: A Sampling Taxonomy.

    PubMed

    Rodgers, J L

    1999-10-01

    A simple sampling taxonomy is defined that shows the differences between and relationships among the bootstrap, the jackknife, and the randomization test. Each method has as its goal the creation of an empirical sampling distribution that can be used to test statistical hypotheses, estimate standard errors, and/or create confidence intervals. Distinctions between the methods can be made based on the sampling approach (with replacement versus without replacement) and the sample size (replacing the whole original sample versus replacing a subset of the original sample). The taxonomy is useful for teaching the goals and purposes of resampling schemes. An extension of the taxonomy implies other possible resampling approaches that have not previously been considered. Univariate and multivariate examples are presented.

  13. Golden Proportion Analysis of Dental-Skeletal Patterns of Class II and III Patients Pre and Post Orthodontic-orthognathic Treatment.

    PubMed

    Bragatto, Fernanda P; Chicarelli, Mariliani; Kasuya, Amanda Vb; Takeshita, Wilton M; Iwaki-Filho, Liogi; Iwaki, Lilian Cv

    2016-09-01

    The golden proportion has been used in dentistry in an attempt to improve facial function and, possibly, esthetics by simplifying the diagnosis of facial and dental disharmony. The aim of this study is to analyze pre- and postoperative cephalometric tracings of lateral cephalograms of patients with class II and III deformities submitted to orthognathic surgery, and verify if the 13 dental-skeletal patterns (ratios), as defined by Ricketts, moved closer to or further away from the golden proportion. A total of 110 lateral cephalometric radiographs, 55 obtained preoperatively and 55 postoperatively, were analyzed using Dolphin Imaging software. Radiographs analysis demonstrated that ratios 1, 3, 4, 5, 7, 8, 9, 10, and 13 remained statistically different from the golden proportion postoperatively. Ratio 12 was the only one to move closer to the golden number, while the opposite happened with ratio 6, which moved further away after the surgery. Ratios 2 and 11 kept statistically similar to the golden proportion both pre and postoperatively. It may be concluded that orthognathic surgery had little effect on the proportions studied, and that the golden proportion was not present in the majority of the ratios analyzed neither before nor after surgery. Determine whether the facial patterns approach the golden ratio after surgical correction. Also determine whether the golden ratio may be a standard to guide the surgical treatment of patients with skeletal patterns of type II and III.

  14. Empirical Bayes scan statistics for detecting clusters of disease risk variants in genetic studies.

    PubMed

    McCallum, Kenneth J; Ionita-Laza, Iuliana

    2015-12-01

    Recent developments of high-throughput genomic technologies offer an unprecedented detailed view of the genetic variation in various human populations, and promise to lead to significant progress in understanding the genetic basis of complex diseases. Despite this tremendous advance in data generation, it remains very challenging to analyze and interpret these data due to their sparse and high-dimensional nature. Here, we propose novel applications and new developments of empirical Bayes scan statistics to identify genomic regions significantly enriched with disease risk variants. We show that the proposed empirical Bayes methodology can be substantially more powerful than existing scan statistics methods especially so in the presence of many non-disease risk variants, and in situations when there is a mixture of risk and protective variants. Furthermore, the empirical Bayes approach has greater flexibility to accommodate covariates such as functional prediction scores and additional biomarkers. As proof-of-concept we apply the proposed methods to a whole-exome sequencing study for autism spectrum disorders and identify several promising candidate genes. © 2015, The International Biometric Society.

  15. The generalized 20/80 law using probabilistic fractals applied to petroleum field size

    USGS Publications Warehouse

    Crovelli, R.A.

    1995-01-01

    Fractal properties of the Pareto probability distribution are used to generalize "the 20/80 law." The 20/80 law is a heuristic law that has evolved over the years into the following rule of thumb for many populations: 20 percent of the population accounts for 80 percent of the total value. The general p100/q100 law in probabilistic form is defined with q as a function of p, where p is the population proportion and q is the proportion of total value. Using the Pareto distribution, the p100/q100 law in fractal form is derived with the parameter q being a fractal, where q unexpectedly possesses the scale invariance property. The 20/80 law is a special case of the p100/q100 law in fractal form. The p100/q100 law in fractal form is applied to petroleum fieldsize data to obtain p and q such that p100% of the oil fields greater than any specified scale or size in a geologic play account for q100% of the total oil of the fields. The theoretical percentages of total resources of oil using the fractal q are extremely close to the empirical percentages from the data using the statistic q. Also, the empirical scale invariance property of the statistic q for the petroleum fieldsize data is in excellent agreement with the theoretical scale invariance property of the fractal q. ?? 1995 Oxford University Press.

  16. Empirical parameterization of setup, swash, and runup

    USGS Publications Warehouse

    Stockdon, H.F.; Holman, R.A.; Howd, P.A.; Sallenger, A.H.

    2006-01-01

    Using shoreline water-level time series collected during 10 dynamically diverse field experiments, an empirical parameterization for extreme runup, defined by the 2% exceedence value, has been developed for use on natural beaches over a wide range of conditions. Runup, the height of discrete water-level maxima, depends on two dynamically different processes; time-averaged wave setup and total swash excursion, each of which is parameterized separately. Setup at the shoreline was best parameterized using a dimensional form of the more common Iribarren-based setup expression that includes foreshore beach slope, offshore wave height, and deep-water wavelength. Significant swash can be decomposed into the incident and infragravity frequency bands. Incident swash is also best parameterized using a dimensional form of the Iribarren-based expression. Infragravity swash is best modeled dimensionally using offshore wave height and wavelength and shows no statistically significant linear dependence on either foreshore or surf-zone slope. On infragravity-dominated dissipative beaches, the magnitudes of both setup and swash, modeling both incident and infragravity frequency components together, are dependent only on offshore wave height and wavelength. Statistics of predicted runup averaged over all sites indicate a - 17 cm bias and an rms error of 38 cm: the mean observed runup elevation for all experiments was 144 cm. On intermediate and reflective beaches with complex foreshore topography, the use of an alongshore-averaged beach slope in practical applications of the runup parameterization may result in a relative runup error equal to 51% of the fractional variability between the measured and the averaged slope.

  17. Geospace environment modeling 2008--2009 challenge: Dst index

    USGS Publications Warehouse

    Rastätter, L.; Kuznetsova, M.M.; Glocer, A.; Welling, D.; Meng, X.; Raeder, J.; Wittberger, M.; Jordanova, V.K.; Yu, Y.; Zaharia, S.; Weigel, R.S.; Sazykin, S.; Boynton, R.; Wei, H.; Eccles, V.; Horton, W.; Mays, M.L.; Gannon, J.

    2013-01-01

    This paper reports the metrics-based results of the Dst index part of the 2008–2009 GEM Metrics Challenge. The 2008–2009 GEM Metrics Challenge asked modelers to submit results for four geomagnetic storm events and five different types of observations that can be modeled by statistical, climatological or physics-based models of the magnetosphere-ionosphere system. We present the results of 30 model settings that were run at the Community Coordinated Modeling Center and at the institutions of various modelers for these events. To measure the performance of each of the models against the observations, we use comparisons of 1 hour averaged model data with the Dst index issued by the World Data Center for Geomagnetism, Kyoto, Japan, and direct comparison of 1 minute model data with the 1 minute Dst index calculated by the United States Geological Survey. The latter index can be used to calculate spectral variability of model outputs in comparison to the index. We find that model rankings vary widely by skill score used. None of the models consistently perform best for all events. We find that empirical models perform well in general. Magnetohydrodynamics-based models of the global magnetosphere with inner magnetosphere physics (ring current model) included and stand-alone ring current models with properly defined boundary conditions perform well and are able to match or surpass results from empirical models. Unlike in similar studies, the statistical models used in this study found their challenge in the weakest events rather than the strongest events.

  18. Trajectory of social isolation following hip fracture: an analysis of the English Longitudinal Study of Ageing (ELSA) cohort.

    PubMed

    Smith, Toby O; Dainty, Jack R; MacGregor, Alex

    2018-01-01

    social isolation is defined as a lack of meaningful and sustained communication or interactions with social networks. There is limited understanding on the prevalence of social isolation and loneliness in people following hip fracture and no previous understanding of how this changes over time. to determine the prevalence and trajectory of social isolation and loneliness before a hip fracture, during the recovery phase and a minimum of 2 years post-hip fracture in an English population. data were from the English Longitudinal Study of Ageing (ELSA) cohort (2004/5-2014/15). The sample comprised of 215 participants who had sustained a hip fracture. Measures of social isolation and loneliness were analysed through multilevel modelling to determine their trajectories during three-time intervals (pre-fracture; interval at hip fracture and recovery; minimum 2 years post-fracture). The prevalence of social isolation and loneliness were determined pre- and post-fracture. prevalence of social isolation was 19% post-hip fracture and loneliness 13% post-hip fracture. There was no statistically significant change in social isolation pre-fracture compared to a minimum of 2 years post-fracture (P = 0.78). Similarly, there was no statistically significant change in loneliness pre-fracture compared to a minimum of 2 years post-fracture (P = 0.12). this analysis has determined that whilst social isolation and loneliness do not change over time following hip fracture, these remain a significant problem for this population. Interventions are required to address these physical and psychological health needs. This is important as they may have short and longer term health benefits for people post-hip fracture. © The Author 2017. Published by Oxford University Press on behalf of the British Geriatrics Society.All rights reserved. For permissions, please email: journals.permissions@oup.com

  19. Mach 10 Stage Separation Analysis for the X43-A

    NASA Technical Reports Server (NTRS)

    Tartabini, Paul V.; Bose, David M.; Thornblom, Mark N.; Lien, J. P.; Martin, John G.

    2007-01-01

    This paper describes the pre-flight stage separation analysis that was conducted in support of the final flight of the X-43A. In that flight, which occurred less than eight months after the successful Mach 7 flight, the X-43A Research Vehicle attained a peak speed of Mach 9.6. Details are provided on how the lessons learned from the Mach 7 flight affected separation modeling and how adjustments were made to account for the increased flight Mach number. Also, the procedure for defining the feedback loop closure and feed-forward parameters employed in the separation control logic are described, and their effect on separation performance is explained. In addition, the range and nominal values of these parameters, which were included in the Mission Data Load, are presented. Once updates were made, the nominal pre-flight trajectory and Monte Carlo statistical results were determined and stress tests were performed to ensure system robustness. During flight the vehicle performed within the uncertainty bounds predicted in the pre-flight analysis and ultimately set the world record for airbreathing powered flight.

  20. Teaching Integrity in Empirical Research: A Protocol for Documenting Data Management and Analysis

    ERIC Educational Resources Information Center

    Ball, Richard; Medeiros, Norm

    2012-01-01

    This article describes a protocol the authors developed for teaching undergraduates to document their statistical analyses for empirical research projects so that their results are completely reproducible and verifiable. The protocol is guided by the principle that the documentation prepared to accompany an empirical research project should be…

  1. Development and Validation of an Instrument to Measure Indonesian Pre-Service Teachers' Conceptions of Statistics

    ERIC Educational Resources Information Center

    Idris, Khairiani; Yang, Kai-Lin

    2017-01-01

    This article reports the results of a mixed-methods approach to develop and validate an instrument to measure Indonesian pre-service teachers' conceptions of statistics. First, a phenomenographic study involving a sample of 44 participants uncovered six categories of conceptions of statistics. Second, an instrument of conceptions of statistics was…

  2. Statistical Analysis of Categorical Time Series of Atmospheric Elementary Circulation Mechanisms - Dzerdzeevski Classification for the Northern Hemisphere

    PubMed Central

    Brenčič, Mihael

    2016-01-01

    Northern hemisphere elementary circulation mechanisms, defined with the Dzerdzeevski classification and published on a daily basis from 1899–2012, are analysed with statistical methods as continuous categorical time series. Classification consists of 41 elementary circulation mechanisms (ECM), which are assigned to calendar days. Empirical marginal probabilities of each ECM were determined. Seasonality and the periodicity effect were investigated with moving dispersion filters and randomisation procedure on the ECM categories as well as with the time analyses of the ECM mode. The time series were determined as being non-stationary with strong time-dependent trends. During the investigated period, periodicity interchanges with periods when no seasonality is present. In the time series structure, the strongest division is visible at the milestone of 1986, showing that the atmospheric circulation pattern reflected in the ECM has significantly changed. This change is result of the change in the frequency of ECM categories; before 1986, the appearance of ECM was more diverse, and afterwards fewer ECMs appear. The statistical approach applied to the categorical climatic time series opens up new potential insight into climate variability and change studies that have to be performed in the future. PMID:27116375

  3. Statistical Analysis of Categorical Time Series of Atmospheric Elementary Circulation Mechanisms - Dzerdzeevski Classification for the Northern Hemisphere.

    PubMed

    Brenčič, Mihael

    2016-01-01

    Northern hemisphere elementary circulation mechanisms, defined with the Dzerdzeevski classification and published on a daily basis from 1899-2012, are analysed with statistical methods as continuous categorical time series. Classification consists of 41 elementary circulation mechanisms (ECM), which are assigned to calendar days. Empirical marginal probabilities of each ECM were determined. Seasonality and the periodicity effect were investigated with moving dispersion filters and randomisation procedure on the ECM categories as well as with the time analyses of the ECM mode. The time series were determined as being non-stationary with strong time-dependent trends. During the investigated period, periodicity interchanges with periods when no seasonality is present. In the time series structure, the strongest division is visible at the milestone of 1986, showing that the atmospheric circulation pattern reflected in the ECM has significantly changed. This change is result of the change in the frequency of ECM categories; before 1986, the appearance of ECM was more diverse, and afterwards fewer ECMs appear. The statistical approach applied to the categorical climatic time series opens up new potential insight into climate variability and change studies that have to be performed in the future.

  4. Universal avalanche statistics and triggering close to failure in a mean-field model of rheological fracture

    NASA Astrophysics Data System (ADS)

    Baró, Jordi; Davidsen, Jörn

    2018-03-01

    The hypothesis of critical failure relates the presence of an ultimate stability point in the structural constitutive equation of materials to a divergence of characteristic scales in the microscopic dynamics responsible for deformation. Avalanche models involving critical failure have determined common universality classes for stick-slip processes and fracture. However, not all empirical failure processes exhibit the trademarks of criticality. The rheological properties of materials introduce dissipation, usually reproduced in conceptual models as a hardening of the coarse grained elements of the system. Here, we investigate the effects of transient hardening on (i) the activity rate and (ii) the statistical properties of avalanches. We find the explicit representation of transient hardening in the presence of generalized viscoelasticity and solve the corresponding mean-field model of fracture. In the quasistatic limit, the accelerated energy release is invariant with respect to rheology and the avalanche propagation can be reinterpreted in terms of a stochastic counting process. A single universality class can be defined from such analogy, and all statistical properties depend only on the distance to criticality. We also prove that interevent correlations emerge due to the hardening—even in the quasistatic limit—that can be interpreted as "aftershocks" and "foreshocks."

  5. Record statistics of financial time series and geometric random walks

    NASA Astrophysics Data System (ADS)

    Sabir, Behlool; Santhanam, M. S.

    2014-09-01

    The study of record statistics of correlated series in physics, such as random walks, is gaining momentum, and several analytical results have been obtained in the past few years. In this work, we study the record statistics of correlated empirical data for which random walk models have relevance. We obtain results for the records statistics of select stock market data and the geometric random walk, primarily through simulations. We show that the distribution of the age of records is a power law with the exponent α lying in the range 1.5≤α≤1.8. Further, the longest record ages follow the Fréchet distribution of extreme value theory. The records statistics of geometric random walk series is in good agreement with that obtained from empirical stock data.

  6. [Clinical outcomes of single-level lumbar spondylolisthesis by minimally invasive transforaminal lumbar interbody fusion with bilateral tubular channels].

    PubMed

    Zeng, Z L; Jia, L; Yu, Y; Xu, W; Hu, X; Zhan, X H; Jia, Y W; Wang, J J; Cheng, L M

    2017-04-01

    Objective: To evaluate the clinical effectiveness of minimally invasive transforaminal lumbar interbody fusion (MIS-TLIF) for single-level lumbar spondylolisthesis treatment with bilateral Spotlight tubular channels. Methods: A total of 21 patients with lumbar spondylolisthesis whom underwent MIS-TLIF via bilateral Spotlight tubular channels were retrospectively analyzed from October 2014 to November 2015. The 21 patients included 11 males and 10 females ranged from 35 to 82 years (average aged 60.7 years). In term of spondylolisthesis category, there were 18 cases of degenerative spondylolisthesis and 3 cases of isthmic spondylolisthesis. With respect to spondylolisthesis degree, 17 cases were grade Ⅰ° and 4 cases were grade Ⅱ°. Besides, 17 cases at L(4-5) and 4 cases at L(5)-S(1)were categorized by spondylolisthesis levels. Operation duration, blood loss, postoperative drainage and intraoperative exposure time were recorded, functional improvement was defined as an improvement in the Oswestry Disability Index (ODI), Visual Analog Scale (VAS) was also employed at pre and post-operation (3 months and the last follow-up), to evaluate low back and leg pain. Furthermore, to evaluate the recovery of the intervertebral foramen and of lumbar sagittal curvature, average height of intervertebral space, Cobb angles of lumbar vertebrae and operative segments, spondylolisthesis index were measured. At the last follow-up, intervertebral fusion was assessed using Siepe evaluation criteria and the clinical outcome was assessed using the MacNab scale. Radiographic and functional outcomes were compared pre- and post-operation using the paired T test to determine the effectiveness of MIS-TLIF. Statistical significance was defined as P <0.05. Results: All patients underwent a successful MIS-TLIF surgery. The operation time (235.2±30.2) mins, intraoperative blood loss (238.1±130.3) ml, postoperative drainage (95.7±57.1) ml and intraoperative radiation exposure (47.1±8.8) were recorded. Different significance between 3 months post-operative follow-up and pre-operation was exhibited ( P <0.01) in respects of lumbar VAS ( t =11.1, P <0.01) and leg VAS ( t =17.8, P <0.01). Moreover, final follow-up compared with pre-operation, and final follow-up compared with 3 months post-operative follow-up, VAS scores were also statistical difference ( P <0.01). At the final follow-up, there were significant differences compared with pre-operation in ODI scores ( t =30.1, P <0.01). Comparison between 3 months post-operative follow-up and pre-operation, statistical distinctions were demonstrated ( P <0.05) in terms of mean height of intervertebral space ( t =-10.9, P <0.01), the Cobb angles of lumbar vertebrae ( t =-2.4, P <0.05), operative segments Cobb angles ( t =-5.2, P <0.01) and Lumbar spondylolisthesis incidence ( t =17.1, P <0.01). In addition, there was statistical difference between final follow-up and pre-operation ( P <0.05) as well. For instance, mean height of intervertebral space ( t =-10.5, P <0.01), the Cobb angles of lumbar vertebrae ( t =-2.7, P <0.05), operative segments Cobb angles ( t =-4.2, P <0.01) and Lumbar spondylolisthesis incidence ( t =18.6, P <0.01) were involved. All spondylolisthesis vertebrae were restored completely. Lastly, at the last follow-up, 12 cases of grade 1 and 7 cases of grade 2 fusion were present as determined by the Siepe evaluation criteria. McNab scale assessment classified 17 patients having excellent clinical outcome, 3 patients in good and 1 patient having a better clinical outcome. Conclusion: MIS-TLIF with bilateral Spotlight tubular channels is a safe and effective approach for single segment lumbar spondylolisthesis.

  7. The Impact of Acculturative Stress and Daily Hassles on Pre-Adolescent Psychological Adjustment: Examining Anxiety Symptoms

    ERIC Educational Resources Information Center

    Suarez-Morales, Lourdes; Lopez, Barbara

    2009-01-01

    Acculturative stress in relation to anxiety symptoms has not been examined empirically in young Hispanic populations. The present study, conducted with 138 pre-adolescent Hispanic youngsters, investigated this relationship. The findings suggested that acculturative stress was related to physiological, concentration, and worrisome symptoms of…

  8. Pre-Service Teachers' Beliefs and Other Predictors of Pupil Control Ideologies

    ERIC Educational Resources Information Center

    Rideout, Glenn W.; Morton, Larry L.

    2007-01-01

    Purpose: This study aims to examine a variety of demographic, experiential, and philosophical orientation variables that may be predictive of pupil control ideologies (PCI) for teacher candidates at the beginning of a pre-service program. In particular, it sets out to provide empirically grounded generalizations regarding the relationship between…

  9. Formative Assessment and Academic Achievement in Pre-Graduate Students of Health Sciences

    ERIC Educational Resources Information Center

    Carrillo-de-la-Pena, Maria T.; Bailles, Eva; Caseras, Xavier; Martinez, Alvar; Ortet, Generos; Perez, Jorge

    2009-01-01

    Although educational experts recommend the use of formative assessment, there is a dearth of empirical studies on its impact on academic achievement. In this research the authors analyse to what extent participation and performance in formative assessment are associated with positive academic outcomes of pre-graduate students of health sciences. A…

  10. Pre-Service Teachers' Attitude towards Information and Communication Technology Usage: A Ghanaian Survey

    ERIC Educational Resources Information Center

    Gyamfi, Stephen Adu

    2017-01-01

    This study employed the Technology Acceptance Model (TAM) to empirically investigate factors that influence Ghanaian pre-service teachers' attitudes towards Information and Communication Technology (ICT) usage. To achieve this aim, the study extended the TAM framework by adding leadership support and job relevance as exogenous variables. Data were…

  11. The Use of Pre-Reading Activities in Reading Skills Achievement in Preschool Education

    ERIC Educational Resources Information Center

    Osei, Aboagye Michael; Liang, Qing Jing; Natalia, Ihnatushchenko; Stephen, Mensah Abrampah

    2016-01-01

    Although wealth of empirical researches have covered the impact of crucial, indispensable role reading skills play in the development of individuals' mental faculties through the acquisition of knowledge in a particular language, scientific works on the assessment of the relationship(s) between pre-reading activities (consisting of games, puzzle…

  12. Poisson-process generalization for the trading waiting-time distribution in a double-auction mechanism

    NASA Astrophysics Data System (ADS)

    Cincotti, Silvano; Ponta, Linda; Raberto, Marco; Scalas, Enrico

    2005-05-01

    In this paper, empirical analyses and computational experiments are presented on high-frequency data for a double-auction (book) market. Main objective of the paper is to generalize the order waiting time process in order to properly model such empirical evidences. The empirical study is performed on the best bid and best ask data of 7 U.S. financial markets, for 30-stock time series. In particular, statistical properties of trading waiting times have been analyzed and quality of fits is evaluated by suitable statistical tests, i.e., comparing empirical distributions with theoretical models. Starting from the statistical studies on real data, attention has been focused on the reproducibility of such results in an artificial market. The computational experiments have been performed within the Genoa Artificial Stock Market. In the market model, heterogeneous agents trade one risky asset in exchange for cash. Agents have zero intelligence and issue random limit or market orders depending on their budget constraints. The price is cleared by means of a limit order book. The order generation is modelled with a renewal process. Based on empirical trading estimation, the distribution of waiting times between two consecutive orders is modelled by a mixture of exponential processes. Results show that the empirical waiting-time distribution can be considered as a generalization of a Poisson process. Moreover, the renewal process can approximate real data and implementation on the artificial stocks market can reproduce the trading activity in a realistic way.

  13. Use of an evidence-based algorithm for patients with traumatic hemothorax reduces need for additional interventions.

    PubMed

    Dennis, Bradley M; Gondek, Stephen P; Guyer, Richard A; Hamblin, Susan E; Gunter, Oliver L; Guillamondegui, Oscar D

    2017-04-01

    Concerted management of the traumatic hemothorax is ill-defined. Surgical management of specific hemothoraces may be beneficial. A comprehensive strategy to delineate appropriate patients for additional procedures does not exist. We developed an evidence-based algorithm for hemothorax management. We hypothesize that the use of this algorithm will decrease additional interventions. A pre-/post-study was performed on all patients admitted to our trauma service with traumatic hemothorax from August 2010 to September 2013. An evidence-based management algorithm was initiated for the management of retained hemothoraces. Patients with length of stay (LOS) less than 24 hours or admitted during an implementation phase were excluded. Study data included age, Injury Severity Score, Abbreviated Injury Scale chest, mechanism of injury, ventilator days, intensive care unit (ICU) LOS, total hospital LOS, and interventions required. Our primary outcome was number of patients requiring more than 1 intervention. Secondary outcomes were empyema rate, number of patients requiring specific additional interventions, 28-day ventilator-free days, 28-day ICU-free days, hospital LOS, all-cause 6-month readmission rate. Standard statistical analysis was performed for all data. Six hundred forty-two patients (326 pre and 316 post) met the study criteria. There were no demographic differences in either group. The number of patients requiring more than 1 intervention was significantly reduced (49 pre vs. 28 post, p = 0.02). Number of patients requiring VATS decreased (27 pre vs. 10 post, p < 0.01). Number of catheters placed by interventional radiology increased (2 pre vs. 10 post, p = 0.02). Intrapleural thrombolytic use, open thoracotomy, empyema, and 6-month readmission rates were unchanged. The "post" group more ventilator-free days (median, 23.9 vs. 22.5, p = 0.04), but ICU and hospital LOS were unchanged. Using an evidence-based hemothorax algorithm reduced the number of patients requiring additional interventions without increasing complication rates. Defined criteria for surgical intervention allows for more appropriate utilization of resources. Therapeutic study, level IV.

  14. The effects of academic grouping on student performance in science

    NASA Astrophysics Data System (ADS)

    Scoggins, Sally Smykla

    The current action research study explored how student placement in heterogeneous or homogeneous classes in seventh-grade science affected students' eighth-grade Science State of Texas Assessment of Academic Readiness (STAAR) scores, and how ability grouping affected students' scores based on race and socioeconomic status. The population included all eighth-grade students in the target district who took the regular eighth-grade science STAAR over four academic school years. The researcher ran three statistical tests: a t-test for independent samples, a one-way between subjects analysis of variance (ANOVA) and a two-way between subjects ANOVA. The results showed no statistically significant difference between eighth-grade Pre-AP students from seventh-grade Pre-AP classes and eighth-grade Pre-AP students from heterogeneous seventh-grade classes and no statistically significant difference between Pre-AP students' scores based on socioeconomic status. There was no statistically significant interaction between socioeconomic status and the seventh-grade science classes. The scores between regular eighth-grade students who were in heterogeneous seventh-grade classes were statistically significantly higher than the scores of regular eighth-grade students who were in regular seventh-grade classes. The results also revealed that the scores of students who were White were statistically significantly higher than the scores of students who were Black and Hispanic. Black and Hispanic scores did not differ significantly. Further results indicated that the STAAR Level II and Level III scores were statistically significantly higher for the Pre-AP eighth-grade students who were in heterogeneous seventh-grade classes than the STAAR Level II and Level III scores of Pre-AP eighth-grade students who were in Pre-AP seventh-grade classes.

  15. Item Selection and Pre-equating with Empirical Item Characteristic Curves.

    ERIC Educational Resources Information Center

    Livingston, Samuel A.

    An empirical item characteristic curve shows the probability of a correct response as a function of the student's total test score. These curves can be estimated from large-scale pretest data. They enable test developers to select items that discriminate well in the score region where decisions are made. A similar set of curves can be used to…

  16. Nutritional status of pediatric patients with congenital heart disease: pre- and post cardiac surgery.

    PubMed

    Ratanachu-Ek, Suntaree; Pongdara, Aujjimavadee

    2011-08-01

    Malnutrition is common in infants and children with congenital heart disease (CHD). Cardiac surgery has improved patient survival and nutritional status. To evaluate the impact of cardiac surgery on nutritional status of pediatric patients with CHD. A prospective cohort study was conducted in pediatric patients with CHD, admitted for cardiac surgery at Queen Sirikit National Institute of Child Health (QSNICH), Bangkok, from August 1st, 2002 to 2003. Demographic data, cardiac and related problems were obtained before operation. Anthropometry was performed at the presentation and post cardiac surgery. Nutritional status was assessed by Z-score of weight for age (ZWA), weight for height (ZWH) and height for age (ZHA). Malnutrition was defined as Z-score <- 2 and compared pre- and post-operation using Chi-square. Paired t-test was used to compare mean Z-score and p-value < 0.05 was statistically significant. All of 161 pediatric patients with CHD undergoing cardiac surgery were 41% males and 59% females. Patients' age ranged from 1 month to 15 years. The related problems included low birth weight (28%) and feeding problem (58%). The most common CHD was ventricular septal defect (29%). The nutritional status of the patients before surgery was defined as normal 57%, malnutrition 40% and over-nutrition 3%. Malnutrition included underweight 28%, wasting 22% and stunting 16%. Post cardiac surgery, the means of ZWA, ZWH and ZHA were significantly increased and the prevalence of underweight and wasting were decreased to 17% and 6% respectively, with statistically significant from the baseline (p < 0.05). Malnutrition was found in 40% of pediatric patients with CHD and cardiac surgery has a significant positive effect on weight gain and nutritional status.

  17. Bias, precision and statistical power of analysis of covariance in the analysis of randomized trials with baseline imbalance: a simulation study.

    PubMed

    Egbewale, Bolaji E; Lewis, Martyn; Sim, Julius

    2014-04-09

    Analysis of variance (ANOVA), change-score analysis (CSA) and analysis of covariance (ANCOVA) respond differently to baseline imbalance in randomized controlled trials. However, no empirical studies appear to have quantified the differential bias and precision of estimates derived from these methods of analysis, and their relative statistical power, in relation to combinations of levels of key trial characteristics. This simulation study therefore examined the relative bias, precision and statistical power of these three analyses using simulated trial data. 126 hypothetical trial scenarios were evaluated (126,000 datasets), each with continuous data simulated by using a combination of levels of: treatment effect; pretest-posttest correlation; direction and magnitude of baseline imbalance. The bias, precision and power of each method of analysis were calculated for each scenario. Compared to the unbiased estimates produced by ANCOVA, both ANOVA and CSA are subject to bias, in relation to pretest-posttest correlation and the direction of baseline imbalance. Additionally, ANOVA and CSA are less precise than ANCOVA, especially when pretest-posttest correlation ≥ 0.3. When groups are balanced at baseline, ANCOVA is at least as powerful as the other analyses. Apparently greater power of ANOVA and CSA at certain imbalances is achieved in respect of a biased treatment effect. Across a range of correlations between pre- and post-treatment scores and at varying levels and direction of baseline imbalance, ANCOVA remains the optimum statistical method for the analysis of continuous outcomes in RCTs, in terms of bias, precision and statistical power.

  18. Bias, precision and statistical power of analysis of covariance in the analysis of randomized trials with baseline imbalance: a simulation study

    PubMed Central

    2014-01-01

    Background Analysis of variance (ANOVA), change-score analysis (CSA) and analysis of covariance (ANCOVA) respond differently to baseline imbalance in randomized controlled trials. However, no empirical studies appear to have quantified the differential bias and precision of estimates derived from these methods of analysis, and their relative statistical power, in relation to combinations of levels of key trial characteristics. This simulation study therefore examined the relative bias, precision and statistical power of these three analyses using simulated trial data. Methods 126 hypothetical trial scenarios were evaluated (126 000 datasets), each with continuous data simulated by using a combination of levels of: treatment effect; pretest-posttest correlation; direction and magnitude of baseline imbalance. The bias, precision and power of each method of analysis were calculated for each scenario. Results Compared to the unbiased estimates produced by ANCOVA, both ANOVA and CSA are subject to bias, in relation to pretest-posttest correlation and the direction of baseline imbalance. Additionally, ANOVA and CSA are less precise than ANCOVA, especially when pretest-posttest correlation ≥ 0.3. When groups are balanced at baseline, ANCOVA is at least as powerful as the other analyses. Apparently greater power of ANOVA and CSA at certain imbalances is achieved in respect of a biased treatment effect. Conclusions Across a range of correlations between pre- and post-treatment scores and at varying levels and direction of baseline imbalance, ANCOVA remains the optimum statistical method for the analysis of continuous outcomes in RCTs, in terms of bias, precision and statistical power. PMID:24712304

  19. Cheap but accurate calculation of chemical reaction rate constants from ab initio data, via system-specific, black-box force fields

    NASA Astrophysics Data System (ADS)

    Steffen, Julien; Hartke, Bernd

    2017-10-01

    Building on the recently published quantum-mechanically derived force field (QMDFF) and its empirical valence bond extension, EVB-QMDFF, it is now possible to generate a reliable potential energy surface for any given elementary reaction step in an essentially black box manner. This requires a limited and pre-defined set of reference data near the reaction path and generates an accurate approximation of the reference potential energy surface, on and off the reaction path. This intermediate representation can be used to generate reaction rate data, with far better accuracy and reliability than with traditional approaches based on transition state theory (TST) or variational extensions thereof (VTST), even if those include sophisticated tunneling corrections. However, the additional expense at the reference level remains very modest. We demonstrate all this for three arbitrarily chosen example reactions.

  20. Forecasting inundation from debris flows that grow during travel, with application to the Oregon Coast Range, USA

    USGS Publications Warehouse

    Reid, Mark E.; Coe, Jeffrey A.; Brien, Dianne

    2016-01-01

    Many debris flows increase in volume as they travel downstream, enhancing their mobility and hazard. Volumetric growth can result from diverse physical processes, such as channel sediment entrainment, stream bank collapse, adjacent landsliding, hillslope erosion and rilling, and coalescence of multiple debris flows; incorporating these varied phenomena into physics-based debris-flow models is challenging. As an alternative, we embedded effects of debris-flow growth into an empirical/statistical approach to forecast potential inundation areas within digital landscapes in a GIS framework. Our approach used an empirical debris-growth function to account for the effects of growth phenomena. We applied this methodology to a debris-flow-prone area in the Oregon Coast Range, USA, where detailed mapping revealed areas of erosion and deposition along paths of debris flows that occurred during a large storm in 1996. Erosion was predominant in stream channels with slopes > 5°. Using pre- and post-event aerial photography, we derived upslope contributing area and channel-length growth factors. Our method reproduced the observed inundation patterns produced by individual debris flows; it also generated reproducible, objective potential inundation maps for entire drainage networks. These maps better matched observations than those using previous methods that focus on proximal or distal regions of a drainage network.

  1. A maximally selected test of symmetry about zero.

    PubMed

    Laska, Eugene; Meisner, Morris; Wanderling, Joseph

    2012-11-20

    The problem of testing symmetry about zero has a long and rich history in the statistical literature. We introduce a new test that sequentially discards observations whose absolute value is below increasing thresholds defined by the data. McNemar's statistic is obtained at each threshold and the largest is used as the test statistic. We obtain the exact distribution of this maximally selected McNemar and provide tables of critical values and a program for computing p-values. Power is compared with the t-test, the Wilcoxon Signed Rank Test and the Sign Test. The new test, MM, is slightly less powerful than the t-test and Wilcoxon Signed Rank Test for symmetric normal distributions with nonzero medians and substantially more powerful than all three tests for asymmetric mixtures of normal random variables with or without zero medians. The motivation for this test derives from the need to appraise the safety profile of new medications. If pre and post safety measures are obtained, then under the null hypothesis, the variables are exchangeable and the distribution of their difference is symmetric about a zero median. Large pre-post differences are the major concern of a safety assessment. The discarded small observations are not particularly relevant to safety and can reduce power to detect important asymmetry. The new test was utilized on data from an on-road driving study performed to determine if a hypnotic, a drug used to promote sleep, has next day residual effects. Copyright © 2012 John Wiley & Sons, Ltd.

  2. The costs of the soviet empire.

    PubMed

    Wolf, C

    1985-11-29

    A comprehensive framework is developed and applied to estimate the economic costs incurred by the Soviet Union in acquiring, maintaining, and expanding its empire. The terms "empire" and "costs" are explicitly defined. Between 1971 and 1980, the average ratio between empire costs and Soviet gross national product was about 3.5 percent; as a ratio to Soviet military spending, empire costs averaged about 28 percent. The burden imposed on Soviet economic growth by empire costs is also considered, as well as rates of change in these costs, and the important political, military, and strategic benefits associated by the Soviet leadership with maintenance and expansion of the empire. Prospective empire costs and changes in Soviet economic constraints resulting from the declining performance of the domestic economy are also considered.

  3. Statistical approaches for the definition of landslide rainfall thresholds and their uncertainty using rain gauge and satellite data

    NASA Astrophysics Data System (ADS)

    Rossi, M.; Luciani, S.; Valigi, D.; Kirschbaum, D.; Brunetti, M. T.; Peruccacci, S.; Guzzetti, F.

    2017-05-01

    Models for forecasting rainfall-induced landslides are mostly based on the identification of empirical rainfall thresholds obtained exploiting rain gauge data. Despite their increased availability, satellite rainfall estimates are scarcely used for this purpose. Satellite data should be useful in ungauged and remote areas, or should provide a significant spatial and temporal reference in gauged areas. In this paper, the analysis of the reliability of rainfall thresholds based on rainfall remote sensed and rain gauge data for the prediction of landslide occurrence is carried out. To date, the estimation of the uncertainty associated with the empirical rainfall thresholds is mostly based on a bootstrap resampling of the rainfall duration and the cumulated event rainfall pairs (D,E) characterizing rainfall events responsible for past failures. This estimation does not consider the measurement uncertainty associated with D and E. In the paper, we propose (i) a new automated procedure to reconstruct ED conditions responsible for the landslide triggering and their uncertainties, and (ii) three new methods to identify rainfall threshold for the possible landslide occurrence, exploiting rain gauge and satellite data. In particular, the proposed methods are based on Least Square (LS), Quantile Regression (QR) and Nonlinear Least Square (NLS) statistical approaches. We applied the new procedure and methods to define empirical rainfall thresholds and their associated uncertainties in the Umbria region (central Italy) using both rain-gauge measurements and satellite estimates. We finally validated the thresholds and tested the effectiveness of the different threshold definition methods with independent landslide information. The NLS method among the others performed better in calculating thresholds in the full range of rainfall durations. We found that the thresholds obtained from satellite data are lower than those obtained from rain gauge measurements. This is in agreement with the literature, where satellite rainfall data underestimate the "ground" rainfall registered by rain gauges.

  4. Statistical Approaches for the Definition of Landslide Rainfall Thresholds and their Uncertainty Using Rain Gauge and Satellite Data

    NASA Technical Reports Server (NTRS)

    Rossi, M.; Luciani, S.; Valigi, D.; Kirschbaum, D.; Brunetti, M. T.; Peruccacci, S.; Guzzetti, F.

    2017-01-01

    Models for forecasting rainfall-induced landslides are mostly based on the identification of empirical rainfall thresholds obtained exploiting rain gauge data. Despite their increased availability, satellite rainfall estimates are scarcely used for this purpose. Satellite data should be useful in ungauged and remote areas, or should provide a significant spatial and temporal reference in gauged areas. In this paper, the analysis of the reliability of rainfall thresholds based on rainfall remote sensed and rain gauge data for the prediction of landslide occurrence is carried out. To date, the estimation of the uncertainty associated with the empirical rainfall thresholds is mostly based on a bootstrap resampling of the rainfall duration and the cumulated event rainfall pairs (D,E) characterizing rainfall events responsible for past failures. This estimation does not consider the measurement uncertainty associated with D and E. In the paper, we propose (i) a new automated procedure to reconstruct ED conditions responsible for the landslide triggering and their uncertainties, and (ii) three new methods to identify rainfall threshold for the possible landslide occurrence, exploiting rain gauge and satellite data. In particular, the proposed methods are based on Least Square (LS), Quantile Regression (QR) and Nonlinear Least Square (NLS) statistical approaches. We applied the new procedure and methods to define empirical rainfall thresholds and their associated uncertainties in the Umbria region (central Italy) using both rain-gauge measurements and satellite estimates. We finally validated the thresholds and tested the effectiveness of the different threshold definition methods with independent landslide information. The NLS method among the others performed better in calculating thresholds in the full range of rainfall durations. We found that the thresholds obtained from satellite data are lower than those obtained from rain gauge measurements. This is in agreement with the literature, where satellite rainfall data underestimate the 'ground' rainfall registered by rain gauges.

  5. Modeling conflict and error in the medial frontal cortex.

    PubMed

    Mayer, Andrew R; Teshiba, Terri M; Franco, Alexandre R; Ling, Josef; Shane, Matthew S; Stephen, Julia M; Jung, Rex E

    2012-12-01

    Despite intensive study, the role of the dorsal medial frontal cortex (dMFC) in error monitoring and conflict processing remains actively debated. The current experiment manipulated conflict type (stimulus conflict only or stimulus and response selection conflict) and utilized a novel modeling approach to isolate error and conflict variance during a multimodal numeric Stroop task. Specifically, hemodynamic response functions resulting from two statistical models that either included or isolated variance arising from relatively few error trials were directly contrasted. Twenty-four participants completed the task while undergoing event-related functional magnetic resonance imaging on a 1.5-Tesla scanner. Response times monotonically increased based on the presence of pure stimulus or stimulus and response selection conflict. Functional results indicated that dMFC activity was present during trials requiring response selection and inhibition of competing motor responses, but absent during trials involving pure stimulus conflict. A comparison of the different statistical models suggested that relatively few error trials contributed to a disproportionate amount of variance (i.e., activity) throughout the dMFC, but particularly within the rostral anterior cingulate gyrus (rACC). Finally, functional connectivity analyses indicated that an empirically derived seed in the dorsal ACC/pre-SMA exhibited strong connectivity (i.e., positive correlation) with prefrontal and inferior parietal cortex but was anti-correlated with the default-mode network. An empirically derived seed from the rACC exhibited the opposite pattern, suggesting that sub-regions of the dMFC exhibit different connectivity patterns with other large scale networks implicated in internal mentations such as daydreaming (default-mode) versus the execution of top-down attentional control (fronto-parietal). Copyright © 2011 Wiley Periodicals, Inc.

  6. Modeling species-abundance relationships in multi-species collections

    USGS Publications Warehouse

    Peng, S.; Yin, Z.; Ren, H.; Guo, Q.

    2003-01-01

    Species-abundance relationship is one of the most fundamental aspects of community ecology. Since Motomura first developed the geometric series model to describe the feature of community structure, ecologists have developed many other models to fit the species-abundance data in communities. These models can be classified into empirical and theoretical ones, including (1) statistical models, i.e., negative binomial distribution (and its extension), log-series distribution (and its extension), geometric distribution, lognormal distribution, Poisson-lognormal distribution, (2) niche models, i.e., geometric series, broken stick, overlapping niche, particulate niche, random assortment, dominance pre-emption, dominance decay, random fraction, weighted random fraction, composite niche, Zipf or Zipf-Mandelbrot model, and (3) dynamic models describing community dynamics and restrictive function of environment on community. These models have different characteristics and fit species-abundance data in various communities or collections. Among them, log-series distribution, lognormal distribution, geometric series, and broken stick model have been most widely used.

  7. Changes in Manipulative Peak Force Modulation and Time to Peak Thrust among First-Year Chiropractic Students Following a 12-Week Detraining Period.

    PubMed

    Starmer, David J; Guist, Brett P; Tuff, Taylor R; Warren, Sarah C; Williams, Matthew G R

    2016-05-01

    The purpose of this study was to analyze differences in peak force modulation and time-to-peak thrust in posterior-to-anterior (PA) high-velocity-low-amplitude (HVLA) manipulations in first-year chiropractic students prior to and following a 12-week detraining period. Chiropractic students (n=125) performed 2 thrusts prior to and following a 12-week detraining period: total peak force targets were 400 and 600 N, on a force-sensing table using a PA hand contact of the participant's choice (bilateral hypothenar, bilateral thenar, or cross bilateral). Force modulation was compared to defined target total peak force values of 600 and 400 N, and time-to-peak thrust was compared between data sets using 2-tailed paired t-tests. Total peak force for the 600 N intensity varied by 124.11 + 65.77 N during the pre-test and 123.29 + 61.43 N during the post-test compared to the defined target of 600 N (P = .90); total peak force for the 400 N intensity varied by 44.91 + 34.67 N during the pre-test and 44.60 + 32.63 N during the post-test compared to the defined target of 400 N (P = .57). Time-to-peak thrust for the 400 N total peak force was 137.094 + 42.47 milliseconds during the pre-test and 125.385 + 37.46 milliseconds during the post-test (P = .0004); time-to-peak thrust for the 600 N total peak force was 136.835 + 40.48 milliseconds during the pre-test and 125.385 + 33.78 milliseconds during the post-test (P = .03). The results indicate no drop-off in the ability to modulate force for either thrust intensity, but did indicate a statistically significant change in time-to-peak thrust for the 400 N total peak force thrust intensity in first-year chiropractic students following a 12-week detraining period. Copyright © 2016 National University of Health Sciences. Published by Elsevier Inc. All rights reserved.

  8. Applying multimedia design principles enhances learning in medical education.

    PubMed

    Issa, Nabil; Schuller, Mary; Santacaterina, Susan; Shapiro, Michael; Wang, Edward; Mayer, Richard E; DaRosa, Debra A

    2011-08-01

    The Association of American Medical Colleges' Institute for Improving Medical Education's report entitled 'Effective Use of Educational Technology' called on researchers to study the effectiveness of multimedia design principles. These principles were empirically shown to result in superior learning when used with college students in laboratory studies, but have not been studied with undergraduate medical students as participants. A pre-test/post-test control group design was used, in which the traditional-learning group received a lecture on shock using traditionally designed slides and the modified-design group received the same lecture using slides modified in accord with Mayer's principles of multimedia design. Participants included Year 3 medical students at a private, midwestern medical school progressing through their surgery clerkship during the academic year 2009-2010. The medical school divides students into four groups; each group attends the surgery clerkship during one of the four quarters of the academic year. Students in the second and third quarters served as the modified-design group (n=91) and students in the fourth-quarter clerkship served as the traditional-design group (n=39). Both student cohorts had similar levels of pre-lecture knowledge. Both groups showed significant improvements in retention (p<0.0001), transfer (p<0.05) and total scores (p<0.0001) between the pre- and post-tests. Repeated-measures anova analysis showed statistically significant greater improvements in retention (F=10.2, p=0.0016) and total scores (F=7.13, p=0.0081) for those students instructed using principles of multimedia design compared with those instructed using the traditional design. Multimedia design principles are easy to implement and result in improved short-term retention among medical students, but empirical research is still needed to determine how these principles affect transfer of learning. Further research on applying the principles of multimedia design to medical education is needed to verify the impact it has on the long-term learning of medical students, as well as its impact on other forms of multimedia instructional programmes used in the education of medical students. © Blackwell Publishing Ltd 2011.

  9. Impact of Structured Group Activities on Pre-Service Teachers' Beliefs about Classroom Motivation: An Exploratory Study

    ERIC Educational Resources Information Center

    Mansfield, Caroline F.; Volet, Simone E.

    2014-01-01

    Pre-service teachers' beliefs about classroom motivation, and how these beliefs may be developed during initial teacher preparation, is a relatively new aspect of enquiry in the fields of motivation and teacher education. An empirical study, grounded in a social constructivist perspective, was designed to examine the impact of providing…

  10. Collaborative Research Projects in the Technology-Enhanced Language Classroom: Preservice and In-Service Teachers Exchange Knowledge about Technology

    ERIC Educational Resources Information Center

    Schmid, Euline Cutrim; Hegelheimer, Volker

    2014-01-01

    This paper presents research findings of a longitudinal empirical case study that investigated an innovative Computer Assisted Language Learning (CALL) professional development program for pre-service English as Foreign Language (EFL) teachers. The conceptualization of the program was based on the assumption that pre-service language teachers…

  11. Naive Theory of Biology: The Pre-School Child's Explanation of Death

    ERIC Educational Resources Information Center

    Vlok, Milandre; de Witt, Marike W.

    2012-01-01

    This article explains the naive theory of biology that the pre-school child uses to explain the cause of death. The empirical investigation showed that the young participants do use a naive theory of biology to explain function and do make reference to "vitalistic causality" in explaining organ function. Furthermore, most of these…

  12. Learning to Take the Tablet: How Pre-Service Teachers use iPads to Facilitate their Learning

    ERIC Educational Resources Information Center

    Pegrum, Mark; Howitt, Christine; Striepe, Michelle

    2013-01-01

    Mobile handheld devices are spreading rapidly in education. iPads, especially, are increasingly being adopted by different educational sectors, but there is currently little empirical evidence on whether, or how, they facilitate student learning. This paper reports on how iPads contributed to pre-service teachers' learning, including their…

  13. The Chinese Number Naming System and Its Impact on the Arithmetic Performance of Pre-Schoolers in Hong Kong

    ERIC Educational Resources Information Center

    Ng, Sharon Sui Ngan

    2012-01-01

    Asian children, including Chinese children, perform better than their English-speaking peers in cross-national mathematics studies. This superior Asian performance is attributed to several factors including cultural beliefs, educational systems and practices, and the Chinese number naming system. Given the limited empirical evidence on pre-school…

  14. Effects of Real-Time Visual Feedback on Pre-Service Teachers' Singing

    ERIC Educational Resources Information Center

    Leong, S.; Cheng, L.

    2014-01-01

    This pilot study focuses on the use real-time visual feedback technology (VFT) in vocal training. The empirical research has two aims: to ascertain the effectiveness of the real-time visual feedback software "Sing & See" in the vocal training of pre-service music teachers and the teachers' perspective on their experience with…

  15. The Effects of Online Communities of Practice on Pre-Service Teachers' Critical Thinking Dispositions

    ERIC Educational Resources Information Center

    Ekici, Didem Inel

    2017-01-01

    This empirical study attempted to investigate the effect of using online communities of practice in teacher education on pre-service teachers' critical thinking dispositions. California Critical Thinking Disposition Inventory and the comments posted to the online community of practice were used as the data collection tools. Results showed that…

  16. Vehicular headways on signalized intersections: theory, models, and reality

    NASA Astrophysics Data System (ADS)

    Krbálek, Milan; Šleis, Jiří

    2015-01-01

    We discuss statistical properties of vehicular headways measured on signalized crossroads. On the basis of mathematical approaches, we formulate theoretical and empirically inspired criteria for the acceptability of theoretical headway distributions. Sequentially, the multifarious families of statistical distributions (commonly used to fit real-road headway statistics) are confronted with these criteria, and with original empirical time clearances gauged among neighboring vehicles leaving signal-controlled crossroads after a green signal appears. Using three different numerical schemes, we demonstrate that an arrangement of vehicles on an intersection is a consequence of the general stochastic nature of queueing systems, rather than a consequence of traffic rules, driver estimation processes, or decision-making procedures.

  17. The beta distribution: A statistical model for world cloud cover

    NASA Technical Reports Server (NTRS)

    Falls, L. W.

    1973-01-01

    Much work has been performed in developing empirical global cloud cover models. This investigation was made to determine an underlying theoretical statistical distribution to represent worldwide cloud cover. The beta distribution with probability density function is given to represent the variability of this random variable. It is shown that the beta distribution possesses the versatile statistical characteristics necessary to assume the wide variety of shapes exhibited by cloud cover. A total of 160 representative empirical cloud cover distributions were investigated and the conclusion was reached that this study provides sufficient statical evidence to accept the beta probability distribution as the underlying model for world cloud cover.

  18. Explorations in Statistics: the Bootstrap

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2009-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This fourth installment of Explorations in Statistics explores the bootstrap. The bootstrap gives us an empirical approach to estimate the theoretical variability among possible values of a sample statistic such as the…

  19. The Use of Intravenous Antibiotics at the Onset of Neutropenia in Patients Receiving Outpatient-Based Hematopoietic Stem Cell Transplants

    PubMed Central

    Hamadah, Aziz; Schreiber, Yoko; Toye, Baldwin; McDiarmid, Sheryl; Huebsch, Lothar; Bredeson, Christopher; Tay, Jason

    2012-01-01

    Empirical antibiotics at the onset of febrile neutropenia are one of several strategies for management of bacterial infections in patients undergoing Hematopoietic Stem Cell Transplant (HSCT) (empiric strategy). Our HSCT program aims to perform HSCT in an outpatient setting, where an empiric antibiotic strategy was employed. HSCT recipients began receiving intravenous antibiotics at the onset of neutropenia in the absence of fever as part of our institutional policy from 01 Jan 2009; intravenous Prophylactic strategy. A prospective study was conducted to compare two consecutive cohorts [Year 2008 (Empiric strategy) vs. Year 2009 (Prophylactic strategy)] of patients receiving HSCT. There were 238 HSCTs performed between 01 Jan 2008 and 31 Dec 2009 with 127 and 111 in the earlier and later cohorts respectively. Infection-related mortality pre- engraftment was similar with a prophylactic compared to an empiric strategy (3.6% vs. 7.1%; p = 0.24), but reduced among recipients of autologous HSCT (0% vs. 6.8%; p = 0.03). Microbiologically documented, blood stream infections and clinically documented infections pre-engraftment were reduced in those receiving a prophylactic compared to an empiric strategy, (11.7% vs. 28.3%; p = 0.001), (9.9% vs. 24.4%; p = 0.003) and (18.2% vs. 33.9% p = 0.007) respectively. The prophylactic use of intravenous once-daily ceftriaxone in patients receiving outpatient based HSCT is safe and may be particularly effective in patients receiving autologous HSCT. Further studies are warranted to study the impact of this Prophylactic strategy in an outpatient based HSCT program. PMID:23029441

  20. Optimizing exoplanet transit searches

    NASA Astrophysics Data System (ADS)

    Herrero, E.; Ribas, I.; Jordi, C.

    2013-05-01

    Exoplanet searches using the transit technique are nowadays providing a great number of findings. Most exoplanet transit detection programs that are currently underway are focused on large catalogs of stars with no pre-selection. This necessarily makes such surveys quite inefficient, because huge amounts of data are processed for a relatively low transiting planet yield. In this work we investigate a method to increase the efficiency of a targeted exoplanet search with the transit technique by preselecting a subset of candidates from large catalogs of stars. Assuming spin-orbit alignment, this can be done by considering stars that have higher probability to be oriented nearly equator-on (inclination close to 90°). We use activity-rotation velocity relations for low-mass stars to study the dependence of the position in the activity - v sin(i) diagram on the stellar axis inclination. We compose a catalog of G-, K-, M-type main sequence simulated stars using isochrones, an isotropic inclination distribution and empirical relations to obtain their rotation periods and activity indexes. Then the activity-vsini diagram is filled and statistics are applied to trace the areas containing the higher ratio of stars with inclinations above 80°. A similar statistics is applied to stars from real catalogs with log(R'_{HK}) and v sin(i) data to find their probability of being equator-on. We present the method used to generate the simulated star catalog and the subsequent statistics to find the highly inclined stars from real catalogs using the activity-v sin(i) diagram. Several catalogs from the literature are analysed and a subsample of stars with the highest probability of being equator-on is presented. Assuming spin-orbit alignment, the efficiency of an exoplanet transit search in the resulting subsample of probably highly inclined stars is estimated to be two to three times higher than with a global search with no pre-selection.

  1. An empirical Bayes method for updating inferences in analysis of quantitative trait loci using information from related genome scans.

    PubMed

    Zhang, Kui; Wiener, Howard; Beasley, Mark; George, Varghese; Amos, Christopher I; Allison, David B

    2006-08-01

    Individual genome scans for quantitative trait loci (QTL) mapping often suffer from low statistical power and imprecise estimates of QTL location and effect. This lack of precision yields large confidence intervals for QTL location, which are problematic for subsequent fine mapping and positional cloning. In prioritizing areas for follow-up after an initial genome scan and in evaluating the credibility of apparent linkage signals, investigators typically examine the results of other genome scans of the same phenotype and informally update their beliefs about which linkage signals in their scan most merit confidence and follow-up via a subjective-intuitive integration approach. A method that acknowledges the wisdom of this general paradigm but formally borrows information from other scans to increase confidence in objectivity would be a benefit. We developed an empirical Bayes analytic method to integrate information from multiple genome scans. The linkage statistic obtained from a single genome scan study is updated by incorporating statistics from other genome scans as prior information. This technique does not require that all studies have an identical marker map or a common estimated QTL effect. The updated linkage statistic can then be used for the estimation of QTL location and effect. We evaluate the performance of our method by using extensive simulations based on actual marker spacing and allele frequencies from available data. Results indicate that the empirical Bayes method can account for between-study heterogeneity, estimate the QTL location and effect more precisely, and provide narrower confidence intervals than results from any single individual study. We also compared the empirical Bayes method with a method originally developed for meta-analysis (a closely related but distinct purpose). In the face of marked heterogeneity among studies, the empirical Bayes method outperforms the comparator.

  2. The Impact of the Temporal Distribution of Communicating Civilizations on Their Detectability

    NASA Astrophysics Data System (ADS)

    Balbi, Amedeo

    2018-01-01

    We used a statistical model to investigate the detectability (defined by the requirement that causal contact has been initiated with us) of communicating civilizations within a volume of the Universe surrounding our location. If the civilizations are located in our galaxy, the detectability requirement imposes a strict constraint on their epoch of appearance and their communicating life span. This, in turn, implies that our ability to gather empirical evidence of the fraction of civilizations within range of detection strongly depends on the specific features of their temporal distribution. Our approach illuminates aspects of the problem that can escape the standard treatment based on the Drake equation. Therefore, it might provide the appropriate framework for future studies dealing with the evolutionary aspects of the search for extraterrestrial intelligence (SETI).

  3. A main sequence for quasars

    NASA Astrophysics Data System (ADS)

    Marziani, Paola; Dultzin, Deborah; Sulentic, Jack W.; Del Olmo, Ascensión; Negrete, C. A.; Martínez-Aldama, Mary L.; D'Onofrio, Mauro; Bon, Edi; Bon, Natasa; Stirpe, Giovanna M.

    2018-03-01

    The last 25 years saw a major step forward in the analysis of optical and UV spectroscopic data of large quasar samples. Multivariate statistical approaches have led to the definition of systematic trends in observational properties that are the basis of physical and dynamical modeling of quasar structure. We discuss the empirical correlates of the so-called “main sequence” associated with the quasar Eigenvector 1, its governing physical parameters and several implications on our view of the quasar structure, as well as some luminosity effects associated with the virialized component of the line emitting regions. We also briefly discuss quasars in a segment of the main sequence that includes the strongest FeII emitters. These sources show a small dispersion around a well-defined Eddington ratio value, a property which makes them potential Eddington standard candles.

  4. A General Reliability Model for Ni-BaTiO3-Based Multilayer Ceramic Capacitors

    NASA Technical Reports Server (NTRS)

    Liu, Donhang

    2014-01-01

    The evaluation of multilayer ceramic capacitors (MLCCs) with Ni electrode and BaTiO3 dielectric material for potential space project applications requires an in-depth understanding of their reliability. A general reliability model for Ni-BaTiO3 MLCC is developed and discussed. The model consists of three parts: a statistical distribution; an acceleration function that describes how a capacitor's reliability life responds to the external stresses, and an empirical function that defines contribution of the structural and constructional characteristics of a multilayer capacitor device, such as the number of dielectric layers N, dielectric thickness d, average grain size, and capacitor chip size A. Application examples are also discussed based on the proposed reliability model for Ni-BaTiO3 MLCCs.

  5. A General Reliability Model for Ni-BaTiO3-Based Multilayer Ceramic Capacitors

    NASA Technical Reports Server (NTRS)

    Liu, Donhang

    2014-01-01

    The evaluation for potential space project applications of multilayer ceramic capacitors (MLCCs) with Ni electrode and BaTiO3 dielectric material requires an in-depth understanding of the MLCCs reliability. A general reliability model for Ni-BaTiO3 MLCCs is developed and discussed in this paper. The model consists of three parts: a statistical distribution; an acceleration function that describes how a capacitors reliability life responds to external stresses; and an empirical function that defines the contribution of the structural and constructional characteristics of a multilayer capacitor device, such as the number of dielectric layers N, dielectric thickness d, average grain size r, and capacitor chip size A. Application examples are also discussed based on the proposed reliability model for Ni-BaTiO3 MLCCs.

  6. Resampling-Based Empirical Bayes Multiple Testing Procedures for Controlling Generalized Tail Probability and Expected Value Error Rates: Focus on the False Discovery Rate and Simulation Study

    PubMed Central

    Dudoit, Sandrine; Gilbert, Houston N.; van der Laan, Mark J.

    2014-01-01

    Summary This article proposes resampling-based empirical Bayes multiple testing procedures for controlling a broad class of Type I error rates, defined as generalized tail probability (gTP) error rates, gTP(q, g) = Pr(g(Vn, Sn) > q), and generalized expected value (gEV) error rates, gEV(g) = E[g(Vn, Sn)], for arbitrary functions g(Vn, Sn) of the numbers of false positives Vn and true positives Sn. Of particular interest are error rates based on the proportion g(Vn, Sn) = Vn/(Vn + Sn) of Type I errors among the rejected hypotheses, such as the false discovery rate (FDR), FDR = E[Vn/(Vn + Sn)]. The proposed procedures offer several advantages over existing methods. They provide Type I error control for general data generating distributions, with arbitrary dependence structures among variables. Gains in power are achieved by deriving rejection regions based on guessed sets of true null hypotheses and null test statistics randomly sampled from joint distributions that account for the dependence structure of the data. The Type I error and power properties of an FDR-controlling version of the resampling-based empirical Bayes approach are investigated and compared to those of widely-used FDR-controlling linear step-up procedures in a simulation study. The Type I error and power trade-off achieved by the empirical Bayes procedures under a variety of testing scenarios allows this approach to be competitive with or outperform the Storey and Tibshirani (2003) linear step-up procedure, as an alternative to the classical Benjamini and Hochberg (1995) procedure. PMID:18932138

  7. The Inevitability of Ethnocentrism Revisited: Ethnocentrism Diminishes As Mobility Increases.

    PubMed

    De, Soham; Gelfand, Michele J; Nau, Dana; Roos, Patrick

    2015-12-08

    Nearly all major conflicts across the globe, both current and historical, are characterized by individuals defining themselves and others by group membership. This existence of group-biased behavior (in-group favoring and out-group hostile) has been well established empirically, and has been shown to be an inevitable outcome in many evolutionary studies. Thus it is puzzling that statistics show violence and out-group conflict declining dramatically over the past few centuries of human civilization. Using evolutionary game-theoretic models, we solve this puzzle by showing for the first time that out-group hostility is dramatically reduced by mobility. Technological and societal advances over the past centuries have greatly increased the degree to which humans change physical locations, and our results show that in highly mobile societies, one's choice of action is more likely to depend on what individual one is interacting with, rather than the group to which the individual belongs. Our empirical analysis of archival data verifies that contexts with high residential mobility indeed have less out-group hostility than those with low mobility. This work suggests that, in fact, group-biased behavior that discriminates against out-groups is not inevitable after all.

  8. The Inevitability of Ethnocentrism Revisited: Ethnocentrism Diminishes As Mobility Increases

    PubMed Central

    De, Soham; Gelfand, Michele J.; Nau, Dana; Roos, Patrick

    2015-01-01

    Nearly all major conflicts across the globe, both current and historical, are characterized by individuals defining themselves and others by group membership. This existence of group-biased behavior (in-group favoring and out-group hostile) has been well established empirically, and has been shown to be an inevitable outcome in many evolutionary studies. Thus it is puzzling that statistics show violence and out-group conflict declining dramatically over the past few centuries of human civilization. Using evolutionary game-theoretic models, we solve this puzzle by showing for the first time that out-group hostility is dramatically reduced by mobility. Technological and societal advances over the past centuries have greatly increased the degree to which humans change physical locations, and our results show that in highly mobile societies, one’s choice of action is more likely to depend on what individual one is interacting with, rather than the group to which the individual belongs. Our empirical analysis of archival data verifies that contexts with high residential mobility indeed have less out-group hostility than those with low mobility. This work suggests that, in fact, group-biased behavior that discriminates against out-groups is not inevitable after all. PMID:26644192

  9. Five Factor Model personality disorder scales: An introduction to a special section on assessment of maladaptive variants of the five factor model.

    PubMed

    Bagby, R Michael; Widiger, Thomas A

    2018-01-01

    The Five-Factor Model (FFM) is a dimensional model of general personality structure, consisting of the domains of neuroticism (or emotional instability), extraversion versus introversion, openness (or unconventionality), agreeableness versus antagonism, and conscientiousness (or constraint). The FFM is arguably the most commonly researched dimensional model of general personality structure. However, a notable limitation of existing measures of the FFM has been a lack of coverage of its maladaptive variants. A series of self-report inventories has been developed to assess for the maladaptive personality traits that define Diagnostic and Statistical Manual of Mental Disorders (fifth edition; DSM-5) Section II personality disorders (American Psychiatric Association [APA], 2013) from the perspective of the FFM. In this paper, we provide an introduction to this Special Section, presenting the rationale and empirical support for these measures and placing them in the historical context of the recent revision to the APA diagnostic manual. This introduction is followed by 5 papers that provide further empirical support for these measures and address current issues within the personality assessment literature. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  10. Beta Atomic Contacts: Identifying Critical Specific Contacts in Protein Binding Interfaces

    PubMed Central

    Liu, Qian; Kwoh, Chee Keong; Hoi, Steven C. H.

    2013-01-01

    Specific binding between proteins plays a crucial role in molecular functions and biological processes. Protein binding interfaces and their atomic contacts are typically defined by simple criteria, such as distance-based definitions that only use some threshold of spatial distance in previous studies. These definitions neglect the nearby atomic organization of contact atoms, and thus detect predominant contacts which are interrupted by other atoms. It is questionable whether such kinds of interrupted contacts are as important as other contacts in protein binding. To tackle this challenge, we propose a new definition called beta (β) atomic contacts. Our definition, founded on the β-skeletons in computational geometry, requires that there is no other atom in the contact spheres defined by two contact atoms; this sphere is similar to the van der Waals spheres of atoms. The statistical analysis on a large dataset shows that β contacts are only a small fraction of conventional distance-based contacts. To empirically quantify the importance of β contacts, we design βACV, an SVM classifier with β contacts as input, to classify homodimers from crystal packing. We found that our βACV is able to achieve the state-of-the-art classification performance superior to SVM classifiers with distance-based contacts as input. Our βACV also outperforms several existing methods when being evaluated on several datasets in previous works. The promising empirical performance suggests that β contacts can truly identify critical specific contacts in protein binding interfaces. β contacts thus provide a new model for more precise description of atomic organization in protein quaternary structures than distance-based contacts. PMID:23630569

  11. Improved characterization of truck traffic volumes and axle loads for mechanistic-empirical pavement design.

    DOT National Transportation Integrated Search

    2012-12-01

    The recently developed mechanistic-empirical pavement design guide (MEPDG) requires a multitude of traffic : inputs to be defined for the design of pavement structures, including the initial two-way annual average daily truck : traffic (AADTT), direc...

  12. 'Whose failure counts?' A critical reflection on definitions of failure for community health volunteers providing HIV self-testing in a community-based HIV/TB intervention study in urban Malawi.

    PubMed

    Sambakunsi, Rodrick; Kumwenda, Moses; Choko, Augustine; Corbett, Elizabeth L; Desmond, Nicola Ann

    2015-12-01

    The category of community health worker applied within the context of health intervention trials has been promoted as a cost-effective approach to meeting study objectives across large populations, relying on the promotion of the concept of 'community belonging' to encourage altruistic volunteerism from community members to promote health. This community-based category of individuals is recruited to facilitate externally driven priorities defined by large research teams, outside of the target research environment. An externally defined intervention is then 'brought to' the community through locally recruited community volunteers who form a bridge between the researchers and participants. The specific role of these workers is context-driven and responsive to the needs of the intervention. This paper is based on the findings from an annual evaluation of community health worker performance employed as community counsellors to deliver semi-supervised HIV self-testing (HIVST) at community level of a large HIV/TB intervention trial conducted in urban Blantyre, Malawi. A performance evaluation was conducted to appraise individual service delivery and assess achievements in meeting pre-defined targets for uptake of HIVST with the aim of improving overall uptake of HIVST. Through an empirical 'evaluation of the evaluation' this paper critically reflects on the position of the community volunteer through the analytical lens of 'failure', exploring the tensions in communication and interpretation of intervention delivery between researchers and community volunteers and the differing perspectives on defining failure. It is concluded that community interventions should be developed in collaboration with the population and that information guiding success should be clearly defined.

  13. ‘Whose failure counts?’ A critical reflection on definitions of failure for community health volunteers providing HIV self-testing in a community-based HIV/TB intervention study in urban Malawi

    PubMed Central

    Sambakunsi, Rodrick; Kumwenda, Moses; Choko, Augustine; Corbett, Elizabeth L.; Desmond, Nicola Ann

    2015-01-01

    The category of community health worker applied within the context of health intervention trials has been promoted as a cost-effective approach to meeting study objectives across large populations, relying on the promotion of the concept of ‘com-munity belonging’ to encourage altruistic volunteerism from community members to promote health. This community-based category of individuals is recruited to facilitate externally driven priorities defined by large research teams, outside of the target research environment. An externally defined intervention is then ‘brought to’ the community through locally recruited community volunteers who form a bridge between the researchers and participants. The specific role of these workers is context-driven and responsive to the needs of the intervention. This paper is based on the findings from an annual evaluation of community health worker performance employed as community counsellors to deliver semi-supervised HIV self-testing (HIVST) at community level of a large HIV/TB intervention trial conducted in urban Blantyre, Malawi. A performance evaluation was conducted to appraise individual service delivery and assess achievements in meeting pre-defined targets for uptake of HIVST with the aim of improving overall uptake of HIVST. Through an empirical ‘evaluation of the evaluation’ this paper critically reflects on the position of the community volunteer through the analytical lens of ‘failure’, exploring the tensions in communication and interpretation of intervention delivery between researchers and community volunteers and the differing perspectives on defining failure. It is concluded that community interventions should be developed in collaboration with the population and that information guiding success should be clearly defined. PMID:26762610

  14. Exploring the Role of Gratitude in the Professional Experience of Pre-Service Teachers

    ERIC Educational Resources Information Center

    Howells, Kerry; Cumming, Jessie

    2012-01-01

    The burgeoning body of empirical research in the area of positive psychology points to the beneficial effect of gratitude on factors that could be relevant to some of the challenges faced by pre-service teachers. Although the topic of gratitude has also long been discussed in other fields, there has been a noticeable absence of mention in teacher…

  15. Life Stories of Pre-Service Teachers: Bids, Invitations, Resistance and Redemption, and the Hidden Alliance of Blessers

    ERIC Educational Resources Information Center

    Porter, Thomas Alan

    2017-01-01

    The purpose of this study is to examine how pre-service teachers' negative and/or positive life stories inform their future teaching practices. This study used Gee's (2000) theoretical work on identity, particularly his concepts of bids and invitations; and McAdams and Bowman's (2001) empirical study from life story research, focusing on the…

  16. When "Teaching a Class of Daemons, Dragons and Trainee Teachers"--Learning the Pedagogy of the Virtual Classroom

    ERIC Educational Resources Information Center

    Woollard, John

    2012-01-01

    Virtual worlds can offer opportunities to further extend the experience, skills and understanding of professionals, in this case pre-service teachers. Based on the empirical evidence provided by professional, pre-service teachers, this paper describes the social and emotional aspects of being and learning in a virtual world and the implications…

  17. Awareness, Openness and Eco-Friendly (AOE) Model Teaches Pre-Service Teachers on How to Be Eco-Friendly

    ERIC Educational Resources Information Center

    Jena, Ananta Kumar

    2012-01-01

    This paper studied the empirical pattern to observe the overall attitude of pre service teachers' of different training colleges towards environmental education and practice. Environmental education is a continuous lifelong process, starts at the preschool level and continues up to adulthood via all levels of education. In this context, to know…

  18. Using Participatory Action Research to Develop a Course Module on Education for Sustainable Development in Pre-Service Chemistry Teacher Education

    ERIC Educational Resources Information Center

    Burmeister, Mareike; Eilks, Ingo

    2013-01-01

    This paper describes the development of a course module on sustainability issues and Education for Sustainable Development in German pre-service chemistry teacher education. The module was inspired by empirical research findings about the knowledge base of student teachers. It was created and cyclically refined using Participatory Action Research.…

  19. From empirical Bayes to full Bayes : methods for analyzing traffic safety data.

    DOT National Transportation Integrated Search

    2004-10-24

    Traffic safety engineers are among the early adopters of Bayesian statistical tools for : analyzing crash data. As in many other areas of application, empirical Bayes methods were : their first choice, perhaps because they represent an intuitively ap...

  20. efficient association study design via power-optimized tag SNP selection

    PubMed Central

    HAN, BUHM; KANG, HYUN MIN; SEO, MYEONG SEONG; ZAITLEN, NOAH; ESKIN, ELEAZAR

    2008-01-01

    Discovering statistical correlation between causal genetic variation and clinical traits through association studies is an important method for identifying the genetic basis of human diseases. Since fully resequencing a cohort is prohibitively costly, genetic association studies take advantage of local correlation structure (or linkage disequilibrium) between single nucleotide polymorphisms (SNPs) by selecting a subset of SNPs to be genotyped (tag SNPs). While many current association studies are performed using commercially available high-throughput genotyping products that define a set of tag SNPs, choosing tag SNPs remains an important problem for both custom follow-up studies as well as designing the high-throughput genotyping products themselves. The most widely used tag SNP selection method optimizes over the correlation between SNPs (r2). However, tag SNPs chosen based on an r2 criterion do not necessarily maximize the statistical power of an association study. We propose a study design framework that chooses SNPs to maximize power and efficiently measures the power through empirical simulation. Empirical results based on the HapMap data show that our method gains considerable power over a widely used r2-based method, or equivalently reduces the number of tag SNPs required to attain the desired power of a study. Our power-optimized 100k whole genome tag set provides equivalent power to the Affymetrix 500k chip for the CEU population. For the design of custom follow-up studies, our method provides up to twice the power increase using the same number of tag SNPs as r2-based methods. Our method is publicly available via web server at http://design.cs.ucla.edu. PMID:18702637

  1. Evaluation of Theoretical and Empirical Characteristics of the Communication, Language, and Statistics Survey (CLASS)

    ERIC Educational Resources Information Center

    Wagler, Amy E.; Lesser, Lawrence M.

    2018-01-01

    The interaction between language and the learning of statistical concepts has been receiving increased attention. The Communication, Language, And Statistics Survey (CLASS) was developed in response to the need to focus on dynamics of language in light of the culturally and linguistically diverse environments of introductory statistics classrooms.…

  2. The role of presence in virtual reality exposure therapy

    PubMed Central

    Price, Matthew; Anderson, Page

    2013-01-01

    A growing body of literature suggests that virtual reality is a successful tool for exposure therapy in the treatment of anxiety disorders. Virtual reality (VR) researchers posit the construct of presence, defined as the interpretation of an artificial stimulus as if it were real, to be a presumed factor that enables anxiety to be felt during virtual reality exposure therapy (VRE). However, a handful of empirical studies on the relation between presence and anxiety in VRE have yielded mixed findings. The current study tested the following hypotheses about the relation between presence and anxiety in VRE with a clinical sample of fearful flyers: (1) presence is related to in-session anxiety; (2) presence mediates the extent that pre-existing (pre-treatment) anxiety is experienced during exposure with VR; (3) presence is positively related to the amount of phobic elements included within the virtual environment; (4) presence is related to treatment outcome. Results supported presence as a factor that contributes to the experience of anxiety in the virtual environment as well as a relation between presence and the phobic elements, but did not support a relation between presence and treatment outcome. The study suggests that presence may be a necessary but insufficient requirement for successful VRE. PMID:17145164

  3. From Constraints to Resolution Rules Part II : chains, braids, confluence and T&E

    NASA Astrophysics Data System (ADS)

    Berthier, Denis

    In this Part II, we apply the general theory developed in Part I to a detailed analysis of the Constraint Satisfaction Problem (CSP). We show how specific types of resolution rules can be defined. In particular, we introduce the general notions of a chain and a braid. As in Part I, these notions are illustrated in detail with the Sudoku example - a problem known to be NP-complete and which is therefore typical of a broad class of hard problems. For Sudoku, we also show how far one can go in "approximating" a CSP with a resolution theory and we give an empirical statistical analysis of how the various puzzles, corresponding to different sets of entries, can be classified along a natural scale of complexity. For any CSP, we also prove the confluence property of some Resolution Theories based on braids and we show how it can be used to define different resolution strategies. Finally, we prove that, in any CSP, braids have the same solving capacity as Trial-and-Error (T&E) with no guessing and we comment this result in the Sudoku case.

  4. Microsatellites: Evolutionary and methodological background and empirical applications at individual, population, and phylogenetic levels

    USGS Publications Warehouse

    Scribner, Kim T.; Pearce, John M.; Baker, Allan J.

    2000-01-01

    The recent proliferation and greater accessibility of molecular genetic markers has led to a growing appreciation of the ecological and evolutionary inferences that can be drawn from molecular characterizations of individuals and populations (Burke et al. 1992, Avise 1994). Different techniques have the ability to target DNA sequences which have different patterns of inheritance, different modes and rates of evolution and, concomitantly, different levels of variation. In the quest for 'the right marker for the right job', microsatellites have been widely embraced as the marker of choice for many empirical genetic studies. The proliferation of microsatellite loci for various species and the voluminous literature compiled in very few years associated with their evolution and use in various research applications, exemplifies their growing importance as a research tool in the biological sciences.The ability to define allelic states based on variation at the nucleotide level has afforded unparalleled opportunities to document the actual mutational process and rates of evolution at individual microsatellite loci. The scrutiny to which these loci have been subjected has resulted in data that raise issues pertaining to assumptions formerly stated, but largely untestable for other marker classes. Indeed this is an active arena for theoretical and empirical work. Given the extensive and ever-increasing literature on various statistical methodologies and cautionary notes regarding the uses of microsatellites, some consideration should be given to the unique characteristics of these loci when determining how and under what conditions they can be employed.

  5. Empirical study of recent Chinese stock market

    NASA Astrophysics Data System (ADS)

    Jiang, J.; Li, W.; Cai, X.; Wang, Qiuping A.

    2009-05-01

    We investigate the statistical properties of the empirical data taken from the Chinese stock market during the time period from January, 2006 to July, 2007. By using the methods of detrended fluctuation analysis (DFA) and calculating correlation coefficients, we acquire the evidence of strong correlations among different stock types, stock index, stock volume turnover, A share (B share) seat number, and GDP per capita. In addition, we study the behavior of “volatility”, which is now defined as the difference between the new account numbers for two consecutive days. It is shown that the empirical power-law of the number of aftershock events exceeding the selected threshold is analogous to the Omori law originally observed in geophysics. Furthermore, we find that the cumulative distributions of stock return, trade volume and trade number are all exponential-like, which does not belong to the universality class of such distributions found by Xavier Gabaix et al. [Xavier Gabaix, Parameswaran Gopikrishnan, Vasiliki Plerou, H. Eugene Stanley, Nature, 423 (2003)] for major western markets. Through the comparison, we draw a conclusion that regardless of developed stock markets or emerging ones, “cubic law of returns” is valid only in the long-term absolute return, and in the short-term one, the distributions are exponential-like. Specifically, the distributions of both trade volume and trade number display distinct decaying behaviors in two separate regimes. Lastly, the scaling behavior of the relation is analyzed between dispersion and the mean monthly trade value for each administrative area in China.

  6. Automatic Dynamic Aircraft Modeler (ADAM) for the Computer Program NASTRAN

    NASA Technical Reports Server (NTRS)

    Griffis, H.

    1985-01-01

    Large general purpose finite element programs require users to develop large quantities of input data. General purpose pre-processors are used to decrease the effort required to develop structural models. Further reduction of effort can be achieved by specific application pre-processors. Automatic Dynamic Aircraft Modeler (ADAM) is one such application specific pre-processor. General purpose pre-processors use points, lines and surfaces to describe geometric shapes. Specifying that ADAM is used only for aircraft structures allows generic structural sections, wing boxes and bodies, to be pre-defined. Hence with only gross dimensions, thicknesses, material properties and pre-defined boundary conditions a complete model of an aircraft can be created.

  7. The Effects of Using Animations on Sixth Grade Students' Academic Success in Turkish Grammar Learning

    ERIC Educational Resources Information Center

    Gün, Mesut

    2016-01-01

    The purpose of this empirical study is to determine how and to what extent the use of animations impacts auditory acquisition, one of the key learning fields in 6th grade grammar, as measured by students' academic success and completion rates. By using a pre-test and post-test design, this empirical study randomly divided a group of Turkish 6th…

  8. The empirical status of the third-wave behaviour therapies for the treatment of eating disorders: A systematic review.

    PubMed

    Linardon, Jake; Fairburn, Christopher G; Fitzsimmons-Craft, Ellen E; Wilfley, Denise E; Brennan, Leah

    2017-12-01

    Although third-wave behaviour therapies are being increasingly used for the treatment of eating disorders, their efficacy is largely unknown. This systematic review and meta-analysis aimed to examine the empirical status of these therapies. Twenty-seven studies met full inclusion criteria. Only 13 randomized controlled trials (RCT) were identified, most on binge eating disorder (BED). Pooled within- (pre-post change) and between-groups effect sizes were calculated for the meta-analysis. Large pre-post symptom improvements were observed for all third-wave treatments, including dialectical behaviour therapy (DBT), schema therapy (ST), acceptance and commitment therapy (ACT), mindfulness-based interventions (MBI), and compassion-focused therapy (CFT). Third-wave therapies were not superior to active comparisons generally, or to cognitive-behaviour therapy (CBT) in RCTs. Based on our qualitative synthesis, none of the third-wave therapies meet established criteria for an empirically supported treatment for particular eating disorder subgroups. Until further RCTs demonstrate the efficacy of third-wave therapies for particular eating disorder subgroups, the available data suggest that CBT should retain its status as the recommended treatment approach for bulimia nervosa (BN) and BED, and the front running treatment for anorexia nervosa (AN) in adults, with interpersonal psychotherapy (IPT) considered a strong empirically-supported alternative. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. A precursor of market crashes: Empirical laws of Japan's internet bubble

    NASA Astrophysics Data System (ADS)

    Kaizoji, T.

    2006-03-01

    In this paper, we quantitatively investigate the properties of a statistical ensemble of stock prices. We focus attention on the relative price defined as X(t) = S(t)/S(0), where S(0), is the stock price for an onset time of the bubble. We selected approximately 3200 stocks traded on the Japanese Stock Exchange, and formed a statistical ensemble of daily relative prices for each trading day in the 3-year period from January 4, 1999 to December 28, 2001, corresponding to the period in which internet Bubble formed and crashed in the Japanese stock market. We found that the upper tail of the complementary cumulative distribution function of the ensemble of the relative prices in the high value of the price is well described by a power-law distribution, P(S>x) ˜x-α , with an exponent that moves over time. Furthermore we found that as the power-law exponents α approached two, the bubble burst. It is reasonable to suppose that it indicates that internet bubble is about to burst.

  10. Correcting intensity loss errors in the absence of texture-free reference samples during pole figure measurement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saleh, Ahmed A., E-mail: asaleh@uow.edu.au

    Even with the use of X-ray polycapillary lenses, sample tilting during pole figure measurement results in a decrease in the recorded X-ray intensity. The magnitude of this error is affected by the sample size and/or the finite detector size. These errors can be typically corrected by measuring the intensity loss as a function of the tilt angle using a texture-free reference sample (ideally made of the same alloy as the investigated material). Since texture-free reference samples are not readily available for all alloys, the present study employs an empirical procedure to estimate the correction curve for a particular experimental configuration.more » It involves the use of real texture-free reference samples that pre-exist in any X-ray diffraction laboratory to first establish the empirical correlations between X-ray intensity, sample tilt and their Bragg angles and thereafter generate correction curves for any Bragg angle. It will be shown that the empirically corrected textures are in very good agreement with the experimentally corrected ones. - Highlights: •Sample tilting during X-ray pole figure measurement leads to intensity loss errors. •Texture-free reference samples are typically used to correct the pole figures. •An empirical correction procedure is proposed in the absence of reference samples. •The procedure relies on reference samples that pre-exist in any texture laboratory. •Experimentally and empirically corrected textures are in very good agreement.« less

  11. Statistical Learning and Language: An Individual Differences Study

    ERIC Educational Resources Information Center

    Misyak, Jennifer B.; Christiansen, Morten H.

    2012-01-01

    Although statistical learning and language have been assumed to be intertwined, this theoretical presupposition has rarely been tested empirically. The present study investigates the relationship between statistical learning and language using a within-subject design embedded in an individual-differences framework. Participants were administered…

  12. 48 Capabilities of Highly Educated People

    ERIC Educational Resources Information Center

    Greene, Richard Tabor

    2008-01-01

    Purpose: To get beyond religious, philosophic, and political definitions of educatedness by going empirical. To redo Plato, in effect, by defining "the good" empirically. Background: This research was part of the Excellence Science (orthogonal disciplines) Research Project at the University of Chicago. That project redid Plato by…

  13. Environmental ethics and wilderness management: an empirical study

    Treesearch

    William A. Valliere; Robert E. Manning

    1995-01-01

    The underlying hypothesis of this study is that environmental ethics influence public attitudes toward wilderness management. To study this hypothesis, environmental ethics were defined, categorized, and measured empirically. Additionally, attitudes toward selected wilderness management issues were measured. Associations were found between beliefs in selected...

  14. Sample size determination for disease prevalence studies with partially validated data.

    PubMed

    Qiu, Shi-Fang; Poon, Wai-Yin; Tang, Man-Lai

    2016-02-01

    Disease prevalence is an important topic in medical research, and its study is based on data that are obtained by classifying subjects according to whether a disease has been contracted. Classification can be conducted with high-cost gold standard tests or low-cost screening tests, but the latter are subject to the misclassification of subjects. As a compromise between the two, many research studies use partially validated datasets in which all data points are classified by fallible tests, and some of the data points are validated in the sense that they are also classified by the completely accurate gold-standard test. In this article, we investigate the determination of sample sizes for disease prevalence studies with partially validated data. We use two approaches. The first is to find sample sizes that can achieve a pre-specified power of a statistical test at a chosen significance level, and the second is to find sample sizes that can control the width of a confidence interval with a pre-specified confidence level. Empirical studies have been conducted to demonstrate the performance of various testing procedures with the proposed sample sizes. The applicability of the proposed methods are illustrated by a real-data example. © The Author(s) 2012.

  15. An Empirical Bayes Approach to Mantel-Haenszel DIF Analysis.

    ERIC Educational Resources Information Center

    Zwick, Rebecca; Thayer, Dorothy T.; Lewis, Charles

    1999-01-01

    Developed an empirical Bayes enhancement to Mantel-Haenszel (MH) analysis of differential item functioning (DIF) in which it is assumed that the MH statistics are normally distributed and that the prior distribution of underlying DIF parameters is also normal. (Author/SLD)

  16. Flashback resistant pre-mixer assembly

    DOEpatents

    Laster, Walter R [Oviedo, FL; Gambacorta, Domenico [Oviedo, FL

    2012-02-14

    A pre-mixer assembly associated with a fuel supply system for mixing of air and fuel upstream from a main combustion zone in a gas turbine engine. The pre-mixer assembly includes a swirler assembly disposed about a fuel injector of the fuel supply system and a pre-mixer transition member. The swirler assembly includes a forward end defining an air inlet and an opposed aft end. The pre-mixer transition member has a forward end affixed to the aft end of the swirler assembly and an opposed aft end defining an outlet of the pre-mixer assembly. The aft end of the pre-mixer transition member is spaced from a base plate such that a gap is formed between the aft end of the pre-mixer transition member and the base plate for permitting a flow of purge air therethrough to increase a velocity of the air/fuel mixture exiting the pre-mixer assembly.

  17. A Powerful Procedure for Pathway-Based Meta-analysis Using Summary Statistics Identifies 43 Pathways Associated with Type II Diabetes in European Populations.

    PubMed

    Zhang, Han; Wheeler, William; Hyland, Paula L; Yang, Yifan; Shi, Jianxin; Chatterjee, Nilanjan; Yu, Kai

    2016-06-01

    Meta-analysis of multiple genome-wide association studies (GWAS) has become an effective approach for detecting single nucleotide polymorphism (SNP) associations with complex traits. However, it is difficult to integrate the readily accessible SNP-level summary statistics from a meta-analysis into more powerful multi-marker testing procedures, which generally require individual-level genetic data. We developed a general procedure called Summary based Adaptive Rank Truncated Product (sARTP) for conducting gene and pathway meta-analysis that uses only SNP-level summary statistics in combination with genotype correlation estimated from a panel of individual-level genetic data. We demonstrated the validity and power advantage of sARTP through empirical and simulated data. We conducted a comprehensive pathway-based meta-analysis with sARTP on type 2 diabetes (T2D) by integrating SNP-level summary statistics from two large studies consisting of 19,809 T2D cases and 111,181 controls with European ancestry. Among 4,713 candidate pathways from which genes in neighborhoods of 170 GWAS established T2D loci were excluded, we detected 43 T2D globally significant pathways (with Bonferroni corrected p-values < 0.05), which included the insulin signaling pathway and T2D pathway defined by KEGG, as well as the pathways defined according to specific gene expression patterns on pancreatic adenocarcinoma, hepatocellular carcinoma, and bladder carcinoma. Using summary data from 8 eastern Asian T2D GWAS with 6,952 cases and 11,865 controls, we showed 7 out of the 43 pathways identified in European populations remained to be significant in eastern Asians at the false discovery rate of 0.1. We created an R package and a web-based tool for sARTP with the capability to analyze pathways with thousands of genes and tens of thousands of SNPs.

  18. A Powerful Procedure for Pathway-Based Meta-analysis Using Summary Statistics Identifies 43 Pathways Associated with Type II Diabetes in European Populations

    PubMed Central

    Zhang, Han; Wheeler, William; Hyland, Paula L.; Yang, Yifan; Shi, Jianxin; Chatterjee, Nilanjan; Yu, Kai

    2016-01-01

    Meta-analysis of multiple genome-wide association studies (GWAS) has become an effective approach for detecting single nucleotide polymorphism (SNP) associations with complex traits. However, it is difficult to integrate the readily accessible SNP-level summary statistics from a meta-analysis into more powerful multi-marker testing procedures, which generally require individual-level genetic data. We developed a general procedure called Summary based Adaptive Rank Truncated Product (sARTP) for conducting gene and pathway meta-analysis that uses only SNP-level summary statistics in combination with genotype correlation estimated from a panel of individual-level genetic data. We demonstrated the validity and power advantage of sARTP through empirical and simulated data. We conducted a comprehensive pathway-based meta-analysis with sARTP on type 2 diabetes (T2D) by integrating SNP-level summary statistics from two large studies consisting of 19,809 T2D cases and 111,181 controls with European ancestry. Among 4,713 candidate pathways from which genes in neighborhoods of 170 GWAS established T2D loci were excluded, we detected 43 T2D globally significant pathways (with Bonferroni corrected p-values < 0.05), which included the insulin signaling pathway and T2D pathway defined by KEGG, as well as the pathways defined according to specific gene expression patterns on pancreatic adenocarcinoma, hepatocellular carcinoma, and bladder carcinoma. Using summary data from 8 eastern Asian T2D GWAS with 6,952 cases and 11,865 controls, we showed 7 out of the 43 pathways identified in European populations remained to be significant in eastern Asians at the false discovery rate of 0.1. We created an R package and a web-based tool for sARTP with the capability to analyze pathways with thousands of genes and tens of thousands of SNPs. PMID:27362418

  19. Egalitarianism and altruism in health: some evidence of their relationship

    PubMed Central

    2014-01-01

    Background Egalitarianism and altruism are two ways in which people may have attitudes that go beyond the narrowly defined selfish preferences. The theoretical constructs of egalitarianism and altruism are different from each other, yet there may be connections between the two. This paper explores the empirical relationship between egalitarianism and altruism, in the context of health. Methods We define altruism as individual behaviour that aims to benefit another individual in need; and egalitarianism as a characteristic of a social welfare function, or a meta-level preference. Furthermore, we specify a model that explains the propensity of an individual to be egalitarian in terms of altruism and other background characteristics. Individuals who prefer a hypothetical policy that reduces socioeconomic inequalities in health outcomes over another that does not are regarded ‘egalitarian’ in the health domain. On the other hand, ‘altruism’ in the health context is captured by whether or not the same respondents are (or have been) regular blood donors, provided they are medically able to donate. Probit models are specified to estimate the relationship between egalitarianism and altruism, thus defined. A representative sample of the Spanish population was interviewed for the purpose (n = 417 valid cases). Results Overall, 75% of respondents are found to be egalitarians, whilst 35% are found to be altruists. We find that, once controlled for background characteristics, there is a statistically significant empirical relationship between egalitarianism and altruism in the health context. On average, the probability of an altruist individual supporting egalitarianism is 10% higher than for a non-altruist person. Regarding the other control variables, those living in high per capita income regions have a lower propensity and those who are politically left wing have a higher propensity to be an egalitarian. We do not find evidence of a relationship between egalitarianism and age, socioeconomic status or religious practices. Conclusion Altruist individuals have a higher probability to be egalitarians than would be expected from their observed background characteristics. PMID:24502318

  20. Eye-gaze determination of user intent at the computer interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldberg, J.H.; Schryver, J.C.

    1993-12-31

    Determination of user intent at the computer interface through eye-gaze monitoring can significantly aid applications for the disabled, as well as telerobotics and process control interfaces. Whereas current eye-gaze control applications are limited to object selection and x/y gazepoint tracking, a methodology was developed here to discriminate a more abstract interface operation: zooming-in or out. This methodology first collects samples of eve-gaze location looking at controlled stimuli, at 30 Hz, just prior to a user`s decision to zoom. The sample is broken into data frames, or temporal snapshots. Within a data frame, all spatial samples are connected into a minimummore » spanning tree, then clustered, according to user defined parameters. Each cluster is mapped to one in the prior data frame, and statistics are computed from each cluster. These characteristics include cluster size, position, and pupil size. A multiple discriminant analysis uses these statistics both within and between data frames to formulate optimal rules for assigning the observations into zooming, zoom-out, or no zoom conditions. The statistical procedure effectively generates heuristics for future assignments, based upon these variables. Future work will enhance the accuracy and precision of the modeling technique, and will empirically test users in controlled experiments.« less

  1. Generating landslide inventory by participatory mapping: an example in Purwosari Area, Yogyakarta, Java

    NASA Astrophysics Data System (ADS)

    Samodra, G.; Chen, G.; Sartohadi, J.; Kasama, K.

    2018-04-01

    This paper proposes an approach for landslide inventory mapping considering actual conditions in Indonesia. No satisfactory landslide database exists. What exists is inadequate, focusing, on data response, rather than on pre-disaster preparedness and planning. The humid tropical climate also leads a rapid vegetation growth so past landslides signatures are covered by vegetation or dismantled by erosion process. Generating landslide inventory using standard techniques still seems difficult. A catalog of disasters from local government (village level) was used as a basis of participatory landslide inventory mapping. Eyewitnesses or landslide disaster victims were asked to participate in the reconstruction of past landslides. Field investigation focusing on active participation from communities with the use of an innovative technology was used to verify the landslide events recorded in the disaster catalog. Statistical analysis was also used to obtain the necessary relationships between geometric measurements, including the height of the slope and length of run out, area and volume of displaced materials, the probability distributions of landslide area and volume, and mobilization rate. The result shows that run out distance is proportional to the height of the slope. The frequency distribution calculated by using non-cumulative distribution empirically exhibits a power law (fractal statistic) even though rollover can also be found in the dataset. This cannot be the result of the censoring effect or incompleteness of the data because the landslide inventory dataset can be classified as having complete data or nearly complete data. The so-called participatory landslide inventory mapping method is expected to solve the difficulties of landslide inventory mapping and can be applied to support pre-disaster planning and preparedness action to reduce the landslide disaster risk in Indonesia. It may also supplement the usually incomplete data in a typical landslide inventory.

  2. Counts-in-cylinders in the Sloan Digital Sky Survey with Comparisons to N-body Simulations

    NASA Astrophysics Data System (ADS)

    Berrier, Heather D.; Barton, Elizabeth J.; Berrier, Joel C.; Bullock, James S.; Zentner, Andrew R.; Wechsler, Risa H.

    2011-01-01

    Environmental statistics provide a necessary means of comparing the properties of galaxies in different environments, and a vital test of models of galaxy formation within the prevailing hierarchical cosmological model. We explore counts-in-cylinders, a common statistic defined as the number of companions of a particular galaxy found within a given projected radius and redshift interval. Galaxy distributions with the same two-point correlation functions do not necessarily have the same companion count distributions. We use this statistic to examine the environments of galaxies in the Sloan Digital Sky Survey Data Release 4 (SDSS DR4). We also make preliminary comparisons to four models for the spatial distributions of galaxies, based on N-body simulations and data from SDSS DR4, to study the utility of the counts-in-cylinders statistic. There is a very large scatter between the number of companions a galaxy has and the mass of its parent dark matter halo and the halo occupation, limiting the utility of this statistic for certain kinds of environmental studies. We also show that prevalent empirical models of galaxy clustering, that match observed two- and three-point clustering statistics well, fail to reproduce some aspects of the observed distribution of counts-in-cylinders on 1, 3, and 6 h -1 Mpc scales. All models that we explore underpredict the fraction of galaxies with few or no companions in 3 and 6 h -1 Mpc cylinders. Roughly 7% of galaxies in the real universe are significantly more isolated within a 6 h -1 Mpc cylinder than the galaxies in any of the models we use. Simple phenomenological models that map galaxies to dark matter halos fail to reproduce high-order clustering statistics in low-density environments.

  3. Empirical Model for Predicting Rockfall Trajectory Direction

    NASA Astrophysics Data System (ADS)

    Asteriou, Pavlos; Tsiambaos, George

    2016-03-01

    A methodology for the experimental investigation of rockfall in three-dimensional space is presented in this paper, aiming to assist on-going research of the complexity of a block's response to impact during a rockfall. An extended laboratory investigation was conducted, consisting of 590 tests with cubical and spherical blocks made of an artificial material. The effects of shape, slope angle and the deviation of the post-impact trajectory are examined as a function of the pre-impact trajectory direction. Additionally, an empirical model is proposed that estimates the deviation of the post-impact trajectory as a function of the pre-impact trajectory with respect to the slope surface and the slope angle. This empirical model is validated by 192 small-scale field tests, which are also presented in this paper. Some important aspects of the three-dimensional nature of rockfall phenomena are highlighted that have been hitherto neglected. The 3D space data provided in this study are suitable for the calibration and verification of rockfall analysis software that has become increasingly popular in design practice.

  4. Generating an Empirical Probability Distribution for the Andrews-Pregibon Statistic.

    ERIC Educational Resources Information Center

    Jarrell, Michele G.

    A probability distribution was developed for the Andrews-Pregibon (AP) statistic. The statistic, developed by D. F. Andrews and D. Pregibon (1978), identifies multivariate outliers. It is a ratio of the determinant of the data matrix with an observation deleted to the determinant of the entire data matrix. Although the AP statistic has been used…

  5. New robust statistical procedures for the polytomous logistic regression models.

    PubMed

    Castilla, Elena; Ghosh, Abhik; Martin, Nirian; Pardo, Leandro

    2018-05-17

    This article derives a new family of estimators, namely the minimum density power divergence estimators, as a robust generalization of the maximum likelihood estimator for the polytomous logistic regression model. Based on these estimators, a family of Wald-type test statistics for linear hypotheses is introduced. Robustness properties of both the proposed estimators and the test statistics are theoretically studied through the classical influence function analysis. Appropriate real life examples are presented to justify the requirement of suitable robust statistical procedures in place of the likelihood based inference for the polytomous logistic regression model. The validity of the theoretical results established in the article are further confirmed empirically through suitable simulation studies. Finally, an approach for the data-driven selection of the robustness tuning parameter is proposed with empirical justifications. © 2018, The International Biometric Society.

  6. Initial Study of Pre-Service Teachers' Comments on a Reality-Based, Urban-Student Video Streamed within an Online Course

    ERIC Educational Resources Information Center

    O'Connor, Eileen A.

    2009-01-01

    The Master of Arts in Teaching program at Empire State College, an alternative teacher certification program focused on bringing career-changing adults to high-needs schools, has an important need in its initial pre-service year. These adult students must be prepared to move into complex, high-needs schools without student teaching and often with…

  7. Preparing Future Teachers through Distance Learning: An Empirical Study on Students' Perception of Teacher Education Program Provided by AIOU Pakistan

    ERIC Educational Resources Information Center

    Nadeem, Mohammed; Ali, Akhtar; Maqbool, Saira

    2013-01-01

    The purpose of the current study was to analyse the pre service teachers training programs for the distance learners of Allama Iqbal Open University (AIOU) Islamabad, Pakistan. This kind of training is provided to the future teachers enrolled to acquire pre service training to become a teacher in a Government educational institution in Pakistan.…

  8. Probabilistic performance estimators for computational chemistry methods: The empirical cumulative distribution function of absolute errors

    NASA Astrophysics Data System (ADS)

    Pernot, Pascal; Savin, Andreas

    2018-06-01

    Benchmarking studies in computational chemistry use reference datasets to assess the accuracy of a method through error statistics. The commonly used error statistics, such as the mean signed and mean unsigned errors, do not inform end-users on the expected amplitude of prediction errors attached to these methods. We show that, the distributions of model errors being neither normal nor zero-centered, these error statistics cannot be used to infer prediction error probabilities. To overcome this limitation, we advocate for the use of more informative statistics, based on the empirical cumulative distribution function of unsigned errors, namely, (1) the probability for a new calculation to have an absolute error below a chosen threshold and (2) the maximal amplitude of errors one can expect with a chosen high confidence level. Those statistics are also shown to be well suited for benchmarking and ranking studies. Moreover, the standard error on all benchmarking statistics depends on the size of the reference dataset. Systematic publication of these standard errors would be very helpful to assess the statistical reliability of benchmarking conclusions.

  9. Selecting the most appropriate inferential statistical test for your quantitative research study.

    PubMed

    Bettany-Saltikov, Josette; Whittaker, Victoria Jane

    2014-06-01

    To discuss the issues and processes relating to the selection of the most appropriate statistical test. A review of the basic research concepts together with a number of clinical scenarios is used to illustrate this. Quantitative nursing research generally features the use of empirical data which necessitates the selection of both descriptive and statistical tests. Different types of research questions can be answered by different types of research designs, which in turn need to be matched to a specific statistical test(s). Discursive paper. This paper discusses the issues relating to the selection of the most appropriate statistical test and makes some recommendations as to how these might be dealt with. When conducting empirical quantitative studies, a number of key issues need to be considered. Considerations for selecting the most appropriate statistical tests are discussed and flow charts provided to facilitate this process. When nursing clinicians and researchers conduct quantitative research studies, it is crucial that the most appropriate statistical test is selected to enable valid conclusions to be made. © 2013 John Wiley & Sons Ltd.

  10. The Role of Empirical Research in Bioethics

    PubMed Central

    Kon, Alexander A.

    2010-01-01

    There has long been tension between bioethicists whose work focuses on classical philosophical inquiry and those who perform empirical studies on bioethical issues. While many have argued that empirical research merely illuminates current practices and cannot inform normative ethics, others assert that research-based work has significant implications for refining our ethical norms. In this essay, I present a novel construct for classifying empirical research in bioethics into four hierarchical categories: Lay of the Land, Ideal Versus Reality, Improving Care, and Changing Ethical Norms. Through explaining these four categories and providing examples of publications in each stratum, I define how empirical research informs normative ethics. I conclude by demonstrating how philosophical inquiry and empirical research can work cooperatively to further normative ethics. PMID:19998120

  11. The role of empirical research in bioethics.

    PubMed

    Kon, Alexander A

    2009-01-01

    There has long been tension between bioethicists whose work focuses on classical philosophical inquiry and those who perform empirical studies on bioethical issues. While many have argued that empirical research merely illuminates current practices and cannot inform normative ethics, others assert that research-based work has significant implications for refining our ethical norms. In this essay, I present a novel construct for classifying empirical research in bioethics into four hierarchical categories: Lay of the Land, Ideal Versus Reality, Improving Care, and Changing Ethical Norms. Through explaining these four categories and providing examples of publications in each stratum, I define how empirical research informs normative ethics. I conclude by demonstrating how philosophical inquiry and empirical research can work cooperatively to further normative ethics.

  12. Visualizing histopathologic deep learning classification and anomaly detection using nonlinear feature space dimensionality reduction.

    PubMed

    Faust, Kevin; Xie, Quin; Han, Dominick; Goyle, Kartikay; Volynskaya, Zoya; Djuric, Ugljesa; Diamandis, Phedias

    2018-05-16

    There is growing interest in utilizing artificial intelligence, and particularly deep learning, for computer vision in histopathology. While accumulating studies highlight expert-level performance of convolutional neural networks (CNNs) on focused classification tasks, most studies rely on probability distribution scores with empirically defined cutoff values based on post-hoc analysis. More generalizable tools that allow humans to visualize histology-based deep learning inferences and decision making are scarce. Here, we leverage t-distributed Stochastic Neighbor Embedding (t-SNE) to reduce dimensionality and depict how CNNs organize histomorphologic information. Unique to our workflow, we develop a quantitative and transparent approach to visualizing classification decisions prior to softmax compression. By discretizing the relationships between classes on the t-SNE plot, we show we can super-impose randomly sampled regions of test images and use their distribution to render statistically-driven classifications. Therefore, in addition to providing intuitive outputs for human review, this visual approach can carry out automated and objective multi-class classifications similar to more traditional and less-transparent categorical probability distribution scores. Importantly, this novel classification approach is driven by a priori statistically defined cutoffs. It therefore serves as a generalizable classification and anomaly detection tool less reliant on post-hoc tuning. Routine incorporation of this convenient approach for quantitative visualization and error reduction in histopathology aims to accelerate early adoption of CNNs into generalized real-world applications where unanticipated and previously untrained classes are often encountered.

  13. The Empirical Nature and Statistical Treatment of Missing Data

    ERIC Educational Resources Information Center

    Tannenbaum, Christyn E.

    2009-01-01

    Introduction. Missing data is a common problem in research and can produce severely misleading analyses, including biased estimates of statistical parameters, and erroneous conclusions. In its 1999 report, the APA Task Force on Statistical Inference encouraged authors to report complications such as missing data and discouraged the use of…

  14. The Development of Statistical Literacy at School

    ERIC Educational Resources Information Center

    Callingham, Rosemary; Watson, Jane M.

    2017-01-01

    Statistical literacy increasingly is considered an important outcome of schooling. There is little information, however, about appropriate expectations of students at different stages of schooling. Some progress towards this goal was made by Watson and Callingham (2005), who identified an empirical 6-level hierarchy of statistical literacy and the…

  15. Explorations in Statistics: Permutation Methods

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2012-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eighth installment of "Explorations in Statistics" explores permutation methods, empiric procedures we can use to assess an experimental result--to test a null hypothesis--when we are reluctant to trust statistical…

  16. Pre-Osteoarthritis

    PubMed Central

    Brittberg, Mats; Eriksson, Karl; Jurvelin, Jukka S.; Lindahl, Anders; Marlovits, Stefan; Möller, Per; Richardson, James B.; Steinwachs, Matthias; Zenobi-Wong, Marcy

    2015-01-01

    Objective An attempt to define pre-osteoarthritis (OA) versus early OA and definitive osteoarthritis. Methods A group of specialists in the field of cartilage science and treatment was formed to consider the nature of OA onset and its possible diagnosis. Results Late-stage OA, necessitating total joint replacement, is the end stage of a biological process, with many previous earlier stages. Early-stage OA has been defined and involves structural changes identified by arthroscopy or radiography. The group argued that before the “early-stage OA” there must exist a stage where cellular processes, due to the presence of risk factors, have kicked into action but have not yet resulted in structural changes. The group suggested that this stage could be called “pre-osteoarthritis” (pre-OA). Conclusions The group suggests that defining points of initiation for OA in the knee could be defined, for example, by traumatic episodes or surgical meniscectomy. Such events may set in motion metabolic processes that could be diagnosed by modern MRI protocols or arthroscopy including probing techniques before structural changes of early OA have developed. Preventive measures should preferably be applied at this pre-OA stage in order to stop the projected OA “epidemic.” PMID:26175861

  17. Normoxia vs. Hyperoxia: Impact of Oxygen Tension Strategies on Outcomes for Patients Receiving Cardiopulmonary Bypass for Routine Cardiac Surgical Repair

    PubMed Central

    Brown, D. Mark; Holt, David W.; Edwards, Jeff T.; Burnett, Robert J.

    2006-01-01

    Abstract: Oxygen pressure field theory (OPFT) was originally described in the early 1900s by Danish physiologist, Dr. August Krogh. This revolutionary theory described microcirculation of blood gases at the capillary level using a theoretical cylindrical tissue model commonly referred to as the Krogh cylinder. In recent years, the principles and benefits of OPFT in long-term extracorporeal circulatory support (ECMO) have been realized. Cardiac clinicians have successfully mastered OPFT fundamentals and incorporated them into their clinical practice. These clinicians have experienced significantly improved survival rates as a result of OPFT strategies. The objective of this study was to determine if a hyperoxic strategy can lead to equally beneficial outcomes for short-term support as measured by total ventilator time and total length of stay in intensive care unit (ICU) in the cardiopulmonary bypass (CPB) patient at a private institution. Patients receiving traditional blood gas management while on CPB (group B, n = 17) were retrospectively compared with hyperoxic patients (group A, n = 19). Hyperoxic/OPFT management was defined as paO2 values of 300–350 mmHg and average VSAT > 75%. Traditional blood gas management was defined as paO2 values of 150–250 mmHg and average VSAT < 75%. No significant differences between treatment groups were found for patient weight, CPB/AXC times, BSA, pre/post Hgb, pre/post-platelet (PLT) counts, pre/post-creatinine levels, pre/post-BUN, UF volumes, or CPB urine output. Additionally, no significant statistical differences were found between treatment groups for total time in ICU (T-ICU) or total time on ventilator (TOV). Hyperoxic management strategies provided no conclusive evidence of outcome improvement for patients receiving CPB for routine cardiac surgical repair. Additional studies into the impact of hyperoxia in short-term extracorporeal circulatory support are needed. PMID:17089511

  18. Accuracy of Blood Pressure-to-Height Ratio to Define Elevated Blood Pressure in Children and Adolescents: The CASPIAN-IV Study.

    PubMed

    Kelishadi, Roya; Bahreynian, Maryam; Heshmat, Ramin; Motlagh, Mohammad Esmail; Djalalinia, Shirin; Naji, Fatemeh; Ardalan, Gelayol; Asayesh, Hamid; Qorbani, Mostafa

    2016-02-01

    The aim of this study was to propose a simple practical diagnostic criterion for pre-hypertension (pre-HTN) and hypertension (HTN) in the pediatric age group. This study was conducted on a nationally representative sample of 14,880 students, aged 6-18 years. HTN and pre-HTN were defined as systolic blood pressure (SBP) and/or diastolic blood pressure (DBP) ≥ 95 and 90-95th percentile for age, gender, and height, respectively. By using the area under the curve (AUC) of the receiver operator characteristic curves, we estimated the diagnostic accuracy of two indexes of SBP-to-height ratio (SBPHR) and DBP-to-height (DBPHR) to define pre-HTN and HTN. Overall, SBPHR performed relatively well in classifying subjects to HTN (AUC 0.80-0.85) and pre-HTN (AUC 0.84-0.90). Likewise, DBPHR performed relatively well in classifying subjects to HTN (AUC 0.90-0.97) and pre-HTN (AUC 0.70-0.83). Two indexes of SBPHR and DBPHR are considered as valid, simple, inexpensive, and accurate tools to diagnose pre-HTN and HTN in pediatric age group.

  19. Searching for a Common Ground--A Literature Review of Empirical Research on Scientific Inquiry Activities

    ERIC Educational Resources Information Center

    Rönnebeck, Silke; Bernholt, Sascha; Ropohl, Mathias

    2016-01-01

    Despite the importance of scientific inquiry in science education, researchers and educators disagree considerably regarding what features define this instructional approach. While a large body of literature addresses theoretical considerations, numerous empirical studies investigate scientific inquiry on quite different levels of detail and also…

  20. An Empirical Evaluation of Factor Reliability.

    ERIC Educational Resources Information Center

    Jackson, Douglas N.; Morf, Martin E.

    The psychometric reliability of a factor, defined as its generalizability across samples drawn from the same population of tests, is considered as a necessary precondition for the scientific meaningfulness of factor analytic results. A solution to the problem of generalizability is illustrated empirically on data from a set of tests designed to…

  1. Procedures for Empirical Determination of En-Route Criterion Levels.

    ERIC Educational Resources Information Center

    Moncrief, Michael H.

    En-route Criterion Levels (ECLs) are defined as decision rules for predicting pupil readiness to advance through an instructional sequence. This study investigated the validity of present ELCs in an individualized mathematics program and tested procedures for empirically determining optimal ECLs. Retest scores and subsequent progress were…

  2. University-Industry Collaboration, Knowledge Management and Enterprise Innovation Performance: An Empirical Study

    ERIC Educational Resources Information Center

    Chen, Jin; Wei, Shiyang

    2008-01-01

    This empirical study is concerned with university-industry collaboration from a knowledge management perspective. The authors introduce the concepts of "enterprise-level core elements" to define the principle status of an enterprise during university-industry collaboration, and "network embeddedness" as an indication of the…

  3. On the repeated measures designs and sample sizes for randomized controlled trials.

    PubMed

    Tango, Toshiro

    2016-04-01

    For the analysis of longitudinal or repeated measures data, generalized linear mixed-effects models provide a flexible and powerful tool to deal with heterogeneity among subject response profiles. However, the typical statistical design adopted in usual randomized controlled trials is an analysis of covariance type analysis using a pre-defined pair of "pre-post" data, in which pre-(baseline) data are used as a covariate for adjustment together with other covariates. Then, the major design issue is to calculate the sample size or the number of subjects allocated to each treatment group. In this paper, we propose a new repeated measures design and sample size calculations combined with generalized linear mixed-effects models that depend not only on the number of subjects but on the number of repeated measures before and after randomization per subject used for the analysis. The main advantages of the proposed design combined with the generalized linear mixed-effects models are (1) it can easily handle missing data by applying the likelihood-based ignorable analyses under the missing at random assumption and (2) it may lead to a reduction in sample size, compared with the simple pre-post design. The proposed designs and the sample size calculations are illustrated with real data arising from randomized controlled trials. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  4. Iron therapy for pre-operative anaemia.

    PubMed

    Ng, Oliver; Keeler, Barrie D; Mishra, Amitabh; Simpson, Alastair; Neal, Keith; Brookes, Matthew J; Acheson, Austin G

    2015-12-22

    Pre-operative anaemia is common and occurs in up to 76% of patients. It is associated with increased peri-operative allogeneic blood transfusions, longer hospital lengths of stay and increased morbidity and mortality. Iron deficiency is one of the most common causes of this anaemia. Oral iron therapy has traditionally been used to treat anaemia but newer, safer parenteral iron preparations have been shown to be more effective in other conditions such as inflammatory bowel disease, chronic heart failure and post-partum haemorrhage. A limited number of studies look at iron therapy for the treatment of pre-operative anaemia. The aim of this Cochrane review is to summarise the evidence for use of iron supplementation, both enteral and parenteral, for the management of pre-operative anaemia. The objective of this review is to evaluate the effects of pre-operative iron therapy (enteral or parenteral) in reducing the need for allogeneic blood transfusions in anaemic patients undergoing surgery. We ran the search on 25 March 2015. We searched the Cochrane Injuries Group's Specialised Register, Cochrane Central Register of Controlled Trials (CENTRAL, The Cochrane Library), Ovid MEDLINE(R), Ovid MEDLINE(R) In-Process & Other Non-Indexed Citations, Ovid MEDLINE(R) Daily and Ovid OLDMEDLINE(R), EMBASE Classic and EMBASE (Ovid), CINAHL Plus (EBSCO), PubMed, clinical trials registries, conference abstracts, and we screened reference lists. We included all randomised controlled trials (RCTs) which compared pre-operative iron monotherapy to placebo, no treatment, standard of care or another form of iron therapy for anaemic adults undergoing surgery. Anaemia was defined by haemoglobin values less than 13 g/dL for males and 12 g/dL for non-pregnant females. Data were collected by two authors on the proportion of patients who receive a blood transfusion, amount of blood transfused per patient (units) and haemoglobin measured as continuous variables at pre-determined time-points: pre-treatment, pre-operatively but post-treatment, and post-operatively. Statistical analysis was performed using the Cochrane statistical software, Review Manager 2014. Outcome data were summarised in tables and a forest plot. Three prospective randomised controlled studies evaluated pre-operative iron therapy to correct anaemia (two in colorectal and one in gynaecological surgery) and included 114 patients in total. One compared oral iron versus standard care (Lidder 2007); one intravenous iron versus control (Edwards 2009); and one study compared oral versus intravenous iron (Kim 2009). Both colorectal trials reported the primary outcome (proportion of patients who received allogeneic blood transfusions) and meta-analysis showed a reduction in blood transfusions with the administration of iron therapy, but the reduction was not statistically significant (risk ratio (RR) 0.56, 95% confidence interval (CI) 0.27 to 1.18). All studies reported haemoglobin change but data for the anaemic patients were only available for two studies (Edwards 2009 and Kim 2009). Edwards 2009 showed no difference in haemoglobin at the end of treatment pre-operatively. The intravenous versus oral iron study showed an increase in haemoglobin with intravenous iron at the end of treatment pre-operatively (MD 1.90 g/dL, 95% CI 1.16 to 2.64; participants = 56), but the results are at high risk of bias because participants with less than 80% compliance with therapy were excluded from the analysis and compliance was lower in the oral iron group due to the side-effects of treatment (Kim 2009).None of the studies reported quality of life, short- or long-term mortality or post-operative morbidity. The use of iron therapy for pre-operative anaemia does not show a statistically significant reduction in the proportion of patients who received an allogeneic blood transfusion compared to no iron therapy. However, the 38 patients in our analysis falls far short of the 819 patients our information size calculation recommended to detect a 30% reduction in blood transfusions. Intravenous iron may be more effective than oral iron at increasing haemoglobin. However, all these conclusions are drawn from only three small randomised controlled studies. Further well designed, adequately powered randomised controlled studies are required to determine the true effectiveness of iron therapy for pre-operative anaemia.

  5. Modulation of Respiratory Frequency by Peptidergic Input to Rhythmogenic Neurons in the PreBötzinger Complex

    PubMed Central

    Gray, Paul A.; Rekling, Jens C.; Bocchiaro, Christopher M.; Feldman, Jack L.

    2010-01-01

    Neurokinin-1 receptor (NK1R) and μ-opioid receptor (μOR) agonists affected respiratory rhythm when injected directly into the preBötzinger Complex (preBötC), the hypothesized site for respiratory rhythmogenesis in mammals. These effects were mediated by actions on preBötC rhythmogenic neurons. The distribution of NK1R+ neurons anatomically defined the preBötC. Type 1 neurons in the preBötC, which have rhythmogenic properties, expressed both NK1Rs and μORs, whereas type 2 neurons expressed only NK1Rs. These findings suggest that the preBötC is a definable anatomic structure with unique physiological function and that a subpopulation of neurons expressing both NK1Rs and μORs generate respiratory rhythm and modulate respiratory frequency. PMID:10567264

  6. Importance of vesicle release stochasticity in neuro-spike communication.

    PubMed

    Ramezani, Hamideh; Akan, Ozgur B

    2017-07-01

    Aim of this paper is proposing a stochastic model for vesicle release process, a part of neuro-spike communication. Hence, we study biological events occurring in this process and use microphysiological simulations to observe functionality of these events. Since the most important source of variability in vesicle release probability is opening of voltage dependent calcium channels (VDCCs) followed by influx of calcium ions through these channels, we propose a stochastic model for this event, while using a deterministic model for other variability sources. To capture the stochasticity of calcium influx to pre-synaptic neuron in our model, we study its statistics and find that it can be modeled by a distribution defined based on Normal and Logistic distributions.

  7. Adaptive segmentation of cerebrovascular tree in time-of-flight magnetic resonance angiography.

    PubMed

    Hao, J T; Li, M L; Tang, F L

    2008-01-01

    Accurate segmentation of the human vasculature is an important prerequisite for a number of clinical procedures, such as diagnosis, image-guided neurosurgery and pre-surgical planning. In this paper, an improved statistical approach to extracting whole cerebrovascular tree in time-of-flight magnetic resonance angiography is proposed. Firstly, in order to get a more accurate segmentation result, a localized observation model is proposed instead of defining the observation model over the entire dataset. Secondly, for the binary segmentation, an improved Iterative Conditional Model (ICM) algorithm is presented to accelerate the segmentation process. The experimental results showed that the proposed algorithm can obtain more satisfactory segmentation results and save more processing time than conventional approaches, simultaneously.

  8. Circular RNAs Are the Predominant Transcript Isoform from Hundreds of Human Genes in Diverse Cell Types

    PubMed Central

    Wang, Peter Lincoln; Lacayo, Norman; Brown, Patrick O.

    2012-01-01

    Most human pre-mRNAs are spliced into linear molecules that retain the exon order defined by the genomic sequence. By deep sequencing of RNA from a variety of normal and malignant human cells, we found RNA transcripts from many human genes in which the exons were arranged in a non-canonical order. Statistical estimates and biochemical assays provided strong evidence that a substantial fraction of the spliced transcripts from hundreds of genes are circular RNAs. Our results suggest that a non-canonical mode of RNA splicing, resulting in a circular RNA isoform, is a general feature of the gene expression program in human cells. PMID:22319583

  9. Updated logistic regression equations for the calculation of post-fire debris-flow likelihood in the western United States

    USGS Publications Warehouse

    Staley, Dennis M.; Negri, Jacquelyn A.; Kean, Jason W.; Laber, Jayme L.; Tillery, Anne C.; Youberg, Ann M.

    2016-06-30

    Wildfire can significantly alter the hydrologic response of a watershed to the extent that even modest rainstorms can generate dangerous flash floods and debris flows. To reduce public exposure to hazard, the U.S. Geological Survey produces post-fire debris-flow hazard assessments for select fires in the western United States. We use publicly available geospatial data describing basin morphology, burn severity, soil properties, and rainfall characteristics to estimate the statistical likelihood that debris flows will occur in response to a storm of a given rainfall intensity. Using an empirical database and refined geospatial analysis methods, we defined new equations for the prediction of debris-flow likelihood using logistic regression methods. We showed that the new logistic regression model outperformed previous models used to predict debris-flow likelihood.

  10. The Efficiency of First-Trimester Uterine Artery Doppler, ADAM12, PAPP-A and Maternal Characteristics in the Prediction of Pre-Eclampsia

    PubMed Central

    GOETZINGER, Katherine R.; ZHONG, Yan; CAHILL, Alison G.; ODIBO, Linda; MACONES, George A.; ODIBO, Anthony O.

    2014-01-01

    Objective To estimate the efficiency of first-trimester uterine artery Doppler, A-disintegrin and metalloprotease 12 (ADAM12), pregnancy-associated plasma protein A (PAPP-A) and maternal characteristics in the prediction of pre-eclampsia. Methods This is a prospective cohort study of patients presenting for first-trimester aneuploidy screening between 11-14 weeks’ gestation. Maternal serum ADAM12 and PAPP-A levels were measured by immunoassay, and mean uterine artery Doppler pulsatility indices (PI) were calculated. Outcomes of interest included pre-eclampsia, early pre-eclampsia, defined as requiring delivery at <34 weeks’ gestation, and gestational hypertension. Logistic regression analysis was used to model the prediction of pre-eclampsia using ADAM12 multiples of the median (MoM), PAPP-A MoM, and uterine artery Doppler PI MoM, either individually or in combination. Sensitivity, specificity, and area under the receiver-operating characteristic curves (AUC) were used to compare the screening efficiency of the models using non-parametric U-statistics. Results Of 578 patients with complete outcome data, there were 54 (9.3%) cases of preeclampsia and 13 (2.2%) cases of early pre-eclampsia. Median ADAM12 levels were significantly lower in patients who developed pre-eclampsia compared to those who did not. (0.81 v. 1.01 MoMs; p<0.04) For a fixed false positive rate (FPR) of 10%, ADAM12, PAPP-A, and uterine artery Doppler in combination with maternal characteristics identified 50%, 48%, and 52% of patients who developed pre-eclampsia, respectively. Combining these first-trimester parameters did not improve the predictive efficiency of the models. Conclusion First-trimester ADAM12, PAPP-A, and uterine artery Doppler are not sufficiently predictive of pre-eclampsia. Combinations of these parameters do not further improve their screening efficiency. PMID:23980220

  11. Rates and drivers of progression to pre-diabetes and diabetes mellitus among HIV-infected adults on antiretroviral therapy: a global systematic review and meta-analysis protocol.

    PubMed

    Nansseu, Jobert Richie N; Bigna, Jean Joel R; Kaze, Arnaud D; Noubiap, Jean Jacques N

    2016-09-15

    With the new 'test and treat' policy of the WHO, it is obvious that the number of HIV-infected patients taking antiretroviral therapy (ART) will grow exponentially, with consequential increase in the burden of diabetes mellitus (DM). Our aim is to summarise existing data on the incidence of pre-diabetes and DM, and associated risk factors among HIV-infected adults. This systematic review will include cohort studies reporting the incidence of pre-diabetes and/or DM, and associated risk factors among HIV-infected adults on ART, with these patients being free of any impaired glucose metabolism at study baseline. We will perform electronic searches in PubMed, Excerpta Medica Database (EMBASE), Web of Science and WHO Global Health Library, supplemented with manual searches. Articles published from 1 January 2000 to 31 July 2016, in English or French languages, and without any geographical restriction will be eligible for inclusion. 3 authors will independently screen, select studies, extract data and assess the risk of bias with discrepancies resolved by consensus. We will assess clinical heterogeneity by examining the study design and setting, criteria and cut-offs used to define pre-diabetes or DM, process of calculation of incidence and outcomes in each study. We will also assess statistical heterogeneity using the χ(2) test of homogeneity and quantify it using the I(2) statistic. A random effects meta-analysis will be used to estimate the overall cumulative incidence of pre-diabetes/DM and risk factors. This systematic review will use data from published studies and does not require ethics approval. Its results are expected to help putting in place action plans and preventive measures to curb the growing burden of DM in the HIV population on ART. Findings will be published in a peer-reviewed journal and presented at scientific conferences. CRD42016039651. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  12. On Detecting Influential Data and Selecting Regression Variables

    DTIC Science & Technology

    1989-10-01

    subset of the data. The empirical influence function for ,, IFA is defined to be IFA = AA -- A (2) For a given positive definite matrix M and a nonzero...interest. Cook and Weisberg (1980) tried to treat their measurement of the influence on the fitted values X. They used the empirical influence function for...Characterizations of an empirical influence function for detecting influential cases in regression. Technometrics 22, 495-508. [3] Gray, J. B. and Ling, R. F

  13. Design issues in a randomized controlled trial of a pre-emptive versus empiric antifungal strategy for invasive aspergillosis in patients with high-risk hematologic malignancies.

    PubMed

    Morrissey, C Orla; Chen, Sharon C-A; Sorrell, Tania C; Bradstock, Kenneth F; Szer, Jeffrey; Halliday, Catriona L; Gilroy, Nicole M; Schwarer, Anthony P; Slavin, Monica A

    2011-02-01

    Invasive aspergillosis (IA) is a major cause of mortality in patients with hematological malignancies, due largely to the inability of traditional culture and biopsy methods to make an early or accurate diagnosis. Diagnostic accuracy studies suggest that Aspergillus galactomannan (GM) enzyme immunoassay (ELISA) and Aspergillus PCR-based methods may overcome these limitations, but their impact on patient outcomes should be evaluated in a diagnostic randomized controlled trial (D-RCT). This article describes the methodology of a D-RCT which compares a new pre-emptive strategy (GM-ELISA- and Aspergillus PCR-driven antifungal therapy) with the standard fever-driven empiric antifungal treatment strategy. Issues including primary end-point and patient selection, duration of screening, choice of tests for the pre-emptive strategy, antifungal prophylaxis and bias control, which were considered in the design of the trial, are discussed. We suggest that the template presented herein is considered by researchers when evaluating the utility of new diagnostic tests (ClinicalTrials.gov number, NCT00163722).

  14. An Empirical Study of the Effectiveness of Negotiation of Meaning in L2 Vocabulary Acquisition of Chinese Learners of English

    ERIC Educational Resources Information Center

    Yi, Baoshu; Sun, Zhinong

    2013-01-01

    The study aimed to investigate whether or not negotiation of meaning is effective in L2 vocabulary acquisition of Chinese learners of English in the classroom setting. In the study there were two experimental groups (pre-modified input and negotiation of meaning) and two control groups (pre-modified input). The four groups were required to do a…

  15. The Role of Lesson Analysis in Pre-Service Teacher Education: An Empirical Investigation of Teacher Learning from a Virtual Video-Based Field Experience

    ERIC Educational Resources Information Center

    Santagata, Rossella; Zannoni, Claudia; Stigler, James W.

    2007-01-01

    A video-based program on lesson analysis for pre-service mathematics teachers was implemented for two consecutive years as part of a teacher education program at the University of Lazio, Italy. Two questions were addressed: What can preservice teachers learn from the analysis of videotaped lessons? How can preservice teachers' analysis ability,…

  16. The Chinese number naming system and its impact on the arithmetic performance of pre-schoolers in Hong Kong

    NASA Astrophysics Data System (ADS)

    Ng, Sharon Sui Ngan

    2012-06-01

    Asian children, including Chinese children, perform better than their English-speaking peers in cross-national mathematics studies. This superior Asian performance is attributed to several factors including cultural beliefs, educational systems and practices, and the Chinese number naming system. Given the limited empirical evidence on pre-school mathematics learning in Chinese societies, the outcomes of Western studies are often borrowed and adopted in curriculum planning in Asian schools. The study reported in this paper investigated the performance of Hong Kong Chinese pre-school children based on Western studies involving English-speaking children, and examined the relationship between the Chinese number naming system and children's performance in number and operation concepts. Data were collected from 299 pre-school children aged between 3 and 5 years. The learning sequence of the children in mastering number and operation concepts was established using the Rasch Model. This study provides empirical evidence for the feasibility of borrowing lists of mathematics concepts from studies of English-speaking children to serve as a reference for school-based curriculum planning in a Chinese-speaking context. However, it is not enough to establish the relationship between children's performance and the Chinese number naming system. Classroom instruction and cultural beliefs in mathematics learning are also important in explaining children's performance.

  17. Empirical projection-based basis-component decomposition method

    NASA Astrophysics Data System (ADS)

    Brendel, Bernhard; Roessl, Ewald; Schlomka, Jens-Peter; Proksa, Roland

    2009-02-01

    Advances in the development of semiconductor based, photon-counting x-ray detectors stimulate research in the domain of energy-resolving pre-clinical and clinical computed tomography (CT). For counting detectors acquiring x-ray attenuation in at least three different energy windows, an extended basis component decomposition can be performed in which in addition to the conventional approach of Alvarez and Macovski a third basis component is introduced, e.g., a gadolinium based CT contrast material. After the decomposition of the measured projection data into the basis component projections, conventional filtered-backprojection reconstruction is performed to obtain the basis-component images. In recent work, this basis component decomposition was obtained by maximizing the likelihood-function of the measurements. This procedure is time consuming and often unstable for excessively noisy data or low intrinsic energy resolution of the detector. Therefore, alternative procedures are of interest. Here, we introduce a generalization of the idea of empirical dual-energy processing published by Stenner et al. to multi-energy, photon-counting CT raw data. Instead of working in the image-domain, we use prior spectral knowledge about the acquisition system (tube spectra, bin sensitivities) to parameterize the line-integrals of the basis component decomposition directly in the projection domain. We compare this empirical approach with the maximum-likelihood (ML) approach considering image noise and image bias (artifacts) and see that only moderate noise increase is to be expected for small bias in the empirical approach. Given the drastic reduction of pre-processing time, the empirical approach is considered a viable alternative to the ML approach.

  18. A Comparison of Approaches for Setting Proficiency Standards.

    ERIC Educational Resources Information Center

    Koffler, Stephen L.

    This research compared the cut-off scores estimated from an empirical procedure (Contrasting group method) to those determined from a more theoretical process (Nedelsky method). A methodological and statistical framework was also provided for analysis of the data to obtain the most appropriate standard using the empirical procedure. Data were…

  19. Agent-Based Models in Empirical Social Research

    ERIC Educational Resources Information Center

    Bruch, Elizabeth; Atwell, Jon

    2015-01-01

    Agent-based modeling has become increasingly popular in recent years, but there is still no codified set of recommendations or practices for how to use these models within a program of empirical research. This article provides ideas and practical guidelines drawn from sociology, biology, computer science, epidemiology, and statistics. We first…

  20. Reproducibility in Psychological Science: When Do Psychological Phenomena Exist?

    PubMed Central

    Iso-Ahola, Seppo E.

    2017-01-01

    Scientific evidence has recently been used to assert that certain psychological phenomena do not exist. Such claims, however, cannot be made because (1) scientific method itself is seriously limited (i.e., it can never prove a negative); (2) non-existence of phenomena would require a complete absence of both logical (theoretical) and empirical support; even if empirical support is weak, logical and theoretical support can be strong; (3) statistical data are only one piece of evidence and cannot be used to reduce psychological phenomena to statistical phenomena; and (4) psychological phenomena vary across time, situations and persons. The human mind is unreproducible from one situation to another. Psychological phenomena are not particles that can decisively be tested and discovered. Therefore, a declaration that a phenomenon is not real is not only theoretically and empirically unjustified but runs counter to the propositional and provisional nature of scientific knowledge. There are only “temporary winners” and no “final truths” in scientific knowledge. Psychology is a science of subtleties in human affect, cognition and behavior. Its phenomena fluctuate with conditions and may sometimes be difficult to detect and reproduce empirically. When strictly applied, reproducibility is an overstated and even questionable concept in psychological science. Furthermore, statistical measures (e.g., effect size) are poor indicators of the theoretical importance and relevance of phenomena (cf. “deliberate practice” vs. “talent” in expert performance), not to mention whether phenomena are real or unreal. To better understand psychological phenomena, their theoretical and empirical properties should be examined via multiple parameters and criteria. Ten such parameters are suggested. PMID:28626435

  1. Determination of errors in derived magnetic field directions in geosynchronous orbit: results from a statistical approach

    NASA Astrophysics Data System (ADS)

    Chen, Yue; Cunningham, Gregory; Henderson, Michael

    2016-09-01

    This study aims to statistically estimate the errors in local magnetic field directions that are derived from electron directional distributions measured by Los Alamos National Laboratory geosynchronous (LANL GEO) satellites. First, by comparing derived and measured magnetic field directions along the GEO orbit to those calculated from three selected empirical global magnetic field models (including a static Olson and Pfitzer 1977 quiet magnetic field model, a simple dynamic Tsyganenko 1989 model, and a sophisticated dynamic Tsyganenko 2001 storm model), it is shown that the errors in both derived and modeled directions are at least comparable. Second, using a newly developed proxy method as well as comparing results from empirical models, we are able to provide for the first time circumstantial evidence showing that derived magnetic field directions should statistically match the real magnetic directions better, with averaged errors < ˜ 2°, than those from the three empirical models with averaged errors > ˜ 5°. In addition, our results suggest that the errors in derived magnetic field directions do not depend much on magnetospheric activity, in contrast to the empirical field models. Finally, as applications of the above conclusions, we show examples of electron pitch angle distributions observed by LANL GEO and also take the derived magnetic field directions as the real ones so as to test the performance of empirical field models along the GEO orbits, with results suggesting dependence on solar cycles as well as satellite locations. This study demonstrates the validity and value of the method that infers local magnetic field directions from particle spin-resolved distributions.

  2. Determination of errors in derived magnetic field directions in geosynchronous orbit: results from a statistical approach

    DOE PAGES

    Chen, Yue; Cunningham, Gregory; Henderson, Michael

    2016-09-21

    Our study aims to statistically estimate the errors in local magnetic field directions that are derived from electron directional distributions measured by Los Alamos National Laboratory geosynchronous (LANL GEO) satellites. First, by comparing derived and measured magnetic field directions along the GEO orbit to those calculated from three selected empirical global magnetic field models (including a static Olson and Pfitzer 1977 quiet magnetic field model, a simple dynamic Tsyganenko 1989 model, and a sophisticated dynamic Tsyganenko 2001 storm model), it is shown that the errors in both derived and modeled directions are at least comparable. Furthermore, using a newly developedmore » proxy method as well as comparing results from empirical models, we are able to provide for the first time circumstantial evidence showing that derived magnetic field directions should statistically match the real magnetic directions better, with averaged errors < ~2°, than those from the three empirical models with averaged errors > ~5°. In addition, our results suggest that the errors in derived magnetic field directions do not depend much on magnetospheric activity, in contrast to the empirical field models. Finally, as applications of the above conclusions, we show examples of electron pitch angle distributions observed by LANL GEO and also take the derived magnetic field directions as the real ones so as to test the performance of empirical field models along the GEO orbits, with results suggesting dependence on solar cycles as well as satellite locations. Finally, this study demonstrates the validity and value of the method that infers local magnetic field directions from particle spin-resolved distributions.« less

  3. Determination of errors in derived magnetic field directions in geosynchronous orbit: results from a statistical approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yue; Cunningham, Gregory; Henderson, Michael

    Our study aims to statistically estimate the errors in local magnetic field directions that are derived from electron directional distributions measured by Los Alamos National Laboratory geosynchronous (LANL GEO) satellites. First, by comparing derived and measured magnetic field directions along the GEO orbit to those calculated from three selected empirical global magnetic field models (including a static Olson and Pfitzer 1977 quiet magnetic field model, a simple dynamic Tsyganenko 1989 model, and a sophisticated dynamic Tsyganenko 2001 storm model), it is shown that the errors in both derived and modeled directions are at least comparable. Furthermore, using a newly developedmore » proxy method as well as comparing results from empirical models, we are able to provide for the first time circumstantial evidence showing that derived magnetic field directions should statistically match the real magnetic directions better, with averaged errors < ~2°, than those from the three empirical models with averaged errors > ~5°. In addition, our results suggest that the errors in derived magnetic field directions do not depend much on magnetospheric activity, in contrast to the empirical field models. Finally, as applications of the above conclusions, we show examples of electron pitch angle distributions observed by LANL GEO and also take the derived magnetic field directions as the real ones so as to test the performance of empirical field models along the GEO orbits, with results suggesting dependence on solar cycles as well as satellite locations. Finally, this study demonstrates the validity and value of the method that infers local magnetic field directions from particle spin-resolved distributions.« less

  4. Ancient DNA Analysis Suggests Negligible Impact of the Wari Empire Expansion in Peru’s Central Coast during the Middle Horizon

    PubMed Central

    Barreto Romero, María Inés; Flores Espinoza, Isabel; Cooper, Alan; Fehren-Schmitz, Lars

    2016-01-01

    The analysis of ancient human DNA from South America allows the exploration of pre-Columbian population history through time and to directly test hypotheses about cultural and demographic evolution. The Middle Horizon (650–1100 AD) represents a major transitional period in the Central Andes, which is associated with the development and expansion of ancient Andean empires such as Wari and Tiwanaku. These empires facilitated a series of interregional interactions and socio-political changes, which likely played an important role in shaping the region’s demographic and cultural profiles. We analyzed individuals from three successive pre-Columbian cultures present at the Huaca Pucllana archaeological site in Lima, Peru: Lima (Early Intermediate Period, 500–700 AD), Wari (Middle Horizon, 800–1000 AD) and Ychsma (Late Intermediate Period, 1000–1450 AD). We sequenced 34 complete mitochondrial genomes to investigate the potential genetic impact of the Wari Empire in the Central Coast of Peru. The results indicate that genetic diversity shifted only slightly through time, ruling out a complete population discontinuity or replacement driven by the Wari imperialist hegemony, at least in the region around present-day Lima. However, we caution that the very subtle genetic contribution of Wari imperialism at the particular Huaca Pucllana archaeological site might not be representative for the entire Wari territory in the Peruvian Central Coast. PMID:27248693

  5. Ancient DNA Analysis Suggests Negligible Impact of the Wari Empire Expansion in Peru's Central Coast during the Middle Horizon.

    PubMed

    Valverde, Guido; Barreto Romero, María Inés; Flores Espinoza, Isabel; Cooper, Alan; Fehren-Schmitz, Lars; Llamas, Bastien; Haak, Wolfgang

    2016-01-01

    The analysis of ancient human DNA from South America allows the exploration of pre-Columbian population history through time and to directly test hypotheses about cultural and demographic evolution. The Middle Horizon (650-1100 AD) represents a major transitional period in the Central Andes, which is associated with the development and expansion of ancient Andean empires such as Wari and Tiwanaku. These empires facilitated a series of interregional interactions and socio-political changes, which likely played an important role in shaping the region's demographic and cultural profiles. We analyzed individuals from three successive pre-Columbian cultures present at the Huaca Pucllana archaeological site in Lima, Peru: Lima (Early Intermediate Period, 500-700 AD), Wari (Middle Horizon, 800-1000 AD) and Ychsma (Late Intermediate Period, 1000-1450 AD). We sequenced 34 complete mitochondrial genomes to investigate the potential genetic impact of the Wari Empire in the Central Coast of Peru. The results indicate that genetic diversity shifted only slightly through time, ruling out a complete population discontinuity or replacement driven by the Wari imperialist hegemony, at least in the region around present-day Lima. However, we caution that the very subtle genetic contribution of Wari imperialism at the particular Huaca Pucllana archaeological site might not be representative for the entire Wari territory in the Peruvian Central Coast.

  6. A coupled hydrological-hydraulic flood inundation model calibrated using post-event measurements and integrated uncertainty analysis in a poorly gauged Mediterranean basin

    NASA Astrophysics Data System (ADS)

    Hdeib, Rouya; Abdallah, Chadi; Moussa, Roger; Colin, Francois

    2017-04-01

    Developing flood inundation maps of defined exceedance probabilities is required to provide information on the flood hazard and the associated risk. A methodology has been developed to model flood inundation in poorly gauged basins, where reliable information on the hydrological characteristics of floods are uncertain and partially captured by the traditional rain-gauge networks. Flood inundation is performed through coupling a hydrological rainfall-runoff (RR) model (HEC-HMS) with a hydraulic model (HEC-RAS). The RR model is calibrated against the January 2013 flood event in the Awali River basin, Lebanon (300 km2), whose flood peak discharge was estimated by post-event measurements. The resulting flows of the RR model are defined as boundary conditions of the hydraulic model, which is run to generate the corresponding water surface profiles and calibrated against 20 post-event surveyed cross sections after the January-2013 flood event. An uncertainty analysis is performed to assess the results of the models. Consequently, the coupled flood inundation model is simulated with design storms and flood inundation maps are generated of defined exceedance probabilities. The peak discharges estimated by the simulated RR model were in close agreement with the results from different empirical and statistical methods. This methodology can be extended to other poorly gauged basins facing common stage-gauge failure or characterized by floods with a stage exceeding the gauge measurement level, or higher than that defined by the rating curve.

  7. Understanding Statistical Variation: A Response to Sharma

    ERIC Educational Resources Information Center

    Farmer, Jim

    2008-01-01

    In this article, the author responds to the paper "Exploring pre-service teachers' understanding of statistical variation: Implications for teaching and research" by Sashi Sharma (see EJ779107). In that paper, Sharma described a study "designed to investigate pre-service teachers' acknowledgment of variation in sampling and distribution…

  8. Density-based empirical likelihood procedures for testing symmetry of data distributions and K-sample comparisons.

    PubMed

    Vexler, Albert; Tanajian, Hovig; Hutson, Alan D

    In practice, parametric likelihood-ratio techniques are powerful statistical tools. In this article, we propose and examine novel and simple distribution-free test statistics that efficiently approximate parametric likelihood ratios to analyze and compare distributions of K groups of observations. Using the density-based empirical likelihood methodology, we develop a Stata package that applies to a test for symmetry of data distributions and compares K -sample distributions. Recognizing that recent statistical software packages do not sufficiently address K -sample nonparametric comparisons of data distributions, we propose a new Stata command, vxdbel, to execute exact density-based empirical likelihood-ratio tests using K samples. To calculate p -values of the proposed tests, we use the following methods: 1) a classical technique based on Monte Carlo p -value evaluations; 2) an interpolation technique based on tabulated critical values; and 3) a new hybrid technique that combines methods 1 and 2. The third, cutting-edge method is shown to be very efficient in the context of exact-test p -value computations. This Bayesian-type method considers tabulated critical values as prior information and Monte Carlo generations of test statistic values as data used to depict the likelihood function. In this case, a nonparametric Bayesian method is proposed to compute critical values of exact tests.

  9. Nursing students' attitudes toward statistics: Effect of a biostatistics course and association with examination performance.

    PubMed

    Kiekkas, Panagiotis; Panagiotarou, Aliki; Malja, Alvaro; Tahirai, Daniela; Zykai, Rountina; Bakalis, Nick; Stefanopoulos, Nikolaos

    2015-12-01

    Although statistical knowledge and skills are necessary for promoting evidence-based practice, health sciences students have expressed anxiety about statistics courses, which may hinder their learning of statistical concepts. To evaluate the effects of a biostatistics course on nursing students' attitudes toward statistics and to explore the association between these attitudes and their performance in the course examination. One-group quasi-experimental pre-test/post-test design. Undergraduate nursing students of the fifth or higher semester of studies, who attended a biostatistics course. Participants were asked to complete the pre-test and post-test forms of The Survey of Attitudes Toward Statistics (SATS)-36 scale at the beginning and end of the course respectively. Pre-test and post-test scale scores were compared, while correlations between post-test scores and participants' examination performance were estimated. Among 156 participants, post-test scores of the overall SATS-36 scale and of the Affect, Cognitive Competence, Interest and Effort components were significantly higher than pre-test ones, indicating that the course was followed by more positive attitudes toward statistics. Among 104 students who participated in the examination, higher post-test scores of the overall SATS-36 scale and of the Affect, Difficulty, Interest and Effort components were significantly but weakly correlated with higher examination performance. Students' attitudes toward statistics can be improved through appropriate biostatistics courses, while positive attitudes contribute to higher course achievements and possibly to improved statistical skills in later professional life. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Empiric validation of a process for behavior change.

    PubMed

    Elliot, Diane L; Goldberg, Linn; MacKinnon, David P; Ranby, Krista W; Kuehl, Kerry S; Moe, Esther L

    2016-09-01

    Most behavior change trials focus on outcomes rather than deconstructing how those outcomes related to programmatic theoretical underpinnings and intervention components. In this report, the process of change is compared for three evidence-based programs' that shared theories, intervention elements and potential mediating variables. Each investigation was a randomized trial that assessed pre- and post- intervention variables using survey constructs with established reliability. Each also used mediation analyses to define relationships. The findings were combined using a pattern matching approach. Surprisingly, knowledge was a significant mediator in each program (a and b path effects [p<0.01]). Norms, perceived control abilities, and self-monitoring were confirmed in at least two studies (p<0.01 for each). Replication of findings across studies with a common design but varied populations provides a robust validation of the theory and processes of an effective intervention. Combined findings also demonstrate a means to substantiate process aspects and theoretical models to advance understanding of behavior change.

  11. Implementing Evidence-Based Practice: A Review of the Empirical Research Literature

    ERIC Educational Resources Information Center

    Gray, Mel; Joy, Elyssa; Plath, Debbie; Webb, Stephen A.

    2013-01-01

    The article reports on the findings of a review of empirical studies examining the implementation of evidence-based practice (EBP) in the human services. Eleven studies were located that defined EBP as a research-informed, clinical decision-making process and identified barriers and facilitators to EBP implementation. A thematic analysis of the…

  12. An Efficiency Assessment among Empirically Defined Labor Markets for Determining Pay for Teachers

    ERIC Educational Resources Information Center

    Tran, Henry; Young, I. Phillip

    2013-01-01

    Fundamental to updating a fixed-rate salary schedule for teachers is the reliance on a relevant labor market containing comparisons to other school districts--that is, object school districts, which can be chosen from a policy or empirical/efficiency perspective. As such, four relevant markets having roots in neoclassical economic…

  13. Test-Taking Engagement in PIAAC. OECD Education Working Papers, No. 133

    ERIC Educational Resources Information Center

    Goldhammer, Frank; Martens, Thomas; Christoph, Gabriela; Lüdtke, Oliver

    2016-01-01

    In this study, we investigated how empirical indicators of test-taking engagement can be defined, empirically validated, and used to describe group differences in the context of the Programme of International Assessment of Adult Competences (PIAAC). The approach was to distinguish between disengaged and engaged response behavior by means of…

  14. Qualitative Case Study Research as Empirical Inquiry

    ERIC Educational Resources Information Center

    Ellinger, Andrea D.; McWhorter, Rochell

    2016-01-01

    This article introduces the concept of qualitative case study research as empirical inquiry. It defines and distinguishes what a case study is, the purposes, intentions, and types of case studies. It then describes how to determine if a qualitative case study is the preferred approach for conducting research. It overviews the essential steps in…

  15. A comparison of likelihood ratio tests and Rao's score test for three separable covariance matrix structures.

    PubMed

    Filipiak, Katarzyna; Klein, Daniel; Roy, Anuradha

    2017-01-01

    The problem of testing the separability of a covariance matrix against an unstructured variance-covariance matrix is studied in the context of multivariate repeated measures data using Rao's score test (RST). The RST statistic is developed with the first component of the separable structure as a first-order autoregressive (AR(1)) correlation matrix or an unstructured (UN) covariance matrix under the assumption of multivariate normality. It is shown that the distribution of the RST statistic under the null hypothesis of any separability does not depend on the true values of the mean or the unstructured components of the separable structure. A significant advantage of the RST is that it can be performed for small samples, even smaller than the dimension of the data, where the likelihood ratio test (LRT) cannot be used, and it outperforms the standard LRT in a number of contexts. Monte Carlo simulations are then used to study the comparative behavior of the null distribution of the RST statistic, as well as that of the LRT statistic, in terms of sample size considerations, and for the estimation of the empirical percentiles. Our findings are compared with existing results where the first component of the separable structure is a compound symmetry (CS) correlation matrix. It is also shown by simulations that the empirical null distribution of the RST statistic converges faster than the empirical null distribution of the LRT statistic to the limiting χ 2 distribution. The tests are implemented on a real dataset from medical studies. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. A High Resolution Genome-Wide Scan for Significant Selective Sweeps: An Application to Pooled Sequence Data in Laying Chickens

    PubMed Central

    Qanbari, Saber; Strom, Tim M.; Haberer, Georg; Weigend, Steffen; Gheyas, Almas A.; Turner, Frances; Burt, David W.; Preisinger, Rudolf; Gianola, Daniel; Simianer, Henner

    2012-01-01

    In most studies aimed at localizing footprints of past selection, outliers at tails of the empirical distribution of a given test statistic are assumed to reflect locus-specific selective forces. Significance cutoffs are subjectively determined, rather than being related to a clear set of hypotheses. Here, we define an empirical p-value for the summary statistic by means of a permutation method that uses the observed SNP structure in the real data. To illustrate the methodology, we applied our approach to a panel of 2.9 million autosomal SNPs identified from re-sequencing a pool of 15 individuals from a brown egg layer line. We scanned the genome for local reductions in heterozygosity, suggestive of selective sweeps. We also employed a modified sliding window approach that accounts for gaps in the sequence and increases scanning resolution by moving the overlapping windows by steps of one SNP only, and suggest to call this a “creeping window” strategy. The approach confirmed selective sweeps in the region of previously described candidate genes, i.e. TSHR, PRL, PRLHR, INSR, LEPR, IGF1, and NRAMP1 when used as positive controls. The genome scan revealed 82 distinct regions with strong evidence of selection (genome-wide p-value<0.001), including genes known to be associated with eggshell structure and immune system such as CALB1 and GAL cluster, respectively. A substantial proportion of signals was found in poor gene content regions including the most extreme signal on chromosome 1. The observation of multiple signals in a highly selected layer line of chicken is consistent with the hypothesis that egg production is a complex trait controlled by many genes. PMID:23209582

  17. An empirical strategy to detect bacterial transcript structure from directional RNA-seq transcriptome data.

    PubMed

    Wang, Yejun; MacKenzie, Keith D; White, Aaron P

    2015-05-07

    As sequencing costs are being lowered continuously, RNA-seq has gradually been adopted as the first choice for comparative transcriptome studies with bacteria. Unlike microarrays, RNA-seq can directly detect cDNA derived from mRNA transcripts at a single nucleotide resolution. Not only does this allow researchers to determine the absolute expression level of genes, but it also conveys information about transcript structure. Few automatic software tools have yet been established to investigate large-scale RNA-seq data for bacterial transcript structure analysis. In this study, 54 directional RNA-seq libraries from Salmonella serovar Typhimurium (S. Typhimurium) 14028s were examined for potential relationships between read mapping patterns and transcript structure. We developed an empirical method, combined with statistical tests, to automatically detect key transcript features, including transcriptional start sites (TSSs), transcriptional termination sites (TTSs) and operon organization. Using our method, we obtained 2,764 TSSs and 1,467 TTSs for 1331 and 844 different genes, respectively. Identification of TSSs facilitated further discrimination of 215 putative sigma 38 regulons and 863 potential sigma 70 regulons. Combining the TSSs and TTSs with intergenic distance and co-expression information, we comprehensively annotated the operon organization in S. Typhimurium 14028s. Our results show that directional RNA-seq can be used to detect transcriptional borders at an acceptable resolution of ±10-20 nucleotides. Technical limitations of the RNA-seq procedure may prevent single nucleotide resolution. The automatic transcript border detection methods, statistical models and operon organization pipeline that we have described could be widely applied to RNA-seq studies in other bacteria. Furthermore, the TSSs, TTSs, operons, promoters and unstranslated regions that we have defined for S. Typhimurium 14028s may constitute valuable resources that can be used for comparative analyses with other Salmonella serotypes.

  18. Stochastic empirical loading and dilution model (SELDM) version 1.0.0

    USGS Publications Warehouse

    Granato, Gregory E.

    2013-01-01

    The Stochastic Empirical Loading and Dilution Model (SELDM) is designed to transform complex scientific data into meaningful information about the risk of adverse effects of runoff on receiving waters, the potential need for mitigation measures, and the potential effectiveness of such management measures for reducing these risks. The U.S. Geological Survey developed SELDM in cooperation with the Federal Highway Administration to help develop planning-level estimates of event mean concentrations, flows, and loads in stormwater from a site of interest and from an upstream basin. Planning-level estimates are defined as the results of analyses used to evaluate alternative management measures; planning-level estimates are recognized to include substantial uncertainties (commonly orders of magnitude). SELDM uses information about a highway site, the associated receiving-water basin, precipitation events, stormflow, water quality, and the performance of mitigation measures to produce a stochastic population of runoff-quality variables. SELDM provides input statistics for precipitation, prestorm flow, runoff coefficients, and concentrations of selected water-quality constituents from National datasets. Input statistics may be selected on the basis of the latitude, longitude, and physical characteristics of the site of interest and the upstream basin. The user also may derive and input statistics for each variable that are specific to a given site of interest or a given area. SELDM is a stochastic model because it uses Monte Carlo methods to produce the random combinations of input variable values needed to generate the stochastic population of values for each component variable. SELDM calculates the dilution of runoff in the receiving waters and the resulting downstream event mean concentrations and annual average lake concentrations. Results are ranked, and plotting positions are calculated, to indicate the level of risk of adverse effects caused by runoff concentrations, flows, and loads on receiving waters by storm and by year. Unlike deterministic hydrologic models, SELDM is not calibrated by changing values of input variables to match a historical record of values. Instead, input values for SELDM are based on site characteristics and representative statistics for each hydrologic variable. Thus, SELDM is an empirical model based on data and statistics rather than theoretical physiochemical equations. SELDM is a lumped parameter model because the highway site, the upstream basin, and the lake basin each are represented as a single homogeneous unit. Each of these source areas is represented by average basin properties, and results from SELDM are calculated as point estimates for the site of interest. Use of the lumped parameter approach facilitates rapid specification of model parameters to develop planning-level estimates with available data. The approach allows for parsimony in the required inputs to and outputs from the model and flexibility in the use of the model. For example, SELDM can be used to model runoff from various land covers or land uses by using the highway-site definition as long as representative water quality and impervious-fraction data are available.

  19. Statistical mechanics of neocortical interactions. Derivation of short-term-memory capacity

    NASA Astrophysics Data System (ADS)

    Ingber, Lester

    1984-06-01

    A theory developed by the author to describe macroscopic neocortical interactions demonstrates that empirical values of chemical and electrical parameters of synaptic interactions establish several minima of the path-integral Lagrangian as a function of excitatory and inhibitory columnar firings. The number of possible minima, their time scales of hysteresis and probable reverberations, and their nearest-neighbor columnar interactions are all consistent with well-established empirical rules of human short-term memory. Thus, aspects of conscious experience are derived from neuronal firing patterns, using modern methods of nonlinear nonequilibrium statistical mechanics to develop realistic explicit synaptic interactions.

  20. Causal diagrams for empirical legal research: a methodology for identifying causation, avoiding bias and interpreting results

    PubMed Central

    VanderWeele, Tyler J.; Staudt, Nancy

    2014-01-01

    In this paper we introduce methodology—causal directed acyclic graphs—that empirical researchers can use to identify causation, avoid bias, and interpret empirical results. This methodology has become popular in a number of disciplines, including statistics, biostatistics, epidemiology and computer science, but has yet to appear in the empirical legal literature. Accordingly we outline the rules and principles underlying this new methodology and then show how it can assist empirical researchers through both hypothetical and real-world examples found in the extant literature. While causal directed acyclic graphs are certainly not a panacea for all empirical problems, we show they have potential to make the most basic and fundamental tasks, such as selecting covariate controls, relatively easy and straightforward. PMID:25685055

  1. Identifying ideal brow vector position: empirical analysis of three brow archetypes.

    PubMed

    Hamamoto, Ashley A; Liu, Tiffany W; Wong, Brian J

    2013-02-01

    Surgical browlifts counteract the effects of aging, correct ptosis, and optimize forehead aesthetics. While surgeons have control over brow shape, the metrics defining ideal brow shape are subjective. This study aims to empirically determine whether three expert brow design strategies are aesthetically equivalent by using expert focus group analysis and relating these findings to brow surgery. Comprehensive literature search identified three dominant brow design methods (Westmore, Lamas and Anastasia) that are heavily cited, referenced or internationally recognized in either medical literature or by the lay media. Using their respective guidelines, brow shape was modified for 10 synthetic female faces, yielding 30 images. A focus group of 50 professional makeup artists ranked the three images for each of the 10 faces to generate ordinal attractiveness scores. The contemporary methods employed by Anastasia and Lamas produce a brow arch more lateral than Westmore's classic method. Although the more laterally located brow arch is considered the current trend in facial aesthetics, this style was not empirically supported. No single method was consistently rated most or least attractive by the focus group, and no significant difference in attractiveness score for the different methods was observed (p = 0.2454). Although each method of brow placement has been promoted as the "best" approach, no single brow design method achieved statistical significance in optimizing attractiveness. Each can be used effectively as a guide in designing eyebrow shape during browlift procedures, making it possible to use the three methods interchangeably. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  2. Quantifying interactions between real oscillators with information theory and phase models: application to cardiorespiratory coupling.

    PubMed

    Zhu, Yenan; Hsieh, Yee-Hsee; Dhingra, Rishi R; Dick, Thomas E; Jacono, Frank J; Galán, Roberto F

    2013-02-01

    Interactions between oscillators can be investigated with standard tools of time series analysis. However, these methods are insensitive to the directionality of the coupling, i.e., the asymmetry of the interactions. An elegant alternative was proposed by Rosenblum and collaborators [M. G. Rosenblum, L. Cimponeriu, A. Bezerianos, A. Patzak, and R. Mrowka, Phys. Rev. E 65, 041909 (2002); M. G. Rosenblum and A. S. Pikovsky, Phys. Rev. E 64, 045202 (2001)] which consists in fitting the empirical phases to a generic model of two weakly coupled phase oscillators. This allows one to obtain the interaction functions defining the coupling and its directionality. A limitation of this approach is that a solution always exists in the least-squares sense, even in the absence of coupling. To preclude spurious results, we propose a three-step protocol: (1) Determine if a statistical dependency exists in the data by evaluating the mutual information of the phases; (2) if so, compute the interaction functions of the oscillators; and (3) validate the empirical oscillator model by comparing the joint probability of the phases obtained from simulating the model with that of the empirical phases. We apply this protocol to a model of two coupled Stuart-Landau oscillators and show that it reliably detects genuine coupling. We also apply this protocol to investigate cardiorespiratory coupling in anesthetized rats. We observe reciprocal coupling between respiration and heartbeat and that the influence of respiration on the heartbeat is generally much stronger than vice versa. In addition, we find that the vagus nerve mediates coupling in both directions.

  3. Modern Empirical Statistical Spectral Analysis.

    DTIC Science & Technology

    1980-05-01

    716-723. Akaike, H. (1977). On entropy maximization principle, Applications of Statistics, P.R. Krishnaiah , ed., North-Holland, Amsterdam, 27-41...by P. Krishnaiah , North Holland: Amsterdam, 283-295. Parzen, E. (1979). Forecasting and whitening filter estimation, TIMS Studies in the Management

  4. New BVI {sub C} photometry of low-mass pleiades stars: Exploring the effects of rotation on broadband colors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamai, Brittany L.; Stassun, Keivan G.; Vrba, Frederick J.

    2014-08-01

    We present new BVI{sub C} photometry for 350 Pleiades proper motion members with 9 < V ≲ 17. Importantly, our new catalog includes a large number of K- and early M-type stars, roughly doubling the number of low-mass stars with well-calibrated Johnson/Cousins photometry in this benchmark cluster. We combine our new photometry with existing photometry from the literature to define a purely empirical isochrone at Pleiades age (≈100 Myr) extending from V = 9 to 17. We use the empirical isochrone to identify 48 new probable binaries and 14 likely nonmembers. The photometrically identified single stars are compared against theirmore » expected positions in the color-magnitude diagram (CMD). At 100 Myr, the mid K and early M stars are predicted to lie above the zero-age main sequence (ZAMS) having not yet reached the ZAMS. We find in the B – V versus V CMD that mid K and early M dwarfs are instead displaced below (or blueward of) the ZAMS. Using the stars' previously reported rotation periods, we find a highly statistically significant correlation between rotation period and CMD displacement, in the sense that the more rapidly rotating stars have the largest displacements in the B – V CMD.« less

  5. Net Reclassification Indices for Evaluating Risk-Prediction Instruments: A Critical Review

    PubMed Central

    Kerr, Kathleen F.; Wang, Zheyu; Janes, Holly; McClelland, Robyn L.; Psaty, Bruce M.; Pepe, Margaret S.

    2014-01-01

    Net reclassification indices have recently become popular statistics for measuring the prediction increment of new biomarkers. We review the various types of net reclassification indices and their correct interpretations. We evaluate the advantages and disadvantages of quantifying the prediction increment with these indices. For pre-defined risk categories, we relate net reclassification indices to existing measures of the prediction increment. We also consider statistical methodology for constructing confidence intervals for net reclassification indices and evaluate the merits of hypothesis testing based on such indices. We recommend that investigators using net reclassification indices should report them separately for events (cases) and nonevents (controls). When there are two risk categories, the components of net reclassification indices are the same as the changes in the true-positive and false-positive rates. We advocate use of true- and false-positive rates and suggest it is more useful for investigators to retain the existing, descriptive terms. When there are three or more risk categories, we recommend against net reclassification indices because they do not adequately account for clinically important differences in shifts among risk categories. The category-free net reclassification index is a new descriptive device designed to avoid pre-defined risk categories. However, it suffers from many of the same problems as other measures such as the area under the receiver operating characteristic curve. In addition, the category-free index can mislead investigators by overstating the incremental value of a biomarker, even in independent validation data. When investigators want to test a null hypothesis of no prediction increment, the well-established tests for coefficients in the regression model are superior to the net reclassification index. If investigators want to use net reclassification indices, confidence intervals should be calculated using bootstrap methods rather than published variance formulas. The preferred single-number summary of the prediction increment is the improvement in net benefit. PMID:24240655

  6. Evaluation of a suicide prevention training curriculum for substance abuse treatment providers based on Treatment Improvement Protocol Number 50 (TIP 50)

    PubMed Central

    Conner, Kenneth R.; Wood, Jane; Pisani, Anthony R.; Kemp, Janet

    2013-01-01

    Substance use disorders (SUD) confer risk for suicide yet there are no empirically supported suicide prevention training curricula tailored to SUD treatment providers. We assessed the efficacy of a 2-hour training that featured a suicide prevention training video produced by the Department of Veterans Affairs (VA). The video was based on Treatment Improvement Protocol Number 50, TIP 50, a practical manual to manage suicide risk produced by the Substance Abuse and Mental Health Services Administration (SAMHSA). The training was provided in small groups to 273 SUD treatment providers in 18 states. Results were evaluated using self-report assessments obtained at pre-test, post-test, and 2-month follow-up. Statistically significant changes (p<.001) within subjects were obtained on self-efficacy, knowledge, and frequency of suicide prevention practice behaviors. The positive results together with the brevity of the training and its ease of implementation indicate high potential for widespread adoption and the importance of further study. PMID:22417671

  7. Efficiency and cross-correlation in equity market during global financial crisis: Evidence from China

    NASA Astrophysics Data System (ADS)

    Ma, Pengcheng; Li, Daye; Li, Shuo

    2016-02-01

    Using one minute high-frequency data of the Shanghai Composite Index (SHCI) and the Shenzhen Composite Index (SZCI) (2007-2008), we employ the detrended fluctuation analysis (DFA) and the detrended cross correlation analysis (DCCA) with rolling window approach to observe the evolution of market efficiency and cross-correlation in pre-crisis and crisis period. Considering the fat-tail distribution of return time series, statistical test based on shuffling method is conducted to verify the null hypothesis of no long-term dependence. Our empirical research displays three main findings. First Shanghai equity market efficiency deteriorated while Shenzhen equity market efficiency improved with the advent of financial crisis. Second the highly positive dependence between SHCI and SZCI varies with time scale. Third financial crisis saw a significant increase of dependence between SHCI and SZCI at shorter time scales but a lack of significant change at longer time scales, providing evidence of contagion and absence of interdependence during crisis.

  8. Prevalence and associated factors of low bone mass in adults with systemic lupus erythematosus.

    PubMed

    Cramarossa, G; Urowitz, M B; Su, J; Gladman, D; Touma, Z

    2017-04-01

    Background Systemic lupus erythematosus (SLE) patients are often treated with glucocorticoids, which place them at risk of bone loss. Objectives The objectives of this article are to determine: (1) the prevalence of low bone mineral density (BMD) and factors associated with low BMD and (2) the prevalence of symptomatic fragility fractures in inception patients of the Toronto Lupus Cohort (TLC). Methods Prospectively collected data from the TLC (1996-2015) of inception patients' first BMD were analyzed. For pre-menopausal women/males <50 years, BMD 'below expected range for age' was defined by Z-score ≤ -2.0 SD. For post-menopausal women/males age 50 or older, osteoporosis was defined by T-score ≤ -2.5 SD and low bone mass by T-score between -1.0 and -2.5 SD. Patients' BMDs were defined as abnormal if Z-score ≤ -2.0 or T-score < -1.0 SD, and the remainder as normal. Descriptive analysis and logistic regression were employed. Results Of 1807 patients, 286 are inception patients with BMD results (mean age 37.9 ± 13.7 years); 88.8% are female. The overall prevalence of abnormal BMD is 31.5%. In pre-menopausal women ( n = 173), the prevalence of BMD below expected range is 17.3%. In post-menopausal women ( n = 81), the prevalence of osteoporosis and low BMD are 12.3% and 43.2%, respectively. Age and cumulative dose of glucocorticoids are statistically significantly associated with abnormal BMD in multivariate analysis. Of 769 inception patients from TLC, 11.1% experienced symptomatic fragility fractures (peripheral and vertebral) over the course of their disease. Conclusion The prevalence of low BMD is high in SLE patients, and is associated with older age and higher cumulative glucocorticoid dose.

  9. The Epistemological Perceptions of the Relationship between Physics and Mathematics and Its Effect on Problem-Solving among Pre-Service Teachers at Yarmouk University in Jordan

    ERIC Educational Resources Information Center

    Al-Omari, Wesal; Miqdadi, Ruba

    2014-01-01

    The purpose of this paper was to examine the perception pre-service teachers hold to the nature of the relationship between physics and mathematics. The study examined this relationship in reference to their performance in problem solving and strategies they used. The results of this empirical study suggested that most participants hold a naïve…

  10. Subject-Specific Correctness of Students' Conceptions and Factors of Influence: Empirical Findings from a Quantitative Study with Grade 7 Students in Germany Regarding the Formation and Location of Deserts

    ERIC Educational Resources Information Center

    Schubert, Jan Christoph; Wrenger, Katja

    2016-01-01

    Students' conceptions are a central learning condition. Until now there have only been qualitative results regarding the important geographical area of the desert, especially its location and formation. Therefore this study surveys students' conceptions (N = 585; n = 448 without pre-instruction on deserts and n = 137 with pre-instruction on…

  11. Assessment of structured physical examination skills training using a retro-pre-questionnaire.

    PubMed

    Piryani, Rano Mal; Shankar, P Ravi; Piryani, Suneel; Thapa, Trilok Pati; Karki, Balmansingh; Khakurel, Mahesh Prasad; Bhandary, Shital

    2013-01-01

    The effectiveness of physical examination skills (PES) training is very rarely assessed using the "post-then-pre" approach. In this study, a retro-pre-questionnaire was used to study the effect of structured physical examination skills training (SPEST) imparted to second-year undergraduate medical students. KIST Medical College (KISTMC) affiliated to Tribhuvan University Nepal admitted its first batch of MBBS students in November 2008. The university curriculum recommends the involvement of Medicine and Surgery Departments in PES training, but the methods for teaching and assessment are not well defined. KISTMC has made training more structured and involved the Medicine, Surgery, Gynaecology and Obstetrics, Orthopaedics, ENT, Ophthalmology, Paediatrics, and Family Medicine Departments. SPEST includes the teaching/learning of basic PES for 210 minutes once a week for 28 weeks. Self-assessment is done by using a retro-pre-questionnaire at the end of the last session of training, and these data are analysed using SPSS. Out of 100 students, 98 participated in the objective structured clinical examination (OSCE); 82 completed the retro-pre-questionnaire. Forty-six skills representing various systems were selected for inclusion in the retro-pre-questionnaire from among the many skills taught in different departments. The average perceived skills score (maximum score, 46×4=184) before training was 15.9 and increased to 116.5 after training. The increase was statistically significant upon the application of a paired t-test. The students perceived that their level of skills improved after the training. The retro-pre- instrument seems to be useful for assessing the learners' self-reported changes in PES after training if a large number of skills need to be assessed. However, it should be noted that although a retro-pre-questionnaire may reveal valuable information, it is not a substitute for an objective measure or gold standard.

  12. Parricide: An Empirical Analysis of 24 Years of U.S. Data

    ERIC Educational Resources Information Center

    Heide, Kathleen M.; Petee, Thomas A.

    2007-01-01

    Empirical analysis of homicides in which children have killed parents has been limited. The most comprehensive statistical analysis involving parents as victims was undertaken by Heide and used Supplementary Homicide Report (SHR) data for the 10-year period 1977 to 1986. This article provides an updated examination of characteristics of victims,…

  13. Evaluation of Nursing Documentation Completion of Stroke Patients in the Emergency Department: A Pre-Post Analysis Using Flowsheet Templates and Clinical Decision Support.

    PubMed

    Richardson, Karen J; Sengstack, Patricia; Doucette, Jeffrey N; Hammond, William E; Schertz, Matthew; Thompson, Julie; Johnson, Constance

    2016-02-01

    The primary aim of this performance improvement project was to determine whether the electronic health record implementation of stroke-specific nursing documentation flowsheet templates and clinical decision support alerts improved the nursing documentation of eligible stroke patients in seven stroke-certified emergency departments. Two system enhancements were introduced into the electronic record in an effort to improve nursing documentation: disease-specific documentation flowsheets and clinical decision support alerts. Using a pre-post design, project measures included six stroke management goals as defined by the National Institute of Neurological Disorders and Stroke and three clinical decision support measures based on entry of orders used to trigger documentation reminders for nursing: (1) the National Institutes of Health's Stroke Scale, (2) neurological checks, and (3) dysphagia screening. Data were reviewed 6 months prior (n = 2293) and 6 months following the intervention (n = 2588). Fisher exact test was used for statistical analysis. Statistical significance was found for documentation of five of the six stroke management goals, although effect sizes were small. Customizing flowsheets to meet the needs of nursing workflow showed improvement in the completion of documentation. The effects of the decision support alerts on the completeness of nursing documentation were not statistically significant (likely due to lack of order entry). For example, an order for the National Institutes of Health Stroke Scale was entered only 10.7% of the time, which meant no alert would fire for nursing in the postintervention group. Future work should focus on decision support alerts that trigger reminders for clinicians to place relevant orders for this population.

  14. 78 FR 57822 - Lease and Interchange of Vehicles; Motor Carriers of Passengers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-20

    ... evaluation develops a threshold analysis. There are no statistical or empirical studies that directly link... divided by the value of a statistical life (VSL) of $9.1 million results in 5.8 lives prevented over ten...

  15. Risk Factors for Sexual Violence in the Military: An Analysis of Sexual Assault and Sexual Harassment Incidents and Reporting

    DTIC Science & Technology

    2017-03-01

    53 ix LIST OF TABLES Table 1. Descriptive Statistics for Control Variables by... Statistics for Control Variables by Gender (Random Subsample with Complete Survey) ............................................................30 Table...empirical analysis. Chapter IV describes the summary statistics and results. Finally, Chapter V offers concluding thoughts, study limitations, and

  16. Statistical power of intervention analyses: simulation and empirical application to treated lumber prices

    Treesearch

    Jeffrey P. Prestemon

    2009-01-01

    Timber product markets are subject to large shocks deriving from natural disturbances and policy shifts. Statistical modeling of shocks is often done to assess their economic importance. In this article, I simulate the statistical power of univariate and bivariate methods of shock detection using time series intervention models. Simulations show that bivariate methods...

  17. Differential Item Functioning Detection across Two Methods of Defining Group Comparisons: Pairwise and Composite Group Comparisons

    ERIC Educational Resources Information Center

    Sari, Halil Ibrahim; Huggins, Anne Corinne

    2015-01-01

    This study compares two methods of defining groups for the detection of differential item functioning (DIF): (a) pairwise comparisons and (b) composite group comparisons. We aim to emphasize and empirically support the notion that the choice of pairwise versus composite group definitions in DIF is a reflection of how one defines fairness in DIF…

  18. Good epidemiology, good ethics: empirical and ethical dimensions of global public health.

    PubMed

    Rentmeester, Christy A; Dasgupta, Rajib

    2012-01-01

    This paper examines the following ethically and epidemiologically relevant challenges, as yet neglected in public health ethics: how to distribute resources and health risks and benefits, how to define evidentiary criteria that justify public health interventions, and how to define terms in which programme goals, successes, and failures will be assessed and monitored. We illuminate critical intersections of empirical and ethical dimensions of public health work, drawing upon three global public health interventions-inclusion of the Hepatitis B vaccine in the Universal Immunisation Programme, Universal Salt Iodisation, and the Global Polio Eradication Initiative-and suggest strategies for addressing and responding to them.

  19. The place for itraconazole in treatment.

    PubMed

    Maertens, Johan; Boogaerts, Marc

    2005-09-01

    The incidence of systemic fungal infections has risen sharply in the last two decades, reflecting a rise in the number of patients who are predisposed to these diseases because they are immunosuppressed or immunocompromised. The growing use of intensive chemotherapy to treat cancer, highly immunosuppressive drug regimens (not only in transplant recipients), widespread prophylactic or empirical broad-spectrum antibiotics, prolonged parenteral nutrition, long-term indwelling lines, improved survival in neonatal and other intensive care units, together with the AIDS epidemic have led to an upsurge in the number of patients at risk. In addition, there have been changes in the epidemiology of systemic fungal infections, with Aspergillus spp. and Candida spp. other than Candida albicans becoming increasingly common causes. These changes have affected the selection of drugs for first-line or prophylactic use, as not all agents have the critical spectrum of activity required. The management of systemic fungal infections can be divided into four main strategies: prophylaxis, early empirical use, pre-emptive and definite therapy. Antifungal prophylaxis is given based on the patient risk factors, but in the absence of infection. Empirical antifungal therapy is given in patients at risk with signs of infection of unclear aetiology (usually persistent fever) but of possible fungal origin. Therapy is given pre-emptively in patients at risk with additional evidence for the presence of an infective agent in a way predisposing for infection (e.g. Aspergillus colonization; high Candida colonization index). Finally, definite treatment is used in patients with confirmed fungal infection. The distinction between risk-adapted prophylaxis, early empirical therapy, and pre-emptive use of antifungals often becomes unclear and clinical decision making depends largely on local epidemiology and resistance patterns, adequate definition of patient risk categories, early diagnosis and the calculation of cost-benefit ratios. This article addresses the use of itraconazole in the treatment of invasive fungal infections in the haematology patient.

  20. Becoming a College Student: An Empirical Phenomenological Analysis of First Generation College Students

    ERIC Educational Resources Information Center

    Whitehead, Patrick M.; Wright, Robert

    2017-01-01

    This article is an empirical phenomenological examination of the perceived security that first generation college students have in their identity as college students. First generation college students (FGCS) have been defined as students whose parents or guardians have not completed a 2- or 4-year postsecondary degree. Previous research (Davis,…

  1. Modified retrieval algorithm for three types of precipitation distribution using x-band synthetic aperture radar

    NASA Astrophysics Data System (ADS)

    Xie, Yanan; Zhou, Mingliang; Pan, Dengke

    2017-10-01

    The forward-scattering model is introduced to describe the response of normalized radar cross section (NRCS) of precipitation with synthetic aperture radar (SAR). Since the distribution of near-surface rainfall is related to the rate of near-surface rainfall and horizontal distribution factor, a retrieval algorithm called modified regression empirical and model-oriented statistical (M-M) based on the volterra integration theory is proposed. Compared with the model-oriented statistical and volterra integration (MOSVI) algorithm, the biggest difference is that the M-M algorithm is based on the modified regression empirical algorithm rather than the linear regression formula to retrieve the value of near-surface rainfall rate. Half of the empirical parameters are reduced in the weighted integral work and a smaller average relative error is received while the rainfall rate is less than 100 mm/h. Therefore, the algorithm proposed in this paper can obtain high-precision rainfall information.

  2. Bootstrap data methodology for sequential hybrid model building

    NASA Technical Reports Server (NTRS)

    Volponi, Allan J. (Inventor); Brotherton, Thomas (Inventor)

    2007-01-01

    A method for modeling engine operation comprising the steps of: 1. collecting a first plurality of sensory data, 2. partitioning a flight envelope into a plurality of sub-regions, 3. assigning the first plurality of sensory data into the plurality of sub-regions, 4. generating an empirical model of at least one of the plurality of sub-regions, 5. generating a statistical summary model for at least one of the plurality of sub-regions, 6. collecting an additional plurality of sensory data, 7. partitioning the second plurality of sensory data into the plurality of sub-regions, 8. generating a plurality of pseudo-data using the empirical model, and 9. concatenating the plurality of pseudo-data and the additional plurality of sensory data to generate an updated empirical model and an updated statistical summary model for at least one of the plurality of sub-regions.

  3. Detecting failure of climate predictions

    USGS Publications Warehouse

    Runge, Michael C.; Stroeve, Julienne C.; Barrett, Andrew P.; McDonald-Madden, Eve

    2016-01-01

    The practical consequences of climate change challenge society to formulate responses that are more suited to achieving long-term objectives, even if those responses have to be made in the face of uncertainty1, 2. Such a decision-analytic focus uses the products of climate science as probabilistic predictions about the effects of management policies3. Here we present methods to detect when climate predictions are failing to capture the system dynamics. For a single model, we measure goodness of fit based on the empirical distribution function, and define failure when the distribution of observed values significantly diverges from the modelled distribution. For a set of models, the same statistic can be used to provide relative weights for the individual models, and we define failure when there is no linear weighting of the ensemble models that produces a satisfactory match to the observations. Early detection of failure of a set of predictions is important for improving model predictions and the decisions based on them. We show that these methods would have detected a range shift in northern pintail 20 years before it was actually discovered, and are increasingly giving more weight to those climate models that forecast a September ice-free Arctic by 2055.

  4. Using Excel in Teacher Education for Sustainability

    ERIC Educational Resources Information Center

    Aydin, Serhat

    2016-01-01

    In this study, the feasibility of using Excel software in teaching whole Basic Statistics Course and its influence on the attitudes of pre-service science teachers towards statistics were investigated. One hundred and two pre-service science teachers in their second year participated in the study. The data were collected from the prospective…

  5. Addressing the Needs of Children With Disabilities Experiencing Disaster or Terrorism.

    PubMed

    Stough, Laura M; Ducy, Elizabeth McAdams; Kang, Donghyun

    2017-04-01

    This paper reviews the empirical literature on psychosocial factors relating to children with disabilities in the context of disaster or terrorism. Research indicates adults with disabilities experience increased exposure to hazards due to existing social disparities and barriers associated with disability status. However, studies on the psychological effects of disaster/terrorism on children with pre-existing disabilities are exceedingly few and empirical evidence of the effectiveness of trauma-focused therapies for this population is limited. Secondary adversities, including social stigma and health concerns, also compromise the recovery of these children post-disaster/terrorism. Schools and teachers appear to be particularly important in the recovery of children with disabilities from disaster. Disasters, terrorism, and war all contribute to increased incidence of disability, as well as disproportionately affect children with pre-existing disabilities. Disaster preparedness interventions and societal changes are needed to decrease the disproportionate environmental and social vulnerability of children with disabilities to disaster and terrorism.

  6. Pre-Kindergarten: Research-Based Recommendations for Developing Standards and Factors Contributing to School Readiness Gaps. Information Capsule. Volume 1201

    ERIC Educational Resources Information Center

    Blazer, Christie

    2012-01-01

    States across the country are developing pre-kindergarten standards that articulate expectations for preschooler's learning and development and define the manner in which services will be provided. There are two different types of standards: student outcome standards and program standards. Student outcome standards define the knowledge and skills…

  7. Preentry communications study. Outer planets atmospheric entry probe

    NASA Technical Reports Server (NTRS)

    Hinrichs, C. A.

    1976-01-01

    A pre-entry communications study is presented for a relay link between a Jupiter entry probe and a spacecraft in hyperbolic orbit. Two generic communications links of interest are described: a pre-entry link to a spun spacecraft antenna, and a pre-entry link to a despun spacecraft antenna. The propagation environment of Jupiter is defined. Although this is one of the least well known features of Jupiter, enough information exists to reasonably establish bounds on the performance of a communications link. Within these bounds, optimal carrier frequencies are defined. The next step is to identify optimal relative geometries between the probe and the spacecraft. Optimal trajectories are established for both spun and despun spacecraft antennas. Given the optimal carrier frequencies, and the optimal trajectories, the data carrying capacities of the pre-entry links are defined. The impact of incorporating pre-entry communications into a basic post entry probe is then assessed. This assessment covers the disciplines of thermal control, power source, mass properties and design layout. A conceptual design is developed of an electronically despun antenna for use on a Pioneer class of spacecraft.

  8. Using Naturalistic Driving Performance Data to Develop an Empirically Defined Model of Distracted Driving

    DOT National Transportation Integrated Search

    2016-08-05

    Driver distraction is defined as a diversion of attention away from the primary driving activity toward non-driving related tasks (Lee et al., 2008). Multiple resource theory (MRT) describes this diversion as a process of competition for attentional ...

  9. Bilingual Education and Accountability: A Perceptual View.

    ERIC Educational Resources Information Center

    Hernandez-Domingues, Jose L.; Gertenbach, Donald

    This paper discusses (1) The Current Definition of Bilingual Education, (2) The Origin of Accountability, (3) The Empirical and Rational View of Education, (4) Man Defines Himself or Is Defined, and (5) Who Is Accountable? A list of notes is included in the study. (SK)

  10. Accretion-induced luminosity spreads in young clusters: evidence from stellar rotation

    NASA Astrophysics Data System (ADS)

    Littlefair, S. P.; Naylor, Tim; Mayne, N. J.; Saunders, Eric; Jeffries, R. D.

    2011-05-01

    We present an analysis of the rotation of young stars in the associations Cepheus OB3b, NGC 2264, 2362 and the Orion Nebula Cluster (ONC). We discover a correlation between rotation rate and position in a colour-magnitude diagram (CMD) such that stars which lie above an empirically determined median pre-main sequence rotate more rapidly than stars which lie below this sequence. The same correlation is seen, with a high degree of statistical significance, in each association studied here. If position within the CMD is interpreted as being due to genuine age spreads within a cluster, then the stars above the median pre-main sequence would be the youngest stars. This would in turn imply that the most rapidly rotating stars in an association are the youngest, and hence those with the largest moments of inertia and highest likelihood of ongoing accretion. Such a result does not fit naturally into the existing picture of angular momentum evolution in young stars, where the stars are braked effectively by their accretion discs until the disc disperses. Instead, we argue that, for a given association of young stars, position within the CMD is not primarily a function of age, but of accretion history. We show that this hypothesis could explain the correlation we observe between rotation rate and position within the CMD.

  11. Developing the Pieta House Suicide Intervention Model: a quasi-experimental, repeated measures design.

    PubMed

    Surgenor, Paul Wg; Freeman, Joan; O'Connor, Cindy

    2015-01-01

    While most crisis intervention models adhere to a generalised theoretical framework, the lack of clarity around how these should be enacted has resulted in a proliferation of models, most of which have little to no empirical support. The primary aim of this research was to propose a suicide intervention model that would resolve the client's suicidal crisis by decreasing their suicidal ideation and improve their outlook through enhancing a range of protective factors. The secondary aim was to assess the impact of this model on negative and positive outlook. A quasi-experimental, pre-test post-test repeated measures design was employed. A questionnaire assessing self-esteem, depression, and positive and negative suicidal ideation was administered to the same participants pre- and post- therapy facilitating paired responses. Multiple analysis of variance and paired-samples t-tests were conducted to establish whether therapy using the PH-SIM had a significant effect on the clients' negative and positive outlook. Analyses revealed a statistically significant effect of therapy for depression, negative suicidal ideation, self-esteem, and positive suicidal ideation. Negative outlook was significantly lower after therapy and positive outlook significantly higher. The decreased negative outlook and increased positive outlook following therapy provide some support for the proposed model in fulfilling its role, though additional research is required to establish the precise role of the intervention model in achieving this.

  12. Estimating mortality, morbidity and disability due to malaria among Africa's non-pregnant population.

    PubMed Central

    Snow, R. W.; Craig, M.; Deichmann, U.; Marsh, K.

    1999-01-01

    The contribution of malaria to morbidity and mortality among people in Africa has been a subject of academic interest, political advocacy, and speculation. National statistics for much of sub-Saharan Africa have proved to be an unreliable source of disease-specific morbidity and mortality data. Credible estimates of disease-specific burdens are required for setting global and national priorities for health in order to rationalize the use of limited resources and lobby for financial support. We have taken an empirical approach to defining the limits of Plasmodium falciparum transmission across the continent and interpolated the distributions of projected populations in 1995. By combining a review of the literature on malaria in Africa and models of acquired functional immunity, we have estimated the age-structured rates of the fatal, morbid and disabling sequelae following exposure to malaria infection under different epidemiological conditions. PMID:10516785

  13. Fluctuation behaviors of financial return volatility duration

    NASA Astrophysics Data System (ADS)

    Niu, Hongli; Wang, Jun; Lu, Yunfan

    2016-04-01

    It is of significantly crucial to understand the return volatility of financial markets because it helps to quantify the investment risk, optimize the portfolio, and provide a key input of option pricing models. The characteristics of isolated high volatility events above certain threshold in price fluctuations and the distributions of return intervals between these events arouse great interest in financial research. In the present work, we introduce a new concept of daily return volatility duration, which is defined as the shortest passage time when the future volatility intensity is above or below the current volatility intensity (without predefining a threshold). The statistical properties of the daily return volatility durations for seven representative stock indices from the world financial markets are investigated. Some useful and interesting empirical results of these volatility duration series about the probability distributions, memory effects and multifractal properties are obtained. These results also show that the proposed stock volatility series analysis is a meaningful and beneficial trial.

  14. Information driving force and its application in agent-based modeling

    NASA Astrophysics Data System (ADS)

    Chen, Ting-Ting; Zheng, Bo; Li, Yan; Jiang, Xiong-Fei

    2018-04-01

    Exploring the scientific impact of online big-data has attracted much attention of researchers from different fields in recent years. Complex financial systems are typical open systems profoundly influenced by the external information. Based on the large-scale data in the public media and stock markets, we first define an information driving force, and analyze how it affects the complex financial system. The information driving force is observed to be asymmetric in the bull and bear market states. As an application, we then propose an agent-based model driven by the information driving force. Especially, all the key parameters are determined from the empirical analysis rather than from statistical fitting of the simulation results. With our model, both the stationary properties and non-stationary dynamic behaviors are simulated. Considering the mean-field effect of the external information, we also propose a few-body model to simulate the financial market in the laboratory.

  15. DQE analysis for CCD imaging arrays

    NASA Astrophysics Data System (ADS)

    Shaw, Rodney

    1997-05-01

    By consideration of the statistical interaction between exposure quanta and the mechanisms of image detection, the signal-to-noise limitations of a variety of image acquisition technologies are now well understood. However in spite of the growing fields of application for CCD imaging- arrays and the obvious advantages of their multi-level mode of quantum detection, only limited and largely empirical approaches have been made to quantify these advantages on an absolute basis. Here an extension is made of a previous model for noise-free sequential photon-counting to the more general case involving both count-noise and arbitrary separation functions between count levels. This allows a basic model to be developed for the DQE associated with devices which approximate to the CCD mode of operation, and conclusions to be made concerning the roles of the separation-function and count-noise in defining the departure from the ideal photon counter.

  16. Free will beliefs predict attitudes toward unethical behavior and criminal punishment

    PubMed Central

    Martin, Nathan D.; Rigoni, Davide; Vohs, Kathleen D.

    2017-01-01

    Do free will beliefs influence moral judgments? Answers to this question from theoretical and empirical perspectives are controversial. This study attempted to replicate past research and offer theoretical insights by analyzing World Values Survey data from residents of 46 countries (n = 65,111 persons). Corroborating experimental findings, free will beliefs predicted intolerance of unethical behaviors and support for severe criminal punishment. Further, the link between free will beliefs and intolerance of unethical behavior was moderated by variations in countries’ institutional integrity, defined as the degree to which countries had accountable, corruption-free public sectors. Free will beliefs predicted intolerance of unethical behaviors for residents of countries with high and moderate institutional integrity, but this correlation was not seen for countries with low institutional integrity. Free will beliefs predicted support for criminal punishment regardless of countries’ institutional integrity. Results were robust across different operationalizations of institutional integrity and with or without statistical control variables. PMID:28652361

  17. Implementation of jump-diffusion algorithms for understanding FLIR scenes

    NASA Astrophysics Data System (ADS)

    Lanterman, Aaron D.; Miller, Michael I.; Snyder, Donald L.

    1995-07-01

    Our pattern theoretic approach to the automated understanding of forward-looking infrared (FLIR) images brings the traditionally separate endeavors of detection, tracking, and recognition together into a unified jump-diffusion process. New objects are detected and object types are recognized through discrete jump moves. Between jumps, the location and orientation of objects are estimated via continuous diffusions. An hypothesized scene, simulated from the emissive characteristics of the hypothesized scene elements, is compared with the collected data by a likelihood function based on sensor statistics. This likelihood is combined with a prior distribution defined over the set of possible scenes to form a posterior distribution. The jump-diffusion process empirically generates the posterior distribution. Both the diffusion and jump operations involve the simulation of a scene produced by a hypothesized configuration. Scene simulation is most effectively accomplished by pipelined rendering engines such as silicon graphics. We demonstrate the execution of our algorithm on a silicon graphics onyx/reality engine.

  18. Environmental Factors Contributing to Wrongdoing in Medicine: A Criterion-Based Review of Studies and Cases

    PubMed Central

    DuBois, James M.; Carroll, Kelly; Gibb, Tyler; Kraus, Elena; Rubbelke, Timothy; Vasher, Meghan; Anderson, Emily E.

    2012-01-01

    In this paper we describe our approach to understanding wrongdoing in medical research and practice, which involves the statistical analysis of coded data from a large set of published cases. We focus on understanding the environmental factors that predict the kind and the severity of wrongdoing in medicine. Through review of empirical and theoretical literature, consultation with experts, the application of criminological theory, and ongoing analysis of our first 60 cases, we hypothesize that 10 contextual features of the medical environment (including financial rewards, oversight failures, and patients belonging to vulnerable groups) may contribute to professional wrongdoing. We define each variable, examine data supporting our hypothesis, and present a brief case synopsis from our study that illustrates the potential influence of the variable. Finally, we discuss limitations of the resulting framework and directions for future research. PMID:23226933

  19. Studying Si/SiGe disordered alloys within effective mass theory

    NASA Astrophysics Data System (ADS)

    Gamble, John; Montaño, Inès; Carroll, Malcolm S.; Muller, Richard P.

    Si/SiGe is an attractive material system for electrostatically-defined quantum dot qubits due to its high-quality crystalline quantum well interface. Modeling the properties of single-electron quantum dots in this system is complicated by the presence of alloy disorder, which typically requires atomistic techniques in order to treat properly. Here, we use the NEMO-3D empirical tight binding code to calibrate a multi-valley effective mass theory (MVEMT) to properly handle alloy disorder. The resulting MVEMT simulations give good insight into the essential physics of alloy disorder, while being extremely computationally efficient and well-suited to determining statistical properties. Sandia is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the US Department of Energy's National Nuclear Security Administration under Contract No. DE-AC04-94AL85000.

  20. The Impact of the Temporal Distribution of Communicating Civilizations on Their Detectability.

    PubMed

    Balbi, Amedeo

    2018-01-01

    We used a statistical model to investigate the detectability (defined by the requirement that causal contact has been initiated with us) of communicating civilizations within a volume of the Universe surrounding our location. If the civilizations are located in our galaxy, the detectability requirement imposes a strict constraint on their epoch of appearance and their communicating life span. This, in turn, implies that our ability to gather empirical evidence of the fraction of civilizations within range of detection strongly depends on the specific features of their temporal distribution. Our approach illuminates aspects of the problem that can escape the standard treatment based on the Drake equation. Therefore, it might provide the appropriate framework for future studies dealing with the evolutionary aspects of the search for extraterrestrial intelligence (SETI). Key Words: Astrobiology-Extraterrestrial life-SETI-Complex life-Life detection-Intelligence. Astrobiology 18, 54-58.

  1. The validity and clinical utility of purging disorder.

    PubMed

    Keel, Pamela K; Striegel-Moore, Ruth H

    2009-12-01

    To review evidence of the validity and clinical utility of Purging Disorder and examine options for the Diagnostic and Statistical Manual of Mental Disorders fifth edition (DSM-V). Articles were identified by computerized and manual searches and reviewed to address five questions about Purging Disorder: Is there "ample" literature? Is the syndrome clearly defined? Can it be measured and diagnosed reliably? Can it be differentiated from other eating disorders? Is there evidence of syndrome validity? Although empirical classification and concurrent validity studies provide emerging support for the distinctiveness of Purging Disorder, questions remain about definition, diagnostic reliability in clinical settings, and clinical utility (i.e., prognostic validity). We discuss strengths and weaknesses associated with various options for the status of Purging Disorder in the DSM-V ranging from making no changes from DSM-IV to designating Purging Disorder a diagnosis on equal footing with Anorexia Nervosa and Bulimia Nervosa.

  2. Free will beliefs predict attitudes toward unethical behavior and criminal punishment.

    PubMed

    Martin, Nathan D; Rigoni, Davide; Vohs, Kathleen D

    2017-07-11

    Do free will beliefs influence moral judgments? Answers to this question from theoretical and empirical perspectives are controversial. This study attempted to replicate past research and offer theoretical insights by analyzing World Values Survey data from residents of 46 countries ( n = 65,111 persons). Corroborating experimental findings, free will beliefs predicted intolerance of unethical behaviors and support for severe criminal punishment. Further, the link between free will beliefs and intolerance of unethical behavior was moderated by variations in countries' institutional integrity, defined as the degree to which countries had accountable, corruption-free public sectors. Free will beliefs predicted intolerance of unethical behaviors for residents of countries with high and moderate institutional integrity, but this correlation was not seen for countries with low institutional integrity. Free will beliefs predicted support for criminal punishment regardless of countries' institutional integrity. Results were robust across different operationalizations of institutional integrity and with or without statistical control variables.

  3. Trend extraction using empirical mode decomposition and statistical empirical mode decomposition: Case study: Kuala Lumpur stock market

    NASA Astrophysics Data System (ADS)

    Jaber, Abobaker M.

    2014-12-01

    Two nonparametric methods for prediction and modeling of financial time series signals are proposed. The proposed techniques are designed to handle non-stationary and non-linearity behave and to extract meaningful signals for reliable prediction. Due to Fourier Transform (FT), the methods select significant decomposed signals that will be employed for signal prediction. The proposed techniques developed by coupling Holt-winter method with Empirical Mode Decomposition (EMD) and it is Extending the scope of empirical mode decomposition by smoothing (SEMD). To show performance of proposed techniques, we analyze daily closed price of Kuala Lumpur stock market index.

  4. Long-term evaluation of three satellite ocean color algorithms for identifying harmful algal blooms (Karenia brevis) along the west coast of Florida: A matchup assessment

    PubMed Central

    Carvalho, Gustavo A.; Minnett, Peter J.; Banzon, Viva F.; Baringer, Warner; Heil, Cynthia A.

    2011-01-01

    We present a simple algorithm to identify Karenia brevis blooms in the Gulf of Mexico along the west coast of Florida in satellite imagery. It is based on an empirical analysis of collocated matchups of satellite and in situ measurements. The results of this Empirical Approach is compared to those of a Bio-optical Technique – taken from the published literature – and the Operational Method currently implemented by the NOAA Harmful Algal Bloom Forecasting System for K. brevis blooms. These three algorithms are evaluated using a multi-year MODIS data set (from July, 2002 to October, 2006) and a long-term in situ database. Matchup pairs, consisting of remotely-sensed ocean color parameters and near-coincident field measurements of K. brevis concentration, are used to assess the accuracy of the algorithms. Fair evaluation of the algorithms was only possible in the central west Florida shelf (i.e. between 25.75°N and 28.25°N) during the boreal Summer and Fall months (i.e. July to December) due to the availability of valid cloud-free matchups. Even though the predictive values of the three algorithms are similar, the statistical measure of success in red tide identification (defined as cell counts in excess of 1.5 × 104 cells L−1) varied considerably (sensitivity—Empirical: 86%; Bio-optical: 77%; Operational: 26%), as did their effectiveness in identifying non-bloom cases (specificity—Empirical: 53%; Bio-optical: 65%; Operational: 84%). As the Operational Method had an elevated frequency of false-negative cases (i.e. presented low accuracy in detecting known red tides), and because of the considerable overlap between the optical characteristics of the red tide and non-bloom population, only the other two algorithms underwent a procedure for further inspecting possible detection improvements. Both optimized versions of the Empirical and Bio-optical algorithms performed similarly, being equally specific and sensitive (~70% for both) and showing low levels of uncertainties (i.e. few cases of false-negatives and false-positives: ~30%)—improved positive predictive values (~60%) were also observed along with good negative predictive values (~80%). PMID:22180667

  5. FonaDyn - A system for real-time analysis of the electroglottogram, over the voice range

    NASA Astrophysics Data System (ADS)

    Ternström, Sten; Johansson, Dennis; Selamtzis, Andreas

    2018-01-01

    From soft to loud and low to high, the mechanisms of human voice have many degrees of freedom, making it difficult to assess phonation from the acoustic signal alone. FonaDyn is a research tool that combines acoustics with electroglottography (EGG). It characterizes and visualizes in real time the dynamics of EGG waveforms, using statistical clustering of the cycle-synchronous EGG Fourier components, and their sample entropy. The prevalence and stability of different EGG waveshapes are mapped as colored regions into a so-called voice range profile, without needing pre-defined thresholds or categories. With appropriately 'trained' clusters, FonaDyn can classify and map voice regimes. This is of potential scientific, clinical and pedagogical interest.

  6. Does journal club membership improve research evidence uptake in different allied health disciplines: a pre-post study.

    PubMed

    Lizarondo, Lucylynn M; Grimmer-Somers, Karen; Kumar, Saravana; Crockett, Alan

    2012-10-29

    Although allied health is considered to be one 'unit' of healthcare providers, it comprises a range of disciplines which have different training and ways of thinking, and different tasks and methods of patient care. Very few empirical studies on evidence-based practice (EBP) have directly compared allied health professionals. The objective of this study was to examine the impact of a structured model of journal club (JC), known as iCAHE (International Centre for Allied Health Evidence) JC, on the EBP knowledge, skills and behaviour of the different allied health disciplines. A pilot, pre-post study design using maximum variation sampling was undertaken. Recruitment was conducted in groups and practitioners such as physiotherapists, occupational therapists, speech pathologists, social workers, psychologists, nutritionists/dieticians and podiatrists were invited to participate. All participating groups received the iCAHE JC for six months. Quantitative data using the Adapted Fresno Test (McCluskey & Bishop) and Evidence-based Practice Questionnaire (Upton & Upton) were collected prior to the implementation of the JC, with follow-up measurements six months later. Mean percentage change and confidence intervals were calculated to compare baseline and post JC scores for all outcome measures. The results of this study demonstrate variability in EBP outcomes across disciplines after receiving the iCAHE JC. Only physiotherapists showed statistically significant improvements in all outcomes; speech pathologists and occupational therapists demonstrated a statistically significant increase in knowledge but not for attitude and evidence uptake; social workers and dieticians/nutritionists showed statistically significant positive changes in their knowledge, and evidence uptake but not for attitude. There is evidence to suggest that a JC such as the iCAHE model is an effective method for improving the EBP knowledge and skills of allied health practitioners. It may be used as a single intervention to facilitate evidence uptake in some allied health disciplines but may need to be integrated with other strategies to influence practice behaviour in other practitioners. An in-depth analysis of other factors (e.g. individual, contextual, organisational), or the relative contribution of these variables is required to better understand the determinants of evidence uptake in allied health.

  7. Does journal club membership improve research evidence uptake in different allied health disciplines: a pre-post study

    PubMed Central

    2012-01-01

    Background Although allied health is considered to be one 'unit' of healthcare providers, it comprises a range of disciplines which have different training and ways of thinking, and different tasks and methods of patient care. Very few empirical studies on evidence-based practice (EBP) have directly compared allied health professionals. The objective of this study was to examine the impact of a structured model of journal club (JC), known as iCAHE (International Centre for Allied Health Evidence) JC, on the EBP knowledge, skills and behaviour of the different allied health disciplines. Methods A pilot, pre-post study design using maximum variation sampling was undertaken. Recruitment was conducted in groups and practitioners such as physiotherapists, occupational therapists, speech pathologists, social workers, psychologists, nutritionists/dieticians and podiatrists were invited to participate. All participating groups received the iCAHE JC for six months. Quantitative data using the Adapted Fresno Test (McCluskey & Bishop) and Evidence-based Practice Questionnaire (Upton & Upton) were collected prior to the implementation of the JC, with follow-up measurements six months later. Mean percentage change and confidence intervals were calculated to compare baseline and post JC scores for all outcome measures. Results The results of this study demonstrate variability in EBP outcomes across disciplines after receiving the iCAHE JC. Only physiotherapists showed statistically significant improvements in all outcomes; speech pathologists and occupational therapists demonstrated a statistically significant increase in knowledge but not for attitude and evidence uptake; social workers and dieticians/nutritionists showed statistically significant positive changes in their knowledge, and evidence uptake but not for attitude. Conclusions There is evidence to suggest that a JC such as the iCAHE model is an effective method for improving the EBP knowledge and skills of allied health practitioners. It may be used as a single intervention to facilitate evidence uptake in some allied health disciplines but may need to be integrated with other strategies to influence practice behaviour in other practitioners. An in-depth analysis of other factors (e.g. individual, contextual, organisational), or the relative contribution of these variables is required to better understand the determinants of evidence uptake in allied health. PMID:23106851

  8. Comparing replacement rates under private and federal retirement systems.

    PubMed

    Martin, Patricia P

    One measure of the adequacy of retirement income is replacement rate - the percentage of pre-retirement salary that is available to a worker in retirement. This article compares salary replacement rates for private-sector employees of medium and large private establishments with those for federal employees under the Civil Service Retirement System and the Federal Employees Retirement System. Because there is no standard benefit formula to represent the variety of formulas available in the private sector, a composite defined benefit formula was developed using the characteristics of plans summarized in the Bureau of Labor Statistics Medium and Large Employer Plan Survey. The resulting "typical" private-sector defined benefit plan, with an accompanying defined contribution plan, was then compared with the two federal systems. The Civil Service Retirement System (CSRS) is a stand-alone defined benefit plan whose participants are not covered by Social Security. Until passage of the 1983 Amendments to Social Security Act, it was the only retirement plan for most federal civilian employees. Provisions of the 1983 Amendments were designed to restore long-term financial stability to the Social Security trust funds. One provision created the Federal Employees Retirement System (FERS), which covers federal employees hired after 1983. It was one of the provisions designed to restore long-term financial stability to the Social Security trust funds. FERS employees contribute to and are covered by Social Security. FERS, which is a defined benefit plan, also includes a basic benefit and a 401(k)-type plan known as the Thrift Savings Plan (TSP). To compare how retirees would fare under the three different retirement systems, benefits of employees retiring at age 65 with 35 years of service were calculated using hypothetical workers with steady earnings. Workers were classified according to a percentage of the average wage in the economy: low earners (45 percent), average earners (100 percent) high earners (160 percent), and maximum earners (earnings at the taxable maximum amount). Overall, this analysis found that: Excluding Social Security benefits and TSP and defined contribution annuities, CSRS retirees have a higher pre-retirement salary replacement rate than either FERS or private-sector retirees. Private-sector retirees, however, have higher replacement rate than their FERS counterparts. Including Social Security benefits but not TSP and defined contribution plan annuities, CSRS retirees who are maximum earners have a higher pre-retirement salary replacement rate (despite receiving no Social Security benefits) than FERS retirees with the same earnings. Private-sector retirees in all earnings categories have a higher replacement rate than federal retirees with the same earnings. Including Social Security and TSP and defined contribution plan annuities, private-sector retirees in all earnings categories have a higher replacement rate than federal retirees, but their rate is close to that of FERS retirees. The rate is higher for FERS retirees than for CSRS retirees in all earnings categories. This analysis shows that replacement creates could exceed 100 percent for FERS employees who contribute who contribute 6 percent of earnings to the TSP over full working career. Private-sector replacement rates were quite similar for those with both a defined benefit and a defined contribution pension plan. Social Security replacement rates make up the highest proportion of benefits for th private sector's lowest income quartile group. The replacement rate for 401(k) plans and the TSP account for a higher proportion of benefits than does Social Security for all other income groups, assuming the absence of a defined benefit plan.

  9. Cognitive Flexibility through Metastable Neural Dynamics Is Disrupted by Damage to the Structural Connectome.

    PubMed

    Hellyer, Peter J; Scott, Gregory; Shanahan, Murray; Sharp, David J; Leech, Robert

    2015-06-17

    Current theory proposes that healthy neural dynamics operate in a metastable regime, where brain regions interact to simultaneously maximize integration and segregation. Metastability may confer important behavioral properties, such as cognitive flexibility. It is increasingly recognized that neural dynamics are constrained by the underlying structural connections between brain regions. An important challenge is, therefore, to relate structural connectivity, neural dynamics, and behavior. Traumatic brain injury (TBI) is a pre-eminent structural disconnection disorder whereby traumatic axonal injury damages large-scale connectivity, producing characteristic cognitive impairments, including slowed information processing speed and reduced cognitive flexibility, that may be a result of disrupted metastable dynamics. Therefore, TBI provides an experimental and theoretical model to examine how metastable dynamics relate to structural connectivity and cognition. Here, we use complementary empirical and computational approaches to investigate how metastability arises from the healthy structural connectome and relates to cognitive performance. We found reduced metastability in large-scale neural dynamics after TBI, measured with resting-state functional MRI. This reduction in metastability was associated with damage to the connectome, measured using diffusion MRI. Furthermore, decreased metastability was associated with reduced cognitive flexibility and information processing. A computational model, defined by empirically derived connectivity data, demonstrates how behaviorally relevant changes in neural dynamics result from structural disconnection. Our findings suggest how metastable dynamics are important for normal brain function and contingent on the structure of the human connectome. Copyright © 2015 the authors 0270-6474/15/359050-14$15.00/0.

  10. The effect of topically applied tissue expanders on radial forearm skin pliability: a prospective self-controlled study

    PubMed Central

    2014-01-01

    Background The use of pre-operatively applied topical tissue expansion tapes have previously demonstrated increased rates of primary closure of radial forearm free flap donor sites. This is associated with a reduced cost of care as well as improved cosmetic appearance of the donor site. Unfortunately, little is known about the biomechanical changes these tapes cause in the forearm skin. This study tested the hypothesis that the use of topically applied tissue expansion tapes will result in an increase in forearm skin pliability in patients undergoing radial forearm free flap surgery. Methods Twenty-four patients scheduled for head and neck surgery requiring a radial forearm free flap were enrolled in this prospective self-controlled observational study. DynaClose tissue expansion tapes (registered Canica Design Inc, Almonte, Canada) were applied across the forearm one week pre-operatively. Immediately prior to surgery, the skin pliability of the dorsal and volar forearm sites were measured with the Cutometer MPA 580 (registered Courage-Khazaka Electronic GmbH, Cologne, Germany) on both the treatment and contralateral (control) arms. Paired t-tests were used to compare treatment to control at both sites, with p < 0.025 defined as statistically significant. Results There was a statistically significant increase in pliability by a mean of 0.05 mm (SD = 0.09 mm) between treatment and control arms on the dorsal site (95% CI [0.01, 0.08], p = 0.018). This corresponded to an 8% increase in pliability. In contrast, the volar site did not show a statistically significant difference between treatment and control (mean difference = 0.04 mm, SD = 0.20 mm, 95% CI [−0.04, 0.12], p = 0.30). Conclusions This result provides evidence that the pre-operative application of topical tissue expansion tapes produces measurable changes in skin biomechanical properties. The location of this change on the dorsal forearm is consistent with the method of tape application. While this increase in skin pliability may account for the improved rate of primary donor site closure reported using this technique, the results did not reach our definition of clinical significance. PMID:24739510

  11. Pre-processing by data augmentation for improved ellipse fitting.

    PubMed

    Kumar, Pankaj; Belchamber, Erika R; Miklavcic, Stanley J

    2018-01-01

    Ellipse fitting is a highly researched and mature topic. Surprisingly, however, no existing method has thus far considered the data point eccentricity in its ellipse fitting procedure. Here, we introduce the concept of eccentricity of a data point, in analogy with the idea of ellipse eccentricity. We then show empirically that, irrespective of ellipse fitting method used, the root mean square error (RMSE) of a fit increases with the eccentricity of the data point set. The main contribution of the paper is based on the hypothesis that if the data point set were pre-processed to strategically add additional data points in regions of high eccentricity, then the quality of a fit could be improved. Conditional validity of this hypothesis is demonstrated mathematically using a model scenario. Based on this confirmation we propose an algorithm that pre-processes the data so that data points with high eccentricity are replicated. The improvement of ellipse fitting is then demonstrated empirically in real-world application of 3D reconstruction of a plant root system for phenotypic analysis. The degree of improvement for different underlying ellipse fitting methods as a function of data noise level is also analysed. We show that almost every method tested, irrespective of whether it minimizes algebraic error or geometric error, shows improvement in the fit following data augmentation using the proposed pre-processing algorithm.

  12. Empirically Based Phenotypic Profiles of Children with Pervasive Developmental Disorders: Interpretation in the Light of the DSM-5

    ERIC Educational Resources Information Center

    Greaves-Lord, Kirstin; Eussen, Mart L. J. M.; Verhulst, Frank C.; Minderaa, Ruud B.; Mandy, William; Hudziak, James J.; Steenhuis, Mark Peter; de Nijs, Pieter F.; Hartman, Catharina A.

    2013-01-01

    This study aimed to contribute to the Diagnostic and Statistical Manual (DSM) debates on the conceptualization of autism by investigating (1) whether empirically based distinct phenotypic profiles could be distinguished within a sample of mainly cognitively able children with pervasive developmental disorder (PDD), and (2) how profiles related to…

  13. The Myth of Social Class and Criminality: An Empirical Assessment of the Empirical Evidence.

    ERIC Educational Resources Information Center

    Tittle, Charles R.; And Others

    1978-01-01

    Thirty-five studies examining the relationship between social class and crime/delinquency are reduced to comparable statistics using as units of analysis instances where the relationship was studied for specific categories of age, sex, race, place of residence, data type, or offense. Findings from 363 instances are summarized and patterns are…

  14. A New Sample Size Formula for Regression.

    ERIC Educational Resources Information Center

    Brooks, Gordon P.; Barcikowski, Robert S.

    The focus of this research was to determine the efficacy of a new method of selecting sample sizes for multiple linear regression. A Monte Carlo simulation was used to study both empirical predictive power rates and empirical statistical power rates of the new method and seven other methods: those of C. N. Park and A. L. Dudycha (1974); J. Cohen…

  15. Assimilation approach to measuring organizational change from pre- to post-intervention

    PubMed Central

    Moore, Scott C; Osatuke, Katerine; Howe, Steven R

    2014-01-01

    AIM: To present a conceptual and measurement strategy that allows to objectively, sensitively evaluate intervention progress based on data of participants’ perceptions of presenting problems. METHODS: We used as an example an organization development intervention at a United States Veterans Affairs medical center. Within a year, the intervention addressed the hospital’s initially serious problems and multiple stakeholders (employees, management, union representatives) reported satisfaction with progress made. Traditional quantitative outcome measures, however, failed to capture the strong positive impact consistently reported by several types of stakeholders in qualitative interviews. To address the paradox, full interview data describing the medical center pre- and post- intervention were examined applying a validated theoretical framework from another discipline: Psychotherapy research. The Assimilation model is a clinical-developmental theory that describes empirically grounded change levels in problematic experiences, e.g., problems reported by participants. The model, measure Assimilation of Problematic Experiences Scale (APES), and rating procedure have been previously applied across various populations and problem types, mainly in clinical but also in non-clinical settings. We applied the APES to the transcribed qualitative data of intervention participants’ interviews, using the method closely replicating prior assimilation research (the process whereby trained clinicians familiar with the Assimilation model work with full, transcribed interview data to assign the APES ratings). The APES ratings summarized levels of progress which was defined as participants’ assimilation level of problematic experiences, and compared from pre- to post-intervention. RESULTS: The results were consistent with participants’ own reported perceptions of the intervention impact. Increase in APES levels from pre- to post-intervention suggested improvement, missed in the previous quantitative measures (the Maslach Burnout Inventory and the Work Environment Scale). The progress specifically consisted of participants’ moving from the APES stages where the problematic experience was avoided, to the APES stages where awareness and attention to the problems were steadily sustained, although the problems were not yet fully processed or resolved. These results explain why the conventional outcome measures failed to reflect the intervention progress; they narrowly defined progress as resolution of the presenting problems and alleviation of symptomatic distress. In the Assimilation model, this definition only applies to a sub-segment of the change continuum, specifically the latest APES stages. The model defines progress as change in psychological processes used in response to the problem, i.e., a growing ability to deal with problematic issues non-defensively, manifested differently depending on APES stages. At early stages, progress is an increased ability to face the problem rather than turning away. At later APES stages, progress involves naming, understanding and successfully addressing the problem. The assimilation approach provides a broader developmental context compared to exclusively symptom, problem-, or behavior- focused approaches that typically inform outcome measurement in interpersonally based interventions. In our data, this made the difference between reflecting (APES) vs missing (Maslach Burnout Inventory, Work Environment Scale) the pre-post change that was strongly perceived by the intervention recipients. CONCLUSION: The results illustrated a working solution to the challenge of objectively evaluating progress in subjectively experienced problems. This approach informs measuring change in psychologically based interventions. PMID:24660141

  16. Study of components and statistical reaction mechanism in simulation of nuclear process for optimized production of {sup 64}Cu and {sup 67}Ga medical radioisotopes using TALYS, EMPIRE and LISE++ nuclear reaction and evaporation codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nasrabadi, M. N., E-mail: mnnasrabadi@ast.ui.ac.ir; Sepiani, M.

    2015-03-30

    Production of medical radioisotopes is one of the most important tasks in the field of nuclear technology. These radioactive isotopes are mainly produced through variety nuclear process. In this research, excitation functions and nuclear reaction mechanisms are studied for simulation of production of these radioisotopes in the TALYS, EMPIRE and LISE++ reaction codes, then parameters and different models of nuclear level density as one of the most important components in statistical reaction models are adjusted for optimum production of desired radioactive yields.

  17. Study of components and statistical reaction mechanism in simulation of nuclear process for optimized production of 64Cu and 67Ga medical radioisotopes using TALYS, EMPIRE and LISE++ nuclear reaction and evaporation codes

    NASA Astrophysics Data System (ADS)

    Nasrabadi, M. N.; Sepiani, M.

    2015-03-01

    Production of medical radioisotopes is one of the most important tasks in the field of nuclear technology. These radioactive isotopes are mainly produced through variety nuclear process. In this research, excitation functions and nuclear reaction mechanisms are studied for simulation of production of these radioisotopes in the TALYS, EMPIRE & LISE++ reaction codes, then parameters and different models of nuclear level density as one of the most important components in statistical reaction models are adjusted for optimum production of desired radioactive yields.

  18. Assessment of Current Jet Noise Prediction Capabilities

    NASA Technical Reports Server (NTRS)

    Hunter, Craid A.; Bridges, James E.; Khavaran, Abbas

    2008-01-01

    An assessment was made of the capability of jet noise prediction codes over a broad range of jet flows, with the objective of quantifying current capabilities and identifying areas requiring future research investment. Three separate codes in NASA s possession, representative of two classes of jet noise prediction codes, were evaluated, one empirical and two statistical. The empirical code is the Stone Jet Noise Module (ST2JET) contained within the ANOPP aircraft noise prediction code. It is well documented, and represents the state of the art in semi-empirical acoustic prediction codes where virtual sources are attributed to various aspects of noise generation in each jet. These sources, in combination, predict the spectral directivity of a jet plume. A total of 258 jet noise cases were examined on the ST2JET code, each run requiring only fractions of a second to complete. Two statistical jet noise prediction codes were also evaluated, JeNo v1, and Jet3D. Fewer cases were run for the statistical prediction methods because they require substantially more resources, typically a Reynolds-Averaged Navier-Stokes solution of the jet, volume integration of the source statistical models over the entire plume, and a numerical solution of the governing propagation equation within the jet. In the evaluation process, substantial justification of experimental datasets used in the evaluations was made. In the end, none of the current codes can predict jet noise within experimental uncertainty. The empirical code came within 2dB on a 1/3 octave spectral basis for a wide range of flows. The statistical code Jet3D was within experimental uncertainty at broadside angles for hot supersonic jets, but errors in peak frequency and amplitude put it out of experimental uncertainty at cooler, lower speed conditions. Jet3D did not predict changes in directivity in the downstream angles. The statistical code JeNo,v1 was within experimental uncertainty predicting noise from cold subsonic jets at all angles, but did not predict changes with heating of the jet and did not account for directivity changes at supersonic conditions. Shortcomings addressed here give direction for future work relevant to the statistical-based prediction methods. A full report will be released as a chapter in a NASA publication assessing the state of the art in aircraft noise prediction.

  19. Pre-Statistical Process Control: Making Numbers Count! JobLink Winning at Work Instructor's Manual, Module 3.

    ERIC Educational Resources Information Center

    Coast Community Coll. District, Costa Mesa, CA.

    This instructor's manual for workplace trainers contains the materials required to conduct a course in pre-statistical process control. The course consists of six lessons for workers and two lessons for supervisors that discuss the following: concepts taught in the six lessons; workers' progress in the individual lessons; and strategies for…

  20. Exploring Pre-Service Teachers' Understanding of Statistical Variation: Implications for Teaching and Research

    ERIC Educational Resources Information Center

    Sharma, Sashi

    2007-01-01

    Concerns about the importance of variation in statistics education and a lack of research in this topic led to a preliminary study which explored pre-service teachers' ideas in this area. The teachers completed a written questionnaire about variation in sampling and distribution contexts. Responses were categorised in relation to a framework that…

  1. Pre-defined and optional staging for the deployment of enterprise systems: a case study and a framework

    NASA Astrophysics Data System (ADS)

    Lichtenstein, Yossi; Cucuy, Shy; Fink, Lior

    2017-03-01

    The effective deployment of enterprise systems has been a major challenge for many organisations. Customising the new system, changing business processes, and integrating multiple information sources are all difficult tasks. As such, they are typically done in carefully planned stages in a process known as phased implementation. Using ideas from Option Theory, this article critiques aspects of phased implementation. One customer relationship management (CRM) project and its phased implementation are described in detail and ten other enterprise system deployments are summarised as a basis for the observation that almost all deployment stages are pre-defined operational steps rather than decision points. However, Option Theory suggests that optional stages, to be used only when risk materialises, should be integral parts of project plans. Although such optional stages are often more valuable than pre-defined stages, the evidence presented in this article shows that they are only rarely utilised. Therefore, a simple framework is presented; it first identifies risks related to the deployment of enterprise systems, then identifies optional stages that can mitigate these risks, and finally compares the costs and benefits of both pre-defined and optional stages.

  2. Discovering new events beyond the catalogue—application of empirical matched field processing to Salton Sea geothermal field seismicity

    DOE PAGES

    Wang, Jingbo; Templeton, Dennise C.; Harris, David B.

    2015-07-30

    Using empirical matched field processing (MFP), we compare 4 yr of continuous seismic data to a set of 195 master templates from within an active geothermal field and identify over 140 per cent more events than were identified using traditional detection and location techniques alone. In managed underground reservoirs, a substantial fraction of seismic events can be excluded from the official catalogue due to an inability to clearly identify seismic-phase onsets. Empirical MFP can improve the effectiveness of current seismic detection and location methodologies by using conventionally located events with higher signal-to-noise ratios as master events to define wavefield templatesmore » that could then be used to map normally discarded indistinct seismicity. Since MFP does not require picking, it can be carried out automatically and rapidly once suitable templates are defined. In this application, we extend MFP by constructing local-distance empirical master templates using Southern California Earthquake Data Center archived waveform data of events originating within the Salton Sea Geothermal Field. We compare the empirical templates to continuous seismic data collected between 1 January 2008 and 31 December 2011. The empirical MFP method successfully identifies 6249 additional events, while the original catalogue reported 4352 events. The majority of these new events are lower-magnitude events with magnitudes between M0.2–M0.8. Here, the increased spatial-temporal resolution of the microseismicity map within the geothermal field illustrates how empirical MFP, when combined with conventional methods, can significantly improve seismic network detection capabilities, which can aid in long-term sustainability and monitoring of managed underground reservoirs.« less

  3. Pre-crash scenario typology for crash avoidance research

    DOT National Transportation Integrated Search

    2007-04-01

    This report defines a new pre-crash scenario typology for crash avoidance research based on the 2004 General Estimates System (GES) crash database, which consists of pre-crash scenarios depicting vehicle movements and dynamics as well as the critical...

  4. A REVIEW OF STATISTICAL METHODS FOR THE METEOROLOGICAL ADJUSTMENT OF TROPOSPHERIC OZONE

    EPA Science Inventory

    A variety of statistical methods for meteorological adjustment of ozone have been proposed in the literature over the last decade for purposes of forecasting, estimating ozone time trends, or investigating underlying mechanisms from an empirical perspective. The methods can be...

  5. Control Theory and Statistical Generalizations.

    ERIC Educational Resources Information Center

    Powers, William T.

    1990-01-01

    Contrasts modeling methods in control theory to the methods of statistical generalizations in empirical studies of human or animal behavior. Presents a computer simulation that predicts behavior based on variables (effort and rewards) determined by the invariable (desired reward). Argues that control theory methods better reflect relationships to…

  6. Contrast Analysis: A Tutorial

    ERIC Educational Resources Information Center

    Haans, Antal

    2018-01-01

    Contrast analysis is a relatively simple but effective statistical method for testing theoretical predictions about differences between group means against the empirical data. Despite its advantages, contrast analysis is hardly used to date, perhaps because it is not implemented in a convenient manner in many statistical software packages. This…

  7. A Critical Review of the Canadian Empirical Literature: Documenting Generation 1.5's K-16 Trajectories

    ERIC Educational Resources Information Center

    Garnett, Bruce

    2012-01-01

    Little empirical research has ever systematically documented the academic trajectories of Generation 1.5 in Canadian schools. Indeed, this label has not even been used to define the population of interest in the studies reviewed here. Nonetheless, some earlier work, along with more current studies made possible by recent availability of data, has…

  8. EGG: hatching a mock Universe from empirical prescriptions⋆

    NASA Astrophysics Data System (ADS)

    Schreiber, C.; Elbaz, D.; Pannella, M.; Merlin, E.; Castellano, M.; Fontana, A.; Bourne, N.; Boutsia, K.; Cullen, F.; Dunlop, J.; Ferguson, H. C.; Michałowski, M. J.; Okumura, K.; Santini, P.; Shu, X. W.; Wang, T.; White, C.

    2017-06-01

    This paper introduces EGG, the Empirical Galaxy Generator, a tool designed within the ASTRODEEP collaboration to generate mock galaxy catalogs for deep fields with realistic fluxes and simple morphologies. The simulation procedure is based exclusively on empirical prescriptions - rather than first principles - to provide the most accurate match with current observations at 0

  9. Measured, modeled, and causal conceptions of fitness

    PubMed Central

    Abrams, Marshall

    2012-01-01

    This paper proposes partial answers to the following questions: in what senses can fitness differences plausibly be considered causes of evolution?What relationships are there between fitness concepts used in empirical research, modeling, and abstract theoretical proposals? How does the relevance of different fitness concepts depend on research questions and methodological constraints? The paper develops a novel taxonomy of fitness concepts, beginning with type fitness (a property of a genotype or phenotype), token fitness (a property of a particular individual), and purely mathematical fitness. Type fitness includes statistical type fitness, which can be measured from population data, and parametric type fitness, which is an underlying property estimated by statistical type fitnesses. Token fitness includes measurable token fitness, which can be measured on an individual, and tendential token fitness, which is assumed to be an underlying property of the individual in its environmental circumstances. Some of the paper's conclusions can be outlined as follows: claims that fitness differences do not cause evolution are reasonable when fitness is treated as statistical type fitness, measurable token fitness, or purely mathematical fitness. Some of the ways in which statistical methods are used in population genetics suggest that what natural selection involves are differences in parametric type fitnesses. Further, it's reasonable to think that differences in parametric type fitness can cause evolution. Tendential token fitnesses, however, are not themselves sufficient for natural selection. Though parametric type fitnesses are typically not directly measurable, they can be modeled with purely mathematical fitnesses and estimated by statistical type fitnesses, which in turn are defined in terms of measurable token fitnesses. The paper clarifies the ways in which fitnesses depend on pragmatic choices made by researchers. PMID:23112804

  10. Organic food consumption during pregnancy and its association with health-related characteristics: the KOALA Birth Cohort Study.

    PubMed

    Simões-Wüst, Ana Paula; Moltó-Puigmartí, Carolina; Jansen, Eugene Hjm; van Dongen, Martien Cjm; Dagnelie, Pieter C; Thijs, Carel

    2017-08-01

    To investigate the associations of organic food consumption with maternal pre-pregnancy BMI, hypertension and diabetes in pregnancy, and several blood biomarkers of pregnant women. Prospective cohort study. Pregnant women were recruited at midwives' practices and through channels related to consumption of food from organic origin. Pregnant women who filled in FFQ and donated a blood sample (n 1339). Participant groups were defined based on the share of consumed organic products; to discriminate between effects of food origin and food patterns, healthy diet indicators were considered in some statistical models. Consumption of organic food was associated with a more favourable pre-pregnancy BMI and lower prevalence of gestational diabetes. Compared with participants consuming no organic food (reference group), a marker of dairy products intake (pentadecanoic acid) and trans-fatty acids from natural origin (vaccenic and rumenic acids) were higher among participants consuming organic food (organic groups), whereas elaidic acid, a marker of the intake of trans-fatty acids found in industrially hydrogenated fats, was lower. Plasma levels of homocysteine and 25-hydroxyvitamin D were lower in the organic groups than in the reference group. Differences in pentadecanoic acid, vaccenic acid and vitamin D retained statistical significance when correcting for indicators of the healthy diet pattern associated with the consumption of organic food. Consumption of organic food during pregnancy is associated with several health-related characteristics and blood biomarkers. Part of the observed associations is explained by food patterns accompanying the consumption of organic food.

  11. Estimation and Modelling of Land Surface Temperature Using Landsat 7 ETM+ Images and Fuzzy System Techniques

    NASA Astrophysics Data System (ADS)

    Bisht, K.; Dodamani, S. S.

    2016-12-01

    Modelling of Land Surface Temperature is essential for short term and long term management of environmental studies and management activities of the Earth's resources. The objective of this research is to estimate and model Land Surface Temperatures (LST). For this purpose, Landsat 7 ETM+ images period from 2007 to 2012 were used for retrieving LST and processed through MATLAB software using Mamdani fuzzy inference systems (MFIS), which includes pre-monsoon and post-monsoon LST in the fuzzy model. The Mangalore City of Karnataka state, India has been taken for this research work. Fuzzy model inputs are considered as the pre-monsoon and post-monsoon retrieved temperatures and LST was chosen as output. In order to develop a fuzzy model for LST, seven fuzzy subsets, nineteen rules and one output are considered for the estimation of weekly mean air temperature. These are very low (VL), low (L), medium low (ML), medium (M), medium high (MH), high (H) and very high (VH). The TVX (Surface Temperature Vegetation Index) and the empirical method have provided estimated LST. The study showed that the Fuzzy model M4/7-19-1 (model 4, 7 fuzzy sets, 19 rules and 1 output) which developed over Mangalore City has provided more accurate outcomes than other models (M1, M2, M3, M5). The result of this research was evaluated according to statistical rules. The best correlation coefficient (R) and root mean squared error (RMSE) between estimated and measured values for pre-monsoon and post-monsoon LST found to be 0.966 - 1.607 K and 0.963- 1.623 respectively.

  12. A weighted generalized score statistic for comparison of predictive values of diagnostic tests.

    PubMed

    Kosinski, Andrzej S

    2013-03-15

    Positive and negative predictive values are important measures of a medical diagnostic test performance. We consider testing equality of two positive or two negative predictive values within a paired design in which all patients receive two diagnostic tests. The existing statistical tests for testing equality of predictive values are either Wald tests based on the multinomial distribution or the empirical Wald and generalized score tests within the generalized estimating equations (GEE) framework. As presented in the literature, these test statistics have considerably complex formulas without clear intuitive insight. We propose their re-formulations that are mathematically equivalent but algebraically simple and intuitive. As is clearly seen with a new re-formulation we presented, the generalized score statistic does not always reduce to the commonly used score statistic in the independent samples case. To alleviate this, we introduce a weighted generalized score (WGS) test statistic that incorporates empirical covariance matrix with newly proposed weights. This statistic is simple to compute, always reduces to the score statistic in the independent samples situation, and preserves type I error better than the other statistics as demonstrated by simulations. Thus, we believe that the proposed WGS statistic is the preferred statistic for testing equality of two predictive values and for corresponding sample size computations. The new formulas of the Wald statistics may be useful for easy computation of confidence intervals for difference of predictive values. The introduced concepts have potential to lead to development of the WGS test statistic in a general GEE setting. Copyright © 2012 John Wiley & Sons, Ltd.

  13. A weighted generalized score statistic for comparison of predictive values of diagnostic tests

    PubMed Central

    Kosinski, Andrzej S.

    2013-01-01

    Positive and negative predictive values are important measures of a medical diagnostic test performance. We consider testing equality of two positive or two negative predictive values within a paired design in which all patients receive two diagnostic tests. The existing statistical tests for testing equality of predictive values are either Wald tests based on the multinomial distribution or the empirical Wald and generalized score tests within the generalized estimating equations (GEE) framework. As presented in the literature, these test statistics have considerably complex formulas without clear intuitive insight. We propose their re-formulations which are mathematically equivalent but algebraically simple and intuitive. As is clearly seen with a new re-formulation we present, the generalized score statistic does not always reduce to the commonly used score statistic in the independent samples case. To alleviate this, we introduce a weighted generalized score (WGS) test statistic which incorporates empirical covariance matrix with newly proposed weights. This statistic is simple to compute, it always reduces to the score statistic in the independent samples situation, and it preserves type I error better than the other statistics as demonstrated by simulations. Thus, we believe the proposed WGS statistic is the preferred statistic for testing equality of two predictive values and for corresponding sample size computations. The new formulas of the Wald statistics may be useful for easy computation of confidence intervals for difference of predictive values. The introduced concepts have potential to lead to development of the weighted generalized score test statistic in a general GEE setting. PMID:22912343

  14. Mere exposure alters category learning of novel objects.

    PubMed

    Folstein, Jonathan R; Gauthier, Isabel; Palmeri, Thomas J

    2010-01-01

    We investigated how mere exposure to complex objects with correlated or uncorrelated object features affects later category learning of new objects not seen during exposure. Correlations among pre-exposed object dimensions influenced later category learning. Unlike other published studies, the collection of pre-exposed objects provided no information regarding the categories to be learned, ruling out unsupervised or incidental category learning during pre-exposure. Instead, results are interpreted with respect to statistical learning mechanisms, providing one of the first demonstrations of how statistical learning can influence visual object learning.

  15. Mere Exposure Alters Category Learning of Novel Objects

    PubMed Central

    Folstein, Jonathan R.; Gauthier, Isabel; Palmeri, Thomas J.

    2010-01-01

    We investigated how mere exposure to complex objects with correlated or uncorrelated object features affects later category learning of new objects not seen during exposure. Correlations among pre-exposed object dimensions influenced later category learning. Unlike other published studies, the collection of pre-exposed objects provided no information regarding the categories to be learned, ruling out unsupervised or incidental category learning during pre-exposure. Instead, results are interpreted with respect to statistical learning mechanisms, providing one of the first demonstrations of how statistical learning can influence visual object learning. PMID:21833209

  16. Defining effective community support for long-term psychiatric patients according to behavioural principles.

    PubMed

    Evans, I M; Moltzen, N L

    2000-08-01

    The purpose of this article is to define the characteristics of effective support in community mental health settings for patients with serious and persistent mental illness. A broad literature providing empirical evidence on competent caregiver behaviours and styles is selectively reviewed. Relevant findings from family caregiver research and studies of social environments that enhance skill development in people with intellectual disabilities are incorporated, within a cognitive-behavioural framework. Six important domains are identified which represent positive caregiver styles: acceptance, creating a positive atmosphere, expectations of change, responsiveness, normalisation and educativeness. The characteristics hypothesised to be critical for caregivers and support workers are defined in a general way that can allow for individualisation according to the goals of the programs and the cultural priorities of staff and patients. Further empirical validation of these characteristics would enable community mental health services to provide more specialised clinical treatments.

  17. Regional Earthquake Likelihood Models: A realm on shaky grounds?

    NASA Astrophysics Data System (ADS)

    Kossobokov, V.

    2005-12-01

    Seismology is juvenile and its appropriate statistical tools to-date may have a "medievil flavor" for those who hurry up to apply a fuzzy language of a highly developed probability theory. To become "quantitatively probabilistic" earthquake forecasts/predictions must be defined with a scientific accuracy. Following the most popular objectivists' viewpoint on probability, we cannot claim "probabilities" adequate without a long series of "yes/no" forecast/prediction outcomes. Without "antiquated binary language" of "yes/no" certainty we cannot judge an outcome ("success/failure"), and, therefore, quantify objectively a forecast/prediction method performance. Likelihood scoring is one of the delicate tools of Statistics, which could be worthless or even misleading when inappropriate probability models are used. This is a basic loophole for a misuse of likelihood as well as other statistical methods on practice. The flaw could be avoided by an accurate verification of generic probability models on the empirical data. It is not an easy task in the frames of the Regional Earthquake Likelihood Models (RELM) methodology, which neither defines the forecast precision nor allows a means to judge the ultimate success or failure in specific cases. Hopefully, the RELM group realizes the problem and its members do their best to close the hole with an adequate, data supported choice. Regretfully, this is not the case with the erroneous choice of Gerstenberger et al., who started the public web site with forecasts of expected ground shaking for `tomorrow' (Nature 435, 19 May 2005). Gerstenberger et al. have inverted the critical evidence of their study, i.e., the 15 years of recent seismic record accumulated just in one figure, which suggests rejecting with confidence above 97% "the generic California clustering model" used in automatic calculations. As a result, since the date of publication in Nature the United States Geological Survey website delivers to the public, emergency planners and the media, a forecast product, which is based on wrong assumptions that violate the best-documented earthquake statistics in California, which accuracy was not investigated, and which forecasts were not tested in a rigorous way.

  18. A statistical test of the stability assumption inherent in empirical estimates of economic depreciation.

    PubMed

    Shriver, K A

    1986-01-01

    Realistic estimates of economic depreciation are required for analyses of tax policy, economic growth and production, and national income and wealth. THe purpose of this paper is to examine the stability assumption underlying the econometric derivation of empirical estimates of economic depreciation for industrial machinery and and equipment. The results suggest that a reasonable stability of economic depreciation rates of decline may exist over time. Thus, the assumption of a constant rate of economic depreciation may be a reasonable approximation for further empirical economic analyses.

  19. Removal of antibiotics in a parallel-plate thin-film-photocatalytic reactor: Process modeling and evolution of transformation by-products and toxicity.

    PubMed

    Özkal, Can Burak; Frontistis, Zacharias; Antonopoulou, Maria; Konstantinou, Ioannis; Mantzavinos, Dionissios; Meriç, Süreyya

    2017-10-01

    Photocatalytic degradation of sulfamethoxazole (SMX) antibiotic has been studied under recycling batch and homogeneous flow conditions in a thin-film coated immobilized system namely parallel-plate (PPL) reactor. Experimentally designed, statistically evaluated with a factorial design (FD) approach with intent to provide a mathematical model takes into account the parameters influencing process performance. Initial antibiotic concentration, UV energy level, irradiated surface area, water matrix (ultrapure and secondary treated wastewater) and time, were defined as model parameters. A full of 2 5 experimental design was consisted of 32 random experiments. PPL reactor test experiments were carried out in order to set boundary levels for hydraulic, volumetric and defined defined process parameters. TTIP based thin-film with polyethylene glycol+TiO 2 additives were fabricated according to pre-described methodology. Antibiotic degradation was monitored by High Performance Liquid Chromatography analysis while the degradation products were specified by LC-TOF-MS analysis. Acute toxicity of untreated and treated SMX solutions was tested by standard Daphnia magna method. Based on the obtained mathematical model, the response of the immobilized PC system is described with a polynomial equation. The statistically significant positive effects are initial SMX concentration, process time and the combined effect of both, while combined effect of water matrix and irradiated surface area displays an adverse effect on the rate of antibiotic degradation by photocatalytic oxidation. Process efficiency and the validity of the acquired mathematical model was also verified for levofloxacin and cefaclor antibiotics. Immobilized PC degradation in PPL reactor configuration was found capable of providing reduced effluent toxicity by simultaneous degradation of SMX parent compound and TBPs. Copyright © 2017. Published by Elsevier B.V.

  20. EMPIRE: Nuclear Reaction Model Code System for Data Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herman, M.; Capote, R.; Carlson, B.V.

    EMPIRE is a modular system of nuclear reaction codes, comprising various nuclear models, and designed for calculations over a broad range of energies and incident particles. A projectile can be a neutron, proton, any ion (including heavy-ions) or a photon. The energy range extends from the beginning of the unresolved resonance region for neutron-induced reactions ({approx} keV) and goes up to several hundred MeV for heavy-ion induced reactions. The code accounts for the major nuclear reaction mechanisms, including direct, pre-equilibrium and compound nucleus ones. Direct reactions are described by a generalized optical model (ECIS03) or by the simplified coupled-channels approachmore » (CCFUS). The pre-equilibrium mechanism can be treated by a deformation dependent multi-step direct (ORION + TRISTAN) model, by a NVWY multi-step compound one or by either a pre-equilibrium exciton model with cluster emission (PCROSS) or by another with full angular momentum coupling (DEGAS). Finally, the compound nucleus decay is described by the full featured Hauser-Feshbach model with {gamma}-cascade and width-fluctuations. Advanced treatment of the fission channel takes into account transmission through a multiple-humped fission barrier with absorption in the wells. The fission probability is derived in the WKB approximation within the optical model of fission. Several options for nuclear level densities include the EMPIRE-specific approach, which accounts for the effects of the dynamic deformation of a fast rotating nucleus, the classical Gilbert-Cameron approach and pre-calculated tables obtained with a microscopic model based on HFB single-particle level schemes with collective enhancement. A comprehensive library of input parameters covers nuclear masses, optical model parameters, ground state deformations, discrete levels and decay schemes, level densities, fission barriers, moments of inertia and {gamma}-ray strength functions. The results can be converted into ENDF-6 formatted files using the accompanying code EMPEND and completed with neutron resonances extracted from the existing evaluations. The package contains the full EXFOR (CSISRS) library of experimental reaction data that are automatically retrieved during the calculations. Publication quality graphs can be obtained using the powerful and flexible plotting package ZVView. The graphic user interface, written in Tcl/Tk, provides for easy operation of the system. This paper describes the capabilities of the code, outlines physical models and indicates parameter libraries used by EMPIRE to predict reaction cross sections and spectra, mainly for nucleon-induced reactions. Selected applications of EMPIRE are discussed, the most important being an extensive use of the code in evaluations of neutron reactions for the new US library ENDF/B-VII.0. Future extensions of the system are outlined, including neutron resonance module as well as capabilities of generating covariances, using both KALMAN and Monte-Carlo methods, that are still being advanced and refined.« less

  1. Impact of Private Health Insurance on Lengths of Hospitalization and Healthcare Expenditure in India: Evidences from a Quasi-Experiment Study.

    PubMed

    Vellakkal, Sukumar

    2013-01-01

    The health insurers administer retrospectively package rates for various inpatient procedures as a provider payment mechanism to empanelled hospitals in Indian healthcare market. This study analyzed the impact of private health insurance on healthcare utilization in terms of both lengths of hospitalization and per-day hospitalization expenditure in Indian healthcare market where package rates are retrospectively defined as healthcare provider payment mechanism. The claim records of 94443 insured individuals and the hospitalisation data of 32665 uninsured individuals were used. By applying stepwise and propensity score matching method, the sample of uninsured individual was matched with insured and 'average treatment effect on treated' (ATT) was estimated. Overall, the strategies of hospitals, insured and insurers for maximizing their utility were competing with each other. However, two aligning co-operative strategies between insurer and hospitals were significant with dominant role of hospitals. The hospitals maximize their utility by providing high cost healthcare in par with pre-defined package rates but align with the interest of insurers by reducing the number (length) of hospitalisation days. The empirical results show that private health insurance coverage leads to i) reduction in length of hospitalization, and ii) increase in per day hospital (health) expenditure. It is necessary to regulate and develop a competent healthcare market in the country with proper monitoring mechanism on healthcare utilization and benchmarks for pricing and provision of healthcare services.

  2. Addressing the mischaracterization of extreme rainfall in regional climate model simulations - A synoptic pattern based bias correction approach

    NASA Astrophysics Data System (ADS)

    Li, Jingwan; Sharma, Ashish; Evans, Jason; Johnson, Fiona

    2018-01-01

    Addressing systematic biases in regional climate model simulations of extreme rainfall is a necessary first step before assessing changes in future rainfall extremes. Commonly used bias correction methods are designed to match statistics of the overall simulated rainfall with observations. This assumes that change in the mix of different types of extreme rainfall events (i.e. convective and non-convective) in a warmer climate is of little relevance in the estimation of overall change, an assumption that is not supported by empirical or physical evidence. This study proposes an alternative approach to account for the potential change of alternate rainfall types, characterized here by synoptic weather patterns (SPs) using self-organizing maps classification. The objective of this study is to evaluate the added influence of SPs on the bias correction, which is achieved by comparing the corrected distribution of future extreme rainfall with that using conventional quantile mapping. A comprehensive synthetic experiment is first defined to investigate the conditions under which the additional information of SPs makes a significant difference to the bias correction. Using over 600,000 synthetic cases, statistically significant differences are found to be present in 46% cases. This is followed by a case study over the Sydney region using a high-resolution run of the Weather Research and Forecasting (WRF) regional climate model, which indicates a small change in the proportions of the SPs and a statistically significant change in the extreme rainfall over the region, although the differences between the changes obtained from the two bias correction methods are not statistically significant.

  3. Marx and Dahrendorf on Income Inequality, Class Consciousness and Class Conflict: An Empirical Test.

    ERIC Educational Resources Information Center

    Robinson, Robert V.; Kelley, Jonathan

    The issue addressed by this paper is the lack of empirical research on the class theories of Karl Marx and Ralf Dahrendorf. In order to bridge this gap, data are analyzed on the theoretical and statistical implications of Marx's theory (which focuses on ownership of the means of production) and Dahrendorf's theory (which focuses on authority in…

  4. Forest canopy effects on snow accumulation and ablation: an integrative review of empirical results

    Treesearch

    Andres Varhola; Nicholas C. Coops; Markus Weiler; R. Dan Moore

    2010-01-01

    The past century has seen significant research comparing snow accumulation and ablation in forested and open sites. In this review we compile and standardize the results of previous empirical studies to generate statistical relations between changes in forest cover and the associated changes in snow accumulation and ablation rate. The analysis drew upon 33 articles...

  5. Complex dynamics and empirical evidence (Invited Paper)

    NASA Astrophysics Data System (ADS)

    Delli Gatti, Domenico; Gaffeo, Edoardo; Giulioni, Gianfranco; Gallegati, Mauro; Kirman, Alan; Palestrini, Antonio; Russo, Alberto

    2005-05-01

    Standard macroeconomics, based on a reductionist approach centered on the representative agent, is badly equipped to explain the empirical evidence where heterogeneity and industrial dynamics are the rule. In this paper we show that a simple agent-based model of heterogeneous financially fragile agents is able to replicate a large number of scaling type stylized facts with a remarkable degree of statistical precision.

  6. Covariations in ecological scaling laws fostered by community dynamics.

    PubMed

    Zaoli, Silvia; Giometto, Andrea; Maritan, Amos; Rinaldo, Andrea

    2017-10-03

    Scaling laws in ecology, intended both as functional relationships among ecologically relevant quantities and the probability distributions that characterize their occurrence, have long attracted the interest of empiricists and theoreticians. Empirical evidence exists of power laws associated with the number of species inhabiting an ecosystem, their abundances, and traits. Although their functional form appears to be ubiquitous, empirical scaling exponents vary with ecosystem type and resource supply rate. The idea that ecological scaling laws are linked has been entertained before, but the full extent of macroecological pattern covariations, the role of the constraints imposed by finite resource supply, and a comprehensive empirical verification are still unexplored. Here, we propose a theoretical scaling framework that predicts the linkages of several macroecological patterns related to species' abundances and body sizes. We show that such a framework is consistent with the stationary-state statistics of a broad class of resource-limited community dynamics models, regardless of parameterization and model assumptions. We verify predicted theoretical covariations by contrasting empirical data and provide testable hypotheses for yet unexplored patterns. We thus place the observed variability of ecological scaling exponents into a coherent statistical framework where patterns in ecology embed constrained fluctuations.

  7. Towards Validation of an Adaptive Flight Control Simulation Using Statistical Emulation

    NASA Technical Reports Server (NTRS)

    He, Yuning; Lee, Herbert K. H.; Davies, Misty D.

    2012-01-01

    Traditional validation of flight control systems is based primarily upon empirical testing. Empirical testing is sufficient for simple systems in which a.) the behavior is approximately linear and b.) humans are in-the-loop and responsible for off-nominal flight regimes. A different possible concept of operation is to use adaptive flight control systems with online learning neural networks (OLNNs) in combination with a human pilot for off-nominal flight behavior (such as when a plane has been damaged). Validating these systems is difficult because the controller is changing during the flight in a nonlinear way, and because the pilot and the control system have the potential to co-adapt in adverse ways traditional empirical methods are unlikely to provide any guarantees in this case. Additionally, the time it takes to find unsafe regions within the flight envelope using empirical testing means that the time between adaptive controller design iterations is large. This paper describes a new concept for validating adaptive control systems using methods based on Bayesian statistics. This validation framework allows the analyst to build nonlinear models with modal behavior, and to have an uncertainty estimate for the difference between the behaviors of the model and system under test.

  8. Empirical analysis of storm-time energetic electron enhancements

    NASA Astrophysics Data System (ADS)

    O'Brien, Thomas Paul, III

    This Ph.D. thesis documents a program for studying the appearance of energetic electrons in the Earth's outer radiation belts that is associated with many geomagnetic storms. The dynamic evolution of the electron radiation belts is an outstanding empirical problem in both theoretical space physics and its applied sibling, space weather. The project emphasizes the development of empirical tools and their use in testing several theoretical models of the energization of the electron belts. First, I develop the Statistical Asynchronous Regression technique to provide proxy electron fluxes throughout the parts of the radiation belts explored by geosynchronous and GPS spacecraft. Next, I show that a theoretical adiabatic model can relate the local time asymmetry of the proxy geosynchronous fluxes to the asymmetry of the geomagnetic field. Then, I perform a superposed epoch analysis on the proxy fluxes at local noon to identify magnetospheric and interplanetary precursors of relativistic electron enhancements. Finally, I use statistical and neural network phase space analyses to determine the hourly evolution of flux at a virtual stationary monitor. The dynamic equation quantitatively identifies the importance of different drivers of the electron belts. This project provides empirical constraints on theoretical models of electron acceleration.

  9. Pre-Service Teachers' Understanding of Measures of Centre: When the Meaning Gets Lost?

    ERIC Educational Resources Information Center

    Reaburn, Robyn

    2013-01-01

    Measures of centre (the mean, median and mode) are fundamental to the discipline of statistics. Yet previous research shows that students may not have a thorough conceptual understanding of these measures, even though these statistics are easy to calculate. This study describes the findings of a study of pre-service teachers' ideas of measure of…

  10. Pre-Service Mathematics Teachers' Use of Probability Models in Making Informal Inferences about a Chance Game

    ERIC Educational Resources Information Center

    Kazak, Sibel; Pratt, Dave

    2017-01-01

    This study considers probability models as tools for both making informal statistical inferences and building stronger conceptual connections between data and chance topics in teaching statistics. In this paper, we aim to explore pre-service mathematics teachers' use of probability models for a chance game, where the sum of two dice matters in…

  11. Signal Statistics and Maximum Likelihood Sequence Estimation in Intensity Modulated Fiber Optic Links Containing a Single Optical Pre-amplifier.

    PubMed

    Alić, Nikola; Papen, George; Saperstein, Robert; Milstein, Laurence; Fainman, Yeshaiahu

    2005-06-13

    Exact signal statistics for fiber-optic links containing a single optical pre-amplifier are calculated and applied to sequence estimation for electronic dispersion compensation. The performance is evaluated and compared with results based on the approximate chi-square statistics. We show that detection in existing systems based on exact statistics can be improved relative to using a chi-square distribution for realistic filter shapes. In contrast, for high-spectral efficiency systems the difference between the two approaches diminishes, and performance tends to be less dependent on the exact shape of the filter used.

  12. Intrusion Detection: Generics and State-of-the-Art (la Detection de l’intrusion: Modeles generiques et etat de l’art)

    DTIC Science & Technology

    2002-01-01

    by the user for a number of possible pre-defined intrusions. One of these pre-defined intrusions is the command “get /etc/ passwd ”. If this command is...Application-level firewalls: which check communication at the application level. An example is the string get /etc/ passwd in the ftp protocol

  13. Quantifying interactions between real oscillators with information theory and phase models: Application to cardiorespiratory coupling

    NASA Astrophysics Data System (ADS)

    Zhu, Yenan; Hsieh, Yee-Hsee; Dhingra, Rishi R.; Dick, Thomas E.; Jacono, Frank J.; Galán, Roberto F.

    2013-02-01

    Interactions between oscillators can be investigated with standard tools of time series analysis. However, these methods are insensitive to the directionality of the coupling, i.e., the asymmetry of the interactions. An elegant alternative was proposed by Rosenblum and collaborators [M. G. Rosenblum, L. Cimponeriu, A. Bezerianos, A. Patzak, and R. Mrowka, Phys. Rev. EPLEEE81063-651X10.1103/PhysRevE.65.041909 65, 041909 (2002); M. G. Rosenblum and A. S. Pikovsky, Phys. Rev. EPLEEE81063-651X10.1103/PhysRevE.64.045202 64, 045202 (2001)] which consists in fitting the empirical phases to a generic model of two weakly coupled phase oscillators. This allows one to obtain the interaction functions defining the coupling and its directionality. A limitation of this approach is that a solution always exists in the least-squares sense, even in the absence of coupling. To preclude spurious results, we propose a three-step protocol: (1) Determine if a statistical dependency exists in the data by evaluating the mutual information of the phases; (2) if so, compute the interaction functions of the oscillators; and (3) validate the empirical oscillator model by comparing the joint probability of the phases obtained from simulating the model with that of the empirical phases. We apply this protocol to a model of two coupled Stuart-Landau oscillators and show that it reliably detects genuine coupling. We also apply this protocol to investigate cardiorespiratory coupling in anesthetized rats. We observe reciprocal coupling between respiration and heartbeat and that the influence of respiration on the heartbeat is generally much stronger than vice versa. In addition, we find that the vagus nerve mediates coupling in both directions.

  14. Evaluation of median nerve T2 signal changes in patients with surgically treated carpal tunnel syndrome.

    PubMed

    Samanci, Yavuz; Karagöz, Yeşim; Yaman, Mehmet; Atçı, İbrahim Burak; Emre, Ufuk; Kılıçkesmez, Nuri Özgür; Çelik, Suat Erol

    2016-11-01

    To determine the accuracy of median nerve T2 evaluation and its relation with Boston Questionnaire (BQ) and nerve conduction studies (NCSs) in pre-operative and post-operative carpal tunnel syndrome (CTS) patients in comparison with healthy volunteers. Twenty-three CTS patients and 24 healthy volunteers underwent NCSs, median nerve T2 evaluation and self-administered BQ. Pre-operative and 1st year post-operative median nerve T2 values and cross-sectional areas (CSAs) were compared both within pre-operative and post-operative CTS groups, and with healthy volunteers. The relationship between MRI findings and BQ and NCSs was analyzed. The ROC curve analysis was used for determining the accuracy. The comparison of pre-operative and post-operative T2 values and CSAs revealed statistically significant improvements in the post-operative patient group (p<0.001 for all parameters). There were positive correlations between T2 values at all levels and BQ values, and positive and negative correlations were also found regarding T2 values and NCS findings in CTS patients. The receiver operating characteristic curve analysis for defined cut-off levels of median nerve T2 values in hands with severe CTS yielded excellent accuracy at all levels. However, this accuracy could not be demonstrated in hands with mild CTS. This study is the first to analyze T2 values in both pre-operative and post-operative CTS patients. The presence of increased T2 values in CTS patients compared to controls and excellent accuracy in hands with severe CTS indicates T2 signal changes related to CTS pathophysiology and possible utilization of T2 signal evaluation in hands with severe CTS. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Effects of eight weeks of functional training in the functional autonomy of elderly women: a pilot study.

    PubMed

    DE Matos, Dihogo G; Mazini Filho, Mauro L; Moreira, Osvaldo C; DE Oliveira, Cláudia E; DE Oliveira Venturini, Gabriela R; DA Silva-Grigoletto, Marzo E; Aidar, Felipe J

    2017-03-01

    This study aimed to evaluate the effects of eight weeks of practical training on the functional autonomy of the elderly. The study included 52 elderly women, 65.42±10.31 years, 65.29±11.30 kg body mass, 1.58±0.07 height, 26.30±4.52 body mass index, 86.48±10.96 cm waist circumference. These elderly women received a specific functional training protocol where their functional autonomy was assessed at three specific times (0, 10 and 20 sessions). The evaluation consisted of a set of five tests defined by the Latin-American Development Group for the Elderly (GDLAM) to determine the functional autonomy of the elderly: walk 10 meters (C10m); stand up from a chair and walk straightaway (SUCWA); dress and undress a T-shirt (DUT); stand up from a sitting position (SUSP); stand up from a lying position (SULP). In each test, the time taken to complete the task was measured. There were statistically significant differences in all functional autonomy tests after 20 training sessions: C10m (pre: 8.10±1.27; post: 7.55±1.10); SUCWA (pre: 40.98±2.77; post: 38.44±2.57); DUT (pre: 13.25±0.88; post: 11.85±0.82); SUSP (pre: 10.74±0.52; post: 8.98±056) and SULP (pre: 3.86±0.37; post: 2.82±0.37). It was determined that 20 functional training sessions were enough to improve the functional autonomy of elderly women. However, we believe that higher volume and intensity of training could be interesting alternatives for even stronger results in future interventions.

  16. A Framework for Empirical Discovery.

    DTIC Science & Technology

    1986-09-24

    history of science reveal distinct classes of defined terms. Some systems have focused on one subset of these classes, while other programs have...the operators in detail, presenting examples of each from the history of science . 2.1 Defining Numeric Terms The most obvious operator for defining...laws; they can also simplify the process of discovering such laws. Let us consider some examples from the history of science in which the definition of

  17. Can Propensity Score Analysis Approximate Randomized Experiments Using Pretest and Demographic Information in Pre-K Intervention Research?

    PubMed

    Dong, Nianbo; Lipsey, Mark W

    2017-01-01

    It is unclear whether propensity score analysis (PSA) based on pretest and demographic covariates will meet the ignorability assumption for replicating the results of randomized experiments. This study applies within-study comparisons to assess whether pre-Kindergarten (pre-K) treatment effects on achievement outcomes estimated using PSA based on a pretest and demographic covariates can approximate those found in a randomized experiment. Data-Four studies with samples of pre-K children each provided data on two math achievement outcome measures with baseline pretests and child demographic variables that included race, gender, age, language spoken at home, and mother's highest education. Research Design and Data Analysis-A randomized study of a pre-K math curriculum provided benchmark estimates of effects on achievement measures. Comparison samples from other pre-K studies were then substituted for the original randomized control and the effects were reestimated using PSA. The correspondence was evaluated using multiple criteria. The effect estimates using PSA were in the same direction as the benchmark estimates, had similar but not identical statistical significance, and did not differ from the benchmarks at statistically significant levels. However, the magnitude of the effect sizes differed and displayed both absolute and relative bias larger than required to show statistical equivalence with formal tests, but those results were not definitive because of the limited statistical power. We conclude that treatment effect estimates based on a single pretest and demographic covariates in PSA correspond to those from a randomized experiment on the most general criteria for equivalence.

  18. Treatment outcomes in patients with third-generation cephalosporin-resistant Enterobacter bacteremia.

    PubMed

    O'Neal, Catherine S; O'Neal, Hollis R; Daniels, Titus L; Talbot, Thomas R

    2012-10-01

    Infections with resistant Enterobacter spp. are increasingly described, yet data on outcomes associated with these infections are limited. A retrospective cohort study was conducted to investigate outcomes of hospitalized patients with third-generation cephalosporin-resistant (CR) Enterobacter bacteremia. Cephalosporin resistance was detected using cefotaxime and cefpodoxime. Patients with Enterobacter spp. bacteremia from January 2006 through February 2008 defined the population. We defined cases as those with CR isolates; controls were patients with bacteremia due to non-CR isolates. Treatment failure was defined as persistence of the presenting signs of infection 72 h after initial culture collection. Of the 95 Enterobacter cases identified, 31 (33%) were CR. CR cases were significantly associated with treatment failure (odds ratio (OR) 2.81, 95% confidence interval (CI) 1.14-6.94). This association was not seen after adjustment for age, simplified acute physiology score (SAPS II), and inappropriate empiric antibiotic therapy. Inappropriate empiric therapy (adjusted OR 3.86, 95% CI 1.32-11.31) and SAPS II score (adjusted OR 1.09, 95% CI 1.02-1.16) were significantly associated with treatment failure in the multivariate analysis. Third-generation cephalosporin-resistant Enterobacter bacteremia is associated with treatment failure due to receipt of inappropriate empiric antibiotic therapy and severity of illness.

  19. An empirical, hierarchical typology of tree species assemblages for assessing forest dynamics under global change scenarios

    Treesearch

    Jennifer K. Costanza; John W. Coulston; David N. Wear

    2017-01-01

    The composition of tree species occurring in a forest is important and can be affected by global change drivers such as climate change. To inform assessment and projection of global change impacts at broad extents, we used hierarchical cluster analysis and over 120,000 recent forest inventory plots to empirically define forest tree assemblages across the U.S., and...

  20. Values Education in Ottoman Empire in the Second Constitutional Period: A Sample Lesson

    ERIC Educational Resources Information Center

    Oruc, Sahin; Ilhan, Genc Osman

    2015-01-01

    Values education holds a significant place in an education environment and many studies are carried out about this new subject area. The aim of this study is to define how the subject of "values education" is handled in a sample lesson designed in the period of Constitution II in the Ottoman Empire. In this study, the lesson plan in the…

  1. A REVIEW OF STATISTICAL METHODS FOR THE METEOROLOGICAL ADJUSTMENT OF TROPOSPHERIC OZONE. (R825173)

    EPA Science Inventory

    Abstract

    A variety of statistical methods for meteorological adjustment of ozone have been proposed in the literature over the last decade for purposes of forecasting, estimating ozone time trends, or investigating underlying mechanisms from an empirical perspective. T...

  2. The Surprisingly Modest Relationship between SES and Educational Achievement

    ERIC Educational Resources Information Center

    Harwell, Michael; Maeda, Yukiko; Bishop, Kyoungwon; Xie, Aolin

    2017-01-01

    Measures of socioeconomic status (SES) are routinely used in analyses of achievement data to increase statistical power, statistically control for the effects of SES, and enhance causality arguments under the premise that the SES-achievement relationship is moderate to strong. Empirical evidence characterizing the strength of the SES-achievement…

  3. Assessing the Impact of Group Projects on Examination Performance in Social Statistics

    ERIC Educational Resources Information Center

    Delucchi, Michael

    2007-01-01

    College teachers in the sciences and professional studies have endorsed collaborative learning group strategies for teaching undergraduate statistics courses, but few researchers provide empirical evidence that students' quantitative skills actually increase as a result of the collaborative experience. Assessment of the efficacy of collaborative…

  4. A Geospatial Statistical Analysis of the Density of Lottery Outlets within Ethnically Concentrated Neighborhoods

    ERIC Educational Resources Information Center

    Wiggins, Lyna; Nower, Lia; Mayers, Raymond Sanchez; Peterson, N. Andrew

    2010-01-01

    This study examines the density of lottery outlets within ethnically concentrated neighborhoods in Middlesex County, New Jersey, using geospatial statistical analyses. No prior studies have empirically examined the relationship between lottery outlet density and population demographics. Results indicate that lottery outlets were not randomly…

  5. Statistical Treatment of Looking-Time Data

    ERIC Educational Resources Information Center

    Csibra, Gergely; Hernik, Mikolaj; Mascaro, Olivier; Tatone, Denis; Lengyel, Máté

    2016-01-01

    Looking times (LTs) are frequently measured in empirical research on infant cognition. We analyzed the statistical distribution of LTs across participants to develop recommendations for their treatment in infancy research. Our analyses focused on a common within-subject experimental design, in which longer looking to novel or unexpected stimuli is…

  6. Statistical Measures of Integrity in Online Testing: Empirical Study

    ERIC Educational Resources Information Center

    Wielicki, Tom

    2016-01-01

    This paper reports on longitudinal study regarding integrity of testing in an online format as used by e-learning platforms. Specifically, this study explains whether online testing, which implies an open book format is compromising integrity of assessment by encouraging cheating among students. Statistical experiment designed for this study…

  7. Representing Micro-Macro Linkages by Actor-Based Dynamic Network Models

    PubMed Central

    Snijders, Tom A.B.; Steglich, Christian E.G.

    2014-01-01

    Stochastic actor-based models for network dynamics have the primary aim of statistical inference about processes of network change, but may be regarded as a kind of agent-based models. Similar to many other agent-based models, they are based on local rules for actor behavior. Different from many other agent-based models, by including elements of generalized linear statistical models they aim to be realistic detailed representations of network dynamics in empirical data sets. Statistical parallels to micro-macro considerations can be found in the estimation of parameters determining local actor behavior from empirical data, and the assessment of goodness of fit from the correspondence with network-level descriptives. This article studies several network-level consequences of dynamic actor-based models applied to represent cross-sectional network data. Two examples illustrate how network-level characteristics can be obtained as emergent features implied by micro-specifications of actor-based models. PMID:25960578

  8. Expert training with standardized operative technique helps establish a successful penile prosthetics program for urologic resident education.

    PubMed

    King, Ashley B; Klausner, Adam P; Johnson, Corey M; Moore, Blake W; Wilson, Steven K; Grob, B Mayer

    2011-10-01

    The challenge of resident education in urologic surgery programs is to overcome disparity imparted by diverse patient populations, limited training times, and inequalities in the availability of expert surgical educators. Specifically, in the area of prosthetic urology, only a small proportion of programs have full-time faculty available to train residents in this discipline. To examine whether a new model using yearly training sessions from a recognized expert can establish a successful penile prosthetics program and result in better outcomes, higher case volumes, and willingness to perform more complex surgeries. A recognized expert conducted one to two operative training sessions yearly to teach standardized technique for penile prosthetics to residents. Each session consisted of three to four operative cases performed under the direct supervision of the expert. Retrospective data were collected from all penile prosthetic operations before (February, 2000 to June, 2004: N = 44) and after (July, 2004 to October, 2007: N = 79) implementation of these sessions. Outcomes reviewed included patient age, race, medical comorbidities, operative time, estimated blood loss, type of prosthesis, operative approach, drain usage, length of stay, and complications including revision/explantation rates. Statistical analysis was performed using Student's t-tests, Fisher's tests, and survival curves using the Kaplan-Meier technique (P value ≤ 0.05 to define statistical significance). Patient characteristics were not significantly different pre- vs. post-training. Operative time and estimated blood loss significantly decreased. Inflatable implants increased from 19/44 (43.2%, pre-training) to 69/79 (87.3%, post-training) (P < 0.01). Operations per year increased from 9.96 (pre-training) to 24 (post-training) (P < 0.01). Revision/explantation occurred in 11/44 patients (25%, pre-training) vs. 7/79 (8.9%, post-training) (P < 0.05). These data demonstrate that yearly sessions with a recognized expert can improve surgical outcomes, type, and volume of implants and can reduce explantation/revision rates. This represents an excellent model for improved training of urologic residents in penile prosthetics surgery. © 2011 International Society for Sexual Medicine.

  9. Assumption Trade-Offs When Choosing Identification Strategies for Pre-Post Treatment Effect Estimation: An Illustration of a Community-Based Intervention in Madagascar.

    PubMed

    Weber, Ann M; van der Laan, Mark J; Petersen, Maya L

    2015-03-01

    Failure (or success) in finding a statistically significant effect of a large-scale intervention may be due to choices made in the evaluation. To highlight the potential limitations and pitfalls of some common identification strategies used for estimating causal effects of community-level interventions, we apply a roadmap for causal inference to a pre-post evaluation of a national nutrition program in Madagascar. Selection into the program was non-random and strongly associated with the pre-treatment (lagged) outcome. Using structural causal models (SCM), directed acyclic graphs (DAGs) and simulated data, we illustrate that an estimand with the outcome defined as the post-treatment outcome controls for confounding by the lagged outcome but not by possible unmeasured confounders. Two separate differencing estimands (of the pre- and post-treatment outcome) have the potential to adjust for a certain type of unmeasured confounding, but introduce bias if the additional identification assumptions they rely on are not met. In order to illustrate the practical impact of choice between three common identification strategies and their corresponding estimands, we used observational data from the community nutrition program in Madagascar to estimate each of these three estimands. Specifically, we estimated the average treatment effect of the program on the community mean nutritional status of children 5 years and under and found that the estimate based on the post-treatment estimand was about a quarter of the magnitude of either of the differencing estimands (0.066 SD vs. 0.26-0.27 SD increase in mean weight-for-age z-score). Choice of estimand clearly has important implications for the interpretation of the success of the program to improve nutritional status of young children. A careful appraisal of the assumptions underlying the causal model is imperative before committing to a statistical model and progressing to estimation. However, knowledge about the data-generating process must be sufficient in order to choose the identification strategy that gets us closest to the truth.

  10. Economic impact of switching from metoprolol to nebivolol for hypertension treatment: a retrospective database analysis.

    PubMed

    Chen, Stephanie; Tourkodimitris, Stavros; Lukic, Tatjana

    2014-10-01

    To estimate the real-world economic impact of switching hypertensive patients from metoprolol, a commonly prescribed, generic, non-vasodilatory β1-blocker, to nebivolol, a branded-protected vasodilatory β1-blocker. Retrospective analysis with a pre-post study design was conducted using the MarketScan database (2007-2011). Hypertensive patients continuously treated with metoprolol for ≥6 months (pre-period) and then switched to nebivolol for ≥6 months (post-period) were identified. The index date for switching was defined as the first nebivolol dispensing date. Data were collected for the two 6-month periods pre- and post-switching. Monthly healthcare resource utilization and healthcare costs pre- and post-switching were calculated and compared using Wilcoxon test and paired t-test. Medical costs at different years were inflated to the 2011 dollar. In total, 2259 patients (mean age: 60 years; male: 52%; cardiovascular [CV] disease: 37%) met the selection criteria. Switching to nebivolol was associated with statistically significant reductions in the number of all-cause hospitalization (-33%; p < 0.01), CV-related hospitalizations (-60%; p < 0.01), and outpatient visits (-7%; p < 0.01). Monthly inpatient costs were reduced by $111 (p < 0.01), while monthly drug costs increased by $52 (p < 0.01). No statistically significant differences were found in overall costs and costs of outpatient or ER visits. Sensitivity analyses, conducted using various lengths of medication exposure, controlling for spill-over effect or excluding patients with compelling indications for metoprolol, all found some level of reduction in resource utilization and no significant difference in overall healthcare costs. This real-world study suggests that switching from metoprolol to nebivolol is associated with an increase in medication costs and significant reductions in hospitalizations and outpatient visits upon switching, resulting in an overall neutral effect on healthcare costs. These results may be interpreted with caution due to lack of a comparator group and confounding control caused by design and limitations inherent in insurance claims data.

  11. Source-Modeling Auditory Processes of EEG Data Using EEGLAB and Brainstorm.

    PubMed

    Stropahl, Maren; Bauer, Anna-Katharina R; Debener, Stefan; Bleichner, Martin G

    2018-01-01

    Electroencephalography (EEG) source localization approaches are often used to disentangle the spatial patterns mixed up in scalp EEG recordings. However, approaches differ substantially between experiments, may be strongly parameter-dependent, and results are not necessarily meaningful. In this paper we provide a pipeline for EEG source estimation, from raw EEG data pre-processing using EEGLAB functions up to source-level analysis as implemented in Brainstorm. The pipeline is tested using a data set of 10 individuals performing an auditory attention task. The analysis approach estimates sources of 64-channel EEG data without the prerequisite of individual anatomies or individually digitized sensor positions. First, we show advanced EEG pre-processing using EEGLAB, which includes artifact attenuation using independent component analysis (ICA). ICA is a linear decomposition technique that aims to reveal the underlying statistical sources of mixed signals and is further a powerful tool to attenuate stereotypical artifacts (e.g., eye movements or heartbeat). Data submitted to ICA are pre-processed to facilitate good-quality decompositions. Aiming toward an objective approach on component identification, the semi-automatic CORRMAP algorithm is applied for the identification of components representing prominent and stereotypic artifacts. Second, we present a step-wise approach to estimate active sources of auditory cortex event-related processing, on a single subject level. The presented approach assumes that no individual anatomy is available and therefore the default anatomy ICBM152, as implemented in Brainstorm, is used for all individuals. Individual noise modeling in this dataset is based on the pre-stimulus baseline period. For EEG source modeling we use the OpenMEEG algorithm as the underlying forward model based on the symmetric Boundary Element Method (BEM). We then apply the method of dynamical statistical parametric mapping (dSPM) to obtain physiologically plausible EEG source estimates. Finally, we show how to perform group level analysis in the time domain on anatomically defined regions of interest (auditory scout). The proposed pipeline needs to be tailored to the specific datasets and paradigms. However, the straightforward combination of EEGLAB and Brainstorm analysis tools may be of interest to others performing EEG source localization.

  12. An integrated educational model for continuing nurse education.

    PubMed

    Duff, Beverley; Gardner, Glenn; Osborne, Sonya

    2014-01-01

    This paper reports on the development and evaluation of an integrated clinical learning model to inform ongoing education for surgical nurses. The research aim was to evaluate the effectiveness of implementing a Respiratory Skills Update (ReSKU) education program, in the context of organisational utility, on improving surgical nurses' practice in the area of respiratory assessment. Continuous development and integration of technological innovations and research in the healthcare environment mandate the need for continuing education for nurses. Despite an increased worldwide emphasis on this, there is scant empirical evidence of program effectiveness. A quasi experimental pre test, post test non-equivalent control group design evaluated the impact of the ReSKU program on surgical nurses' clinical practice. The 2008 study was conducted in a 400 bed regional referral public hospital and was consistent with contemporary educational approaches using multi-modal, interactive teaching strategies. The study demonstrated statistically significant differences between groups regarding reported use of respiratory skills, three months after ReSKU program attendance. Between group data analysis indicated that the intervention group's reported beliefs and attitudes pertaining to subscale descriptors showed statistically significant differences in three of the six subscales. The construct of critical thinking in the clinical context, combined with clinical reasoning and purposeful reflection, was a powerful educational strategy to enhance competency and capability in clinicians. Crown Copyright © 2012. Published by Elsevier Ltd. All rights reserved.

  13. pvsR: An Open Source Interface to Big Data on the American Political Sphere.

    PubMed

    Matter, Ulrich; Stutzer, Alois

    2015-01-01

    Digital data from the political sphere is abundant, omnipresent, and more and more directly accessible through the Internet. Project Vote Smart (PVS) is a prominent example of this big public data and covers various aspects of U.S. politics in astonishing detail. Despite the vast potential of PVS' data for political science, economics, and sociology, it is hardly used in empirical research. The systematic compilation of semi-structured data can be complicated and time consuming as the data format is not designed for conventional scientific research. This paper presents a new tool that makes the data easily accessible to a broad scientific community. We provide the software called pvsR as an add-on to the R programming environment for statistical computing. This open source interface (OSI) serves as a direct link between a statistical analysis and the large PVS database. The free and open code is expected to substantially reduce the cost of research with PVS' new big public data in a vast variety of possible applications. We discuss its advantages vis-à-vis traditional methods of data generation as well as already existing interfaces. The validity of the library is documented based on an illustration involving female representation in local politics. In addition, pvsR facilitates the replication of research with PVS data at low costs, including the pre-processing of data. Similar OSIs are recommended for other big public databases.

  14. False discovery rates in spectral identification.

    PubMed

    Jeong, Kyowon; Kim, Sangtae; Bandeira, Nuno

    2012-01-01

    Automated database search engines are one of the fundamental engines of high-throughput proteomics enabling daily identifications of hundreds of thousands of peptides and proteins from tandem mass (MS/MS) spectrometry data. Nevertheless, this automation also makes it humanly impossible to manually validate the vast lists of resulting identifications from such high-throughput searches. This challenge is usually addressed by using a Target-Decoy Approach (TDA) to impose an empirical False Discovery Rate (FDR) at a pre-determined threshold x% with the expectation that at most x% of the returned identifications would be false positives. But despite the fundamental importance of FDR estimates in ensuring the utility of large lists of identifications, there is surprisingly little consensus on exactly how TDA should be applied to minimize the chances of biased FDR estimates. In fact, since less rigorous TDA/FDR estimates tend to result in more identifications (at higher 'true' FDR), there is often little incentive to enforce strict TDA/FDR procedures in studies where the major metric of success is the size of the list of identifications and there are no follow up studies imposing hard cost constraints on the number of reported false positives. Here we address the problem of the accuracy of TDA estimates of empirical FDR. Using MS/MS spectra from samples where we were able to define a factual FDR estimator of 'true' FDR we evaluate several popular variants of the TDA procedure in a variety of database search contexts. We show that the fraction of false identifications can sometimes be over 10× higher than reported and may be unavoidably high for certain types of searches. In addition, we further report that the two-pass search strategy seems the most promising database search strategy. While unavoidably constrained by the particulars of any specific evaluation dataset, our observations support a series of recommendations towards maximizing the number of resulting identifications while controlling database searches with robust and reproducible TDA estimation of empirical FDR.

  15. Construct Definition Using Cognitively Based Evidence: A Framework for Practice

    ERIC Educational Resources Information Center

    Ketterlin-Geller, Leanne R.; Yovanoff, Paul; Jung, EunJu; Liu, Kimy; Geller, Josh

    2013-01-01

    In this article, we highlight the need for a precisely defined construct in score-based validation and discuss the contribution of cognitive theories to accurately and comprehensively defining the construct. We propose a framework for integrating cognitively based theoretical and empirical evidence to specify and evaluate the construct. We apply…

  16. An Analysis of the Women's Movement as a Social Movement.

    ERIC Educational Resources Information Center

    Budenstein, Mary Jane

    The paper analyzes the development of the women's movement, indicating how this particular movement empirically documents the theoretical suppositions of a sociologically defined social movement. A social movement is defined as "a group venture extended beyond a local community or a single event and involving a systematic effort to inaugurate…

  17. Legislative Provisions Underlying Trade Unions' Right to Define Their Organizational Structure

    ERIC Educational Resources Information Center

    Korobchenko, Victoria V.; Penov, Yury V.; Safonov, Valery A.

    2016-01-01

    The article contains a comparative analysis of constitutional and other legislative provisions that ensure a trade union's right to define its own administrative structure in European states. The aim of the study is to reveal the management's problems of European trade unions, declarative and empirical mass-character legislative provisions, which…

  18. Assessing the Value of Rural California High School Career Technical Education

    ERIC Educational Resources Information Center

    Morehead, Coleen Louise

    2015-01-01

    While empirical studies on rural education have defined many of the socioeconomic factors associated with rural students nationally, there is a lack of definitive and comprehensive research defining the benefit or value of career technical education for rural California high school students. Consequently, this lack of research may in turn…

  19. Eye fixations indicate men's preference for female breasts or buttocks.

    PubMed

    Dagnino, Bruno; Navajas, Joaquin; Sigman, Mariano

    2012-08-01

    Evolutionary psychologists have been interested in male preferences for particular female traits that are thought to signal health and reproductive potential. While the majority of studies have focused on what makes specific body traits attractive-such as the waist-to-hip ratio, the body mass index, and breasts shape and size-there is little empirical research that has examined individual differences in male preferences for specific traits (e.g., favoring breasts over buttocks). The current study begins to fill this empirical gap. In the first experiment (Study 1), 184 male participants were asked to report their preference between breasts and buttocks on a continuous scale. We found that (1) the distribution of preference was bimodal, indicating that Argentinean males tended to define themselves as favoring breasts or buttocks but rarely thinking that these traits contributed equally to their choice and (2) the distribution was biased towards buttocks. In a second experiment (Study 2), 19 male participants were asked to rate pictures of female breasts and buttocks. This study was necessary to generate three categories of pictures with statistically different ratings (high, medium, and low). In a third experiment (Study 3), we recorded eye-movements of 25 male participants while they chose the more attractive between two women, only seeing their breasts and buttock. We found that the first and last fixations were systematically directed towards the self-reported preferred trait.

  20. Defining dignity in terminally ill cancer patients: a factor-analytic approach.

    PubMed

    Hack, Thomas F; Chochinov, Harvey Max; Hassard, Thomas; Kristjanson, Linda J; McClement, Susan; Harlos, Mike

    2004-10-01

    The construct of 'dignity' is frequently raised in discussions about quality end of life care for terminal cancer patients, and is invoked by parties on both sides of the euthanasia debate. Lacking in this general debate has been an empirical explication of 'dignity' from the viewpoint of cancer patients themselves. The purpose of the present study was to use factor-analytic and regression methods to analyze dignity data gathered from 213 cancer patients having less than 6 months to live. Patients rated their sense of dignity, and completed measures of symptom distress and psychological well-being. The results showed that although the majority of patients had an intact sense of dignity, there were 99 (46%) patients who reported at least some, or occasional loss of dignity, and 16 (7.5%) patients who indicated that loss of dignity was a significant problem. The exploratory factor analysis yielded six primary factors: (1) Pain; (2) Intimate Dependency; (3) Hopelessness/Depression; (4) Informal Support Network; (5) Formal Support Network; and (6) Quality of Life. Subsequent regression analyses of modifiable factors produced a final two-factor (Hopelessness/Depression and Intimate Dependency) model of statistical significance. These results provide empirical support for the dignity model, and suggest that the provision of end of life care should include methods for treating depression, fostering hope, and facilitating functional independence. Copyright 2004 John Wiley & Sons, Ltd.

  1. Distribution of the two-sample t-test statistic following blinded sample size re-estimation.

    PubMed

    Lu, Kaifeng

    2016-05-01

    We consider the blinded sample size re-estimation based on the simple one-sample variance estimator at an interim analysis. We characterize the exact distribution of the standard two-sample t-test statistic at the final analysis. We describe a simulation algorithm for the evaluation of the probability of rejecting the null hypothesis at given treatment effect. We compare the blinded sample size re-estimation method with two unblinded methods with respect to the empirical type I error, the empirical power, and the empirical distribution of the standard deviation estimator and final sample size. We characterize the type I error inflation across the range of standardized non-inferiority margin for non-inferiority trials, and derive the adjusted significance level to ensure type I error control for given sample size of the internal pilot study. We show that the adjusted significance level increases as the sample size of the internal pilot study increases. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  2. The Small World of Psychopathology

    PubMed Central

    Borsboom, Denny; Cramer, Angélique O. J.; Schmittmann, Verena D.; Epskamp, Sacha; Waldorp, Lourens J.

    2011-01-01

    Background Mental disorders are highly comorbid: people having one disorder are likely to have another as well. We explain empirical comorbidity patterns based on a network model of psychiatric symptoms, derived from an analysis of symptom overlap in the Diagnostic and Statistical Manual of Mental Disorders-IV (DSM-IV). Principal Findings We show that a) half of the symptoms in the DSM-IV network are connected, b) the architecture of these connections conforms to a small world structure, featuring a high degree of clustering but a short average path length, and c) distances between disorders in this structure predict empirical comorbidity rates. Network simulations of Major Depressive Episode and Generalized Anxiety Disorder show that the model faithfully reproduces empirical population statistics for these disorders. Conclusions In the network model, mental disorders are inherently complex. This explains the limited successes of genetic, neuroscientific, and etiological approaches to unravel their causes. We outline a psychosystems approach to investigate the structure and dynamics of mental disorders. PMID:22114671

  3. Improved population estimates through the use of auxiliary information

    USGS Publications Warehouse

    Johnson, D.H.; Ralph, C.J.; Scott, J.M.

    1981-01-01

    When estimating the size of a population of birds, the investigator may have, in addition to an estimator based on a statistical sample, information on one of several auxiliary variables, such as: (1) estimates of the population made on previous occasions, (2) measures of habitat variables associated with the size of the population, and (3) estimates of the population sizes of other species that correlate with the species of interest. Although many studies have described the relationships between each of these kinds of data and the population size to be estimated, very little work has been done to improve the estimator by incorporating such auxiliary information. A statistical methodology termed 'empirical Bayes' seems to be appropriate to these situations. The potential that empirical Bayes methodology has for improved estimation of the population size of the Mallard (Anas platyrhynchos) is explored. In the example considered, three empirical Bayes estimators were found to reduce the error by one-fourth to one-half of that of the usual estimator.

  4. Do we need methodological theory to do qualitative research?

    PubMed

    Avis, Mark

    2003-09-01

    Positivism is frequently used to stand for the epistemological assumption that empirical science based on principles of verificationism, objectivity, and reproducibility is the foundation of all genuine knowledge. Qualitative researchers sometimes feel obliged to provide methodological alternatives to positivism that recognize their different ethical, ontological, and epistemological commitments and have provided three theories: phenomenology, grounded theory, and ethnography. The author argues that positivism was a doomed attempt to define empirical foundations for knowledge through a rigorous separation of theory and evidence; offers a pragmatic, coherent view of knowledge; and suggests that rigorous, rational empirical investigation does not need methodological theory. Therefore, qualitative methodological theory is unnecessary and counterproductive because it hinders critical reflection on the relation between methodological theory and empirical evidence.

  5. The Motivated Strategies for Learning Questionnaire: score validity among medicine residents.

    PubMed

    Cook, David A; Thompson, Warren G; Thomas, Kris G

    2011-12-01

    The Motivated Strategies for Learning Questionnaire (MSLQ) purports to measure motivation using the expectancy-value model. Although it is widely used in other fields, this instrument has received little study in health professions education. The purpose of this study was to evaluate the validity of MSLQ scores. We conducted a validity study evaluating the relationships of MSLQ scores to other variables and their internal structure (reliability and factor analysis). Participants included 210 internal medicine and family medicine residents participating in a web-based course on ambulatory medicine at an academic medical centre. Measurements included pre-course MSLQ scores, pre- and post-module motivation surveys, post-module knowledge test and post-module Instructional Materials Motivation Survey (IMMS) scores. Internal consistency was universally high for all MSLQ items together (Cronbach's α = 0.93) and for each domain (α ≥ 0.67). Total MSLQ scores showed statistically significant positive associations with post-test knowledge scores. For example, a 1-point rise in total MSLQ score was associated with a 4.4% increase in post-test scores (β = 4.4; p < 0.0001). Total MSLQ scores showed moderately strong, statistically significant associations with several other measures of effort, motivation and satisfaction. Scores on MSLQ domains demonstrated associations that generally aligned with our hypotheses. Self-efficacy and control of learning belief scores demonstrated the strongest domain-specific relationships with knowledge scores (β = 2.9 for both). Confirmatory factor analysis showed a borderline model fit. Follow-up exploratory factor analysis revealed the scores of five factors (self-efficacy, intrinsic interest, test anxiety, extrinsic goals, attribution) demonstrated psychometric and predictive properties similar to those of the original scales. Scores on the MSLQ are reliable and predict meaningful outcomes. However, the factor structure suggests a simplified model might better fit the empiric data. Future research might consider how assessing and responding to motivation could enhance learning. © Blackwell Publishing Ltd 2011.

  6. Satellite remote sensing of harmful algal blooms: A new multi-algorithm method for detecting the Florida Red Tide (Karenia brevis).

    PubMed

    Carvalho, Gustavo A; Minnett, Peter J; Fleming, Lora E; Banzon, Viva F; Baringer, Warner

    2010-06-01

    In a continuing effort to develop suitable methods for the surveillance of Harmful Algal Blooms (HABs) of Karenia brevis using satellite radiometers, a new multi-algorithm method was developed to explore whether improvements in the remote sensing detection of the Florida Red Tide was possible. A Hybrid Scheme was introduced that sequentially applies the optimized versions of two pre-existing satellite-based algorithms: an Empirical Approach (using water-leaving radiance as a function of chlorophyll concentration) and a Bio-optical Technique (using particulate backscatter along with chlorophyll concentration). The long-term evaluation of the new multi-algorithm method was performed using a multi-year MODIS dataset (2002 to 2006; during the boreal Summer-Fall periods - July to December) along the Central West Florida Shelf between 25.75°N and 28.25°N. Algorithm validation was done with in situ measurements of the abundances of K. brevis; cell counts ≥1.5×10(4) cells l(-1) defined a detectable HAB. Encouraging statistical results were derived when either or both algorithms correctly flagged known samples. The majority of the valid match-ups were correctly identified (~80% of both HABs and non-blooming conditions) and few false negatives or false positives were produced (~20% of each). Additionally, most of the HAB-positive identifications in the satellite data were indeed HAB samples (positive predictive value: ~70%) and those classified as HAB-negative were almost all non-bloom cases (negative predictive value: ~86%). These results demonstrate an excellent detection capability, on average ~10% more accurate than the individual algorithms used separately. Thus, the new Hybrid Scheme could become a powerful tool for environmental monitoring of K. brevis blooms, with valuable consequences including leading to the more rapid and efficient use of ships to make in situ measurements of HABs.

  7. Satellite remote sensing of harmful algal blooms: A new multi-algorithm method for detecting the Florida Red Tide (Karenia brevis)

    PubMed Central

    Carvalho, Gustavo A.; Minnett, Peter J.; Fleming, Lora E.; Banzon, Viva F.; Baringer, Warner

    2010-01-01

    In a continuing effort to develop suitable methods for the surveillance of Harmful Algal Blooms (HABs) of Karenia brevis using satellite radiometers, a new multi-algorithm method was developed to explore whether improvements in the remote sensing detection of the Florida Red Tide was possible. A Hybrid Scheme was introduced that sequentially applies the optimized versions of two pre-existing satellite-based algorithms: an Empirical Approach (using water-leaving radiance as a function of chlorophyll concentration) and a Bio-optical Technique (using particulate backscatter along with chlorophyll concentration). The long-term evaluation of the new multi-algorithm method was performed using a multi-year MODIS dataset (2002 to 2006; during the boreal Summer-Fall periods – July to December) along the Central West Florida Shelf between 25.75°N and 28.25°N. Algorithm validation was done with in situ measurements of the abundances of K. brevis; cell counts ≥1.5×104 cells l−1 defined a detectable HAB. Encouraging statistical results were derived when either or both algorithms correctly flagged known samples. The majority of the valid match-ups were correctly identified (~80% of both HABs and non-blooming conditions) and few false negatives or false positives were produced (~20% of each). Additionally, most of the HAB-positive identifications in the satellite data were indeed HAB samples (positive predictive value: ~70%) and those classified as HAB-negative were almost all non-bloom cases (negative predictive value: ~86%). These results demonstrate an excellent detection capability, on average ~10% more accurate than the individual algorithms used separately. Thus, the new Hybrid Scheme could become a powerful tool for environmental monitoring of K. brevis blooms, with valuable consequences including leading to the more rapid and efficient use of ships to make in situ measurements of HABs. PMID:21037979

  8. Evaluation of Cross-Protocol Stability of a Fully Automated Brain Multi-Atlas Parcellation Tool.

    PubMed

    Liang, Zifei; He, Xiaohai; Ceritoglu, Can; Tang, Xiaoying; Li, Yue; Kutten, Kwame S; Oishi, Kenichi; Miller, Michael I; Mori, Susumu; Faria, Andreia V

    2015-01-01

    Brain parcellation tools based on multiple-atlas algorithms have recently emerged as a promising method with which to accurately define brain structures. When dealing with data from various sources, it is crucial that these tools are robust for many different imaging protocols. In this study, we tested the robustness of a multiple-atlas, likelihood fusion algorithm using Alzheimer's Disease Neuroimaging Initiative (ADNI) data with six different protocols, comprising three manufacturers and two magnetic field strengths. The entire brain was parceled into five different levels of granularity. In each level, which defines a set of brain structures, ranging from eight to 286 regions, we evaluated the variability of brain volumes related to the protocol, age, and diagnosis (healthy or Alzheimer's disease). Our results indicated that, with proper pre-processing steps, the impact of different protocols is minor compared to biological effects, such as age and pathology. A precise knowledge of the sources of data variation enables sufficient statistical power and ensures the reliability of an anatomical analysis when using this automated brain parcellation tool on datasets from various imaging protocols, such as clinical databases.

  9. Empirical Tryout of a New Statistic for Detecting Temporally Inconsistent Responders.

    PubMed

    Kerry, Matthew J

    2018-01-01

    Statistical screening of self-report data is often advised to support the quality of analyzed responses - For example, reduction of insufficient effort responding (IER). One recently introduced index based on Mahalanobis's D for detecting outliers in cross-sectional designs replaces centered scores with difference scores between repeated-measure items: Termed person temporal consistency ( D 2 ptc ). Although the adapted D 2 ptc index demonstrated usefulness in simulation datasets, it has not been applied to empirical data. The current study addresses D 2 ptc 's low uptake by critically appraising its performance across three empirical applications. Independent samples were selected to represent a range of scenarios commonly encountered by organizational researchers. First, in Sample 1, a repeat-measure of future time perspective (FTP) inexperienced working adults (age >40-years; n = 620) indicated that temporal inconsistency was significantly related to respondent age and item reverse-scoring. Second, in repeat-measure of team efficacy aggregations, D 2 ptc successfully detected team-level inconsistency across repeat-performance cycles. Thirdly, the usefulness of the D 2 ptc was examined in an experimental study dataset of subjective life expectancy indicated significantly more stable responding in experimental conditions compared to controls. The empirical findings support D 2 ptc 's flexible and useful application to distinct study designs. Discussion centers on current limitations and further extensions that may be of value to psychologists screening self-report data for strengthening response quality and meaningfulness of inferences from repeated-measures self-reports. Taken together, the findings support the usefulness of the newly devised statistic for detecting IER and other extreme response patterns.

  10. Flood Change Assessment and Attribution in Austrian alpine Basins

    NASA Astrophysics Data System (ADS)

    Claps, Pierluigi; Allamano, Paola; Como, Anastasia; Viglione, Alberto

    2016-04-01

    The present paper aims to investigate the sensitivity of flood peaks to global warming in the Austrian alpine basins. A group of 97 Austrian watersheds, with areas ranging from 14 to 6000 km2 and with average elevation ranging from 1000 to 2900 m a.s.l. have been considered. Annual maximum floods are available for the basins from 1890 to 2007 with two densities of observation. In a first period, until 1950, an average of 42 records of flood peaks are available. From 1951 to 2007 the density of observation increases to an average amount of contemporary peaks of 85. This information is very important with reference to the statistical tools used for the empirical assessment of change over time, that is linear quantile regressions. Application of this tool to the data set unveils trends in extreme events, confirmed by statistical testing, for the 0.75 and 0.95 empirical quantiles. All applications are made with specific (discharges/area) values . Similarly of what done in a previous approach, multiple quantile regressions have also been applied, confirming the presence of trends even when the possible interference of the specific discharge and morphoclimatic parameters (i.e. mean elevation and catchment area). Application of a geomorphoclimatic model by Allamano et al (2009) can allow to mimic to which extent the empirically available increase in air temperature and annual rainfall can justify the attribution of change derived by the empirical statistical tools. An comparison with data from Swiss alpine basins treated in a previous paper is finally undertaken.

  11. Anxiety Sensitivity and Pre-Cessation Smoking Processes: Testing the Independent and Combined Mediating Effects of Negative Affect–Reduction Expectancies and Motives

    PubMed Central

    Farris, Samantha G.; Leventhal, Adam M.; Schmidt, Norman B.; Zvolensky, Michael J.

    2015-01-01

    Objective: Anxiety sensitivity appears to be relevant in understanding the nature of emotional symptoms and disorders associated with smoking. Negative-reinforcement smoking expectancies and motives are implicated as core regulatory processes that may explain, in part, the anxiety sensitivity–smoking interrelations; however, these pathways have received little empirical attention. Method: Participants (N = 471) were adult treatment-seeking daily smokers assessed for a smoking-cessation trial who provided baseline data; 157 participants provided within-treatment (pre-cessation) data. Anxiety sensitivity was examined as a cross-sectional predictor of several baseline smoking processes (nicotine dependence, perceived barriers to cessation, severity of prior withdrawal-related quit problems) and pre-cessation processes including nicotine withdrawal and smoking urges (assessed during 3 weeks before the quit day). Baseline negative-reinforcement smoking expectancies and motives were tested as simultaneous mediators via parallel multiple mediator models. Results: Higher levels of anxiety sensitivity were related to higher levels of nicotine dependence, greater perceived barriers to smoking cessation, more severe withdrawal-related problems during prior quit attempts, and greater average withdrawal before the quit day; effects were indirectly explained by the combination of both mediators. Higher levels of anxiety sensitivity were not directly related to pre-cessation smoking urges but were indirectly related through the independent and combined effects of the mediators. Conclusions: These empirical findings bolster theoretical models of anxiety sensitivity and smoking and identify targets for nicotine dependence etiology research and cessation interventions. PMID:25785807

  12. Regional Morphology Analysis Package (RMAP): Empirical Orthogonal Function Analysis, Background and Examples

    DTIC Science & Technology

    2007-10-01

    1984. Complex principal component analysis : Theory and examples. Journal of Climate and Applied Meteorology 23: 1660-1673. Hotelling, H. 1933...Sediments 99. ASCE: 2,566-2,581. Von Storch, H., and A. Navarra. 1995. Analysis of climate variability. Applications of statistical techniques. Berlin...ERDC TN-SWWRP-07-9 October 2007 Regional Morphology Empirical Analysis Package (RMAP): Orthogonal Function Analysis , Background and Examples by

  13. Threats to the Internal Validity of Experimental and Quasi-Experimental Research in Healthcare.

    PubMed

    Flannelly, Kevin J; Flannelly, Laura T; Jankowski, Katherine R B

    2018-01-01

    The article defines, describes, and discusses the seven threats to the internal validity of experiments discussed by Donald T. Campbell in his classic 1957 article: history, maturation, testing, instrument decay, statistical regression, selection, and mortality. These concepts are said to be threats to the internal validity of experiments because they pose alternate explanations for the apparent causal relationship between the independent variable and dependent variable of an experiment if they are not adequately controlled. A series of simple diagrams illustrate three pre-experimental designs and three true experimental designs discussed by Campbell in 1957 and several quasi-experimental designs described in his book written with Julian C. Stanley in 1966. The current article explains why each design controls for or fails to control for these seven threats to internal validity.

  14. Decreasing Wait Times and Increasing Patient Satisfaction: A Lean Six Sigma Approach.

    PubMed

    Godley, Mary; Jenkins, Jeanne B

    2018-06-08

    Patient satisfaction scores in the vascular interventional radiology department were low, especially related to wait times in registration and for tests/treatments, with low scores for intentions to recommend. The purpose of our quality improvement project was to decrease wait times and improve patient satisfaction using Lean Six Sigma's define, measure, analyze, improve, and control (DMAIC) framework with a pre-/postintervention design. There was a statistically significant decrease in wait times (P < .0019) and an increase in patient satisfaction scores in 3 areas: registration wait times (from 17 to 99 percentiles), test/treatment (from 19 to 60 percentiles), and likelihood to recommend (from 6 to 97 percentiles). Lean Six Sigma was an effective framework for use in decreasing wait times and improving patient satisfaction.

  15. Sex-specific 99th percentiles derived from the AACC Universal Sample Bank for the Roche Gen 5 cTnT assay: Comorbidities and statistical methods influence derivation of reference limits.

    PubMed

    Gunsolus, Ian L; Jaffe, Allan S; Sexter, Anne; Schulz, Karen; Ler, Ranka; Lindgren, Brittany; Saenger, Amy K; Love, Sara A; Apple, Fred S

    2017-12-01

    Our purpose was to determine a) overall and sex-specific 99th percentile upper reference limits (URL) and b) influences of statistical methods and comorbidities on the URLs. Heparin plasma from 838 normal subjects (423 men, 415 women) were obtained from the AACC (Universal Sample Bank). The cobas e602 measured cTnT (Roche Gen 5 assay); limit of detection (LoD), 3ng/L. Hemoglobin A1c (URL 6.5%), NT-proBNP (URL 125ng/L) and eGFR (60mL/min/1.73m 2 ) were measured, along with identification of statin use, to better define normality. 99th percentile URLs were determined by the non-parametric (NP), Harrell-Davis Estimator (HDE) and Robust (R) methods. 355 men and 339 women remained after exclusions. Overall<50% of subjects had measureable concentrations ≥ LoD: 45.6% no exclusion, 43.5% after exclusion; compared to men: 68.1% no exclusion, 65.1% post exclusion; women: 22.7% no exclusion, 20.9% post exclusion. The statistical method used influenced URLs as follows: pre/post exclusion overall, NP 16/16ng/L, HDE 17/17ng/L, R not available; men NP 18/16ng/L, HDE 21/19ng/L, R 16/11ng/L; women NP 13/10ng/L, HDE 14/14ng/L, R not available. We demonstrated that a) the Gen 5 cTnT assay does not meet the IFCC guideline for high-sensitivity assays, b) surrogate biomarkers significantly lowers the URLs and c) statistical methods used impact URLs. Our data suggest lower sex-specific cTnT 99th percentiles than reported in the FDA approved package insert. We emphasize the importance of detailing the criteria used to include and exclude subjects for defining a healthy population and the statistical method used to calculate 99th percentiles and identify outliers. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  16. Matched Filter Stochastic Background Characterization for Hyperspectral Target Detection

    DTIC Science & Technology

    2005-09-30

    and Pre- Clustering MVN Test.....................126 4.2.3 Pre- Clustering Detection Results.................................................130...4.2.4 Pre- Clustering Target Influence..................................................134 4.2.5 Statistical Distance Exclusion and Low Contrast...al, 2001] Figure 2.7 ROC Curve Comparison of RX, K-Means, and Bayesian Pre- Clustering Applied to Anomaly Detection [Ashton, 1998] Figure 2.8 ROC

  17. Estimation of the amount of asbestos-cement roofing in Poland.

    PubMed

    Wilk, Ewa; Krówczyńska, Małgorzata; Pabjanek, Piotr; Mędrzycki, Piotr

    2017-05-01

    The unique set of physical and chemical properties has led to many industrial applications of asbestos worldwide; one of them was roof covering. Asbestos is harmful to human health, and therefore its use was legally forbidden. Since in Poland there is no adequate data on the amount of asbestos-cement roofing, the objective of this study was to estimate its quantity on the basis of physical inventory taking with the use of aerial imagery, and the application of selected statistical features. Data pre-processing and analysis was executed in R Statistical Environment v. 3.1.0. Best random forest models were computed; model explaining 72.9% of the variance was subsequently used to prepare the prediction map of the amount of asbestos-cement roofing in Poland. Variables defining the number of farms, number and age of buildings, and regional differences were crucial for the analysis. The total amount of asbestos roofing in Poland was estimated at 738,068,000 m 2 (8.2m t). It is crucial for the landfill development programme, financial resources distribution, and application of monitoring policies.

  18. Science and Mathematics Teaching Efficacy Beliefs of Pre-School Teachers

    ERIC Educational Resources Information Center

    Aydogdu, Bülent; Peker, Murat

    2016-01-01

    The aim of this research was to examine science and mathematics teaching efficacy beliefs of pre-school teachers in terms of some variables. The sample of the study was comprised of 191 pre-school teachers working in a city in Aegean Region of Turkey. Since it attempted to define self-efficacy beliefs of pre-school teachers toward science and…

  19. The Sense of Incompleteness as a Motivator of Obsessive-Compulsive Symptoms: An Empirical Analysis of Concepts and Correlates

    PubMed Central

    Taylor, Steven; McKay, Dean; Crowe, Katherine B.; Abramowitz, Jonathan S.; Conelea, Christine A.; Calamari, John E.; Sica, Claudio

    2014-01-01

    Contemporary models of obsessive-compulsive disorder emphasize the importance of harm avoidance (HA) and related dysfunctional beliefs as motivators of obsessive-compulsive (OC) symptoms. Recently, there has been a resurgence of interest in Janet’s (1908) concept of incompleteness (INC) as another potentially important motivator. Contemporary investigators define INC as the sense that one’s actions, intentions, or experiences have not been properly achieved. Janet defined INC more broadly to include alexithymia, depersonalization, derealization, and impaired psychological mindedness. We conducted two studies to address four issues: (a) the clinical correlates of INC; (b) whether INC and HA are distinguishable constructs; (c) whether INC predicts OC symptoms after controlling for HA; and (d) the relative merits of broad versus narrow conceptualizations of INC. Study 1 was a meta-analysis of the clinical correlates of narrowly defined INC (16 studies, N=5,940). INC was correlated with all types of OC symptoms, and was more strongly correlated with OC symptoms than with general distress. Study 2 (N=534 nonclinical participants) showed that: (a) INC and HA were strongly correlated but factor analytically distinguishable; (b) INC statistically predicted all types of OC symptoms even after controlling for HA; and (c) narrow INC was most strongly correlated with OC symptoms whereas broad INC was most strongly correlated with general distress. Although the findings are limited by being correlational in nature, they support the hypothesis that INC, especially in its narrow form, is a motivator of OC symptoms. PMID:24491200

  20. Empirical Studies of Patterning

    ERIC Educational Resources Information Center

    Pasnak, Robert

    2017-01-01

    Young children have been taught simple sequences of alternating shapes and colors, referred to as "patterning", for the past half century in the hope that their understanding of pre-algebra and their mathematics achievement would be improved. The evidence that such patterning instruction actually improves children's academic achievement…

  1. Statistical analysis plan for the Pneumatic CompREssion for PreVENting Venous Thromboembolism (PREVENT) trial: a study protocol for a randomized controlled trial.

    PubMed

    Arabi, Yaseen; Al-Hameed, Fahad; Burns, Karen E A; Mehta, Sangeeta; Alsolamy, Sami; Almaani, Mohammed; Mandourah, Yasser; Almekhlafi, Ghaleb A; Al Bshabshe, Ali; Finfer, Simon; Alshahrani, Mohammed; Khalid, Imran; Mehta, Yatin; Gaur, Atul; Hawa, Hassan; Buscher, Hergen; Arshad, Zia; Lababidi, Hani; Al Aithan, Abdulsalam; Jose, Jesna; Abdukahil, Sheryl Ann I; Afesh, Lara Y; Dbsawy, Maamoun; Al-Dawood, Abdulaziz

    2018-03-15

    The Pneumatic CompREssion for Preventing VENous Thromboembolism (PREVENT) trial evaluates the effect of adjunctive intermittent pneumatic compression (IPC) with pharmacologic thromboprophylaxis compared to pharmacologic thromboprophylaxis alone on venous thromboembolism (VTE) in critically ill adults. In this multicenter randomized trial, critically ill patients receiving pharmacologic thromboprophylaxis will be randomized to an IPC or a no IPC (control) group. The primary outcome is "incident" proximal lower-extremity deep vein thrombosis (DVT) within 28 days after randomization. Radiologists interpreting the lower-extremity ultrasonography will be blinded to intervention allocation, whereas the patients and treating team will be unblinded. The trial has 80% power to detect a 3% absolute risk reduction in the rate of proximal DVT from 7% to 4%. Consistent with international guidelines, we have developed a detailed plan to guide the analysis of the PREVENT trial. This plan specifies the statistical methods for the evaluation of primary and secondary outcomes, and defines covariates for adjusted analyses a priori. Application of this statistical analysis plan to the PREVENT trial will facilitate unbiased analyses of clinical data. ClinicalTrials.gov , ID: NCT02040103 . Registered on 3 November 2013; Current controlled trials, ID: ISRCTN44653506 . Registered on 30 October 2013.

  2. Empirical correlations between the arrhenius' parameters of impurities' diffusion coefficients in CdTe crystals

    DOE PAGES

    Shcherbak, L.; Kopach, O.; Fochuk, P.; ...

    2015-01-21

    Understanding of self- and dopant-diffusion in semiconductor devices is essential to our being able to assure the formation of well-defined doped regions. In this paper, we compare obtained in the literature up to date the Arrhenius’ parameters (D=D 0exp(–ΔE a/kT)) of point-defect diffusion coefficients and the I-VII groups impurities in CdTe crystals and films. We found that in the diffusion process there was a linear dependence between the pre-exponential factor, D 0, and the activation energy, ΔE a, of different species: This was evident in the self-diffusivity and isovalent impurity Hg diffusivity as well as for the dominant IIIA andmore » IVA groups impurities and Chlorine, except for the fast diffusing elements (e.g., Cu and Ag), chalcogens O, S, and Se, halogens I and Br as well as the transit impurities Mn, Co, Fe. As a result, reasons of the lack of correspondence of the data to compensative dependence are discussed.« less

  3. A review on acidifying treatments for vegetable canned food.

    PubMed

    Derossi, A; Fiore, A G; De Pilli, T; Severini, C

    2011-12-01

    As is well known, pasteurization treatments are not sufficient for destroying heat resistance of spore forming microorganisms, which are prevented from germination and growing by pH reducing. So, the acidification process becomes one of the most important pre-treatments for the canning industry. It is commonly applied before pasteurization treatment with the purpose of inhibiting spore germination and for reducing heat resistance of the microorganism, thereby allowing to reduce the time or temperature values of the heat treatment. With the aim to reduce the pH of vegetables several techniques are available but their application is not easy to plan. Often, industries define operative conditions only on the basis of empirical experience, thus increasing the risk of microbial growth or imparting an unpleasant sour taste. With the aim of highlighting the correct plan and management of acidification treatments to reach safety without degrading quality of canned fruit and vegetables, the topics that are reviewed and discussed are the effects of low pH on heat resistance of the most important microorganisms, acidification techniques and significant process variables, the effect of low pH on sensorial properties, and future trends.

  4. A test of prospect theory.

    PubMed

    Feeny, David; Eng, Ken

    2005-01-01

    Prospect theory (PT) hypothesizes that people judge states relative to a reference point, usually assumed to be their current health. States better than the reference point are valued on a concave portion of the utility function; worse states are valued on a convex portion. Using prospectively collected utility scores, the objective is to test empirically implications of PT. Osteoarthritis (OA) patients undergoing total hip arthroplasty periodically provided standard gamble scores for three OA hypothetical states describing mild, moderate, and severe OA as well as their subjectively defined current state (SDCS). Our hypothesis was that most patients improved between the pre- and postsurgery assessments. According to PT, scores for hypothetical states previously > SDCS but now < SDCS should be lower at the postsurgery assessment. Fourteen patients met the criteria for testing the hypothesis. Predictions were confirmed for 0 patients; there was no change or mixed results for 6 patients (42.9 percent); and scores moved in the direction opposite to that predicted by PT for 8 patients (57.1 percent). In general, the direction and magnitude of the changes in hypothetical-state scores do not conform to the predictions of PT.

  5. Significant Pre-Accession Factors Predicting Success or Failure During a Marine Corps Officer’s Initial Service Obligation

    DTIC Science & Technology

    2015-12-01

    WAIVERS ..............................................................................................49  APPENDIX C. DESCRIPTIVE STATISTICS ... Statistics of Dependent Variables. .............................................23  Table 6.  Summary Statistics of Academics Variables...24  Table 7.  Summary Statistics of Application Variables ............................................25  Table 8

  6. An empirical test of Maslow's theory of need hierarchy using hologeistic comparison by statistical sampling.

    PubMed

    Davis-Sharts, J

    1986-10-01

    Maslow's hierarchy of basic human needs provides a major theoretical framework in nursing science. The purpose of this study was to empirically test Maslow's need theory, specifically at the levels of physiological and security needs, using a hologeistic comparative method. Thirty cultures taken from the 60 cultural units in the Health Relations Area Files (HRAF) Probability Sample were found to have data available for examining hypotheses about thermoregulatory (physiological) and protective (security) behaviors practiced prior to sleep onset. The findings demonstrate there is initial worldwide empirical evidence to support Maslow's need hierarchy.

  7. The Elegance of Disordered Granular Packings: A Validation of Edwards' Hypothesis

    NASA Technical Reports Server (NTRS)

    Metzger, Philip T.; Donahue, Carly M.

    2004-01-01

    We have found a way to analyze Edwards' density of states for static granular packings in the special case of round, rigid, frictionless grains assuming constant coordination number. It obtains the most entropic density of single grain states, which predicts several observables including the distribution of contact forces. We compare these results against empirical data obtained in dynamic simulations of granular packings. The agreement between theory and the empirics is quite good, helping validate the use of statistical mechanics methods in granular physics. The differences between theory and empirics are mainly due to the variable coordination number, and when the empirical data are sorted by that number we obtain several insights that suggest an underlying elegance in the density of states

  8. Colonial modernity and networks in the Japanese empire: the role of Gotō Shinpei.

    PubMed

    Low, Morris

    2010-01-01

    This paper examines how Gotō Shinpei (1857-1929) sought to develop imperial networks emanating out of Tokyo in the fields of public health, railways, and communications. These areas helped define colonial modernity in the Japanese empire. In public health, Gotō's friendship with the bacteriologist Kitasato Shibasaburō led to the establishment of an Institute of Infectious Diseases in Tokyo. Key scientists from the institute took up positions in colonial medical colleges, creating a public health network that serviced the empire. Much of the empire itself was linked by a network of railways. Gotō was the first president of the South Manchuria Railway company (SMR). Communication technologies, especially radio, helped to bring the empire closer. By 1925, the Tokyo Broadcasting Station had begun its public radio broadcasts. Broadcasting soon came under the umbrella of the new organization, the Nippon Hōsō Kyōkai (NHK). Gotō was NHK's first president. The empire would soon be linked by radio, and it was by radio that Emperor Hirohito announced to the nation in 1945 that the empire had been lost.

  9. Uncertain impacts on economic growth when stabilizing global temperatures at 1.5°C or 2°C warming

    PubMed Central

    Schwarz, Moritz; Tang, Kevin; Haustein, Karsten; Allen, Myles R.

    2018-01-01

    Empirical evidence suggests that variations in climate affect economic growth across countries over time. However, little is known about the relative impacts of climate change on economic outcomes when global mean surface temperature (GMST) is stabilized at 1.5°C or 2°C warming relative to pre-industrial levels. Here we use a new set of climate simulations under 1.5°C and 2°C warming from the ‘Half a degree Additional warming, Prognosis and Projected Impacts' (HAPPI) project to assess changes in economic growth using empirical estimates of climate impacts in a global panel dataset. Panel estimation results that are robust to outliers and breaks suggest that within-year variability of monthly temperatures and precipitation has little effect on economic growth beyond global nonlinear temperature effects. While expected temperature changes under a GMST increase of 1.5°C lead to proportionally higher warming in the Northern Hemisphere, the projected impact on economic growth is larger in the Tropics and Southern Hemisphere. Accounting for econometric estimation and climate uncertainty, the projected impacts on economic growth of 1.5°C warming are close to indistinguishable from current climate conditions, while 2°C warming suggests statistically lower economic growth for a large set of countries (median projected annual growth up to 2% lower). Level projections of gross domestic product (GDP) per capita exhibit high uncertainties, with median projected global average GDP per capita approximately 5% lower at the end of the century under 2°C warming relative to 1.5°C. The correlation between climate-induced reductions in per capita GDP growth and national income levels is significant at the p < 0.001 level, with lower-income countries experiencing greater losses, which may increase economic inequality between countries and is relevant to discussions of loss and damage under the United Nations Framework Convention on Climate Change. This article is part of the theme issue ‘The Paris Agreement: understanding the physical and social challenges for a warming world of 1.5°C above pre-industrial levels'. PMID:29610370

  10. Prediction of mandibular rotation: an empirical test of clinician performance.

    PubMed

    Baumrind, S; Korn, E L; West, E E

    1984-11-01

    An experiment was conducted in an attempt to determine empirically how effective a number of expert clinicians were at differentiating "backward rotators" from "forward rotators" on the basis of head-film information which might reasonably have been available to them prior to instituting treatment for the correction of Class II malocclusion. As a result of a previously reported ongoing study, pre- and posttreatment head films were available for 188 patients treated in the mixed dentition for the correction of Class II malocclusion and for 50 untreated Class II subjects. These subjects were divided into 14 groups (average size of group, 17; range, 6 to 23) solely on the basis of type of treatment and the clinician from whose clinic the records had originated. From within each group, we selected the two or three subjects who had exhibited the most extreme backward rotation and the two or three subjects who had exhibited the most extreme forward rotation of the mandible during the interval between films. The sole criterion for classification was magnitude of change in the mandibular plane angle of Downs between the pre- and posttreatment films of each patient. The resulting sample contained 32 backward-rotator subjects and 32 forward-rotator subjects. Five expert judges (mean clinical experience, 28 years) were asked to identify the backward-rotator subjects by examination of the pretreatment films. The findings may be summarized as follows: (1) No judge performed significantly better than chance. (2) There was strong evidence that the judges used a shared, though relatively ineffective, set of rules in making their discriminations between forward and backward rotators. (3) Statistical analysis of the predictive power of a set of standard cephalometric measurements which had previously been made for this set of subjects indicated that the numerical data also failed to identify potential backward rotators at a rate significantly better than chance. We infer from these findings that the ability of clinicians to identify backward rotators on the basis of information available at the outset of treatment is poor. Hence, we believe that it is unlikely that such predictions play any consequential operational role in the planning of successful orthodontic therapy at the present state of the art.

  11. Uncertain impacts on economic growth when stabilizing global temperatures at 1.5°C or 2°C warming.

    PubMed

    Pretis, Felix; Schwarz, Moritz; Tang, Kevin; Haustein, Karsten; Allen, Myles R

    2018-05-13

    Empirical evidence suggests that variations in climate affect economic growth across countries over time. However, little is known about the relative impacts of climate change on economic outcomes when global mean surface temperature (GMST) is stabilized at 1.5°C or 2°C warming relative to pre-industrial levels. Here we use a new set of climate simulations under 1.5°C and 2°C warming from the 'Half a degree Additional warming, Prognosis and Projected Impacts' (HAPPI) project to assess changes in economic growth using empirical estimates of climate impacts in a global panel dataset. Panel estimation results that are robust to outliers and breaks suggest that within-year variability of monthly temperatures and precipitation has little effect on economic growth beyond global nonlinear temperature effects. While expected temperature changes under a GMST increase of 1.5°C lead to proportionally higher warming in the Northern Hemisphere, the projected impact on economic growth is larger in the Tropics and Southern Hemisphere. Accounting for econometric estimation and climate uncertainty, the projected impacts on economic growth of 1.5°C warming are close to indistinguishable from current climate conditions, while 2°C warming suggests statistically lower economic growth for a large set of countries (median projected annual growth up to 2% lower). Level projections of gross domestic product (GDP) per capita exhibit high uncertainties, with median projected global average GDP per capita approximately 5% lower at the end of the century under 2°C warming relative to 1.5°C. The correlation between climate-induced reductions in per capita GDP growth and national income levels is significant at the p  < 0.001 level, with lower-income countries experiencing greater losses, which may increase economic inequality between countries and is relevant to discussions of loss and damage under the United Nations Framework Convention on Climate Change.This article is part of the theme issue 'The Paris Agreement: understanding the physical and social challenges for a warming world of 1.5°C above pre-industrial levels'. © 2018 The Authors.

  12. Uncertain impacts on economic growth when stabilizing global temperatures at 1.5°C or 2°C warming

    NASA Astrophysics Data System (ADS)

    Pretis, Felix; Schwarz, Moritz; Tang, Kevin; Haustein, Karsten; Allen, Myles R.

    2018-05-01

    Empirical evidence suggests that variations in climate affect economic growth across countries over time. However, little is known about the relative impacts of climate change on economic outcomes when global mean surface temperature (GMST) is stabilized at 1.5°C or 2°C warming relative to pre-industrial levels. Here we use a new set of climate simulations under 1.5°C and 2°C warming from the `Half a degree Additional warming, Prognosis and Projected Impacts' (HAPPI) project to assess changes in economic growth using empirical estimates of climate impacts in a global panel dataset. Panel estimation results that are robust to outliers and breaks suggest that within-year variability of monthly temperatures and precipitation has little effect on economic growth beyond global nonlinear temperature effects. While expected temperature changes under a GMST increase of 1.5°C lead to proportionally higher warming in the Northern Hemisphere, the projected impact on economic growth is larger in the Tropics and Southern Hemisphere. Accounting for econometric estimation and climate uncertainty, the projected impacts on economic growth of 1.5°C warming are close to indistinguishable from current climate conditions, while 2°C warming suggests statistically lower economic growth for a large set of countries (median projected annual growth up to 2% lower). Level projections of gross domestic product (GDP) per capita exhibit high uncertainties, with median projected global average GDP per capita approximately 5% lower at the end of the century under 2°C warming relative to 1.5°C. The correlation between climate-induced reductions in per capita GDP growth and national income levels is significant at the p < 0.001 level, with lower-income countries experiencing greater losses, which may increase economic inequality between countries and is relevant to discussions of loss and damage under the United Nations Framework Convention on Climate Change. This article is part of the theme issue `The Paris Agreement: understanding the physical and social challenges for a warming world of 1.5°C above pre-industrial levels'.

  13. The Characteristics of Turbulent Flows on Forested Floodplains

    NASA Astrophysics Data System (ADS)

    Darby, S. E.; Richardson, K.; Sear, D. A.

    2008-12-01

    Forested floodplain environments represent the undisturbed land cover of most river systems, but they are under threat from human activities. An understanding of forest floodplain processes therefore has relevance to ecosystem conservation and restoration, and the interpretation of pre-historic river and floodplain evolution. However, relatively little research has been undertaken within forested floodplain environments, a particular limitation being an absence of empirical data regarding the hydraulic characteristics of over bank flows, which inhibits the development of flow, sediment and solute transport models. Forest floodplain flows are strongly modified by floodplain topography and the presence of vegetation and organic debris on the woodland floor. In such instances flow blockage and diversions are common, and there is the possibility of intense turbulence generation, both by wakes and by shear. To address this gap we have undertaken a study based on a floodplain reach located in the Highland Water Research Catchment (southern England), a UK national reference site for lowland floodplain forest streams. Given the difficulties of acquiring spatially-distributed hydraulic data sets during floods, our methodological approach has been to attempt to replicate over bank flow observed at the study site within a laboratory flume. This is necessary to acquire flow velocity data at sufficiently high spatial resolution to evaluate the underlying flow mechanics and has been achieved using (i) a large (21m) flume to achieve 1:1 hydraulic scaling and (ii) a novel method of precisely replicating the floodplain topography within the flume. Specifically, accurate replication of a representative floodplain patch was achieved by creating a 1:1 scale Physical Terrain Model (PTM) from high-density polyurethane using a computer-controlled milling process based on Digital Terrain Model (DTM) data, the latter acquired via terrestrial laser scanning (TLS) survey. The PTM was deployed within the flume immediately downstream of a 8m long hydraulically smooth 'run-in' section with a steady discharge replicating an over bank flow observed in the field, thus achieving 1:1 hydraulic scaling. Above the PTM 3D flow velocity time-series were acquired at each node on a dense (5-10cm horizontal spatial resolution) sampling grid using Acoustic Doppler Velocimeters (ADVs). The data were analysed by visualising the 3D structure of flow velocity and derivative statistics (turbulent intensity, turbulent kinetic energy, Reynolds stresses, etc), combined with quadrant analysis to identify the spatial variation of each quadrant's contribution to the turbulence intensity. These analyses have been used to delineate flow regions dominated by different structures, and construct an empirical model that will be helpful in defining relevant modelling strategies in future research.

  14. On Allometry Relations

    NASA Astrophysics Data System (ADS)

    West, Damien; West, Bruce J.

    2012-07-01

    There are a substantial number of empirical relations that began with the identification of a pattern in data; were shown to have a terse power-law description; were interpreted using existing theory; reached the level of "law" and given a name; only to be subsequently fade away when it proved impossible to connect the "law" with a larger body of theory and/or data. Various forms of allometry relations (ARs) have followed this path. The ARs in biology are nearly two hundred years old and those in ecology, geophysics, physiology and other areas of investigation are not that much younger. In general if X is a measure of the size of a complex host network and Y is a property of a complex subnetwork embedded within the host network a theoretical AR exists between the two when Y = aXb. We emphasize that the reductionistic models of AR interpret X and Y as dynamic variables, albeit the ARs themselves are explicitly time independent even though in some cases the parameter values change over time. On the other hand, the phenomenological models of AR are based on the statistical analysis of data and interpret X and Y as averages to yield the empirical AR: = ab. Modern explanations of AR begin with the application of fractal geometry and fractal statistics to scaling phenomena. The detailed application of fractal geometry to the explanation of theoretical ARs in living networks is slightly more than a decade old and although well received it has not been universally accepted. An alternate perspective is given by the empirical AR that is derived using linear regression analysis of fluctuating data sets. We emphasize that the theoretical and empirical ARs are not the same and review theories "explaining" AR from both the reductionist and statistical fractal perspectives. The probability calculus is used to systematically incorporate both views into a single modeling strategy. We conclude that the empirical AR is entailed by the scaling behavior of the probability density, which is derived using the probability calculus.

  15. Deriving Criteria-supporting Benchmark Values from Empirical Response Relationships: Comparison of Statistical Techniques and Effect of Log-transforming the Nutrient Variable

    EPA Science Inventory

    In analyses supporting the development of numeric nutrient criteria, multiple statistical techniques can be used to extract critical values from stressor response relationships. However there is little guidance for choosing among techniques, and the extent to which log-transfor...

  16. A Matched Field Processing Framework for Coherent Detection Over Local and Regional Networks (Postprint)

    DTIC Science & Technology

    2011-12-30

    the term " superresolution "). The single-phase matched field statistic for a given template was also demonstrated to be a viable detection statistic... Superresolution with seismic arrays using empirical matched field processing, Geophys. J. Int. 182: 1455–1477. Kim, K.-H. and Park, Y. (2010): The 20

  17. Hybrid Tasks: Promoting Statistical Thinking and Critical Thinking through the Same Mathematical Activities

    ERIC Educational Resources Information Center

    Aizikovitsh-Udi, Einav; Clarke, David; Kuntze, Sebastian

    2014-01-01

    Even though statistical thinking and critical thinking appear to have strong links from a theoretical point of view, empirical research into the intersections and potential interrelatedness of these aspects of competence is scarce. Our research suggests that thinking skills in both areas may be interdependent. Given this interconnection, it should…

  18. An Empirical Consideration of a Balanced Amalgamation of Learning Strategies in Graduate Introductory Statistics Classes

    ERIC Educational Resources Information Center

    Vaughn, Brandon K.

    2009-01-01

    This study considers the effectiveness of a "balanced amalgamated" approach to teaching graduate level introductory statistics. Although some research stresses replacing traditional lectures with more active learning methods, the approach of this study is to combine effective lecturing with active learning and team projects. The results of this…

  19. The Empirical Review of Meta-Analysis Published in Korea

    ERIC Educational Resources Information Center

    Park, Sunyoung; Hong, Sehee

    2016-01-01

    Meta-analysis is a statistical method that is increasingly utilized to combine and compare the results of previous primary studies. However, because of the lack of comprehensive guidelines for how to use meta-analysis, many meta-analysis studies have failed to consider important aspects, such as statistical programs, power analysis, publication…

  20. AN EMPIRICAL BAYES APPROACH TO COMBINING ESTIMATES OF THE VALUE OF A STATISTICAL LIFE FOR ENVIRONMENTAL POLICY ANALYSIS

    EPA Science Inventory

    This analysis updates EPA's standard VSL estimate by using a more comprehensive collection of VSL studies that include studies published between 1992 and 2000, as well as applying a more appropriate statistical method. We provide a pooled effect VSL estimate by applying the empi...

Top