Sample records for powernext futures statistics

  1. Statistical field theory of futures commodity prices

    NASA Astrophysics Data System (ADS)

    Baaquie, Belal E.; Yu, Miao

    2018-02-01

    The statistical theory of commodity prices has been formulated by Baaquie (2013). Further empirical studies of single (Baaquie et al., 2015) and multiple commodity prices (Baaquie et al., 2016) have provided strong evidence in support the primary assumptions of the statistical formulation. In this paper, the model for spot prices (Baaquie, 2013) is extended to model futures commodity prices using a statistical field theory of futures commodity prices. The futures prices are modeled as a two dimensional statistical field and a nonlinear Lagrangian is postulated. Empirical studies provide clear evidence in support of the model, with many nontrivial features of the model finding unexpected support from market data.

  2. Zu Problemen statistischer Methoden in der Sprachwissenschaft (Problems of Statistical Methods in Linguistics)

    ERIC Educational Resources Information Center

    Zorn, Klaus

    1973-01-01

    Discussion of statistical apparatus employed in L. Doncheva-Mareva's article on the wide-spread usage of the present and future tense forms with future meaning in German letters, Deutsch als Fremdsprache, n1 1971. (RS)

  3. Projecting future precipitation and temperature at sites with diverse climate through multiple statistical downscaling schemes

    NASA Astrophysics Data System (ADS)

    Vallam, P.; Qin, X. S.

    2017-10-01

    Anthropogenic-driven climate change would affect the global ecosystem and is becoming a world-wide concern. Numerous studies have been undertaken to determine the future trends of meteorological variables at different scales. Despite these studies, there remains significant uncertainty in the prediction of future climates. To examine the uncertainty arising from using different schemes to downscale the meteorological variables for the future horizons, projections from different statistical downscaling schemes were examined. These schemes included statistical downscaling method (SDSM), change factor incorporated with LARS-WG, and bias corrected disaggregation (BCD) method. Global circulation models (GCMs) based on CMIP3 (HadCM3) and CMIP5 (CanESM2) were utilized to perturb the changes in the future climate. Five study sites (i.e., Alice Springs, Edmonton, Frankfurt, Miami, and Singapore) with diverse climatic conditions were chosen for examining the spatial variability of applying various statistical downscaling schemes. The study results indicated that the regions experiencing heavy precipitation intensities were most likely to demonstrate the divergence between the predictions from various statistical downscaling methods. Also, the variance computed in projecting the weather extremes indicated the uncertainty derived from selection of downscaling tools and climate models. This study could help gain an improved understanding about the features of different downscaling approaches and the overall downscaling uncertainty.

  4. An outlook for cargo aircraft of the future. [assessment of the future of air cargo by analyzing statistics and trends

    NASA Technical Reports Server (NTRS)

    Nicks, O. W.; Whitehead, A. H., Jr.; Alford, W. J., Jr.

    1975-01-01

    An assessment is provided of the future of air cargo by analyzing air cargo statistics and trends, by noting air cargo system problems and inefficiencies, by analyzing characteristics of air-eligible commodities, and by showing the promise of new technology for future cargo aircraft with significant improvements in costs and efficiency. NASA's proposed program is reviewed which would sponsor the research needed to provide for development of advanced designs by 1985.

  5. Nonstationarity RC Workshop Report: Nonstationary Weather Patterns and Extreme Events Informing Design and Planning for Long-Lived Infrastructure

    DTIC Science & Technology

    2017-11-01

    magnitude, intensity, and seasonality of climate. For infrastructure projects, relevant design life often exceeds 30 years—a period of time of...uncertainty about future statistical properties of climate at time and spatial scales required for planning and design purposes. Information...about future statistical properties of climate at time and spatial scales required for planning and design , and for assessing future operational

  6. The Math Problem: Advertising Students' Attitudes toward Statistics

    ERIC Educational Resources Information Center

    Fullerton, Jami A.; Kendrick, Alice

    2013-01-01

    This study used the Students' Attitudes toward Statistics Scale (STATS) to measure attitude toward statistics among a national sample of advertising students. A factor analysis revealed four underlying factors make up the attitude toward statistics construct--"Interest & Future Applicability," "Confidence," "Statistical Tools," and "Initiative."…

  7. Nonparametric Bayesian predictive distributions for future order statistics

    Treesearch

    Richard A. Johnson; James W. Evans; David W. Green

    1999-01-01

    We derive the predictive distribution for a specified order statistic, determined from a future random sample, under a Dirichlet process prior. Two variants of the approach are treated and some limiting cases studied. A practical application to monitoring the strength of lumber is discussed including choices of prior expectation and comparisons made to a Bayesian...

  8. Public Policy and Planning for Nurse Education and Practice.

    ERIC Educational Resources Information Center

    Feldbaum, Eleanor G.; Levitt, Morris J.

    This report focuses on nursing educational and practice issues that government officials may have to address in the near future. The report provides statistical information on nurses, compares statistics for white and black nurses, and recommends policies for the future. Data was gathered for the report during a three-year study of 5,175…

  9. Statistical Forecasting of Current and Future Circum-Arctic Ground Temperatures and Active Layer Thickness

    NASA Astrophysics Data System (ADS)

    Aalto, J.; Karjalainen, O.; Hjort, J.; Luoto, M.

    2018-05-01

    Mean annual ground temperature (MAGT) and active layer thickness (ALT) are key to understanding the evolution of the ground thermal state across the Arctic under climate change. Here a statistical modeling approach is presented to forecast current and future circum-Arctic MAGT and ALT in relation to climatic and local environmental factors, at spatial scales unreachable with contemporary transient modeling. After deploying an ensemble of multiple statistical techniques, distance-blocked cross validation between observations and predictions suggested excellent and reasonable transferability of the MAGT and ALT models, respectively. The MAGT forecasts indicated currently suitable conditions for permafrost to prevail over an area of 15.1 ± 2.8 × 106 km2. This extent is likely to dramatically contract in the future, as the results showed consistent, but region-specific, changes in ground thermal regime due to climate change. The forecasts provide new opportunities to assess future Arctic changes in ground thermal state and biogeochemical feedback.

  10. Looking Back over Their Shoulders: A Qualitative Analysis of Portuguese Teachers' Attitudes towards Statistics

    ERIC Educational Resources Information Center

    Martins, Jose Alexandre; Nascimento, Maria Manuel; Estrada, Assumpta

    2012-01-01

    Teachers' attitudes towards statistics can have a significant effect on their own statistical training, their teaching of statistics, and the future attitudes of their students. The influence of attitudes in teaching statistics in different contexts was previously studied in the work of Estrada et al. (2004, 2010a, 2010b) and Martins et al.…

  11. 78 FR 14153 - Advisory Council on Transportation Statistics; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-04

    ... Transportation Statistics; Meeting AGENCY: Research and Innovative Technology Administration (RITA), DOT. ACTION... Statistics (ACTS). The meeting was scheduled for Monday, March 4, 2013 from 8:30 a.m. to 4:00 p.m. E.S.T. in...., Washington, DC. The Bureau of Transportation Statistics (BTS) will reschedule the meeting for a future date...

  12. From Data to Information: New Directions for the National Center for Education Statistics. Synthesis Report.

    ERIC Educational Resources Information Center

    Hoachlander, Gary; And Others

    In the fall of 1995 the National Center for Education Statistics (NCES) held a conference to stimulate dialogue about future developments in the fields of education, statistical methodology, and technology and the implications of these developments for the nation's education statistics program. This paper summarizes and synthesizes the results of…

  13. Future time perspective and positive health practices in young adults: an extension.

    PubMed

    Mahon, N E; Yarcheski, T J; Yarcheski, A

    1997-06-01

    A sample of 69 young adults attending a public university responded to the Future Time Perspective Inventory, two subscales of the Time Experience Scales (Fast and Slow Tempo), and the Personal Lifestyle Questionnaire in classroom settings. A statistically significant correlation (.52) was found between scores for future time perspective and the ratings for the practice of positive health behaviors in young adults. This correlation was larger than those previously found for middle and late adolescents. Scores on subscales of individual health practices and future time perspective indicated statistically significant correlations for five (.25 to .56) of the six subscales. Scores on neither Fast nor Slow Tempo were related to ratings of positive health practices or ratings on subscales measuring positive health practices.

  14. An Analysis of Attitudes toward Statistics: Gender Differences among Advertising Majors.

    ERIC Educational Resources Information Center

    Fullerton, Jami A.; Umphrey, Don

    This study measures advertising students' attitudes toward statistics. Subjects, 275 undergraduate advertising students from two southwestern United States universities, completed a questionnaire used to gauge students' attitudes toward statistics by measuring 6 underlying factors: (1) students' interest and future applicability; (2) relationship…

  15. Teaching Research Methods and Statistics in eLearning Environments: Pedagogy, Practical Examples, and Possible Futures

    PubMed Central

    Rock, Adam J.; Coventry, William L.; Morgan, Methuen I.; Loi, Natasha M.

    2016-01-01

    Generally, academic psychologists are mindful of the fact that, for many students, the study of research methods and statistics is anxiety provoking (Gal et al., 1997). Given the ubiquitous and distributed nature of eLearning systems (Nof et al., 2015), teachers of research methods and statistics need to cultivate an understanding of how to effectively use eLearning tools to inspire psychology students to learn. Consequently, the aim of the present paper is to discuss critically how using eLearning systems might engage psychology students in research methods and statistics. First, we critically appraise definitions of eLearning. Second, we examine numerous important pedagogical principles associated with effectively teaching research methods and statistics using eLearning systems. Subsequently, we provide practical examples of our own eLearning-based class activities designed to engage psychology students to learn statistical concepts such as Factor Analysis and Discriminant Function Analysis. Finally, we discuss general trends in eLearning and possible futures that are pertinent to teachers of research methods and statistics in psychology. PMID:27014147

  16. Teaching Research Methods and Statistics in eLearning Environments: Pedagogy, Practical Examples, and Possible Futures.

    PubMed

    Rock, Adam J; Coventry, William L; Morgan, Methuen I; Loi, Natasha M

    2016-01-01

    Generally, academic psychologists are mindful of the fact that, for many students, the study of research methods and statistics is anxiety provoking (Gal et al., 1997). Given the ubiquitous and distributed nature of eLearning systems (Nof et al., 2015), teachers of research methods and statistics need to cultivate an understanding of how to effectively use eLearning tools to inspire psychology students to learn. Consequently, the aim of the present paper is to discuss critically how using eLearning systems might engage psychology students in research methods and statistics. First, we critically appraise definitions of eLearning. Second, we examine numerous important pedagogical principles associated with effectively teaching research methods and statistics using eLearning systems. Subsequently, we provide practical examples of our own eLearning-based class activities designed to engage psychology students to learn statistical concepts such as Factor Analysis and Discriminant Function Analysis. Finally, we discuss general trends in eLearning and possible futures that are pertinent to teachers of research methods and statistics in psychology.

  17. Comparison of future and base precipitation anomalies by SimCLIM statistical projection through ensemble approach in Pakistan

    NASA Astrophysics Data System (ADS)

    Amin, Asad; Nasim, Wajid; Mubeen, Muhammad; Kazmi, Dildar Hussain; Lin, Zhaohui; Wahid, Abdul; Sultana, Syeda Refat; Gibbs, Jim; Fahad, Shah

    2017-09-01

    Unpredictable precipitation trends have largely influenced by climate change which prolonged droughts or floods in South Asia. Statistical analysis of monthly, seasonal, and annual precipitation trend carried out for different temporal (1996-2015 and 2041-2060) and spatial scale (39 meteorological stations) in Pakistan. Statistical downscaling model (SimCLIM) was used for future precipitation projection (2041-2060) and analyzed by statistical approach. Ensemble approach combined with representative concentration pathways (RCPs) at medium level used for future projections. The magnitude and slop of trends were derived by applying Mann-Kendal and Sen's slop statistical approaches. Geo-statistical application used to generate precipitation trend maps. Comparison of base and projected precipitation by statistical analysis represented by maps and graphical visualization which facilitate to detect trends. Results of this study projects that precipitation trend was increasing more than 70% of weather stations for February, March, April, August, and September represented as base years. Precipitation trend was decreased in February to April but increase in July to October in projected years. Highest decreasing trend was reported in January for base years which was also decreased in projected years. Greater variation in precipitation trends for projected and base years was reported in February to April. Variations in projected precipitation trend for Punjab and Baluchistan highly accredited in March and April. Seasonal analysis shows large variation in winter, which shows increasing trend for more than 30% of weather stations and this increased trend approaches 40% for projected precipitation. High risk was reported in base year pre-monsoon season where 90% of weather station shows increasing trend but in projected years this trend decreased up to 33%. Finally, the annual precipitation trend has increased for more than 90% of meteorological stations in base (1996-2015) which has decreased for projected year (2041-2060) up to 76%. These result revealed that overall precipitation trend is decreasing in future year which may prolonged the drought in 14% of weather stations under study.

  18. Statistical bias correction method applied on CMIP5 datasets over the Indian region during the summer monsoon season for climate change applications

    NASA Astrophysics Data System (ADS)

    Prasanna, V.

    2018-01-01

    This study makes use of temperature and precipitation from CMIP5 climate model output for climate change application studies over the Indian region during the summer monsoon season (JJAS). Bias correction of temperature and precipitation from CMIP5 GCM simulation results with respect to observation is discussed in detail. The non-linear statistical bias correction is a suitable bias correction method for climate change data because it is simple and does not add up artificial uncertainties to the impact assessment of climate change scenarios for climate change application studies (agricultural production changes) in the future. The simple statistical bias correction uses observational constraints on the GCM baseline, and the projected results are scaled with respect to the changing magnitude in future scenarios, varying from one model to the other. Two types of bias correction techniques are shown here: (1) a simple bias correction using a percentile-based quantile-mapping algorithm and (2) a simple but improved bias correction method, a cumulative distribution function (CDF; Weibull distribution function)-based quantile-mapping algorithm. This study shows that the percentile-based quantile mapping method gives results similar to the CDF (Weibull)-based quantile mapping method, and both the methods are comparable. The bias correction is applied on temperature and precipitation variables for present climate and future projected data to make use of it in a simple statistical model to understand the future changes in crop production over the Indian region during the summer monsoon season. In total, 12 CMIP5 models are used for Historical (1901-2005), RCP4.5 (2005-2100), and RCP8.5 (2005-2100) scenarios. The climate index from each CMIP5 model and the observed agricultural yield index over the Indian region are used in a regression model to project the changes in the agricultural yield over India from RCP4.5 and RCP8.5 scenarios. The results revealed a better convergence of model projections in the bias corrected data compared to the uncorrected data. The study can be extended to localized regional domains aimed at understanding the changes in the agricultural productivity in the future with an agro-economy or a simple statistical model. The statistical model indicated that the total food grain yield is going to increase over the Indian region in the future, the increase in the total food grain yield is approximately 50 kg/ ha for the RCP4.5 scenario from 2001 until the end of 2100, and the increase in the total food grain yield is approximately 90 kg/ha for the RCP8.5 scenario from 2001 until the end of 2100. There are many studies using bias correction techniques, but this study applies the bias correction technique to future climate scenario data from CMIP5 models and applied it to crop statistics to find future crop yield changes over the Indian region.

  19. The Future of Statistical Software. Proceedings of a Forum--Panel on Guidelines for Statistical Software (Washington, D.C., February 22, 1991).

    ERIC Educational Resources Information Center

    National Academy of Sciences - National Research Council, Washington, DC.

    The Panel on Guidelines for Statistical Software was organized in 1990 to document, assess, and prioritize problem areas regarding quality and reliability of statistical software; present prototype guidelines in high priority areas; and make recommendations for further research and discussion. This document provides the following papers presented…

  20. Statistical Downscaling and Bias Correction of Climate Model Outputs for Climate Change Impact Assessment in the U.S. Northeast

    NASA Technical Reports Server (NTRS)

    Ahmed, Kazi Farzan; Wang, Guiling; Silander, John; Wilson, Adam M.; Allen, Jenica M.; Horton, Radley; Anyah, Richard

    2013-01-01

    Statistical downscaling can be used to efficiently downscale a large number of General Circulation Model (GCM) outputs to a fine temporal and spatial scale. To facilitate regional impact assessments, this study statistically downscales (to 1/8deg spatial resolution) and corrects the bias of daily maximum and minimum temperature and daily precipitation data from six GCMs and four Regional Climate Models (RCMs) for the northeast United States (US) using the Statistical Downscaling and Bias Correction (SDBC) approach. Based on these downscaled data from multiple models, five extreme indices were analyzed for the future climate to quantify future changes of climate extremes. For a subset of models and indices, results based on raw and bias corrected model outputs for the present-day climate were compared with observations, which demonstrated that bias correction is important not only for GCM outputs, but also for RCM outputs. For future climate, bias correction led to a higher level of agreements among the models in predicting the magnitude and capturing the spatial pattern of the extreme climate indices. We found that the incorporation of dynamical downscaling as an intermediate step does not lead to considerable differences in the results of statistical downscaling for the study domain.

  1. Information trimming: Sufficient statistics, mutual information, and predictability from effective channel states

    NASA Astrophysics Data System (ADS)

    James, Ryan G.; Mahoney, John R.; Crutchfield, James P.

    2017-06-01

    One of the most basic characterizations of the relationship between two random variables, X and Y , is the value of their mutual information. Unfortunately, calculating it analytically and estimating it empirically are often stymied by the extremely large dimension of the variables. One might hope to replace such a high-dimensional variable by a smaller one that preserves its relationship with the other. It is well known that either X (or Y ) can be replaced by its minimal sufficient statistic about Y (or X ) while preserving the mutual information. While intuitively reasonable, it is not obvious or straightforward that both variables can be replaced simultaneously. We demonstrate that this is in fact possible: the information X 's minimal sufficient statistic preserves about Y is exactly the information that Y 's minimal sufficient statistic preserves about X . We call this procedure information trimming. As an important corollary, we consider the case where one variable is a stochastic process' past and the other its future. In this case, the mutual information is the channel transmission rate between the channel's effective states. That is, the past-future mutual information (the excess entropy) is the amount of information about the future that can be predicted using the past. Translating our result about minimal sufficient statistics, this is equivalent to the mutual information between the forward- and reverse-time causal states of computational mechanics. We close by discussing multivariate extensions to this use of minimal sufficient statistics.

  2. Generation of future potential scenarios in an Alpine Catchment by applying bias-correction techniques, delta-change approaches and stochastic Weather Generators at different spatial scale. Analysis of their influence on basic and drought statistics.

    NASA Astrophysics Data System (ADS)

    Collados-Lara, Antonio-Juan; Pulido-Velazquez, David; Pardo-Iguzquiza, Eulogio

    2017-04-01

    Assessing impacts of potential future climate change scenarios in precipitation and temperature is essential to design adaptive strategies in water resources systems. The objective of this work is to analyze the possibilities of different statistical downscaling methods to generate future potential scenarios in an Alpine Catchment from historical data and the available climate models simulations performed in the frame of the CORDEX EU project. The initial information employed to define these downscaling approaches are the historical climatic data (taken from the Spain02 project for the period 1971-2000 with a spatial resolution of 12.5 Km) and the future series provided by climatic models in the horizon period 2071-2100 . We have used information coming from nine climate model simulations (obtained from five different Regional climate models (RCM) nested to four different Global Climate Models (GCM)) from the European CORDEX project. In our application we have focused on the Representative Concentration Pathways (RCP) 8.5 emissions scenario, which is the most unfavorable scenario considered in the fifth Assessment Report (AR5) by the Intergovernmental Panel on Climate Change (IPCC). For each RCM we have generated future climate series for the period 2071-2100 by applying two different approaches, bias correction and delta change, and five different transformation techniques (first moment correction, first and second moment correction, regression functions, quantile mapping using distribution derived transformation and quantile mapping using empirical quantiles) for both of them. Ensembles of the obtained series were proposed to obtain more representative potential future climate scenarios to be employed to study potential impacts. In this work we propose a non-equifeaseble combination of the future series giving more weight to those coming from models (delta change approaches) or combination of models and techniques that provides better approximation to the basic and drought statistic of the historical data. A multi-objective analysis using basic statistics (mean, standard deviation and asymmetry coefficient) and droughts statistics (duration, magnitude and intensity) has been performed to identify which models are better in terms of goodness of fit to reproduce the historical series. The drought statistics have been obtained from the Standard Precipitation index (SPI) series using the Theory of Runs. This analysis allows discriminate the best RCM and the best combination of model and correction technique in the bias-correction method. We have also analyzed the possibilities of using different Stochastic Weather Generators to approximate the basic and droughts statistics of the historical series. These analyses have been performed in our case study in a lumped and in a distributed way in order to assess its sensibility to the spatial scale. The statistic of the future temperature series obtained with different ensemble options are quite homogeneous, but the precipitation shows a higher sensibility to the adopted method and spatial scale. The global increment in the mean temperature values are 31.79 %, 31.79 %, 31.03 % and 31.74 % for the distributed bias-correction, distributed delta-change, lumped bias-correction and lumped delta-change ensembles respectively and in the precipitation they are -25.48 %, -28.49 %, -26.42 % and -27.35% respectively. Acknowledgments: This research work has been partially supported by the GESINHIMPADAPT project (CGL2013-48424-C2-2-R) with Spanish MINECO funds. We would also like to thank Spain02 and CORDEX projects for the data provided for this study and the R package qmap.

  3. Accounting for Global Climate Model Projection Uncertainty in Modern Statistical Downscaling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johannesson, G

    2010-03-17

    Future climate change has emerged as a national and a global security threat. To carry out the needed adaptation and mitigation steps, a quantification of the expected level of climate change is needed, both at the global and the regional scale; in the end, the impact of climate change is felt at the local/regional level. An important part of such climate change assessment is uncertainty quantification. Decision and policy makers are not only interested in 'best guesses' of expected climate change, but rather probabilistic quantification (e.g., Rougier, 2007). For example, consider the following question: What is the probability that themore » average summer temperature will increase by at least 4 C in region R if global CO{sub 2} emission increases by P% from current levels by time T? It is a simple question, but one that remains very difficult to answer. It is answering these kind of questions that is the focus of this effort. The uncertainty associated with future climate change can be attributed to three major factors: (1) Uncertainty about future emission of green house gasses (GHG). (2) Given a future GHG emission scenario, what is its impact on the global climate? (3) Given a particular evolution of the global climate, what does it mean for a particular location/region? In what follows, we assume a particular GHG emission scenario has been selected. Given the GHG emission scenario, the current batch of the state-of-the-art global climate models (GCMs) is used to simulate future climate under this scenario, yielding an ensemble of future climate projections (which reflect, to some degree our uncertainty of being able to simulate future climate give a particular GHG scenario). Due to the coarse-resolution nature of the GCM projections, they need to be spatially downscaled for regional impact assessments. To downscale a given GCM projection, two methods have emerged: dynamical downscaling and statistical (empirical) downscaling (SDS). Dynamic downscaling involves configuring and running a regional climate model (RCM) nested within a given GCM projection (i.e., the GCM provides bounder conditions for the RCM). On the other hand, statistical downscaling aims at establishing a statistical relationship between observed local/regional climate variables of interest and synoptic (GCM-scale) climate predictors. The resulting empirical relationship is then applied to future GCM projections. A comparison of the pros and cons of dynamical versus statistical downscaling is outside the scope of this effort, but has been extensively studied and the reader is referred to Wilby et al. (1998); Murphy (1999); Wood et al. (2004); Benestad et al. (2007); Fowler et al. (2007), and references within those. The scope of this effort is to study methodology, a statistical framework, to propagate and account for GCM uncertainty in regional statistical downscaling assessment. In particular, we will explore how to leverage an ensemble of GCM projections to quantify the impact of the GCM uncertainty in such an assessment. There are three main component to this effort: (1) gather the necessary climate-related data for a regional SDS study, including multiple GCM projections, (2) carry out SDS, and (3) assess the uncertainty. The first step is carried out using tools written in the Python programming language, while analysis tools were developed in the statistical programming language R; see Figure 1.« less

  4. Attitudes to statistics in primary health care physicians, Qassim province.

    PubMed

    Jahan, Saulat; Al-Saigul, Abdullah Mohammed; Suliman, Amel Abdalrhim

    2016-07-01

    Aim To investigate primary health care (PHC) physicians' attitudes to statistics, their self-reported knowledge level, and their perceived training needs in statistics. In spite of realization of the importance of statistics, inadequacies in physicians' knowledge and skills have been found, underscoring the need for in-service training. Understanding physicians' attitudes to statistics is vital in planning statistics training. The study was based on theory of planned behavior. A cross-sectional survey of all PHC physicians was conducted in Qassim province, from August to October 2014. Attitudes to statistics were determined by a self-administered questionnaire. The attitudes were assessed on four subscales including general perceptions; perceptions of knowledge and training; perceptions of statistics and evidence-based medicine; and perceptions of future learning. Findings Of 416 eligible participants, 338 (81.25%) responded to the survey. On a scale of 1-10, the majority (73.6%) of the participants self-assessed their level of statistics knowledge as five or below. The attitude scores could have a minimum of 20 and a maximum of 100, with higher scores showing a positive attitude. The participants showed a positive attitude with the mean score of 71.14 (±7.73). Out of the four subscales, 'perceptions of statistics and evidence-based medicine' subscale scored the highest, followed by 'perceptions of future learning'. PHC physicians have a positive attitude to statistics. However, they realize their gaps in knowledge in statistics, and are keen to fill these gaps. Statistics training, resulting in improved statistics knowledge is expected to lead to clinical care utilizing evidence-based medicine, and thus improvement to health care services.

  5. Predicting future protection of respirator users: Statistical approaches and practical implications.

    PubMed

    Hu, Chengcheng; Harber, Philip; Su, Jing

    2016-01-01

    The purpose of this article is to describe a statistical approach for predicting a respirator user's fit factor in the future based upon results from initial tests. A statistical prediction model was developed based upon joint distribution of multiple fit factor measurements over time obtained from linear mixed effect models. The model accounts for within-subject correlation as well as short-term (within one day) and longer-term variability. As an example of applying this approach, model parameters were estimated from a research study in which volunteers were trained by three different modalities to use one of two types of respirators. They underwent two quantitative fit tests at the initial session and two on the same day approximately six months later. The fitted models demonstrated correlation and gave the estimated distribution of future fit test results conditional on past results for an individual worker. This approach can be applied to establishing a criterion value for passing an initial fit test to provide reasonable likelihood that a worker will be adequately protected in the future; and to optimizing the repeat fit factor test intervals individually for each user for cost-effective testing.

  6. Uncertainties of statistical downscaling from predictor selection: Equifinality and transferability

    NASA Astrophysics Data System (ADS)

    Fu, Guobin; Charles, Stephen P.; Chiew, Francis H. S.; Ekström, Marie; Potter, Nick J.

    2018-05-01

    The nonhomogeneous hidden Markov model (NHMM) statistical downscaling model, 38 catchments in southeast Australia and 19 general circulation models (GCMs) were used in this study to demonstrate statistical downscaling uncertainties caused by equifinality to and transferability. That is to say, there could be multiple sets of predictors that give similar daily rainfall simulation results for both calibration and validation periods, but project different amounts (or even directions of change) of rainfall changing in the future. Results indicated that two sets of predictors (Set 1 with predictors of sea level pressure north-south gradient, u-wind at 700 hPa, v-wind at 700 hPa, and specific humidity at 700 hPa and Set 2 with predictors of sea level pressure north-south gradient, u-wind at 700 hPa, v-wind at 700 hPa, and dewpoint temperature depression at 850 hPa) as inputs to the NHMM produced satisfactory results of seasonal rainfall in comparison with observations. For example, during the model calibration period, the relative errors across the 38 catchments ranged from 0.48 to 1.76% with a mean value of 1.09% for the predictor Set 1, and from 0.22 to 2.24% with a mean value of 1.16% for the predictor Set 2. However, the changes of future rainfall from NHMM projections based on 19 GCMs produced projections with a different sign for these two different sets of predictors: Set 1 predictors project an increase of future rainfall with magnitudes depending on future time periods and emission scenarios, but Set 2 predictors project a decline of future rainfall. Such divergent projections may present a significant challenge for applications of statistical downscaling as well as climate change impact studies, and could potentially imply caveats in many existing studies in the literature.

  7. Preparing Teachers of Statistics: A Graduate Course for Future Teachers

    ERIC Educational Resources Information Center

    Garfield, Joan; Everson, Michelle

    2009-01-01

    This paper describes a unique graduate-level course that prepares teachers of introductory statistics at the college and high school levels. The course was developed as part of a graduate degree program in statistics education. Although originally taught in a face-to-face setting, the class has been converted to an online course to be accessible…

  8. Kansas woodlands.

    Treesearch

    Clarence D. Chase; John K. Strickler

    1968-01-01

    The report presents statistics on area, volume, growth, mortality, and timber use. Projections of expected timber volumes 30 years in the future are also presented. These data are discussed with regard to possible future development and use of the state's woodlands.

  9. Assimilating the Future for Better Forecasts and Earlier Warnings

    NASA Astrophysics Data System (ADS)

    Du, H.; Wheatcroft, E.; Smith, L. A.

    2016-12-01

    Multi-model ensembles have become popular tools to account for some of the uncertainty due to model inadequacy in weather and climate simulation-based predictions. The current multi-model forecasts focus on combining single model ensemble forecasts by means of statistical post-processing. Assuming each model is developed independently or with different primary target variables, each is likely to contain different dynamical strengths and weaknesses. Using statistical post-processing, such information is only carried by the simulations under a single model ensemble: no advantage is taken to influence simulations under the other models. A novel methodology, named Multi-model Cross Pollination in Time, is proposed for multi-model ensemble scheme with the aim of integrating the dynamical information regarding the future from each individual model operationally. The proposed approach generates model states in time via applying data assimilation scheme(s) to yield truly "multi-model trajectories". It is demonstrated to outperform traditional statistical post-processing in the 40-dimensional Lorenz96 flow. Data assimilation approaches are originally designed to improve state estimation from the past to the current time. The aim of this talk is to introduce a framework that uses data assimilation to improve model forecasts at future time (not to argue for any one particular data assimilation scheme). Illustration of applying data assimilation "in the future" to provide early warning of future high-impact events is also presented.

  10. Validating Future Force Performance Measures (Army Class): Concluding Analyses

    DTIC Science & Technology

    2016-06-01

    32 Table 3.10. Descriptive Statistics and Intercorrelations for LV Final Predictor Factor Scores...55 Table 4.7. Descriptive Statistics for Analysis Criteria...Soldier attrition and performance: Dependability (Non- Delinquency ), Adjustment, Physical Conditioning, Leadership, Work Orientation, and Agreeableness

  11. PLANT DIVERSITY

    EPA Science Inventory

    Habitat change statistics and species-area curves were used to estimate the effects of alternative future scenarios for agriculture on plant diversity in Iowa farmlands. Study areas were two watersheds in central Iowa of about 50 and 90 square kilometers, respectively. Future s...

  12. WILDLIFE HABITAT

    EPA Science Inventory

    Habitat change statistics were used to estimate the effects of alternative future scenarios for agriculture on non-fish vertebrate diversity in Iowa farmlands. Study areas were two watersheds in central Iowa of about 50 and 90 square kilometers, respectively. Future scenarios w...

  13. Making statistical inferences about software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1988-01-01

    Failure times of software undergoing random debugging can be modelled as order statistics of independent but nonidentically distributed exponential random variables. Using this model inferences can be made about current reliability and, if debugging continues, future reliability. This model also shows the difficulty inherent in statistical verification of very highly reliable software such as that used by digital avionics in commercial aircraft.

  14. When Statistical Literacy Really Matters: Understanding Published Information about the HIV/AIDS Epidemic in South Africa

    ERIC Educational Resources Information Center

    Hobden, Sally

    2014-01-01

    Information on the HIV/AIDS epidemic in Southern Africa is often interpreted through a veil of secrecy and shame and, I argue, with flawed understanding of basic statistics. This research determined the levels of statistical literacy evident in 316 future Mathematical Literacy teachers' explanations of the median in the context of HIV/AIDS…

  15. Atmospheric statistics for aerospace vehicle operations

    NASA Technical Reports Server (NTRS)

    Smith, O. E.; Batts, G. W.

    1993-01-01

    Statistical analysis of atmospheric variables was performed for the Shuttle Transportation System (STS) design trade studies and the establishment of launch commit criteria. Atmospheric constraint statistics have been developed for the NASP test flight, the Advanced Launch System, and the National Launch System. The concepts and analysis techniques discussed in the paper are applicable to the design and operations of any future aerospace vehicle.

  16. Spatial Statistical Network Models for Stream and River Temperature in the Chesapeake Bay Watershed, USA

    EPA Science Inventory

    Regional temperature models are needed for characterizing and mapping stream thermal regimes, establishing reference conditions, predicting future impacts and identifying critical thermal refugia. Spatial statistical models have been developed to improve regression modeling techn...

  17. Golden Legacy, Boundless Future: Essays on the United States Air Force and the Rise of Aerospace Power

    DTIC Science & Technology

    2000-01-01

    tactical support, and, to a lesser extent, bom- bardment. The American Army had to digest quickly the crucial lesson already absorbed by the...2. United States Air Force Statistical Digest , 1947, Director of Statistical Services, Comptroller, HQ USAF, Washington, D.C., 1948, 15-16, 72, 132...Statistical Digest , Director of Statistical Services, Deputy Chief of Staff, Comptroller, HQ USAF, Washington, D.C., Nov 1952, 162-164. 6. Sarah A

  18. The Future of Statistics as a Discipline.

    DTIC Science & Technology

    1981-09-01

    Uiversity Dqpsrtzuuht of Statistics Tallahassee, Florida 32306 tIhe Ru, seZ 11 La b WY delveedat he14stAzuuul PMetlnq of the Atudica...from the real world. While the academicians too often fail to enrich their instruction and research with real life pro’ lems, practi- tioners do not...of the Americm Statistical Association, 75, 575-582. Box, George E. P. (1979), "Some Problems of Statistics and Everyday Life ," Jowna of the Amerioan

  19. Statistics for NAEG: past efforts, new results, and future plans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, R.O.; Simpson, J.C.; Kinnison, R.R.

    A brief review of Nevada Applied Ecology Group (NAEG) objectives is followed by a summary of past statistical analyses conducted by Pacific Northwest Laboratory for the NAEG. Estimates of spatial pattern of radionuclides and other statistical analyses at NS's 201, 219 and 221 are reviewed as background for new analyses presented in this paper. Suggested NAEG activities and statistical analyses needed for the projected termination date of NAEG studies in March 1986 are given.

  20. Examining the Stationarity Assumption for Statistically Downscaled Climate Projections of Precipitation

    NASA Astrophysics Data System (ADS)

    Wootten, A.; Dixon, K. W.; Lanzante, J. R.; Mcpherson, R. A.

    2017-12-01

    Empirical statistical downscaling (ESD) approaches attempt to refine global climate model (GCM) information via statistical relationships between observations and GCM simulations. The aim of such downscaling efforts is to create added-value climate projections by adding finer spatial detail and reducing biases. The results of statistical downscaling exercises are often used in impact assessments under the assumption that past performance provides an indicator of future results. Given prior research describing the danger of this assumption with regards to temperature, this study expands the perfect model experimental design from previous case studies to test the stationarity assumption with respect to precipitation. Assuming stationarity implies the performance of ESD methods are similar between the future projections and historical training. Case study results from four quantile-mapping based ESD methods demonstrate violations of the stationarity assumption for both central tendency and extremes of precipitation. These violations vary geographically and seasonally. For the four ESD methods tested the greatest challenges for downscaling of daily total precipitation projections occur in regions with limited precipitation and for extremes of precipitation along Southeast coastal regions. We conclude with a discussion of future expansion of the perfect model experimental design and the implications for improving ESD methods and providing guidance on the use of ESD techniques for impact assessments and decision-support.

  1. Climate Change Assessment of Precipitation in Tandula Reservoir System

    NASA Astrophysics Data System (ADS)

    Jaiswal, Rahul Kumar; Tiwari, H. L.; Lohani, A. K.

    2018-02-01

    The precipitation is the principle input of hydrological cycle affect availability of water in spatial and temporal scale of basin due to widely accepted climate change. The present study deals with the statistical downscaling using Statistical Down Scaling Model for rainfall of five rain gauge stations (Ambagarh, Bhanpura, Balod, Chamra and Gondli) in Tandula, Kharkhara and Gondli reservoirs of Chhattisgarh state of India to forecast future rainfall in three different periods under SRES A1B and A2 climatic forcing conditions. In the analysis, twenty-six climatic variables obtained from National Centers for Environmental Prediction were used and statistically tested for selection of best-fit predictors. The conditional process based statistical correlation was used to evolve multiple linear relations in calibration for period of 1981-1995 was tested with independent data of 1996-2003 for validation. The developed relations were further used to predict future rainfall scenarios for three different periods 2020-2035 (FP-1), 2046-2064 (FP-2) and 2081-2100 (FP-3) and compared with monthly rainfalls during base period (1981-2003) for individual station and all three reservoir catchments. From the analysis, it has been found that most of the rain gauge stations and all three reservoir catchments may receive significant less rainfall in future. The Thiessen polygon based annual and seasonal rainfall for different catchments confirmed a reduction of seasonal rainfall from 5.1 to 14.1% in Tandula reservoir, 11-19.2% in Kharkhara reservoir and 15.1-23.8% in Gondli reservoir. The Gondli reservoir may be affected the most in term of water availability in future prediction periods.

  2. A novel approach for examining future US domestic water demand

    EPA Science Inventory

    Costs of repairing and expanding aging infrastructure and competing demands for water from other sectors such as industry and agriculture are stretching policy makers’ abilities to meet essential domestic drinking water needs for future generations. Using Bayesian statistic...

  3. Illinois Kids Count: A Snap Shot of Our Future. County by County Profiles of Child Well-Being '92.

    ERIC Educational Resources Information Center

    Voices for Illinois Children, Chicago.

    This booklet presents statistics concerning the well-being of Illinois' 3.3 million children between 1980 and 1990. Statistics are compared county by county for each of the state's 102 counties, and statewide statistics are compared with those of the entire nation. A statewide analysis focuses on spending per pupil on education, the percentage of…

  4. Statistical downscaling and future scenario generation of temperatures for Pakistan Region

    NASA Astrophysics Data System (ADS)

    Kazmi, Dildar Hussain; Li, Jianping; Rasul, Ghulam; Tong, Jiang; Ali, Gohar; Cheema, Sohail Babar; Liu, Luliu; Gemmer, Marco; Fischer, Thomas

    2015-04-01

    Finer climate change information on spatial scale is required for impact studies than that presently provided by global or regional climate models. It is especially true for regions like South Asia with complex topography, coastal or island locations, and the areas of highly heterogeneous land-cover. To deal with the situation, an inexpensive method (statistical downscaling) has been adopted. Statistical DownScaling Model (SDSM) employed for downscaling of daily minimum and maximum temperature data of 44 national stations for base time (1961-1990) and then the future scenarios generated up to 2099. Observed as well as Predictors (product of National Oceanic and Atmospheric Administration) data were calibrated and tested on individual/multiple basis through linear regression. Future scenario was generated based on HadCM3 daily data for A2 and B2 story lines. The downscaled data has been tested, and it has shown a relatively strong relationship with the observed in comparison to ECHAM5 data. Generally, the southern half of the country is considered vulnerable in terms of increasing temperatures, but the results of this study projects that in future, the northern belt in particular would have a possible threat of increasing tendency in air temperature. Especially, the northern areas (hosting the third largest ice reserves after the Polar Regions), an important feeding source for Indus River, are projected to be vulnerable in terms of increasing temperatures. Consequently, not only the hydro-agricultural sector but also the environmental conditions in the area may be at risk, in future.

  5. Planning for community resilience to future United States domestic water demand

    EPA Science Inventory

    Costs of repairing and expanding aging infrastructure and competing demands for water from other sectors such as industry and agriculture are stretching water managers’ abilities to meet essential domestic drinking water needs for future generations. Using Bayesian statistical mo...

  6. 78 FR 63458 - Privacy Act of 1974; System of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-24

    ..., access to conduct research involving DoDEA students, staff, parents or data. Additionally will establish researcher accountability, enable future contact with researchers, and support preparation of statistical and... students, staff, parents or data. To establish researcher accountability, enable future contact with...

  7. Designing an Error Resolution Checklist for a Shared Manned-Unmanned Environment

    DTIC Science & Technology

    2010-06-01

    performance during the Olympics. Thank you to Birsen Donmez, who took an active role in my statistics instruction. I appreciate your time and patience...in teaching me the finer details of “varsity statistics ”. Also, thank you for being so responsive through e-mail, even though you are now located in...105! 6.3.! Experiment recommendations and future work................................................ 105! Appendix A: Descriptive Statistics

  8. Learning predictive statistics from temporal sequences: Dynamics and strategies.

    PubMed

    Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew E; Kourtzi, Zoe

    2017-10-01

    Human behavior is guided by our expectations about the future. Often, we make predictions by monitoring how event sequences unfold, even though such sequences may appear incomprehensible. Event structures in the natural environment typically vary in complexity, from simple repetition to complex probabilistic combinations. How do we learn these structures? Here we investigate the dynamics of structure learning by tracking human responses to temporal sequences that change in structure unbeknownst to the participants. Participants were asked to predict the upcoming item following a probabilistic sequence of symbols. Using a Markov process, we created a family of sequences, from simple frequency statistics (e.g., some symbols are more probable than others) to context-based statistics (e.g., symbol probability is contingent on preceding symbols). We demonstrate the dynamics with which individuals adapt to changes in the environment's statistics-that is, they extract the behaviorally relevant structures to make predictions about upcoming events. Further, we show that this structure learning relates to individual decision strategy; faster learning of complex structures relates to selection of the most probable outcome in a given context (maximizing) rather than matching of the exact sequence statistics. Our findings provide evidence for alternate routes to learning of behaviorally relevant statistics that facilitate our ability to predict future events in variable environments.

  9. Collecting and Using Networked Statistics: Current Status, Future Goals

    ERIC Educational Resources Information Center

    Hiott, Judith

    2004-01-01

    For more than five years the Houston Public Library has collected statistics for measuring networked collections and services based on emerging guidelines. While the guidelines have provided authority and stability to the process, the clarification process continues. The development of information discovery software, such as federated search tools…

  10. Focus in High School Mathematics: Statistics and Probability

    ERIC Educational Resources Information Center

    National Council of Teachers of Mathematics, 2009

    2009-01-01

    Reasoning about and making sense of statistics and probability are essential to students' future success. This volume belongs to a series that supports National Council of Teachers of Mathematics' (NCTM's) "Focus in High School Mathematics: Reasoning and Sense Making" by providing additional guidance for making reasoning and sense making part of…

  11. An evaluation of the Goddard Space Flight Center Library

    NASA Technical Reports Server (NTRS)

    Herner, S.; Lancaster, F. W.; Wright, N.; Ockerman, L.; Shearer, B.; Greenspan, S.; Mccartney, J.; Vellucci, M.

    1979-01-01

    The character and degree of coincidence between the current and future missions, programs, and projects of the Goddard Space Flight Center and the current and future collection, services, and facilities of its library were determined from structured interviews and discussions with various classes of facility personnel. In addition to the tabulation and interpretation of the data from the structured interview survey, five types of statistical analyses were performed to corroborate (or contradict) the survey results and to produce useful information not readily attainable through survey material. Conclusions reached regarding compatability between needs and holdings, services and buildings, library hours of operation, methods of early detection and anticipation of changing holdings requirements, and the impact of near future programs are presented along with a list of statistics needing collection, organization, and interpretation on a continuing or longitudinal basis.

  12. A Statistical Weather-Driven Streamflow Model: Enabling future flow predictions in data-scarce headwater streams

    NASA Astrophysics Data System (ADS)

    Rosner, A.; Letcher, B. H.; Vogel, R. M.

    2014-12-01

    Predicting streamflow in headwaters and over a broad spatial scale pose unique challenges due to limited data availability. Flow observation gages for headwaters streams are less common than for larger rivers, and gages with records lengths of ten year or more are even more scarce. Thus, there is a great need for estimating streamflows in ungaged or sparsely-gaged headwaters. Further, there is often insufficient basin information to develop rainfall-runoff models that could be used to predict future flows under various climate scenarios. Headwaters in the northeastern U.S. are of particular concern to aquatic biologists, as these stream serve as essential habitat for native coldwater fish. In order to understand fish response to past or future environmental drivers, estimates of seasonal streamflow are needed. While there is limited flow data, there is a wealth of data for historic weather conditions. Observed data has been modeled to interpolate a spatially continuous historic weather dataset. (Mauer et al 2002). We present a statistical model developed by pairing streamflow observations with precipitation and temperature information for the same and preceding time-steps. We demonstrate this model's use to predict flow metrics at the seasonal time-step. While not a physical model, this statistical model represents the weather drivers. Since this model can predict flows not directly tied to reference gages, we can generate flow estimates for historic as well as potential future conditions.

  13. RIPARIAN SHADE CONTROLS ON STREAM TEMPERATURE NOW AND IN THE FUTURE ACROSS TRIBUTARIES OF THE COLUMBIA RIVER, USA

    EPA Science Inventory

    Future climates may warm stream temperatures altering aquatic communities and threatening socioeconomically-important species. These impacts will vary across large spatial extents and require special evaluation tools. Statistical stream network models (SSNs) account for spatial a...

  14. A Study on Predictive Analytics Application to Ship Machinery Maintenance

    DTIC Science & Technology

    2013-09-01

    Looking at the nature of the time series forecasting method , it would be better applied to offline analysis . The application for real- time online...other system attributes in future. Two techniques of statistical analysis , mainly time series models and cumulative sum control charts, are discussed in...statistical tool employed for the two techniques of statistical analysis . Both time series forecasting as well as CUSUM control charts are shown to be

  15. Developing a Campaign Plan to Target Centers of Gravity Within Economic Systems

    DTIC Science & Technology

    1995-05-01

    Conclusion 67 CHAPTER 7: CURRENT AND FUTURE CONCERNS 69 Decision Making and Planning 69 Conclusion 72 CHAPTER 8: CONCLUSION 73 APPENDIX A: STATISTICS 80...Terminology and Statistical Tests 80 Country Analysis 84 APPENDIX B 154 BIBLIOGRAPHY 157 VITAE 162 IV LIST OF FIGURES Figure 1. Air Campaign...This project furthers the original statistical effort and adds to this a campaign planning approach (including both systems and operational level

  16. Statistical Methods in Integrative Genomics

    PubMed Central

    Richardson, Sylvia; Tseng, George C.; Sun, Wei

    2016-01-01

    Statistical methods in integrative genomics aim to answer important biology questions by jointly analyzing multiple types of genomic data (vertical integration) or aggregating the same type of data across multiple studies (horizontal integration). In this article, we introduce different types of genomic data and data resources, and then review statistical methods of integrative genomics, with emphasis on the motivation and rationale of these methods. We conclude with some summary points and future research directions. PMID:27482531

  17. Evaluation of Cepstrum Algorithm with Impact Seeded Fault Data of Helicopter Oil Cooler Fan Bearings and Machine Fault Simulator Data

    DTIC Science & Technology

    2013-02-01

    of a bearing must be put into practice. There are many potential methods, the most traditional being the use of statistical time-domain features...accelerate degradation to test multiples bearings to gain statistical relevance and extrapolate results to scale for field conditions. Temperature...as time statistics , frequency estimation to improve the fault frequency detection. For future investigations, one can further explore the

  18. Powering the Future of Science and Exploration

    NASA Technical Reports Server (NTRS)

    Miley, Steven C.

    2009-01-01

    This viewgraph presentation reviews NASA's future of science and space exploration. The topics include: 1) NASA's strategic goals; 2) NASA around the Country; 3) Marshall's History; 4) Marshall's Missions; 5) Marshall Statistics: From Exploration to Opportunity; 6) Propulsion and Transportation Systems; 7) Life Support systems; 8) Earth Science; 9) Space Science; 10) NASA Innovation Creates New Jobs, Markets, and Technologies; 11) NASA Inspires Future Generations of Explorers; and 12) Why Explore?

  19. Simulating the Interactions Among Land Use, Transportation, and Economy to Inform Light Rail Transit Decisions

    EPA Science Inventory

    In most transportation studies, computer models that forecast travel behavior statistics for a future year use static projections of the spatial distribution of future population and employment growth as inputs. As a result, they are unable to account for the temporally dynamic a...

  20. Simulating the Interactions Among Land Use, Transportation, and Economy to Inform Light Rail Transit Decisions (proceedings)

    EPA Science Inventory

    In most transportation studies, computer models that forecast travel behavior statistics for a future year use static projections of the spatial distribution of future population and employment growth as inputs. As a result, they are unable to account for the temporally dynamic a...

  1. Future of the Introductory Psychology Textbook: A Survey of College Publishers.

    ERIC Educational Resources Information Center

    Buskit, William; Cush, David T.

    1997-01-01

    Examines aspects of the introductory psychology textbook market through a publishing house survey. Aspects covered are the current and future number of introductory texts, fewer textbook publishers, custom publishing, changing content, and computer technologies. Discusses the results of the publishers' responses and provides statistical tables of…

  2. A statistical view of FMRFamide neuropeptide diversity.

    PubMed

    Espinoza, E; Carrigan, M; Thomas, S G; Shaw, G; Edison, A S

    2000-01-01

    FMRFamide-like peptide (FLP) amino acid sequences have been collected and statistically analyzed. FLP amino acid composition as a function of position in the peptide is graphically presented for several major phyla. Results of total amino acid composition and frequencies of pairs of FLP amino acids have been computed and compared with corresponding values from the entire GenBank protein sequence database. The data for pairwise distributions of amino acids should help in future structure-function studies of FLPs. To aid in future peptide discovery, a computer program and search protocol was developed to identify FLPs from the GenBank protein database without the use of keywords.

  3. A study of correlations between crude oil spot and futures markets: A rolling sample test

    NASA Astrophysics Data System (ADS)

    Liu, Li; Wan, Jieqiu

    2011-10-01

    In this article, we investigate the asymmetries of exceedance correlations and cross-correlations between West Texas Intermediate (WTI) spot and futures markets. First, employing the test statistic proposed by Hong et al. [Asymmetries in stock returns: statistical tests and economic evaluation, Review of Financial Studies 20 (2007) 1547-1581], we find that the exceedance correlations were overall symmetric. However, the results from rolling windows show that some occasional events could induce the significant asymmetries of the exceedance correlations. Second, employing the test statistic proposed by Podobnik et al. [Quantifying cross-correlations using local and global detrending approaches, European Physics Journal B 71 (2009) 243-250], we find that the cross-correlations were significant even for large lagged orders. Using the detrended cross-correlation analysis proposed by Podobnik and Stanley [Detrended cross-correlation analysis: a new method for analyzing two nonstationary time series, Physics Review Letters 100 (2008) 084102], we find that the cross-correlations were weakly persistent and were stronger between spot and futures contract with larger maturity. Our results from rolling sample test also show the apparent effects of the exogenous events. Additionally, we have some relevant discussions on the obtained evidence.

  4. Common and Unique Factors Associated with DSM-IV-TR Internalizing Disorders in Children

    ERIC Educational Resources Information Center

    Higa-McMillan, Charmaine K.; Smith, Rita L.; Chorpita, Bruce F.; Hayashi, Kentaro

    2008-01-01

    With the fifth edition of the Diagnostic and Statistical Manual of Mental Disorders (American Psychiatric Association. "Diagnostic and statistical manual of mental disorders DSM-IV Fourth Edition-Text Revision". Author, Washington, DC. 2000) ahead, decisions will be made about the future of taxonomic conceptualizations. This study examined the…

  5. A Critical Understanding and Transformation of an Introductory Statistics Course

    ERIC Educational Resources Information Center

    Magalhães, Marcos Nascimento; Magalhães, Maria Cecilia Camargo

    2014-01-01

    In this paper, we report on the impact of four activities and two interviews on the organization of an introductory statistics course attended by future mathematics teachers at the University of Sao Paulo, Brazil. The activities were designed to enhance students' learning and collaborative knowledge construction, based on Vygotsky's…

  6. Bayesian statistics in medicine: a 25 year review.

    PubMed

    Ashby, Deborah

    2006-11-15

    This review examines the state of Bayesian thinking as Statistics in Medicine was launched in 1982, reflecting particularly on its applicability and uses in medical research. It then looks at each subsequent five-year epoch, with a focus on papers appearing in Statistics in Medicine, putting these in the context of major developments in Bayesian thinking and computation with reference to important books, landmark meetings and seminal papers. It charts the growth of Bayesian statistics as it is applied to medicine and makes predictions for the future. From sparse beginnings, where Bayesian statistics was barely mentioned, Bayesian statistics has now permeated all the major areas of medical statistics, including clinical trials, epidemiology, meta-analyses and evidence synthesis, spatial modelling, longitudinal modelling, survival modelling, molecular genetics and decision-making in respect of new technologies.

  7. Wind speed statistics for Goldstone, California, anemometer sites

    NASA Technical Reports Server (NTRS)

    Berg, M.; Levy, R.; Mcginness, H.; Strain, D.

    1981-01-01

    An exploratory wind survey at an antenna complex was summarized statistically for application to future windmill designs. Data were collected at six locations from a total of 10 anemometers. Statistics include means, standard deviations, cubes, pattern factors, correlation coefficients, and exponents for power law profile of wind speed. Curves presented include: mean monthly wind speeds, moving averages, and diurnal variation patterns. It is concluded that three of the locations have sufficiently strong winds to justify consideration for windmill sites.

  8. Statistical downscaling of general-circulation-model- simulated average monthly air temperature to the beginning of flowering of the dandelion (Taraxacum officinale) in Slovenia

    NASA Astrophysics Data System (ADS)

    Bergant, Klemen; Kajfež-Bogataj, Lučka; Črepinšek, Zalika

    2002-02-01

    Phenological observations are a valuable source of information for investigating the relationship between climate variation and plant development. Potential climate change in the future will shift the occurrence of phenological phases. Information about future climate conditions is needed in order to estimate this shift. General circulation models (GCM) provide the best information about future climate change. They are able to simulate reliably the most important mean features on a large scale, but they fail on a regional scale because of their low spatial resolution. A common approach to bridging the scale gap is statistical downscaling, which was used to relate the beginning of flowering of Taraxacum officinale in Slovenia with the monthly mean near-surface air temperature for January, February and March in Central Europe. Statistical models were developed and tested with NCAR/NCEP Reanalysis predictor data and EARS predictand data for the period 1960-1999. Prior to developing statistical models, empirical orthogonal function (EOF) analysis was employed on the predictor data. Multiple linear regression was used to relate the beginning of flowering with expansion coefficients of the first three EOF for the Janauary, Febrauary and March air temperatures, and a strong correlation was found between them. Developed statistical models were employed on the results of two GCM (HadCM3 and ECHAM4/OPYC3) to estimate the potential shifts in the beginning of flowering for the periods 1990-2019 and 2020-2049 in comparison with the period 1960-1989. The HadCM3 model predicts, on average, 4 days earlier occurrence and ECHAM4/OPYC3 5 days earlier occurrence of flowering in the period 1990-2019. The analogous results for the period 2020-2049 are a 10- and 11-day earlier occurrence.

  9. eSACP - a new Nordic initiative towards developing statistical climate services

    NASA Astrophysics Data System (ADS)

    Thorarinsdottir, Thordis; Thejll, Peter; Drews, Martin; Guttorp, Peter; Venälainen, Ari; Uotila, Petteri; Benestad, Rasmus; Mesquita, Michel d. S.; Madsen, Henrik; Fox Maule, Cathrine

    2015-04-01

    The Nordic research council NordForsk has recently announced its support for a new 3-year research initiative on "statistical analysis of climate projections" (eSACP). eSACP will focus on developing e-science tools and services based on statistical analysis of climate projections for the purpose of helping decision-makers and planners in the face of expected future challenges in regional climate change. The motivation behind the project is the growing recognition in our society that forecasts of future climate change is associated with various sources of uncertainty, and that any long-term planning and decision-making dependent on a changing climate must account for this. At the same time there is an obvious gap between scientists from different fields and between practitioners in terms of understanding how climate information relates to different parts of the "uncertainty cascade". In eSACP we will develop generic e-science tools and statistical climate services to facilitate the use of climate projections by decision-makers and scientists from all fields for climate impact analyses and for the development of robust adaptation strategies, which properly (in a statistical sense) account for the inherent uncertainty. The new tool will be publically available and include functionality to utilize the extensive and dynamically growing repositories of data and use state-of-the-art statistical techniques to quantify the uncertainty and innovative approaches to visualize the results. Such a tool will not only be valuable for future assessments and underpin the development of dedicated climate services, but will also assist the scientific community in making more clearly its case on the consequences of our changing climate to policy makers and the general public. The eSACP project is led by Thordis Thorarinsdottir, Norwegian Computing Center, and also includes the Finnish Meteorological Institute, the Norwegian Meteorological Institute, the Technical University of Denmark and the Bjerknes Centre for Climate Research, Norway. This poster will present details of focus areas in the project and show some examples of the expected analysis tools.

  10. THE APPLICATION OF A STATISTICAL DOWNSCALING PROCESS TO DERIVE 21{sup ST} CENTURY RIVER FLOW PREDICTIONS USING A GLOBAL CLIMATE SIMULATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Werth, D.; Chen, K. F.

    2013-08-22

    The ability of water managers to maintain adequate supplies in coming decades depends, in part, on future weather conditions, as climate change has the potential to alter river flows from their current values, possibly rendering them unable to meet demand. Reliable climate projections are therefore critical to predicting the future water supply for the United States. These projections cannot be provided solely by global climate models (GCMs), however, as their resolution is too coarse to resolve the small-scale climate changes that can affect hydrology, and hence water supply, at regional to local scales. A process is needed to ‘downscale’ themore » GCM results to the smaller scales and feed this into a surface hydrology model to help determine the ability of rivers to provide adequate flow to meet future needs. We apply a statistical downscaling to GCM projections of precipitation and temperature through the use of a scaling method. This technique involves the correction of the cumulative distribution functions (CDFs) of the GCM-derived temperature and precipitation results for the 20{sup th} century, and the application of the same correction to 21{sup st} century GCM projections. This is done for three meteorological stations located within the Coosa River basin in northern Georgia, and is used to calculate future river flow statistics for the upper Coosa River. Results are compared to the historical Coosa River flow upstream from Georgia Power Company’s Hammond coal-fired power plant and to flows calculated with the original, unscaled GCM results to determine the impact of potential changes in meteorology on future flows.« less

  11. Assessment of a stochastic downscaling methodology in generating an ensemble of hourly future climate time series

    NASA Astrophysics Data System (ADS)

    Fatichi, S.; Ivanov, V. Y.; Caporali, E.

    2013-04-01

    This study extends a stochastic downscaling methodology to generation of an ensemble of hourly time series of meteorological variables that express possible future climate conditions at a point-scale. The stochastic downscaling uses general circulation model (GCM) realizations and an hourly weather generator, the Advanced WEather GENerator (AWE-GEN). Marginal distributions of factors of change are computed for several climate statistics using a Bayesian methodology that can weight GCM realizations based on the model relative performance with respect to a historical climate and a degree of disagreement in projecting future conditions. A Monte Carlo technique is used to sample the factors of change from their respective marginal distributions. As a comparison with traditional approaches, factors of change are also estimated by averaging GCM realizations. With either approach, the derived factors of change are applied to the climate statistics inferred from historical observations to re-evaluate parameters of the weather generator. The re-parameterized generator yields hourly time series of meteorological variables that can be considered to be representative of future climate conditions. In this study, the time series are generated in an ensemble mode to fully reflect the uncertainty of GCM projections, climate stochasticity, as well as uncertainties of the downscaling procedure. Applications of the methodology in reproducing future climate conditions for the periods of 2000-2009, 2046-2065 and 2081-2100, using the period of 1962-1992 as the historical baseline are discussed for the location of Firenze (Italy). The inferences of the methodology for the period of 2000-2009 are tested against observations to assess reliability of the stochastic downscaling procedure in reproducing statistics of meteorological variables at different time scales.

  12. Assessing the Effectiveness of Statistical Classification Techniques in Predicting Future Employment of Participants in the Temporary Assistance for Needy Families Program

    ERIC Educational Resources Information Center

    Montoya, Isaac D.

    2008-01-01

    Three classification techniques (Chi-square Automatic Interaction Detection [CHAID], Classification and Regression Tree [CART], and discriminant analysis) were tested to determine their accuracy in predicting Temporary Assistance for Needy Families program recipients' future employment. Technique evaluation was based on proportion of correctly…

  13. Identifying future research needs in landscape genetics: Where to from here?

    Treesearch

    Niko Balkenhol; Felix Gugerli; Sam A. Cushman; Lisette P. Waits; Aurelie Coulon; J. W. Arntzen; Rolf Holderegger; Helene H. Wagner

    2009-01-01

    Landscape genetics is an emerging interdisciplinary field that combines methods and concepts from population genetics, landscape ecology, and spatial statistics. The interest in landscape genetics is steadily increasing, and the field is evolving rapidly. We here outline four major challenges for future landscape genetic research that were identified during an...

  14. Validation of non-stationary precipitation series for site-specific impact assessment: Comparison of two statistical downscaling techniques

    USDA-ARS?s Scientific Manuscript database

    The generation of realistic future precipitation scenarios is crucial for assessing their impacts on a range of environmental and socio-economic impact sectors. A scale mismatch exists, however, between the coarse spatial resolution at which global climate models (GCMs) output future climate scenari...

  15. New methods in hydrologic modeling and decision support for culvert flood risk under climate change

    NASA Astrophysics Data System (ADS)

    Rosner, A.; Letcher, B. H.; Vogel, R. M.; Rees, P. S.

    2015-12-01

    Assessing culvert flood vulnerability under climate change poses an unusual combination of challenges. We seek a robust method of planning for an uncertain future, and therefore must consider a wide range of plausible future conditions. Culverts in our case study area, northwestern Massachusetts, USA, are predominantly found in small, ungaged basins. The need to predict flows both at numerous sites and under numerous plausible climate conditions requires a statistical model with low data and computational requirements. We present a statistical streamflow model that is driven by precipitation and temperature, allowing us to predict flows without reliance on reference gages of observed flows. The hydrological analysis is used to determine each culvert's risk of failure under current conditions. We also explore the hydrological response to a range of plausible future climate conditions. These results are used to determine the tolerance of each culvert to future increases in precipitation. In a decision support context, current flood risk as well as tolerance to potential climate changes are used to provide a robust assessment and prioritization for culvert replacements.

  16. Identifying climate analogues for precipitation extremes for Denmark based on RCM simulations from the ENSEMBLES database.

    PubMed

    Arnbjerg-Nielsen, K; Funder, S G; Madsen, H

    2015-01-01

    Climate analogues, also denoted Space-For-Time, may be used to identify regions where the present climatic conditions resemble conditions of a past or future state of another location or region based on robust climate variable statistics in combination with projections of how these statistics change over time. The study focuses on assessing climate analogues for Denmark based on current climate data set (E-OBS) observations as well as the ENSEMBLES database of future climates with the aim of projecting future precipitation extremes. The local present precipitation extremes are assessed by means of intensity-duration-frequency curves for urban drainage design for the relevant locations being France, the Netherlands, Belgium, Germany, the United Kingdom, and Denmark. Based on this approach projected increases of extreme precipitation by 2100 of 9 and 21% are expected for 2 and 10 year return periods, respectively. The results should be interpreted with caution as the best region to represent future conditions for Denmark is the coastal areas of Northern France, for which only little information is available with respect to present precipitation extremes.

  17. Responses of calcification of massive and encrusting corals to past, present, and near-future ocean carbon dioxide concentrations.

    PubMed

    Iguchi, Akira; Kumagai, Naoki H; Nakamura, Takashi; Suzuki, Atsushi; Sakai, Kazuhiko; Nojiri, Yukihiro

    2014-12-15

    In this study, we report the acidification impact mimicking the pre-industrial, the present, and near-future oceans on calcification of two coral species (Porites australiensis, Isopora palifera) by using precise pCO2 control system which can produce acidified seawater under stable pCO2 values with low variations. In the analyses, we performed Bayesian modeling approaches incorporating the variations of pCO2 and compared the results between our modeling approach and classical statistical one. The results showed highest calcification rates in pre-industrial pCO2 level and gradual decreases of calcification in the near-future ocean acidification level, which suggests that ongoing and near-future ocean acidification would negatively impact coral calcification. In addition, it was expected that the variations of parameters of carbon chemistry may affect the inference of the best model on calcification responses to these parameters between Bayesian modeling approach and classical statistical one even under stable pCO2 values with low variations. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Learning predictive statistics from temporal sequences: Dynamics and strategies

    PubMed Central

    Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew E.; Kourtzi, Zoe

    2017-01-01

    Human behavior is guided by our expectations about the future. Often, we make predictions by monitoring how event sequences unfold, even though such sequences may appear incomprehensible. Event structures in the natural environment typically vary in complexity, from simple repetition to complex probabilistic combinations. How do we learn these structures? Here we investigate the dynamics of structure learning by tracking human responses to temporal sequences that change in structure unbeknownst to the participants. Participants were asked to predict the upcoming item following a probabilistic sequence of symbols. Using a Markov process, we created a family of sequences, from simple frequency statistics (e.g., some symbols are more probable than others) to context-based statistics (e.g., symbol probability is contingent on preceding symbols). We demonstrate the dynamics with which individuals adapt to changes in the environment's statistics—that is, they extract the behaviorally relevant structures to make predictions about upcoming events. Further, we show that this structure learning relates to individual decision strategy; faster learning of complex structures relates to selection of the most probable outcome in a given context (maximizing) rather than matching of the exact sequence statistics. Our findings provide evidence for alternate routes to learning of behaviorally relevant statistics that facilitate our ability to predict future events in variable environments. PMID:28973111

  19. Intensity changes in future extreme precipitation: A statistical event-based approach.

    NASA Astrophysics Data System (ADS)

    Manola, Iris; van den Hurk, Bart; de Moel, Hans; Aerts, Jeroen

    2017-04-01

    Short-lived precipitation extremes are often responsible for hazards in urban and rural environments with economic and environmental consequences. The precipitation intensity is expected to increase about 7% per degree of warming, according to the Clausius-Clapeyron (CC) relation. However, the observations often show a much stronger increase in the sub-daily values. In particular, the behavior of the hourly summer precipitation from radar observations with the dew point temperature (the Pi-Td relation) for the Netherlands suggests that for moderate to warm days the intensification of the precipitation can be even higher than 21% per degree of warming, that is 3 times higher than the expected CC relation. The rate of change depends on the initial precipitation intensity, as low percentiles increase with a rate below CC, the medium percentiles with 2CC and the moderate-high and high percentiles with 3CC. This non-linear statistical Pi-Td relation is suggested to be used as a delta-transformation to project how a historic extreme precipitation event would intensify under future, warmer conditions. Here, the Pi-Td relation is applied over a selected historic extreme precipitation event to 'up-scale' its intensity to warmer conditions. Additionally, the selected historic event is simulated in the high-resolution, convective-permitting weather model Harmonie. The initial and boundary conditions are alternated to represent future conditions. The comparison between the statistical and the numerical method of projecting the historic event to future conditions showed comparable intensity changes, which depending on the initial percentile intensity, range from below CC to a 3CC rate of change per degree of warming. The model tends to overestimate the future intensities for the low- and the very high percentiles and the clouds are somewhat displaced, due to small wind and convection changes. The total spatial cloud coverage in the model remains, as also in the statistical method, unchanged. The advantages of the suggested Pi-Td method of projecting future precipitation events from historic events is that it is simple to use, is less expensive time, computational and resource wise compared to a numerical model. The outcome can be used directly for hydrological and climatological studies and for impact analysis such as for flood risk assessments.

  20. Statistical dependency in visual scanning

    NASA Technical Reports Server (NTRS)

    Ellis, Stephen R.; Stark, Lawrence

    1986-01-01

    A method to identify statistical dependencies in the positions of eye fixations is developed and applied to eye movement data from subjects who viewed dynamic displays of air traffic and judged future relative position of aircraft. Analysis of approximately 23,000 fixations on points of interest on the display identified statistical dependencies in scanning that were independent of the physical placement of the points of interest. Identification of these dependencies is inconsistent with random-sampling-based theories used to model visual search and information seeking.

  1. The Deployment Life Study: Longitudinal Analysis of Military Families Across the Deployment Cycle

    DTIC Science & Technology

    2016-01-01

    psychological and physical aggression than they reported prior to the deployment. 1 H. Fischer, A Guide to U.S. Military Casualty Statistics ...analyses include a large number of statistical tests and thus the results pre- sented in this report should be viewed in terms of patterns, rather...Military Children and Families,” The Future of Children, Vol. 23, No. 2, 2013, pp. 13–39. Fischer, H., A Guide to U.S. Military Casualty Statistics

  2. Assessing the Temporal Relationship between Race and Ecstasy Use among High School Seniors.

    ERIC Educational Resources Information Center

    Yacoubian, George S., Jr.

    2002-01-01

    Analyzes data from 10,088 high school seniors surveyed through the Monitoring the Future study between 1996 and 1999. Chi-square statistics are used to explore the temporal relationship between ace and the use of ecstasy during this time frame. Statistically significant relationship between race and ecstasy use are discerned. (Contains 46…

  3. Legal and Ethical Issues in the Use of Video in Education Research. Working Paper Series.

    ERIC Educational Resources Information Center

    Arafeh, Sousan; McLaughlin, Mary

    The National Center for Education Statistics (NCES), through the Education Statistics Services Institute, supported the research in this report to help frame future discussions about the use of video research techniques in educational settings. This paper addresses the context of technological, legal, and ethical change facing researchers who use…

  4. New forecasting methodology indicates more disease and earlier mortality ahead for today's younger Americans.

    PubMed

    Reither, Eric N; Olshansky, S Jay; Yang, Yang

    2011-08-01

    Traditional methods of projecting population health statistics, such as estimating future death rates, can give inaccurate results and lead to inferior or even poor policy decisions. A new "three-dimensional" method of forecasting vital health statistics is more accurate because it takes into account the delayed effects of the health risks being accumulated by today's younger generations. Applying this forecasting technique to the US obesity epidemic suggests that future death rates and health care expenditures could be far worse than currently anticipated. We suggest that public policy makers adopt this more robust forecasting tool and redouble efforts to develop and implement effective obesity-related prevention programs and interventions.

  5. Current and future health care professionals attitudes toward and knowledge of statistics: How confidence influences learning.

    PubMed

    Baghi, Heibatollah; Kornides, Melanie L

    2013-01-01

    Health care professionals require some understanding of statistics to successfully implement evidence based practice. Developing competency in statistical reasoning is necessary for students training in health care administration, research, and clinical care. Recently, the interest in healthcare professional's attitudes toward statistics has increased substantially due to evidence that these attitudes can hinder professionalism developing an understanding of statistical concepts. In this study, we analyzed pre- and post-instruction attitudes towards and knowledge of statistics obtained from health science graduate students, including nurses and nurse practitioners, enrolled in an introductory graduate course in statistics (n = 165). Results show that the students already held generally positive attitudes toward statistics at the beginning of course. However, these attitudes-along with the students' statistical proficiency-improved after 10 weeks of instruction. The results have implications for curriculum design and delivery methods as well as for health professionals' effective use of statistics in critically evaluating and utilizing research in their practices.

  6. Current and future health care professionals attitudes toward and knowledge of statistics: How confidence influences learning

    PubMed Central

    Baghi, Heibatollah; Kornides, Melanie L.

    2014-01-01

    Background Health care professionals require some understanding of statistics to successfully implement evidence based practice. Developing competency in statistical reasoning is necessary for students training in health care administration, research, and clinical care. Recently, the interest in healthcare professional's attitudes toward statistics has increased substantially due to evidence that these attitudes can hinder professionalism developing an understanding of statistical concepts. Methods In this study, we analyzed pre- and post-instruction attitudes towards and knowledge of statistics obtained from health science graduate students, including nurses and nurse practitioners, enrolled in an introductory graduate course in statistics (n = 165). Results and Conclusions Results show that the students already held generally positive attitudes toward statistics at the beginning of course. However, these attitudes—along with the students’ statistical proficiency—improved after 10 weeks of instruction. The results have implications for curriculum design and delivery methods as well as for health professionals’ effective use of statistics in critically evaluating and utilizing research in their practices. PMID:25419256

  7. Statistical modelling predicts almost complete loss of major periglacial processes in Northern Europe by 2100.

    PubMed

    Aalto, Juha; Harrison, Stephan; Luoto, Miska

    2017-09-11

    The periglacial realm is a major part of the cryosphere, covering a quarter of Earth's land surface. Cryogenic land surface processes (LSPs) control landscape development, ecosystem functioning and climate through biogeochemical feedbacks, but their response to contemporary climate change is unclear. Here, by statistically modelling the current and future distributions of four major LSPs unique to periglacial regions at fine scale, we show fundamental changes in the periglacial climate realm are inevitable with future climate change. Even with the most optimistic CO 2 emissions scenario (Representative Concentration Pathway (RCP) 2.6) we predict a 72% reduction in the current periglacial climate realm by 2050 in our climatically sensitive northern Europe study area. These impacts are projected to be especially severe in high-latitude continental interiors. We further predict that by the end of the twenty-first century active periglacial LSPs will exist only at high elevations. These results forecast a future tipping point in the operation of cold-region LSP, and predict fundamental landscape-level modifications in ground conditions and related atmospheric feedbacks.Cryogenic land surface processes characterise the periglacial realm and control landscape development and ecosystem functioning. Here, via statistical modelling, the authors predict a 72% reduction of the periglacial realm in Northern Europe by 2050, and almost complete disappearance by 2100.

  8. On the Training of Radio and Communications Engineers in the Decades of the Immediate Future.

    ERIC Educational Resources Information Center

    Klyatskin, I.G.

    A list of 11 statements relating to the change in training programs for radio and communications engineers is presented in this article, in preparation for future developments in the field. Semiconductors, decimeter and centimeter radio frequency ranges, and a statistical approach to communications systems are analyzed as the three important…

  9. The Future of Small- and Medium-Sized Communities in the Prairie Region.

    ERIC Educational Resources Information Center

    Wellar, Barry S., Ed.

    Four papers are featured. The first is a statistical overview and analysis of past, present and future happenings to small communities in the Region; it focuses on two indicators: (1) population growth or declining community class size and, (2) the changing distribution of commercial outlets by community class size. The other three papers report…

  10. The Association between Preservice Elementary Teacher Animal Attitude and Likelihood of Animal Incorporation in Future Science Curriculum

    ERIC Educational Resources Information Center

    Wagler, Ron

    2010-01-01

    The purpose of this study was to assess the association between United States K-4 preservice teacher's attitudes toward specific animals and the likelihood that the preservice elementary teachers would incorporate these specific animals in their future science curriculum. A strong statistically significant association was found between the…

  11. Projecting a Stand Table Through Time

    Treesearch

    Quang V. Cao; V. Clark Baldwin

    1999-01-01

    Stand tables provide number of trees per acre for each diameter class. This paper presents a general technique to predict a future stand table, based on the current stand table and future stand summary statistics such as trees and basal area per acre, and average diameter. The stand projection technique involves (a) predicting surviving trees for each class, and (b)...

  12. Observed and Projected Precipitation Changes over the Nine US Climate Regions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chylek, Petr; Dubey, Manvendra; Hengartner, Nicholas

    Here, we analyze the past (1900–2015) temperature and precipitation changes in nine separate US climate regions. We find that the temperature increased in a statistically significant (95% confidence level equivalent to alpha level of 0.05) manner in all of these regions. However, the variability in the observed precipitation was much more complex. In the eastern US (east of Rocky Mountains), the precipitation increased in all five climate regions and the increase was statistically significant in three of them. In contract, in the western US, the precipitation increased in two regions and decreased in two with no statistical significance in anymore » region. The CMIP5 climate models (an ensemble mean) were not able to capture properly either the large precipitation differences between the eastern and the western US, or the changes of precipitation between 1900 and 2015 in eastern US. The statistical regression model explains the differences between the eastern and western US precipitation as results of different significant predictors. The anthropogenic greenhouse gases and aerosol (GHGA) are the major forcing of the precipitation in the eastern part of US, while the Pacific Decadal Oscillation (PDO) has the major influence on precipitation in the western part of the US. This analysis suggests that the precipitation over the eastern US increased at an approximate rate of 6.7%/K, in agreement with the Clausius-Clapeyron equation, while the precipitation of the western US was approximately constant, independent of the temperature. Future precipitation over the western part of the US will depend on the behavior of the PDO, and how it (PDO) may be affected by future warming. Low hydrological sensitivity (percent increase of precipitation per one K of warming) projected by the CMIP5 models for the eastern US suggests either an underestimate of future precipitation or an overestimate of future warming.« less

  13. Observed and Projected Precipitation Changes over the Nine US Climate Regions

    DOE PAGES

    Chylek, Petr; Dubey, Manvendra; Hengartner, Nicholas; ...

    2017-10-25

    Here, we analyze the past (1900–2015) temperature and precipitation changes in nine separate US climate regions. We find that the temperature increased in a statistically significant (95% confidence level equivalent to alpha level of 0.05) manner in all of these regions. However, the variability in the observed precipitation was much more complex. In the eastern US (east of Rocky Mountains), the precipitation increased in all five climate regions and the increase was statistically significant in three of them. In contract, in the western US, the precipitation increased in two regions and decreased in two with no statistical significance in anymore » region. The CMIP5 climate models (an ensemble mean) were not able to capture properly either the large precipitation differences between the eastern and the western US, or the changes of precipitation between 1900 and 2015 in eastern US. The statistical regression model explains the differences between the eastern and western US precipitation as results of different significant predictors. The anthropogenic greenhouse gases and aerosol (GHGA) are the major forcing of the precipitation in the eastern part of US, while the Pacific Decadal Oscillation (PDO) has the major influence on precipitation in the western part of the US. This analysis suggests that the precipitation over the eastern US increased at an approximate rate of 6.7%/K, in agreement with the Clausius-Clapeyron equation, while the precipitation of the western US was approximately constant, independent of the temperature. Future precipitation over the western part of the US will depend on the behavior of the PDO, and how it (PDO) may be affected by future warming. Low hydrological sensitivity (percent increase of precipitation per one K of warming) projected by the CMIP5 models for the eastern US suggests either an underestimate of future precipitation or an overestimate of future warming.« less

  14. Teen Births: A County-By-County Factbook. For Children for Ohio's Future.

    ERIC Educational Resources Information Center

    Hill, Susan

    This Factbook provides state- and county-level statistical information on teen births in Ohio and discusses statewide trends from 1992 to 1996. The statistical portrait is based on 12 indicators: (1) number of infants born to teens; (2) teen birth rate; (3) repeat teen birth rate; (4) percentage of teen births to unmarried teens; (5) percentage of…

  15. Reconstructing Macroeconomics Based on Statistical Physics

    NASA Astrophysics Data System (ADS)

    Aoki, Masanao; Yoshikawa, Hiroshi

    We believe that time has come to integrate the new approach based on statistical physics or econophysics into macroeconomics. Toward this goal, there must be more dialogues between physicists and economists. In this paper, we argue that there is no reason why the methods of statistical physics so successful in many fields of natural sciences cannot be usefully applied to macroeconomics that is meant to analyze the macroeconomy comprising a large number of economic agents. It is, in fact, weird to regard the macroeconomy as a homothetic enlargement of the representative micro agent. We trust the bright future of the new approach to macroeconomies based on statistical physics.

  16. Statistical downscaling of GCM simulations to streamflow using relevance vector machine

    NASA Astrophysics Data System (ADS)

    Ghosh, Subimal; Mujumdar, P. P.

    2008-01-01

    General circulation models (GCMs), the climate models often used in assessing the impact of climate change, operate on a coarse scale and thus the simulation results obtained from GCMs are not particularly useful in a comparatively smaller river basin scale hydrology. The article presents a methodology of statistical downscaling based on sparse Bayesian learning and Relevance Vector Machine (RVM) to model streamflow at river basin scale for monsoon period (June, July, August, September) using GCM simulated climatic variables. NCEP/NCAR reanalysis data have been used for training the model to establish a statistical relationship between streamflow and climatic variables. The relationship thus obtained is used to project the future streamflow from GCM simulations. The statistical methodology involves principal component analysis, fuzzy clustering and RVM. Different kernel functions are used for comparison purpose. The model is applied to Mahanadi river basin in India. The results obtained using RVM are compared with those of state-of-the-art Support Vector Machine (SVM) to present the advantages of RVMs over SVMs. A decreasing trend is observed for monsoon streamflow of Mahanadi due to high surface warming in future, with the CCSR/NIES GCM and B2 scenario.

  17. Seasonal Drought Prediction: Advances, Challenges, and Future Prospects

    NASA Astrophysics Data System (ADS)

    Hao, Zengchao; Singh, Vijay P.; Xia, Youlong

    2018-03-01

    Drought prediction is of critical importance to early warning for drought managements. This review provides a synthesis of drought prediction based on statistical, dynamical, and hybrid methods. Statistical drought prediction is achieved by modeling the relationship between drought indices of interest and a suite of potential predictors, including large-scale climate indices, local climate variables, and land initial conditions. Dynamical meteorological drought prediction relies on seasonal climate forecast from general circulation models (GCMs), which can be employed to drive hydrological models for agricultural and hydrological drought prediction with the predictability determined by both climate forcings and initial conditions. Challenges still exist in drought prediction at long lead time and under a changing environment resulting from natural and anthropogenic factors. Future research prospects to improve drought prediction include, but are not limited to, high-quality data assimilation, improved model development with key processes related to drought occurrence, optimal ensemble forecast to select or weight ensembles, and hybrid drought prediction to merge statistical and dynamical forecasts.

  18. A digital spatial predictive model of land-use change using economic and environmental inputs and a statistical tree classification approach: Thailand, 1970s--1990s

    NASA Astrophysics Data System (ADS)

    Felkner, John Sames

    The scale and extent of global land use change is massive, and has potentially powerful effects on the global climate and global atmospheric composition (Turner & Meyer, 1994). Because of this tremendous change and impact, there is an urgent need for quantitative, empirical models of land use change, especially predictive models with an ability to capture the trajectories of change (Agarwal, Green, Grove, Evans, & Schweik, 2000; Lambin et al., 1999). For this research, a spatial statistical predictive model of land use change was created and run in two provinces of Thailand. The model utilized an extensive spatial database, and used a classification tree approach for explanatory model creation and future land use (Breiman, Friedman, Olshen, & Stone, 1984). Eight input variables were used, and the trees were run on a dependent variable of land use change measured from 1979 to 1989 using classified satellite imagery. The derived tree models were used to create probability of change surfaces, and these were then used to create predicted land cover maps for 1999. These predicted 1999 maps were compared with actual 1999 landcover derived from 1999 Landsat 7 imagery. The primary research hypothesis was that an explanatory model using both economic and environmental input variables would better predict future land use change than would either a model using only economic variables or a model using only environmental. Thus, the eight input variables included four economic and four environmental variables. The results indicated a very slight superiority of the full models to predict future agricultural change and future deforestation, but a slight superiority of the economic models to predict future built change. However, the margins of superiority were too small to be statistically significant. The resulting tree structures were used, however, to derive a series of principles or "rules" governing land use change in both provinces. The model was able to predict future land use, given a series of assumptions, with 90 percent overall accuracies. The model can be used in other developing or developed country locations for future land use prediction, determination of future threatened areas, or to derive "rules" or principles driving land use change.

  19. Estimating sunspot number

    NASA Technical Reports Server (NTRS)

    Wilson, R. M.; Reichmann, E. J.; Teuber, D. L.

    1984-01-01

    An empirical method is developed to predict certain parameters of future solar activity cycles. Sunspot cycle statistics are examined, and curve fitting and linear regression analysis techniques are utilized.

  20. Hurricane track forecast cones from fluctuations

    PubMed Central

    Meuel, T.; Prado, G.; Seychelles, F.; Bessafi, M.; Kellay, H.

    2012-01-01

    Trajectories of tropical cyclones may show large deviations from predicted tracks leading to uncertainty as to their landfall location for example. Prediction schemes usually render this uncertainty by showing track forecast cones representing the most probable region for the location of a cyclone during a period of time. By using the statistical properties of these deviations, we propose a simple method to predict possible corridors for the future trajectory of a cyclone. Examples of this scheme are implemented for hurricane Ike and hurricane Jimena. The corridors include the future trajectory up to at least 50 h before landfall. The cones proposed here shed new light on known track forecast cones as they link them directly to the statistics of these deviations. PMID:22701776

  1. The MSFC Solar Activity Future Estimation (MSAFE) Model

    NASA Technical Reports Server (NTRS)

    Suggs, Ron

    2017-01-01

    The Natural Environments Branch of the Engineering Directorate at Marshall Space Flight Center (MSFC) provides solar cycle forecasts for NASA space flight programs and the aerospace community. These forecasts provide future statistical estimates of sunspot number, solar radio 10.7 cm flux (F10.7), and the geomagnetic planetary index, Ap, for input to various space environment models. For example, many thermosphere density computer models used in spacecraft operations, orbital lifetime analysis, and the planning of future spacecraft missions require as inputs the F10.7 and Ap. The solar forecast is updated each month by executing MSAFE using historical and the latest month's observed solar indices to provide estimates for the balance of the current solar cycle. The forecasted solar indices represent the 13-month smoothed values consisting of a best estimate value stated as a 50 percentile value along with approximate +/- 2 sigma values stated as 95 and 5 percentile statistical values. This presentation will give an overview of the MSAFE model and the forecast for the current solar cycle.

  2. The effects of budget, delegation, and other variables on the future of school nursing.

    PubMed

    Tetuan, Theresa M; Akagi, Cynthia G

    2004-12-01

    The purpose of this exploratory research study was to survey Kansas school nurses to determine the impact of budget, delegation, and other variables on the future of school nursing. Issues of education and certification status, educational budget, delegation, school nurse-to-student ratio, number of school buildings assigned, Metropolitan Statistical Area, and years of school nursing experience were also investigated. The Budget Impact School Nurse Questionnaire online survey was used to gather data. Findings revealed that school nurses were well prepared academically, but that many school nurses lacked certification. The use of UAPs and the future of school nursing were significantly affected by budget constraints, delegation, number of buildings assigned, legislative contact, and Metropolitan Statistical Area (urban location). Education in delegation and years of experience as a school nurse significantly affected opportunities for health education. The findings depicted budget, school nurse staffing, delegation, and geographic areas as the main variables that have an impact on school nursing.

  3. The Future of Working Women in the United States.

    ERIC Educational Resources Information Center

    Wolfe, Mary Ann

    In light of changing statistics about women in the labor force since 1960, the author discusses possible trends related to working women in the future. In 1962 the labor force participation rate of all U. S. women was 36% and of mothers, 34%. By 1975 these rates increased to 43% and 47% respectively. Unfortunately, women still seem to be taking…

  4. Projections of the Population of the United States, by Age, Sex, and Race: 1983 to 2080.

    ERIC Educational Resources Information Center

    Spencer, Gregory

    1984-01-01

    Based on assumptions about fertility, mortality, and net immigration trends, statistical tables depict the future U.S. population by age, sex, and race. Figures are based on the July 1, 1982, population estimates and race definitions and are projected using the cohort-component method with alternative assumptions for future fertility, mortality,…

  5. Revealing Future Research Capacity from an Analysis of a National Database of Discipline-Coded Australian PhD Thesis Records

    ERIC Educational Resources Information Center

    Pittayachawan, Siddhi; Macauley, Peter; Evans, Terry

    2016-01-01

    This article reports how statistical analyses of PhD thesis records can reveal future research capacities for disciplines beyond their primary fields. The previous research showed that most theses contributed to and/or used methodologies from more than one discipline. In Australia, there was a concern for declining mathematical teaching and…

  6. Atlas of current and potential future distributions of common trees of the eastern United States

    Treesearch

    Louis R. Iverson; Anantha M. Prasad; Betsy J. Hale; Elaine Kennedy Sutherland

    1999-01-01

    This atlas documents the current and possible future distribution of 80 common tree species in the Eastern United States and gives detailed information on environmental characteristics defining these distributions. Also included are outlines of life history characteristics and summary statistics for these species. Much of the data are derived from Forest Inventory and...

  7. Towards estimates of future rainfall erosivity in Europe based on REDES and WorldClim datasets

    NASA Astrophysics Data System (ADS)

    Panagos, Panos; Ballabio, Cristiano; Meusburger, Katrin; Spinoni, Jonathan; Alewell, Christine; Borrelli, Pasquale

    2017-05-01

    The policy requests to develop trends in soil erosion changes can be responded developing modelling scenarios of the two most dynamic factors in soil erosion, i.e. rainfall erosivity and land cover change. The recently developed Rainfall Erosivity Database at European Scale (REDES) and a statistical approach used to spatially interpolate rainfall erosivity data have the potential to become useful knowledge to predict future rainfall erosivity based on climate scenarios. The use of a thorough statistical modelling approach (Gaussian Process Regression), with the selection of the most appropriate covariates (monthly precipitation, temperature datasets and bioclimatic layers), allowed to predict the rainfall erosivity based on climate change scenarios. The mean rainfall erosivity for the European Union and Switzerland is projected to be 857 MJ mm ha-1 h-1 yr-1 till 2050 showing a relative increase of 18% compared to baseline data (2010). The changes are heterogeneous in the European continent depending on the future projections of most erosive months (hot period: April-September). The output results report a pan-European projection of future rainfall erosivity taking into account the uncertainties of the climatic models.

  8. Towards estimates of future rainfall erosivity in Europe based on REDES and WorldClim datasets.

    PubMed

    Panagos, Panos; Ballabio, Cristiano; Meusburger, Katrin; Spinoni, Jonathan; Alewell, Christine; Borrelli, Pasquale

    2017-05-01

    The policy requests to develop trends in soil erosion changes can be responded developing modelling scenarios of the two most dynamic factors in soil erosion, i.e. rainfall erosivity and land cover change. The recently developed Rainfall Erosivity Database at European Scale (REDES) and a statistical approach used to spatially interpolate rainfall erosivity data have the potential to become useful knowledge to predict future rainfall erosivity based on climate scenarios. The use of a thorough statistical modelling approach (Gaussian Process Regression), with the selection of the most appropriate covariates (monthly precipitation, temperature datasets and bioclimatic layers), allowed to predict the rainfall erosivity based on climate change scenarios. The mean rainfall erosivity for the European Union and Switzerland is projected to be 857 MJ mm ha -1  h -1  yr -1 till 2050 showing a relative increase of 18% compared to baseline data (2010). The changes are heterogeneous in the European continent depending on the future projections of most erosive months (hot period: April-September). The output results report a pan-European projection of future rainfall erosivity taking into account the uncertainties of the climatic models.

  9. Developing statistical wildlife habitat relationships for assessing cumulative effects of fuels treatments: Final Report for Joint Fire Science Program Project

    Treesearch

    Samuel A. Cushman; Kevin S. McKelvey

    2006-01-01

    The primary weakness in our current ability to evaluate future landscapes in terms of wildlife lies in the lack of quantitative models linking wildlife to forest stand conditions, including fuels treatments. This project focuses on 1) developing statistical wildlife habitat relationships models (WHR) utilizing Forest Inventory and Analysis (FIA) and National Vegetation...

  10. Statistical Education in the 21st Century: A Review of Challenges, Teaching Innovations and Strategies for Reform

    ERIC Educational Resources Information Center

    Tishkovskaya, Svetlana; Lancaster, Gillian A.

    2012-01-01

    Over the past few decades there has been a large amount of research dedicated to the teaching of statistics. The impact of this research has started to change course content and structure, in both introductory and advanced courses for statisticians and those from other disciplines. In the light of these changes future directions in the teaching…

  11. Statistical Analysis of the Exchange Rate of Bitcoin.

    PubMed

    Chu, Jeffrey; Nadarajah, Saralees; Chan, Stephen

    2015-01-01

    Bitcoin, the first electronic payment system, is becoming a popular currency. We provide a statistical analysis of the log-returns of the exchange rate of Bitcoin versus the United States Dollar. Fifteen of the most popular parametric distributions in finance are fitted to the log-returns. The generalized hyperbolic distribution is shown to give the best fit. Predictions are given for future values of the exchange rate.

  12. Use Trends Indicated by Statistically Calibrated Recreational Sites in the National Forest System

    Treesearch

    Gary L. Tyre

    1971-01-01

    Trends in statistically sampled use of developed sites in the National Forest system indicate an average annual increase of 6.0 percent in the period 1966-69. The high variability of the measure precludes its use for projecting expected future use, but it can be important in gauging the credibility of annual use changes at both sampled and unsampled locations.

  13. Statistical assessment on a combined analysis of GRYN-ROMN-UCBN upland vegetation vital signs

    USGS Publications Warehouse

    Irvine, Kathryn M.; Rodhouse, Thomas J.

    2014-01-01

    As of 2013, Rocky Mountain and Upper Columbia Basin Inventory and Monitoring Networks have multiple years of vegetation data and Greater Yellowstone Network has three years of vegetation data and monitoring is ongoing in all three networks. Our primary objective is to assess whether a combined analysis of these data aimed at exploring correlations with climate and weather data is feasible. We summarize the core survey design elements across protocols and point out the major statistical challenges for a combined analysis at present. The dissimilarity in response designs between ROMN and UCBN-GRYN network protocols presents a statistical challenge that has not been resolved yet. However, the UCBN and GRYN data are compatible as they implement a similar response design; therefore, a combined analysis is feasible and will be pursued in future. When data collected by different networks are combined, the survey design describing the merged dataset is (likely) a complex survey design. A complex survey design is the result of combining datasets from different sampling designs. A complex survey design is characterized by unequal probability sampling, varying stratification, and clustering (see Lohr 2010 Chapter 7 for general overview). Statistical analysis of complex survey data requires modifications to standard methods, one of which is to include survey design weights within a statistical model. We focus on this issue for a combined analysis of upland vegetation from these networks, leaving other topics for future research. We conduct a simulation study on the possible effects of equal versus unequal probability selection of points on parameter estimates of temporal trend using available packages within the R statistical computing package. We find that, as written, using lmer or lm for trend detection in a continuous response and clm and clmm for visually estimated cover classes with “raw” GRTS design weights specified for the weight argument leads to substantially different results and/or computational instability. However, when only fixed effects are of interest, the survey package (svyglm and svyolr) may be suitable for a model-assisted analysis for trend. We provide possible directions for future research into combined analysis for ordinal and continuous vital sign indictors.

  14. Utilization of an Enhanced Canonical Correlation Analysis (ECCA) to Predict Daily Precipitation and Temperature in a Semi-Arid Environment

    NASA Astrophysics Data System (ADS)

    Lopez, S. R.; Hogue, T. S.

    2011-12-01

    Global climate models (GCMs) are primarily used to generate historical and future large-scale circulation patterns at a coarse resolution (typical order of 50,000 km2) and fail to capture climate variability at the ground level due to localized surface influences (i.e topography, marine, layer, land cover, etc). Their inability to accurately resolve these processes has led to the development of numerous 'downscaling' techniques. The goal of this study is to enhance statistical downscaling of daily precipitation and temperature for regions with heterogeneous land cover and topography. Our analysis was divided into two periods, historical (1961-2000) and contemporary (1980-2000), and tested using sixteen predictand combinations from four GCMs (GFDL CM2.0, GFDL CM2.1, CNRM-CM3 and MRI-CGCM2 3.2a. The Southern California area was separated into five county regions: Santa Barbara, Ventura, Los Angeles, Orange and San Diego. Principle component analysis (PCA) was performed on ground-based observations in order to (1) reduce the number of redundant gauges and minimize dimensionality and (2) cluster gauges that behave statistically similarly for post-analysis. Post-PCA analysis included extensive testing of predictor-predictand relationships using an enhanced canonical correlation analysis (ECCA). The ECCA includes obtaining the optimal predictand sets for all models within each spatial domain (county) as governed by daily and monthly overall statistics. Results show all models maintain mean annual and monthly behavior within each county and daily statistics are improved. The level of improvement highly depends on the vegetation extent within each county and the land-to-ocean ratio within the GCM spatial grid. The utilization of the entire historical period also leads to better statistical representation of observed daily precipitation. The validated ECCA technique is being applied to future climate scenarios distributed by the IPCC in order to provide forcing data for regional hydrologic models and assess future water resources in the Southern California region.

  15. Throughput Benefit Assessment for Tactical Runway Configuration Management (TRCM)

    NASA Technical Reports Server (NTRS)

    Phojanamongkolkij, Nipa; Oseguera-Lohr, Rosa M.; Lohr, Gary W.; Fenbert, James W.

    2014-01-01

    The System-Oriented Runway Management (SORM) concept is a collection of needed capabilities focused on a more efficient use of runways while considering all of the factors that affect runway use. Tactical Runway Configuration Management (TRCM), one of the SORM capabilities, provides runway configuration and runway usage recommendations, monitoring the active runway configuration for suitability given existing factors, based on a 90 minute planning horizon. This study evaluates the throughput benefits using a representative sample of today's traffic volumes at three airports: Memphis International Airport (MEM), Dallas-Fort Worth International Airport (DFW), and John F. Kennedy International Airport (JFK). Based on this initial assessment, there are statistical throughput benefits for both arrivals and departures at MEM with an average of 4% for arrivals, and 6% for departures. For DFW, there is a statistical benefit for arrivals with an average of 3%. Although there is an average of 1% benefit observed for departures, it is not statistically significant. For JFK, there is a 12% benefit for arrivals, but a 2% penalty for departures. The results obtained are for current traffic volumes and should show greater benefit for increased future demand. This paper also proposes some potential TRCM algorithm improvements for future research. A continued research plan is being worked to implement these improvements and to re-assess the throughput benefit for today and future projected traffic volumes.

  16. Linear regression analysis of Hospital Episode Statistics predicts a large increase in demand for elective hand surgery in England.

    PubMed

    Bebbington, Emily; Furniss, Dominic

    2015-02-01

    We integrated two factors, demographic population shifts and changes in prevalence of disease, to predict future trends in demand for hand surgery in England, to facilitate workforce planning. We analysed Hospital Episode Statistics data for Dupuytren's disease, carpal tunnel syndrome, cubital tunnel syndrome, and trigger finger from 1998 to 2011. Using linear regression, we estimated trends in both diagnosis and surgery until 2030. We integrated this regression with age specific population data from the Office for National Statistics in order to estimate how this will contribute to a change in workload over time. There has been a significant increase in both absolute numbers of diagnoses and surgery for all four conditions. Combined with future population data, we calculate that the total operative burden for these four conditions will increase from 87,582 in 2011 to 170,166 (95% confidence interval 144,517-195,353) in 2030. The prevalence of these diseases in the ageing population, and increasing prevalence of predisposing factors such as obesity and diabetes, may account for the predicted increase in workload. The most cost effective treatments must be sought, which requires high quality clinical trials. Our methodology can be applied to other sub-specialties to help anticipate the need for future service provision. Copyright © 2014 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  17. A Statistical Bias Correction Tool for Generating Climate Change Scenarios in Indonesia based on CMIP5 Datasets

    NASA Astrophysics Data System (ADS)

    Faqih, A.

    2017-03-01

    Providing information regarding future climate scenarios is very important in climate change study. The climate scenario can be used as basic information to support adaptation and mitigation studies. In order to deliver future climate scenarios over specific region, baseline and projection data from the outputs of global climate models (GCM) is needed. However, due to its coarse resolution, the data have to be downscaled and bias corrected in order to get scenario data with better spatial resolution that match the characteristics of the observed data. Generating this downscaled data is mostly difficult for scientist who do not have specific background, experience and skill in dealing with the complex data from the GCM outputs. In this regards, it is necessary to develop a tool that can be used to simplify the downscaling processes in order to help scientist, especially in Indonesia, for generating future climate scenario data that can be used for their climate change-related studies. In this paper, we introduce a tool called as “Statistical Bias Correction for Climate Scenarios (SiBiaS)”. The tool is specially designed to facilitate the use of CMIP5 GCM data outputs and process their statistical bias corrections relative to the reference data from observations. It is prepared for supporting capacity building in climate modeling in Indonesia as part of the Indonesia 3rd National Communication (TNC) project activities.

  18. Adverse Childhood Experiences (ACEs) Study

    MedlinePlus

    ... Data & Statistics (WISQARS) Funded Programs Press Room Social Media Publications ... experiences, both positive and negative, have a tremendous impact on future violence victimization and perpetration, and lifelong health and opportunity. ...

  19. Comparison of climate envelope models developed using expert-selected variables versus statistical selection

    USGS Publications Warehouse

    Brandt, Laura A.; Benscoter, Allison; Harvey, Rebecca G.; Speroterra, Carolina; Bucklin, David N.; Romañach, Stephanie; Watling, James I.; Mazzotti, Frank J.

    2017-01-01

    Climate envelope models are widely used to describe potential future distribution of species under different climate change scenarios. It is broadly recognized that there are both strengths and limitations to using climate envelope models and that outcomes are sensitive to initial assumptions, inputs, and modeling methods Selection of predictor variables, a central step in modeling, is one of the areas where different techniques can yield varying results. Selection of climate variables to use as predictors is often done using statistical approaches that develop correlations between occurrences and climate data. These approaches have received criticism in that they rely on the statistical properties of the data rather than directly incorporating biological information about species responses to temperature and precipitation. We evaluated and compared models and prediction maps for 15 threatened or endangered species in Florida based on two variable selection techniques: expert opinion and a statistical method. We compared model performance between these two approaches for contemporary predictions, and the spatial correlation, spatial overlap and area predicted for contemporary and future climate predictions. In general, experts identified more variables as being important than the statistical method and there was low overlap in the variable sets (<40%) between the two methods Despite these differences in variable sets (expert versus statistical), models had high performance metrics (>0.9 for area under the curve (AUC) and >0.7 for true skill statistic (TSS). Spatial overlap, which compares the spatial configuration between maps constructed using the different variable selection techniques, was only moderate overall (about 60%), with a great deal of variability across species. Difference in spatial overlap was even greater under future climate projections, indicating additional divergence of model outputs from different variable selection techniques. Our work is in agreement with other studies which have found that for broad-scale species distribution modeling, using statistical methods of variable selection is a useful first step, especially when there is a need to model a large number of species or expert knowledge of the species is limited. Expert input can then be used to refine models that seem unrealistic or for species that experts believe are particularly sensitive to change. It also emphasizes the importance of using multiple models to reduce uncertainty and improve map outputs for conservation planning. Where outputs overlap or show the same direction of change there is greater certainty in the predictions. Areas of disagreement can be used for learning by asking why the models do not agree, and may highlight areas where additional on-the-ground data collection could improve the models.

  20. Response surfaces of vulnerability to climate change: The Colorado River Basin, the High Plains, and California

    Treesearch

    Romano Foti; Jorge A. Ramirez; Thomas C. Brown

    2014-01-01

    We quantify the vulnerability of water supply to shortage for the Colorado River Basin and basins of the High Plains and California and assess the sensitivity of their water supply system to future changes in the statistical variability of supply and demand. We do so for current conditions and future socio-economic scenarios within a probabilistic framework that...

  1. Statistical Analysis of the Exchange Rate of Bitcoin

    PubMed Central

    Chu, Jeffrey; Nadarajah, Saralees; Chan, Stephen

    2015-01-01

    Bitcoin, the first electronic payment system, is becoming a popular currency. We provide a statistical analysis of the log-returns of the exchange rate of Bitcoin versus the United States Dollar. Fifteen of the most popular parametric distributions in finance are fitted to the log-returns. The generalized hyperbolic distribution is shown to give the best fit. Predictions are given for future values of the exchange rate. PMID:26222702

  2. Observations, theoretical ideas and modeling of turbulent flows: Past, present and future

    NASA Technical Reports Server (NTRS)

    Chapman, G. T.; Tobak, M.

    1985-01-01

    Turbulence was analyzed in a historical context featuring the interactions between observations, theoretical ideas, and modeling within three successive movements. These are identified as predominantly statistical, structural and deterministic. The statistical movement is criticized for its failure to deal with the structural elements observed in turbulent flows. The structural movement is criticized for its failure to embody observed structural elements within a formal theory. The deterministic movement is described as having the potential of overcoming these deficiencies by allowing structural elements to exhibit chaotic behavior that is nevertheless embodied within a theory. Four major ideas of this movement are described: bifurcation theory, strange attractors, fractals, and the renormalization group. A framework for the future study of turbulent flows is proposed, based on the premises of the deterministic movement.

  3. Inevitable end-of-21st-century trends toward earlier surface runoff timing in California's Sierra Nevada Mountains

    NASA Astrophysics Data System (ADS)

    Schwartz, M. A.; Hall, A. D.; Sun, F.; Walton, D.; Berg, N.

    2015-12-01

    Hybrid dynamical-statistical downscaling is used to produce surface runoff timing projections for California's Sierra Nevada, a high-elevation mountain range with significant seasonal snow cover. First, future climate change projections (RCP8.5 forcing scenario, 2081-2100 period) from five CMIP5 global climate models (GCMs) are dynamically downscaled. These projections reveal that future warming leads to a shift toward earlier snowmelt and surface runoff timing throughout the Sierra Nevada region. Relationships between warming and surface runoff timing from the dynamical simulations are used to build a simple statistical model that mimics the dynamical model's projected surface runoff timing changes given GCM input or other statistically-downscaled input. This statistical model can be used to produce surface runoff timing projections for other GCMs, periods, and forcing scenarios to quantify ensemble-mean changes, uncertainty due to intermodel variability and consequences stemming from choice of forcing scenario. For all CMIP5 GCMs and forcing scenarios, significant trends toward earlier surface runoff timing occur at elevations below 2500m. Thus, we conclude that trends toward earlier surface runoff timing by the end-of-the-21st century are inevitable. The changes to surface runoff timing diagnosed in this study have implications for many dimensions of climate change, including impacts on surface hydrology, water resources, and ecosystems.

  4. Towards bridging the gap between climate change projections and maize producers in South Africa

    NASA Astrophysics Data System (ADS)

    Landman, Willem A.; Engelbrecht, Francois; Hewitson, Bruce; Malherbe, Johan; van der Merwe, Jacobus

    2018-05-01

    Multi-decadal regional projections of future climate change are introduced into a linear statistical model in order to produce an ensemble of austral mid-summer maximum temperature simulations for southern Africa. The statistical model uses atmospheric thickness fields from a high-resolution (0.5° × 0.5°) reanalysis-forced simulation as predictors in order to develop a linear recalibration model which represents the relationship between atmospheric thickness fields and gridded maximum temperatures across the region. The regional climate model, the conformal-cubic atmospheric model (CCAM), projects maximum temperatures increases over southern Africa to be in the order of 4 °C under low mitigation towards the end of the century or even higher. The statistical recalibration model is able to replicate these increasing temperatures, and the atmospheric thickness-maximum temperature relationship is shown to be stable under future climate conditions. Since dry land crop yields are not explicitly simulated by climate models but are sensitive to maximum temperature extremes, the effect of projected maximum temperature change on dry land crops of the Witbank maize production district of South Africa, assuming other factors remain unchanged, is then assessed by employing a statistical approach similar to the one used for maximum temperature projections.

  5. Meta-analysis and The Cochrane Collaboration: 20 years of the Cochrane Statistical Methods Group

    PubMed Central

    2013-01-01

    The Statistical Methods Group has played a pivotal role in The Cochrane Collaboration over the past 20 years. The Statistical Methods Group has determined the direction of statistical methods used within Cochrane reviews, developed guidance for these methods, provided training, and continued to discuss and consider new and controversial issues in meta-analysis. The contribution of Statistical Methods Group members to the meta-analysis literature has been extensive and has helped to shape the wider meta-analysis landscape. In this paper, marking the 20th anniversary of The Cochrane Collaboration, we reflect on the history of the Statistical Methods Group, beginning in 1993 with the identification of aspects of statistical synthesis for which consensus was lacking about the best approach. We highlight some landmark methodological developments that Statistical Methods Group members have contributed to in the field of meta-analysis. We discuss how the Group implements and disseminates statistical methods within The Cochrane Collaboration. Finally, we consider the importance of robust statistical methodology for Cochrane systematic reviews, note research gaps, and reflect on the challenges that the Statistical Methods Group faces in its future direction. PMID:24280020

  6. A Bayesian nonparametric method for prediction in EST analysis

    PubMed Central

    Lijoi, Antonio; Mena, Ramsés H; Prünster, Igor

    2007-01-01

    Background Expressed sequence tags (ESTs) analyses are a fundamental tool for gene identification in organisms. Given a preliminary EST sample from a certain library, several statistical prediction problems arise. In particular, it is of interest to estimate how many new genes can be detected in a future EST sample of given size and also to determine the gene discovery rate: these estimates represent the basis for deciding whether to proceed sequencing the library and, in case of a positive decision, a guideline for selecting the size of the new sample. Such information is also useful for establishing sequencing efficiency in experimental design and for measuring the degree of redundancy of an EST library. Results In this work we propose a Bayesian nonparametric approach for tackling statistical problems related to EST surveys. In particular, we provide estimates for: a) the coverage, defined as the proportion of unique genes in the library represented in the given sample of reads; b) the number of new unique genes to be observed in a future sample; c) the discovery rate of new genes as a function of the future sample size. The Bayesian nonparametric model we adopt conveys, in a statistically rigorous way, the available information into prediction. Our proposal has appealing properties over frequentist nonparametric methods, which become unstable when prediction is required for large future samples. EST libraries, previously studied with frequentist methods, are analyzed in detail. Conclusion The Bayesian nonparametric approach we undertake yields valuable tools for gene capture and prediction in EST libraries. The estimators we obtain do not feature the kind of drawbacks associated with frequentist estimators and are reliable for any size of the additional sample. PMID:17868445

  7. Learning Predictive Statistics: Strategies and Brain Mechanisms.

    PubMed

    Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew E; Kourtzi, Zoe

    2017-08-30

    When immersed in a new environment, we are challenged to decipher initially incomprehensible streams of sensory information. However, quite rapidly, the brain finds structure and meaning in these incoming signals, helping us to predict and prepare ourselves for future actions. This skill relies on extracting the statistics of event streams in the environment that contain regularities of variable complexity from simple repetitive patterns to complex probabilistic combinations. Here, we test the brain mechanisms that mediate our ability to adapt to the environment's statistics and predict upcoming events. By combining behavioral training and multisession fMRI in human participants (male and female), we track the corticostriatal mechanisms that mediate learning of temporal sequences as they change in structure complexity. We show that learning of predictive structures relates to individual decision strategy; that is, selecting the most probable outcome in a given context (maximizing) versus matching the exact sequence statistics. These strategies engage distinct human brain regions: maximizing engages dorsolateral prefrontal, cingulate, sensory-motor regions, and basal ganglia (dorsal caudate, putamen), whereas matching engages occipitotemporal regions (including the hippocampus) and basal ganglia (ventral caudate). Our findings provide evidence for distinct corticostriatal mechanisms that facilitate our ability to extract behaviorally relevant statistics to make predictions. SIGNIFICANCE STATEMENT Making predictions about future events relies on interpreting streams of information that may initially appear incomprehensible. Past work has studied how humans identify repetitive patterns and associative pairings. However, the natural environment contains regularities that vary in complexity from simple repetition to complex probabilistic combinations. Here, we combine behavior and multisession fMRI to track the brain mechanisms that mediate our ability to adapt to changes in the environment's statistics. We provide evidence for an alternate route for learning complex temporal statistics: extracting the most probable outcome in a given context is implemented by interactions between executive and motor corticostriatal mechanisms compared with visual corticostriatal circuits (including hippocampal cortex) that support learning of the exact temporal statistics. Copyright © 2017 Wang et al.

  8. Evidence-based pathology in its second decade: toward probabilistic cognitive computing.

    PubMed

    Marchevsky, Alberto M; Walts, Ann E; Wick, Mark R

    2017-03-01

    Evidence-based pathology advocates using a combination of best available data ("evidence") from the literature and personal experience for the diagnosis, estimation of prognosis, and assessment of other variables that impact individual patient care. Evidence-based pathology relies on systematic reviews of the literature, evaluation of the quality of evidence as categorized by evidence levels and statistical tools such as meta-analyses, estimates of probabilities and odds, and others. However, it is well known that previously "statistically significant" information usually does not accurately forecast the future for individual patients. There is great interest in "cognitive computing" in which "data mining" is combined with "predictive analytics" designed to forecast future events and estimate the strength of those predictions. This study demonstrates the use of IBM Watson Analytics software to evaluate and predict the prognosis of 101 patients with typical and atypical pulmonary carcinoid tumors in which Ki-67 indices have been determined. The results obtained with this system are compared with those previously reported using "routine" statistical software and the help of a professional statistician. IBM Watson Analytics interactively provides statistical results that are comparable to those obtained with routine statistical tools but much more rapidly, with considerably less effort and with interactive graphics that are intuitively easy to apply. It also enables analysis of natural language variables and yields detailed survival predictions for patient subgroups selected by the user. Potential applications of this tool and basic concepts of cognitive computing are discussed. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Expansion of the Lyme Disease Vector Ixodes Scapularis in Canada Inferred from CMIP5 Climate Projections

    PubMed Central

    McPherson, Michelle; García-García, Almudena; Cuesta-Valero, Francisco José; Hansen-Ketchum, Patti; MacDougall, Donna; Ogden, Nicholas Hume

    2017-01-01

    Background: A number of studies have assessed possible climate change impacts on the Lyme disease vector, Ixodes scapularis. However, most have used surface air temperature from only one climate model simulation and/or one emission scenario, representing only one possible climate future. Objectives: We quantified effects of different Representative Concentration Pathway (RCP) and climate model outputs on the projected future changes in the basic reproduction number (R0) of I. scapularis to explore uncertainties in future R0 estimates. Methods: We used surface air temperature generated by a complete set of General Circulation Models from the Coupled Model Intercomparison Project Phase 5 (CMIP5) to hindcast historical (1971–2000), and to forecast future effects of climate change on the R0 of I. scapularis for the periods 2011–2040 and 2041–2070. Results: Increases in the multimodel mean R0 values estimated for both future periods, relative to 1971–2000, were statistically significant under all RCP scenarios for all of Nova Scotia, areas of New Brunswick and Quebec, Ontario south of 47°N, and Manitoba south of 52°N. When comparing RCP scenarios, only the estimated R0 mean values between RCP6.0 and RCP8.5 showed statistically significant differences for any future time period. Conclusion: Our results highlight the potential for climate change to have an effect on future Lyme disease risk in Canada even if the Paris Agreement’s goal to keep global warming below 2°C is achieved, although mitigation reducing emissions from RCP8.5 levels to those of RCP6.0 or less would be expected to slow tick invasion after the 2030s. https://doi.org/10.1289/EHP57 PMID:28599266

  10. 17 CFR 38.156 - Automated trade surveillance system.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... potential trade practice violations. The automated system must load and process daily orders and trades no... anomalies; compute, retain, and compare trading statistics; compute trade gains, losses, and futures...

  11. 17 CFR 38.156 - Automated trade surveillance system.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... potential trade practice violations. The automated system must load and process daily orders and trades no... anomalies; compute, retain, and compare trading statistics; compute trade gains, losses, and futures...

  12. Analysis and modelling of surface Urban Heat Island in 20 Canadian cities under climate and land-cover change.

    PubMed

    Gaur, Abhishek; Eichenbaum, Markus Kalev; Simonovic, Slobodan P

    2018-01-15

    Surface Urban Heat Island (SUHI) is an urban climate phenomenon that is expected to respond to future climate and land-use land-cover change. It is important to further our understanding of physical mechanisms that govern SUHI phenomenon to enhance our ability to model future SUHI characteristics under changing geophysical conditions. In this study, SUHI phenomenon is quantified and modelled at 20 cities distributed across Canada. By analyzing MODerate Resolution Imaging Spectroradiometer (MODIS) sensed surface temperature at the cities over 2002-2012, it is found that 16 out of 20 selected cities have experienced a positive SUHI phenomenon while 4 cities located in the prairies region and high elevation locations have experienced a negative SUHI phenomenon in the past. A statistically significant relationship between observed SUHI magnitude and city elevation is also recorded over the observational period. A Physical Scaling downscaling model is then validated and used to downscale future surface temperature projections from 3 GCMs and 2 extreme Representative Concentration Pathways in the urban and rural areas of the cities. Future changes in SUHI magnitudes between historical (2006-2015) and future timelines: 2030s (2026-2035), 2050s (2046-2055), and 2090s (2091-2100) are estimated. Analysis of future projected changes indicate that 15 (13) out of 20 cities can be expected to experience increases in SUHI magnitudes in future under RCP 2.6 (RCP 8.5). A statistically significant relationship between projected future SUHI change and current size of the cities is also obtained. The study highlights the role of city properties (i.e. its size, elevation, and surrounding land-cover) towards shaping their current and future SUHI characteristics. The results from this analysis will help decision-makers to manage Canadian cities more efficiently under rapidly changing geophysical and demographical conditions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Building a database for statistical characterization of ELMs on DIII-D

    NASA Astrophysics Data System (ADS)

    Fritch, B. J.; Marinoni, A.; Bortolon, A.

    2017-10-01

    Edge localized modes (ELMs) are bursty instabilities which occur in the edge region of H-mode plasmas and have the potential to damage in-vessel components of future fusion machines by exposing the divertor region to large energy and particle fluxes during each ELM event. While most ELM studies focus on average quantities (e.g. energy loss per ELM), this work investigates the statistical distributions of ELM characteristics, as a function of plasma parameters. A semi-automatic algorithm is being used to create a database documenting trigger times of the tens of thousands of ELMs for DIII-D discharges in scenarios relevant to ITER, thus allowing statistically significant analysis. Probability distributions of inter-ELM periods and energy losses will be determined and related to relevant plasma parameters such as density, stored energy, and current in order to constrain models and improve estimates of the expected inter-ELM periods and sizes, both of which must be controlled in future reactors. Work supported in part by US DoE under the Science Undergraduate Laboratory Internships (SULI) program, DE-FC02-04ER54698 and DE-FG02- 94ER54235.

  14. Statistical Process Control for KSC Processing

    NASA Technical Reports Server (NTRS)

    Ford, Roger G.; Delgado, Hector; Tilley, Randy

    1996-01-01

    The 1996 Summer Faculty Fellowship Program and Kennedy Space Center (KSC) served as the basis for a research effort into statistical process control for KSC processing. The effort entailed several tasks and goals. The first was to develop a customized statistical process control (SPC) course for the Safety and Mission Assurance Trends Analysis Group. The actual teaching of this course took place over several weeks. In addition, an Internet version of the same course complete with animation and video excerpts from the course when it was taught at KSC was developed. The application of SPC to shuttle processing took up the rest of the summer research project. This effort entailed the evaluation of SPC use at KSC, both present and potential, due to the change in roles for NASA and the Single Flight Operations Contractor (SFOC). Individual consulting on SPC use was accomplished as well as an evaluation of SPC software for KSC use in the future. A final accomplishment of the orientation of the author to NASA changes, terminology, data format, and new NASA task definitions will allow future consultation when the needs arise.

  15. Education Vital Signs: Population.

    ERIC Educational Resources Information Center

    Zakariya, Sally Banks

    1985-01-01

    Population changes and demographics shape the future of public schools. Includes statistics on ethnic makeup of student population, the projected baby boomlet, children of working mothers, households without children, and the aging population. (MD)

  16. Surveying Future Surveys

    NASA Astrophysics Data System (ADS)

    Carlstrom, John E.

    2016-06-01

    The now standard model of cosmology has been tested and refined by the analysis of increasingly sensitive, large astronomical surveys, especially with statistically significant millimeter-wave surveys of the cosmic microwave background and optical surveys of the distribution of galaxies. This talk will offer a glimpse of the future, which promises an acceleration of this trend with cosmological information coming from new surveys across the electromagnetic spectrum as well as particles and even gravitational waves.

  17. Updated Intensity - Duration - Frequency Curves Under Different Future Climate Scenarios

    NASA Astrophysics Data System (ADS)

    Ragno, E.; AghaKouchak, A.

    2016-12-01

    Current infrastructure design procedures rely on the use of Intensity - Duration - Frequency (IDF) curves retrieved under the assumption of temporal stationarity, meaning that occurrences of extreme events are expected to be time invariant. However, numerous studies have observed more severe extreme events over time. Hence, the stationarity assumption for extreme analysis may not be appropriate in a warming climate. This issue raises concerns regarding the safety and resilience of the existing and future infrastructures. Here we employ historical and projected (RCP 8.5) CMIP5 runs to investigate IDF curves of 14 urban areas across the United States. We first statistically assess changes in precipitation extremes using an energy-based test for equal distributions. Then, through a Bayesian inference approach for stationary and non-stationary extreme value analysis, we provide updated IDF curves based on climatic model projections. This presentation summarizes the projected changes in statistics of extremes. We show that, based on CMIP5 simulations, extreme precipitation events in some urban areas can be 20% more severe in the future, even when projected annual mean precipitation is expected to remain similar to the ground-based climatology.

  18. Bureau of Labor Statistics Employment Projections: Detailed Analysis of Selected Occupations and Industries. Report to the Honorable Berkley Bedell, United States House of Representatives.

    ERIC Educational Resources Information Center

    General Accounting Office, Washington, DC.

    To compile its projections of future employment levels, the Bureau of Labor Statistics (BLS) combines the following five interlinked models in a six-step process: a labor force model, an econometric model of the U.S. economy, an industry activity model, an industry labor demand model, and an occupational labor demand model. The BLS was asked to…

  19. [Big data in official statistics].

    PubMed

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.

  20. A Sorting Statistic with Application in Neurological Magnetic Resonance Imaging of Autism.

    PubMed

    Levman, Jacob; Takahashi, Emi; Forgeron, Cynthia; MacDonald, Patrick; Stewart, Natalie; Lim, Ashley; Martel, Anne

    2018-01-01

    Effect size refers to the assessment of the extent of differences between two groups of samples on a single measurement. Assessing effect size in medical research is typically accomplished with Cohen's d statistic. Cohen's d statistic assumes that average values are good estimators of the position of a distribution of numbers and also assumes Gaussian (or bell-shaped) underlying data distributions. In this paper, we present an alternative evaluative statistic that can quantify differences between two data distributions in a manner that is similar to traditional effect size calculations; however, the proposed approach avoids making assumptions regarding the shape of the underlying data distribution. The proposed sorting statistic is compared with Cohen's d statistic and is demonstrated to be capable of identifying feature measurements of potential interest for which Cohen's d statistic implies the measurement would be of little use. This proposed sorting statistic has been evaluated on a large clinical autism dataset from Boston Children's Hospital , Harvard Medical School , demonstrating that it can potentially play a constructive role in future healthcare technologies.

  1. A Sorting Statistic with Application in Neurological Magnetic Resonance Imaging of Autism

    PubMed Central

    Takahashi, Emi; Lim, Ashley; Martel, Anne

    2018-01-01

    Effect size refers to the assessment of the extent of differences between two groups of samples on a single measurement. Assessing effect size in medical research is typically accomplished with Cohen's d statistic. Cohen's d statistic assumes that average values are good estimators of the position of a distribution of numbers and also assumes Gaussian (or bell-shaped) underlying data distributions. In this paper, we present an alternative evaluative statistic that can quantify differences between two data distributions in a manner that is similar to traditional effect size calculations; however, the proposed approach avoids making assumptions regarding the shape of the underlying data distribution. The proposed sorting statistic is compared with Cohen's d statistic and is demonstrated to be capable of identifying feature measurements of potential interest for which Cohen's d statistic implies the measurement would be of little use. This proposed sorting statistic has been evaluated on a large clinical autism dataset from Boston Children's Hospital, Harvard Medical School, demonstrating that it can potentially play a constructive role in future healthcare technologies. PMID:29796236

  2. Future Extreme Heat Scenarios to Enable the Assessment of Climate Impacts on Public Health over the Coterminous U.S

    NASA Astrophysics Data System (ADS)

    Quattrochi, D. A.; Crosson, W. L.; Al-Hamdan, M. Z.; Estes, M. G., Jr.

    2013-12-01

    In the United States, extreme heat is the most deadly weather-related hazard. In the face of a warming climate and urbanization, which contributes to local-scale urban heat islands, it is very likely that extreme heat events (EHEs) will become more common and more severe in the U.S. This research seeks to provide historical and future measures of climate-driven extreme heat events to enable assessments of the impacts of heat on public health over the coterminous U.S. We use atmospheric temperature and humidity information from meteorological reanalysis and from Global Climate Models (GCMs) to provide data on past and future heat events. The focus of research is on providing assessments of the magnitude, frequency and geographic distribution of extreme heat in the U.S. to facilitate public health studies. In our approach, long-term climate change is captured with GCM outputs, and the temporal and spatial characteristics of short-term extremes are represented by the reanalysis data. Two future time horizons for 2040 and 2090 are compared to the recent past period of 1981-2000. We characterize regional-scale temperature and humidity conditions using GCM outputs for two climate change scenarios (A2 and A1B) defined in the Special Report on Emissions Scenarios (SRES). For each future period, 20 years of multi-model GCM outputs are analyzed to develop a ';heat stress climatology' based on statistics of extreme heat indicators. Differences between the two future and the past period are used to define temperature and humidity changes on a monthly time scale and regional spatial scale. These changes are combined with the historical meteorological data, which is hourly and at a spatial scale (12 km) much finer than that of GCMs, to create future climate realizations. From these realizations, we compute the daily heat stress measures and related spatially-specific climatological fields, such as the mean annual number of days above certain thresholds of maximum and minimum air temperatures, heat indices, and a new heat stress variable developed as part of this research that gives an integrated measure of heat stress (and relief) over the course of a day. Comparisons are made between projected (2040 and 2090) and past (1990) heat stress statistics. Outputs are aggregated to the county level, which is a popular scale of analysis for public health interests. County-level statistics are made available to public health researchers by the Centers for Disease Control and Prevention (CDC) via the Wide-ranging Online Data for Epidemiologic Research (WONDER) system. This addition of heat stress measures to CDC WONDER allows decision and policy makers to assess the impact of alternative approaches to optimize the public health response to EHEs. Through CDC WONDER, users are able to spatially and temporally query public health and heat-related data sets and create county-level maps and statistical charts of such data across the coterminous U.S.

  3. Future Extreme Heat Scenarios to Enable the Assessment of Climate Impacts on Public Health over the Coterminous U.S.

    NASA Technical Reports Server (NTRS)

    Quattrochi, Dale A.; Crosson, William L.; Al-Hamdan, Mohammad Z.; Estes, Maurice G., Jr.

    2013-01-01

    In the United States, extreme heat is the most deadly weather-related hazard. In the face of a warming climate and urbanization, which contributes to local-scale urban heat islands, it is very likely that extreme heat events (EHEs) will become more common and more severe in the U.S. This research seeks to provide historical and future measures of climate-driven extreme heat events to enable assessments of the impacts of heat on public health over the coterminous U.S. We use atmospheric temperature and humidity information from meteorological reanalysis and from Global Climate Models (GCMs) to provide data on past and future heat events. The focus of research is on providing assessments of the magnitude, frequency and geographic distribution of extreme heat in the U.S. to facilitate public health studies. In our approach, long-term climate change is captured with GCM outputs, and the temporal and spatial characteristics of short-term extremes are represented by the reanalysis data. Two future time horizons for 2040 and 2090 are compared to the recent past period of 1981- 2000. We characterize regional-scale temperature and humidity conditions using GCM outputs for two climate change scenarios (A2 and A1B) defined in the Special Report on Emissions Scenarios (SRES). For each future period, 20 years of multi-model GCM outputs are analyzed to develop a 'heat stress climatology' based on statistics of extreme heat indicators. Differences between the two future and the past period are used to define temperature and humidity changes on a monthly time scale and regional spatial scale. These changes are combined with the historical meteorological data, which is hourly and at a spatial scale (12 km), to create future climate realizations. From these realizations, we compute the daily heat stress measures and related spatially-specific climatological fields, such as the mean annual number of days above certain thresholds of maximum and minimum air temperatures, heat indices and a new heat stress variable developed as part of this research that gives an integrated measure of heat stress (and relief) over the course of a day. Comparisons are made between projected (2040 and 2090) and past (1990) heat stress statistics. Outputs are aggregated to the county level, which is a popular scale of analysis for public health interests. County-level statistics are made available to public health researchers by the Centers for Disease Control and Prevention (CDC) via the Wideranging Online Data for Epidemiologic Research (WONDER) system. This addition of heat stress measures to CDC WONDER allows decision and policy makers to assess the impact of alternative approaches to optimize the public health response to EHEs. Through CDC WONDER, users are able to spatially and temporally query public health and heat-related data sets and create county-level maps and statistical charts of such data across the coterminous U.S

  4. Cancer Statistics

    MedlinePlus

    ... 1,790 died of the disease. Estimated national expenditures for cancer care in the United States in 2017 were $147.3 billion. In future years, costs are likely to increase as the population ages and cancer prevalence increases. ...

  5. Effects of future climate conditions on terrestrial export from coastal southern California

    NASA Astrophysics Data System (ADS)

    Feng, D.; Zhao, Y.; Raoufi, R.; Beighley, E.; Melack, J. M.

    2015-12-01

    The Santa Barbara Coastal - Long Term Ecological Research Project (SBC-LTER) is focused on investigating the relative importance of land and ocean processes in structuring giant kelp forest ecosystems. Understanding how current and future climate conditions influence terrestrial export is a central theme for the project. Here we combine the Hillslope River Routing (HRR) model and daily precipitation and temperature downscaled using statistical downscaling based on localized constructed Analogs (LOCA) to estimate recent streamflow dynamics (2000 to 2014) and future conditions (2015 to 2100). The HRR model covers the SBC-LTER watersheds from just west of the Ventura River to Point Conception; a land area of roughly 800 km2 with 179 watersheds ranging from 0.1 to 123 km2. The downscaled climate conditions have a spatial resolution of 6 km by 6 km. Here, we use the Penman-Monteith method with the Food and Agriculture Organization of the United Nations (FAO) limited climate data approximations and land surface conditions (albedo, leaf area index, land cover) measured from NASA's Moderate Resolution Imaging Spectroradiometer (MODIS) on the Terra and Aqua satellites to estimate potential evapotranspiration (PET). The HRR model is calibrated for the period 2000 to 2014 using USGS and LTER streamflow. An automated calibration technique is used. For future climate scenarios, we use mean 8-day land cover conditions. Future streamflow, ET and soil moisture statistics are presented and based on downscaled P and T from ten climate model projections from the Coupled Model Intercomparison Project Phase 5 (CMIP5).

  6. A review of CDC's Web-based Injury Statistics Query and Reporting System (WISQARS™): Planning for the future of injury surveillance✩

    PubMed Central

    Ballesteros, Michael F.; Webb, Kevin; McClure, Roderick J.

    2017-01-01

    Introduction The Centers for Disease Control and Prevention (CDC) developed the Web-based Injury Statistics Query and Reporting System (WISQARSTM) to meet the data needs of injury practitioners. In 2015, CDC completed a Portfolio Review of this system to inform its future development. Methods Evaluation questions addressed utilization, technology and innovation, data sources, and tools and training. Data were collected through environmental scans, a review of peer-reviewed and grey literature, a web search, and stakeholder interviews. Results Review findings led to specific recommendations for each evaluation question. Response CDC reviewed each recommendation and initiated several enhancements that will improve the ability of injury prevention practitioners to leverage these data, better make sense of query results, and incorporate findings and key messages into prevention practices. PMID:28454867

  7. Machine Learning Approaches for Clinical Psychology and Psychiatry.

    PubMed

    Dwyer, Dominic B; Falkai, Peter; Koutsouleris, Nikolaos

    2018-05-07

    Machine learning approaches for clinical psychology and psychiatry explicitly focus on learning statistical functions from multidimensional data sets to make generalizable predictions about individuals. The goal of this review is to provide an accessible understanding of why this approach is important for future practice given its potential to augment decisions associated with the diagnosis, prognosis, and treatment of people suffering from mental illness using clinical and biological data. To this end, the limitations of current statistical paradigms in mental health research are critiqued, and an introduction is provided to critical machine learning methods used in clinical studies. A selective literature review is then presented aiming to reinforce the usefulness of machine learning methods and provide evidence of their potential. In the context of promising initial results, the current limitations of machine learning approaches are addressed, and considerations for future clinical translation are outlined.

  8. Statistical modeling of daily and subdaily stream temperatures: Application to the Methow River Basin, Washington

    NASA Astrophysics Data System (ADS)

    Caldwell, R. J.; Gangopadhyay, S.; Bountry, J.; Lai, Y.; Elsner, M. M.

    2013-07-01

    Management of water temperatures in the Columbia River Basin (Washington) is critical because water projects have substantially altered the habitat of Endangered Species Act listed species, such as salmon, throughout the basin. This is most important in tributaries to the Columbia, such as the Methow River, where the spawning and rearing life stages of these cold water fishes occurs. Climate change projections generally predict increasing air temperatures across the western United States, with less confidence regarding shifts in precipitation. As air temperatures rise, we anticipate a corresponding increase in water temperatures, which may alter the timing and availability of habitat for fish reproduction and growth. To assess the impact of future climate change in the Methow River, we couple historical climate and future climate projections with a statistical modeling framework to predict daily mean stream temperatures. A K-nearest neighbor algorithm is also employed to: (i) adjust the climate projections for biases compared to the observed record and (ii) provide a reference for performing spatiotemporal disaggregation in future hydraulic modeling of stream habitat. The statistical models indicate the primary drivers of stream temperature are maximum and minimum air temperature and stream flow and show reasonable skill in predictability. When compared to the historical reference time period of 1916-2006, we conclude that increases in stream temperature are expected to occur at each subsequent time horizon representative of the year 2020, 2040, and 2080, with an increase of 0.8 ± 1.9°C by the year 2080.

  9. Characteristics of real futures trading networks

    NASA Astrophysics Data System (ADS)

    Wang, Junjie; Zhou, Shuigeng; Guan, Jihong

    2011-01-01

    Futures trading is the core of futures business, and it is considered as one of the typical complex systems. To investigate the complexity of futures trading, we employ the analytical method of complex networks. First, we use real trading records from the Shanghai Futures Exchange to construct futures trading networks, in which nodes are trading participants, and two nodes have a common edge if the two corresponding investors appear simultaneously in at least one trading record as a purchaser and a seller, respectively. Then, we conduct a comprehensive statistical analysis on the constructed futures trading networks. Empirical results show that the futures trading networks exhibit features such as scale-free behavior with interesting odd-even-degree divergence in low-degree regions, small-world effect, hierarchical organization, power-law betweenness distribution, disassortative mixing, and shrinkage of both the average path length and the diameter as network size increases. To the best of our knowledge, this is the first work that uses real data to study futures trading networks, and we argue that the research results can shed light on the nature of real futures business.

  10. Using statistical models to explore ensemble uncertainty in climate impact studies: the example of air pollution in Europe

    NASA Astrophysics Data System (ADS)

    Lemaire, Vincent E. P.; Colette, Augustin; Menut, Laurent

    2016-03-01

    Because of its sensitivity to unfavorable weather patterns, air pollution is sensitive to climate change so that, in the future, a climate penalty could jeopardize the expected efficiency of air pollution mitigation measures. A common method to assess the impact of climate on air quality consists in implementing chemistry-transport models forced by climate projections. However, the computing cost of such methods requires optimizing ensemble exploration techniques. By using a training data set from a deterministic projection of climate and air quality over Europe, we identified the main meteorological drivers of air quality for eight regions in Europe and developed statistical models that could be used to predict air pollutant concentrations. The evolution of the key climate variables driving either particulate or gaseous pollution allows selecting the members of the EuroCordex ensemble of regional climate projections that should be used in priority for future air quality projections (CanESM2/RCA4; CNRM-CM5-LR/RCA4 and CSIRO-Mk3-6-0/RCA4 and MPI-ESM-LR/CCLM following the EuroCordex terminology). After having tested the validity of the statistical model in predictive mode, we can provide ranges of uncertainty attributed to the spread of the regional climate projection ensemble by the end of the century (2071-2100) for the RCP8.5. In the three regions where the statistical model of the impact of climate change on PM2.5 offers satisfactory performances, we find a climate benefit (a decrease of PM2.5 concentrations under future climate) of -1.08 (±0.21), -1.03 (±0.32), -0.83 (±0.14) µg m-3, for respectively Eastern Europe, Mid-Europe and Northern Italy. In the British-Irish Isles, Scandinavia, France, the Iberian Peninsula and the Mediterranean, the statistical model is not considered skillful enough to draw any conclusion for PM2.5. In Eastern Europe, France, the Iberian Peninsula, Mid-Europe and Northern Italy, the statistical model of the impact of climate change on ozone was considered satisfactory and it confirms the climate penalty bearing upon ozone of 10.51 (±3.06), 11.70 (±3.63), 11.53 (±1.55), 9.86 (±4.41), 4.82 (±1.79) µg m-3, respectively. In the British-Irish Isles, Scandinavia and the Mediterranean, the skill of the statistical model was not considered robust enough to draw any conclusion for ozone pollution.

  11. Future of the geoscience profession

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carleton, A.T.

    1995-05-01

    I want to discuss the future of the energy industry and the geoscience profession. That`s you and me. Is there a future for us? Will there be a need for petroleum? What will we use for energy in the future? Over the past several years, those of us in the energy business have witnessed remarkable changes in our industry and our profession. We must be able to change with the conditions if we are to survive them. To do so, some idea of what the future holds is essential. I will discuss what that future may be and will covermore » these topics: world population and energy demand, exploration and production outlook, environmental considerations, geoscience demographics, education, technology, and government. Much of the statistical data and some of the projections I will discuss have been taken from the report of AAPG`s 21st Century Committee, of which I was a member.« less

  12. Examination of Solar Cycle Statistical Model and New Prediction of Solar Cycle 23

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Wilson, John W.

    2000-01-01

    Sunspot numbers in the current solar cycle 23 were estimated by using a statistical model with the accumulating cycle sunspot data based on the odd-even behavior of historical sunspot cycles from 1 to 22. Since cycle 23 has progressed and the accurate solar minimum occurrence has been defined, the statistical model is validated by comparing the previous prediction with the new measured sunspot number; the improved sunspot projection in short range of future time is made accordingly. The current cycle is expected to have a moderate level of activity. Errors of this model are shown to be self-correcting as cycle observations become available.

  13. The Influence of Statistical versus Exemplar Appeals on Indian Adults' Health Intentions: An Investigation of Direct Effects and Intervening Persuasion Processes.

    PubMed

    McKinley, Christopher J; Limbu, Yam; Jayachandran, C N

    2017-04-01

    In two separate investigations, we examined the persuasive effectiveness of statistical versus exemplar appeals on Indian adults' smoking cessation and mammography screening intentions. To more comprehensively address persuasion processes, we explored whether message response and perceived message effectiveness functioned as antecedents to persuasive effects. Results showed that statistical appeals led to higher levels of health intentions than exemplar appeals. In addition, findings from both studies indicated that statistical appeals stimulated more attention and were perceived as more effective than anecdotal accounts. Among male smokers, statistical appeals also generated greater cognitive processing than exemplar appeals. Subsequent mediation analyses revealed that message response and perceived message effectiveness fully carried the influence of appeal format on health intentions. Given these findings, future public health initiatives conducted among similar populations should design messages that include substantive factual information while ensuring that this content is perceived as credible and valuable.

  14. Statistics, Computation, and Modeling in Cosmology

    NASA Astrophysics Data System (ADS)

    Jewell, Jeff; Guiness, Joe; SAMSI 2016 Working Group in Cosmology

    2017-01-01

    Current and future ground and space based missions are designed to not only detect, but map out with increasing precision, details of the universe in its infancy to the present-day. As a result we are faced with the challenge of analyzing and interpreting observations from a wide variety of instruments to form a coherent view of the universe. Finding solutions to a broad range of challenging inference problems in cosmology is one of the goals of the “Statistics, Computation, and Modeling in Cosmology” workings groups, formed as part of the year long program on ‘Statistical, Mathematical, and Computational Methods for Astronomy’, hosted by the Statistical and Applied Mathematical Sciences Institute (SAMSI), a National Science Foundation funded institute. Two application areas have emerged for focused development in the cosmology working group involving advanced algorithmic implementations of exact Bayesian inference for the Cosmic Microwave Background, and statistical modeling of galaxy formation. The former includes study and development of advanced Markov Chain Monte Carlo algorithms designed to confront challenging inference problems including inference for spatial Gaussian random fields in the presence of sources of galactic emission (an example of a source separation problem). Extending these methods to future redshift survey data probing the nonlinear regime of large scale structure formation is also included in the working group activities. In addition, the working group is also focused on the study of ‘Galacticus’, a galaxy formation model applied to dark matter-only cosmological N-body simulations operating on time-dependent halo merger trees. The working group is interested in calibrating the Galacticus model to match statistics of galaxy survey observations; specifically stellar mass functions, luminosity functions, and color-color diagrams. The group will use subsampling approaches and fractional factorial designs to statistically and computationally efficiently explore the Galacticus parameter space. The group will also use the Galacticus simulations to study the relationship between the topological and physical structure of the halo merger trees and the properties of the resulting galaxies.

  15. A Predictive Statistical Model of Navy Career Enlisted Retention Behavior Utilizing Economic Variables.

    DTIC Science & Technology

    1980-12-01

    career retention rates , and to predict future career retention rates in the Navy. The statistical model utilizes economic variables as predictors...The model developed r has a high correlation with Navy career retention rates . The problem of Navy career retention has not been adequately studied, 0D...findings indicate Navy policymakers must be cognizant of the relationships of economic factors to Navy career retention rates . Accrzsiofl ’or NTIS GRA&I

  16. Australian Disaster Research Directory (Including Some Contributions from New Zealand). Provisional--1983.

    DTIC Science & Technology

    1983-06-01

    storm surge, cyclone,fire) * social and physical effects of nuclear attack * volcanic hazards statistics of abnormal sea levels * management of high...strengths and weaknesses of these responses * Impact of environmental change on present and future disaster strategies SOME QUESTIONNAIRE STATISTICS Some of...James Cook Univ Black , Mr R G 99 Sen Lec, Civil Eng, QIT Blackman. Dr D R 86 Sen Lec, Dept Mech Eng, V---ash Blong, Dr Russell 80 Sen Lec, Earth

  17. Planetary Defense Legacy for a Certain Future

    DTIC Science & Technology

    1998-04-01

    hyperbole. Although I could accept prior impacts as historical fact, having seen Meteor Crater in Arizona and accepted the evidence presented by Luis Alvarez...context of impersonal numbers or statistics, the lives of individuals lose meaning. A threat that puts 100 people at risk is likely to be seen as quite...automobiles even though air travel is statistically safer .52 Some sociologists have estimated that a risk of death of 1 in 1 million is the public’s

  18. Statistical wave climate projections for coastal impact assessments

    NASA Astrophysics Data System (ADS)

    Camus, P.; Losada, I. J.; Izaguirre, C.; Espejo, A.; Menéndez, M.; Pérez, J.

    2017-09-01

    Global multimodel wave climate projections are obtained at 1.0° × 1.0° scale from 30 Coupled Model Intercomparison Project Phase 5 (CMIP5) global circulation model (GCM) realizations. A semi-supervised weather-typing approach based on a characterization of the ocean wave generation areas and the historical wave information from the recent GOW2 database are used to train the statistical model. This framework is also applied to obtain high resolution projections of coastal wave climate and coastal impacts as port operability and coastal flooding. Regional projections are estimated using the collection of weather types at spacing of 1.0°. This assumption is feasible because the predictor is defined based on the wave generation area and the classification is guided by the local wave climate. The assessment of future changes in coastal impacts is based on direct downscaling of indicators defined by empirical formulations (total water level for coastal flooding and number of hours per year with overtopping for port operability). Global multimodel projections of the significant wave height and peak period are consistent with changes obtained in previous studies. Statistical confidence of expected changes is obtained due to the large number of GCMs to construct the ensemble. The proposed methodology is proved to be flexible to project wave climate at different spatial scales. Regional changes of additional variables as wave direction or other statistics can be estimated from the future empirical distribution with extreme values restricted to high percentiles (i.e., 95th, 99th percentiles). The statistical framework can also be applied to evaluate regional coastal impacts integrating changes in storminess and sea level rise.

  19. Applying Bayesian statistics to the study of psychological trauma: A suggestion for future research.

    PubMed

    Yalch, Matthew M

    2016-03-01

    Several contemporary researchers have noted the virtues of Bayesian methods of data analysis. Although debates continue about whether conventional or Bayesian statistics is the "better" approach for researchers in general, there are reasons why Bayesian methods may be well suited to the study of psychological trauma in particular. This article describes how Bayesian statistics offers practical solutions to the problems of data non-normality, small sample size, and missing data common in research on psychological trauma. After a discussion of these problems and the effects they have on trauma research, this article explains the basic philosophical and statistical foundations of Bayesian statistics and how it provides solutions to these problems using an applied example. Results of the literature review and the accompanying example indicates the utility of Bayesian statistics in addressing problems common in trauma research. Bayesian statistics provides a set of methodological tools and a broader philosophical framework that is useful for trauma researchers. Methodological resources are also provided so that interested readers can learn more. (c) 2016 APA, all rights reserved).

  20. Flood risk assessment and robust management under deep uncertainty: Application to Dhaka City

    NASA Astrophysics Data System (ADS)

    Mojtahed, Vahid; Gain, Animesh Kumar; Giupponi, Carlo

    2014-05-01

    The socio-economic changes as well as climatic changes have been the main drivers of uncertainty in environmental risk assessment and in particular flood. The level of future uncertainty that researchers face when dealing with problems in a future perspective with focus on climate change is known as Deep Uncertainty (also known as Knightian uncertainty), since nobody has already experienced and undergone those changes before and our knowledge is limited to the extent that we have no notion of probabilities, and therefore consolidated risk management approaches have limited potential.. Deep uncertainty is referred to circumstances that analysts and experts do not know or parties to decision making cannot agree on: i) the appropriate models describing the interaction among system variables, ii) probability distributions to represent uncertainty about key parameters in the model 3) how to value the desirability of alternative outcomes. The need thus emerges to assist policy-makers by providing them with not a single and optimal solution to the problem at hand, such as crisp estimates for the costs of damages of natural hazards considered, but instead ranges of possible future costs, based on the outcomes of ensembles of assessment models and sets of plausible scenarios. Accordingly, we need to substitute optimality as a decision criterion with robustness. Under conditions of deep uncertainty, the decision-makers do not have statistical and mathematical bases to identify optimal solutions, while instead they should prefer to implement "robust" decisions that perform relatively well over all conceivable outcomes out of all future unknown scenarios. Under deep uncertainty, analysts cannot employ probability theory or other statistics that usually can be derived from observed historical data and therefore, we turn to non-statistical measures such as scenario analysis. We construct several plausible scenarios with each scenario being a full description of what may happen in future and based on a meaningful synthesis of parameters' values with control of their correlations for maintaining internal consistencies. This paper aims at incorporating a set of data mining and sampling tools to assess uncertainty of model outputs under future climatic and socio-economic changes for Dhaka city and providing a decision support system for robust flood management and mitigation policies. After constructing an uncertainty matrix to identify the main sources of uncertainty for Dhaka City, we identify several hazard and vulnerability maps based on future climatic and socio-economic scenarios. The vulnerability of each flood management alternative under different set of scenarios is determined and finally the robustness of each plausible solution considered is defined based on the above assessment.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    J. Richard Hess; Jacob J. Jacobson; Richard Nelson

    This report updates the status of U.S. biomass resources currently and future potentials for domestic and export markets of residues, energy crops, and woody resources. Includes energy and fuel production and consumption statistics, driving policies, targets, and government investment in bioenergy industry development.

  2. Training in metabolomics research. II. Processing and statistical analysis of metabolomics data, metabolite identification, pathway analysis, applications of metabolomics and its future

    PubMed Central

    Barnes, Stephen; Benton, H. Paul; Casazza, Krista; Cooper, Sara; Cui, Xiangqin; Du, Xiuxia; Engler, Jeffrey; Kabarowski, Janusz H.; Li, Shuzhao; Pathmasiri, Wimal; Prasain, Jeevan K.; Renfrow, Matthew B.; Tiwari, Hemant K.

    2017-01-01

    Metabolomics, a systems biology discipline representing analysis of known and unknown pathways of metabolism, has grown tremendously over the past 20 years. Because of its comprehensive nature, metabolomics requires careful consideration of the question(s) being asked, the scale needed to answer the question(s), collection and storage of the sample specimens, methods for extraction of the metabolites from biological matrices, the analytical method(s) to be employed and the quality control of the analyses, how collected data are correlated, the statistical methods to determine metabolites undergoing significant change, putative identification of metabolites, and the use of stable isotopes to aid in verifying metabolite identity and establishing pathway connections and fluxes. This second part of a comprehensive description of the methods of metabolomics focuses on data analysis, emerging methods in metabolomics and the future of this discipline. PMID:28239968

  3. [Dental practitioners in Israel: past, present and future].

    PubMed

    Mann, J; Vered, Y; Zini, A

    2010-04-01

    Since 1980 various studies have been published in Israel dealing with dental manpower issues, utilizing several methods such as manpower to population ratio. The dental literature pointed out that dentistry in Israel has an over supply of dentists and that manpower to population ratio is one of the highest in the world 1:770. All studies were based on the information provided by the Ministry of Health which showed that Israel has over 9500 dentists. The Israel Central Bureau of Statistics figures showed a much smaller number which was 5700 active dentists. This enormous gap in between two sources of information, following strict examination of the data revealed that the Bureau of Statistics information is reliable and hence, the real manpower to population ratio in Israel in 2008 was 1:1271. Prediction of manpower is extremely important and the base line information is crucial for future evaluations.

  4. The FORE-SCE model: a practical approach for projecting land cover change using scenario-based modeling

    USGS Publications Warehouse

    Sohl, Terry L.; Sayler, Kristi L.; Drummond, Mark A.; Loveland, Thomas R.

    2007-01-01

    A wide variety of ecological applications require spatially explicit, historic, current, and projected land use and land cover data. The U.S. Land Cover Trends project is analyzing contemporary (1973–2000) land-cover change in the conterminous United States. The newly developed FORE-SCE model used Land Cover Trends data and theoretical, statistical, and deterministic modeling techniques to project future land cover change through 2020 for multiple plausible scenarios. Projected proportions of future land use were initially developed, and then sited on the lands with the highest potential for supporting that land use and land cover using a statistically based stochastic allocation procedure. Three scenarios of 2020 land cover were mapped for the western Great Plains in the US. The model provided realistic, high-resolution, scenario-based land-cover products suitable for multiple applications, including studies of climate and weather variability, carbon dynamics, and regional hydrology.

  5. Historical and Future Projected Hydrologic Extremes over the Midwest and Great Lakes Region

    NASA Astrophysics Data System (ADS)

    Byun, K.; Hamlet, A. F.; Chiu, C. M.

    2016-12-01

    There is an increasing body of evidence from observed data that climate variability combined with regional climate change has had a significant impact on hydrologic cycles, including both seasonal patterns of runoff and altered hydrologic extremes (e.g. floods and extreme stormwater events). To better understand changing patterns of extreme high flows in Midwest and Great Lakes region, we analyzed long-term historical observations of peak streamflow at different gaging stations. We also conducted hydrologic model experiments using the Variable Infiltration Capacity (VIC) at 1/16 degree resolution in order to explore sensitivity of annual peak streamflow, both historically and under temperature and precipitation changes for several future periods. For future projections, the Hybrid Delta statistical downscaling approach applied to the Coupled Model Inter-comparison, Phase5 (CMIP5) Global Climate Model (GCM) scenarios was used to produce driving data for the VIC hydrologic model. Preliminary results for several test basins in the Midwest support the hypothesis that there are consistent and statistically significant changes in the mean annual flood starting before and after about 1975. Future projections using hydrologic model simulations support the hypothesis of higher peak flows due to warming and increasing precipitation projected for the 21st century. We will extend this preliminary analysis using observed data and simulations from 40 river basins in the Midwest to further test these hypotheses.

  6. The Projection of Space Radiation Environments with a Solar Cycle Statistical Model

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee; Cucinotta, Francis A.; Wilson, John W.

    2006-01-01

    A solar cycle statistical model has been developed to project sunspot numbers which represent the variations in the space radiation environment. The resultant projection of sunspot numbers in near future were coupled to space-related quantities of interest in radiation protection, such as the galactic cosmic radiation (GCR) deceleration potential (f) and the mean occurrence frequency of solar particle event (SPE). Future GCR fluxes have been derived from a predictive model, in which GCR temporal dependence represented by f was derived from GCR flux and ground-based Climax neutron monitor rate measurements over the last four decades. Results showed that the point dose equivalent inside a typical spacecraft in interplanetary radiation fields was influenced by solar modulation up to a factor of three. One important characteristic of sporadic SPEs is their mean frequency of occurrence, which is dependent on solar activity. Projections of future mean frequency of SPE occurrence were estimated from a power law function of sunspot number. Furthermore, the cumulative probabilities of SPE during short-period missions were defined with the continuous database of proton fluences of SPE. The analytic representation of energy spectra of SPE was constructed by the Weibull distribution for different event sizes. The representative exposure level at each event size was estimated for the guideline of protection systems for astronauts during future space exploration missions.

  7. How Should Public Administration Education Curriculum Within Indiana Higher Education Institutions Evolve to Reflect the Complex Homeland Security Issues Faced by Future Public Sector Employees?

    DTIC Science & Technology

    2012-03-01

    security education . According to statistical data collected and published by Indiana University for graduation years 2005–2006 through 2009–2010, between... education efforts to the already existing programs of public administration within colleges and universities in Indiana. By using survey data collected...right direction, but it is limiting in scope. According to the National Center for Education Statistics , 23,493 bachelor’s degrees were awarded to

  8. Youth Attitudes and Military Service: Findings from Two Decades of Monitoring the Future National Samples of American Youth

    DTIC Science & Technology

    2000-06-01

    data have focused on drug use and related factors, the study content is much broader (as the title implies) and includes vocational and educational ...over time. The present reporting, in contrast, focuses primarily on trend data and certain subgroup differences, using simple statistics (percentages...report are included on all forms of the 8th and 10th grade surveys. Tests for the statistical significance of mean differences between data collected in

  9. The 1985 Army Experience Survey. Data Sourcebook and User’s Manual

    DTIC Science & Technology

    1986-01-01

    on the survey data file produced for the 1985 AES.- 4 The survey data are available in Operating System (OS) as well as Statistical Analysis System ...version of the survey data files was produced using the Statistical Analysis System (SASJ. The survey data were also produced in Operating System (OS...impacts upon future enlistments. In order iThe OS data file was designed to make the survey data accessible on any IBM-compatible computer system . 3 N’ to

  10. Process optimization using combinatorial design principles: parallel synthesis and design of experiment methods.

    PubMed

    Gooding, Owen W

    2004-06-01

    The use of parallel synthesis techniques with statistical design of experiment (DoE) methods is a powerful combination for the optimization of chemical processes. Advances in parallel synthesis equipment and easy to use software for statistical DoE have fueled a growing acceptance of these techniques in the pharmaceutical industry. As drug candidate structures become more complex at the same time that development timelines are compressed, these enabling technologies promise to become more important in the future.

  11. The Use of Statistical Downscaling to Project Regional Climate Changes as they Relate to Future Energy Production

    NASA Astrophysics Data System (ADS)

    Werth, D. W.; O'Steen, L.; Chen, K.; Altinakar, M. S.; Garrett, A.; Aleman, S.; Ramalingam, V.

    2010-12-01

    Global climate change has the potential for profound impacts on society, and poses significant challenges to government and industry in the areas of energy security and sustainability. Given that the ability to exploit energy resources often depends on the climate, the possibility of climate change means we cannot simply assume that the untapped potential of today will still exist in the future. Predictions of future climate are generally based on global climate models (GCMs) which, due to computational limitations, are run at spatial resolutions of hundreds of kilometers. While the results from these models can predict climatic trends averaged over large spatial and temporal scales, their ability to describe the effects of atmospheric phenomena that affect weather on regional to local scales is inadequate. We propose the use of several optimized statistical downscaling techniques that can infer climate change at the local scale from coarse resolution GCM predictions, and apply the results to assess future sustainability for two sources of energy production dependent on adequate water resources: nuclear power (through the dissipation of waste heat from cooling towers, ponds, etc.) and hydroelectric power. All methods will be trained with 20th century data, and applied to data from the years 2040-2049 to get the local-scale changes. Models of cooling tower operation and hydropower potential will then use the downscaled data to predict the possible changes in energy production, and the implications of climate change on plant siting, design, and contribution to the future energy grid can then be examined.

  12. Artificial neural networks in gynaecological diseases: current and potential future applications.

    PubMed

    Siristatidis, Charalampos S; Chrelias, Charalampos; Pouliakis, Abraham; Katsimanis, Evangelos; Kassanos, Dimitrios

    2010-10-01

    Current (and probably future) practice of medicine is mostly associated with prediction and accurate diagnosis. Especially in clinical practice, there is an increasing interest in constructing and using valid models of diagnosis and prediction. Artificial neural networks (ANNs) are mathematical systems being used as a prospective tool for reliable, flexible and quick assessment. They demonstrate high power in evaluating multifactorial data, assimilating information from multiple sources and detecting subtle and complex patterns. Their capability and difference from other statistical techniques lies in performing nonlinear statistical modelling. They represent a new alternative to logistic regression, which is the most commonly used method for developing predictive models for outcomes resulting from partitioning in medicine. In combination with the other non-algorithmic artificial intelligence techniques, they provide useful software engineering tools for the development of systems in quantitative medicine. Our paper first presents a brief introduction to ANNs, then, using what we consider the best available evidence through paradigms, we evaluate the ability of these networks to serve as first-line detection and prediction techniques in some of the most crucial fields in gynaecology. Finally, through the analysis of their current application, we explore their dynamics for future use.

  13. Investigating NARCCAP Precipitation Extremes via Bivariate Extreme Value Theory (Invited)

    NASA Astrophysics Data System (ADS)

    Weller, G. B.; Cooley, D. S.; Sain, S. R.; Bukovsky, M. S.; Mearns, L. O.

    2013-12-01

    We introduce methodology from statistical extreme value theory to examine the ability of reanalysis-drive regional climate models to simulate past daily precipitation extremes. Going beyond a comparison of summary statistics such as 20-year return values, we study whether the most extreme precipitation events produced by climate model simulations exhibit correspondence to the most extreme events seen in observational records. The extent of this correspondence is formulated via the statistical concept of tail dependence. We examine several case studies of extreme precipitation events simulated by the six models of the North American Regional Climate Change Assessment Program (NARCCAP) driven by NCEP reanalysis. It is found that the NARCCAP models generally reproduce daily winter precipitation extremes along the Pacific coast quite well; in contrast, simulation of past daily summer precipitation extremes in a central US region is poor. Some differences in the strength of extremal correspondence are seen in the central region between models which employ spectral nudging and those which do not. We demonstrate how these techniques may be used to draw a link between extreme precipitation events and large-scale atmospheric drivers, as well as to downscale extreme precipitation simulated by a future run of a regional climate model. Specifically, we examine potential future changes in the nature of extreme precipitation along the Pacific coast produced by the pineapple express (PE) phenomenon. A link between extreme precipitation events and a "PE Index" derived from North Pacific sea-surface pressure fields is found. This link is used to study PE-influenced extreme precipitation produced by a future-scenario climate model run.

  14. Estimating current and future streamflow characteristics at ungaged sites, central and eastern Montana, with application to evaluating effects of climate change on fish populations

    USGS Publications Warehouse

    Sando, Roy; Chase, Katherine J.

    2017-03-23

    A common statistical procedure for estimating streamflow statistics at ungaged locations is to develop a relational model between streamflow and drainage basin characteristics at gaged locations using least squares regression analysis; however, least squares regression methods are parametric and make constraining assumptions about the data distribution. The random forest regression method provides an alternative nonparametric method for estimating streamflow characteristics at ungaged sites and requires that the data meet fewer statistical conditions than least squares regression methods.Random forest regression analysis was used to develop predictive models for 89 streamflow characteristics using Precipitation-Runoff Modeling System simulated streamflow data and drainage basin characteristics at 179 sites in central and eastern Montana. The predictive models were developed from streamflow data simulated for current (baseline, water years 1982–99) conditions and three future periods (water years 2021–38, 2046–63, and 2071–88) under three different climate-change scenarios. These predictive models were then used to predict streamflow characteristics for baseline conditions and three future periods at 1,707 fish sampling sites in central and eastern Montana. The average root mean square error for all predictive models was about 50 percent. When streamflow predictions at 23 fish sampling sites were compared to nearby locations with simulated data, the mean relative percent difference was about 43 percent. When predictions were compared to streamflow data recorded at 21 U.S. Geological Survey streamflow-gaging stations outside of the calibration basins, the average mean absolute percent error was about 73 percent.

  15. Impact of climate change on Precipitation and temperature under the RCP 8.5 and A1B scenarios in an Alpine Cathment (Alto-Genil Basin,southeast Spain). A comparison of statistical downscaling methods

    NASA Astrophysics Data System (ADS)

    Pulido-Velazquez, David; Juan Collados-Lara, Antonio; Pardo-Iguzquiza, Eulogio; Jimeno-Saez, Patricia; Fernandez-Chacon, Francisca

    2016-04-01

    In order to design adaptive strategies to global change we need to assess the future impact of climate change on water resources, which depends on precipitation and temperature series in the systems. The objective of this work is to generate future climate series in the "Alto Genil" Basin (southeast Spain) for the period 2071-2100 by perturbing the historical series using different statistical methods. For this targeted we use information coming from regionals climate model simulations (RCMs) available in two European projects, CORDEX (2013), with a spatial resolution of 12.5 km, and ENSEMBLES (2009), with a spatial resolution of 25 km. The historical climate series used for the period 1971-2000 have been obtained from Spain02 project (2012) which has the same spatial resolution that CORDEX project (both use the EURO-CORDEX grid). Two emission scenarios have been considered: the Representative Concentration Pathways (RCP) 8.5 emissions scenario, which is the most unfavorable scenario considered in the fifth Assessment Report (AR5) by the Intergovernmental Panel on Climate Change (IPCC), and the A1B emission scenario of fourth Assessment Report (AR4). We use the RCM simulations to create an ensemble of predictions weighting their information according to their ability to reproduce the main statistic of the historical climatology. A multi-objective analysis has been performed to identify which models are better in terms of goodness of fit to the cited statistic of the historical series. The ensemble of the CORDEX and the ENSEMBLES projects has been finally created with nine and four models respectively. These ensemble series have been used to assess the anomalies in mean and standard deviation (differences between the control and future RCM series). A "delta-change" method (Pulido-Velazquez et al., 2011) has been applied to define future series by modifying the historical climate series in accordance with the cited anomalies in mean and standard deviation. A comparison between results for scenario A1B and RCP8.5 has been performed. The reduction obtained for the mean rainfall respect to the historical are 24.2 % and 24.4 % respectively, and the increment in the temperature are 46.3 % and 31.2 % respectively. A sensitivity analysis of the results to the statistical downscaling techniques employed has been performed. The next techniques have been explored: Perturbation method or "delta-change"; Regression method (a regression function which relates the RCM and the historic information will be used to generate future climate series for the fixed period); Quantile mapping, (it attempts to find a transformation function which relates the observed variable and the modeled variable maintaining an statistical distribution equals the observed variable); Stochastic weather generator (SWG): They can be uni-site or multi-site (which considers the spatial correlation of climatic series). A comparative analysis of these techniques has been performed identifying the advantages and disadvantages of each of them. Acknowledgments: This research has been partially supported by the GESINHIMPADAPT project (CGL2013-48424-C2-2-R) with Spanish MINECO funds. We would also like to thank Spain02, ENSEMBLES and CORDEX projects for the data provided for this study.

  16. Simulation of an ensemble of future climate time series with an hourly weather generator

    NASA Astrophysics Data System (ADS)

    Caporali, E.; Fatichi, S.; Ivanov, V. Y.; Kim, J.

    2010-12-01

    There is evidence that climate change is occurring in many regions of the world. The necessity of climate change predictions at the local scale and fine temporal resolution is thus warranted for hydrological, ecological, geomorphological, and agricultural applications that can provide thematic insights into the corresponding impacts. Numerous downscaling techniques have been proposed to bridge the gap between the spatial scales adopted in General Circulation Models (GCM) and regional analyses. Nevertheless, the time and spatial resolutions obtained as well as the type of meteorological variables may not be sufficient for detailed studies of climate change effects at the local scales. In this context, this study presents a stochastic downscaling technique that makes use of an hourly weather generator to simulate time series of predicted future climate. Using a Bayesian approach, the downscaling procedure derives distributions of factors of change for several climate statistics from a multi-model ensemble of GCMs. Factors of change are sampled from their distributions using a Monte Carlo technique to entirely account for the probabilistic information obtained with the Bayesian multi-model ensemble. Factors of change are subsequently applied to the statistics derived from observations to re-evaluate the parameters of the weather generator. The weather generator can reproduce a wide set of climate variables and statistics over a range of temporal scales, from extremes, to the low-frequency inter-annual variability. The final result of such a procedure is the generation of an ensemble of hourly time series of meteorological variables that can be considered as representative of future climate, as inferred from GCMs. The generated ensemble of scenarios also accounts for the uncertainty derived from multiple GCMs used in downscaling. Applications of the procedure in reproducing present and future climates are presented for different locations world-wide: Tucson (AZ), Detroit (MI), and Firenze (Italy). The stochastic downscaling is carried out with eight GCMs from the CMIP3 multi-model dataset (IPCC 4AR, A1B scenario).

  17. Assessment of Surface Air Temperature over China Using Multi-criterion Model Ensemble Framework

    NASA Astrophysics Data System (ADS)

    Li, J.; Zhu, Q.; Su, L.; He, X.; Zhang, X.

    2017-12-01

    The General Circulation Models (GCMs) are designed to simulate the present climate and project future trends. It has been noticed that the performances of GCMs are not always in agreement with each other over different regions. Model ensemble techniques have been developed to post-process the GCMs' outputs and improve their prediction reliabilities. To evaluate the performances of GCMs, root-mean-square error, correlation coefficient, and uncertainty are commonly used statistical measures. However, the simultaneous achievements of these satisfactory statistics cannot be guaranteed when using many model ensemble techniques. Meanwhile, uncertainties and future scenarios are critical for Water-Energy management and operation. In this study, a new multi-model ensemble framework was proposed. It uses a state-of-art evolutionary multi-objective optimization algorithm, termed Multi-Objective Complex Evolution Global Optimization with Principle Component Analysis and Crowding Distance (MOSPD), to derive optimal GCM ensembles and demonstrate the trade-offs among various solutions. Such trade-off information was further analyzed with a robust Pareto front with respect to different statistical measures. A case study was conducted to optimize the surface air temperature (SAT) ensemble solutions over seven geographical regions of China for the historical period (1900-2005) and future projection (2006-2100). The results showed that the ensemble solutions derived with MOSPD algorithm are superior over the simple model average and any single model output during the historical simulation period. For the future prediction, the proposed ensemble framework identified that the largest SAT change would occur in the South Central China under RCP 2.6 scenario, North Eastern China under RCP 4.5 scenario, and North Western China under RCP 8.5 scenario, while the smallest SAT change would occur in the Inner Mongolia under RCP 2.6 scenario, South Central China under RCP 4.5 scenario, and South Central China under RCP 8.5 scenario.

  18. Comparing multiple statistical methods for inverse prediction in nuclear forensics applications

    DOE PAGES

    Lewis, John R.; Zhang, Adah; Anderson-Cook, Christine Michaela

    2017-10-29

    Forensic science seeks to predict source characteristics using measured observables. Statistically, this objective can be thought of as an inverse problem where interest is in the unknown source characteristics or factors ( X) of some underlying causal model producing the observables or responses (Y = g ( X) + error). Here, this paper reviews several statistical methods for use in inverse problems and demonstrates that comparing results from multiple methods can be used to assess predictive capability. Motivation for assessing inverse predictions comes from the desired application to historical and future experiments involving nuclear material production for forensics research inmore » which inverse predictions, along with an assessment of predictive capability, are desired.« less

  19. Comparing multiple statistical methods for inverse prediction in nuclear forensics applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewis, John R.; Zhang, Adah; Anderson-Cook, Christine Michaela

    Forensic science seeks to predict source characteristics using measured observables. Statistically, this objective can be thought of as an inverse problem where interest is in the unknown source characteristics or factors ( X) of some underlying causal model producing the observables or responses (Y = g ( X) + error). Here, this paper reviews several statistical methods for use in inverse problems and demonstrates that comparing results from multiple methods can be used to assess predictive capability. Motivation for assessing inverse predictions comes from the desired application to historical and future experiments involving nuclear material production for forensics research inmore » which inverse predictions, along with an assessment of predictive capability, are desired.« less

  20. Statistical Considerations of Food Allergy Prevention Studies.

    PubMed

    Bahnson, Henry T; du Toit, George; Lack, Gideon

    Clinical studies to prevent the development of food allergy have recently helped reshape public policy recommendations on the early introduction of allergenic foods. These trials are also prompting new research, and it is therefore important to address the unique design and analysis challenges of prevention trials. We highlight statistical concepts and give recommendations that clinical researchers may wish to adopt when designing future study protocols and analysis plans for prevention studies. Topics include selecting a study sample, addressing internal and external validity, improving statistical power, choosing alpha and beta, analysis innovations to address dilution effects, and analysis methods to deal with poor compliance, dropout, and missing data. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  1. Modelling climate impact on floods under future emission scenarios using an ensemble of climate model projections

    NASA Astrophysics Data System (ADS)

    Wetterhall, F.; Cloke, H. L.; He, Y.; Freer, J.; Pappenberger, F.

    2012-04-01

    Evidence provided by modelled assessments of climate change impact on flooding is fundamental to water resource and flood risk decision making. Impact models usually rely on climate projections from Global and Regional Climate Models, and there is no doubt that these provide a useful assessment of future climate change. However, cascading ensembles of climate projections into impact models is not straightforward because of problems of coarse resolution in Global and Regional Climate Models (GCM/RCM) and the deficiencies in modelling high-intensity precipitation events. Thus decisions must be made on how to appropriately pre-process the meteorological variables from GCM/RCMs, such as selection of downscaling methods and application of Model Output Statistics (MOS). In this paper a grand ensemble of projections from several GCM/RCM are used to drive a hydrological model and analyse the resulting future flood projections for the Upper Severn, UK. The impact and implications of applying MOS techniques to precipitation as well as hydrological model parameter uncertainty is taken into account. The resultant grand ensemble of future river discharge projections from the RCM/GCM-hydrological model chain is evaluated against a response surface technique combined with a perturbed physics experiment creating a probabilisic ensemble climate model outputs. The ensemble distribution of results show that future risk of flooding in the Upper Severn increases compared to present conditions, however, the study highlights that the uncertainties are large and that strong assumptions were made in using Model Output Statistics to produce the estimates of future discharge. The importance of analysing on a seasonal basis rather than just annual is highlighted. The inability of the RCMs (and GCMs) to produce realistic precipitation patterns, even in present conditions, is a major caveat of local climate impact studies on flooding, and this should be a focus for future development.

  2. Future Newspaper Managers Learn Basics at Oregon.

    ERIC Educational Resources Information Center

    Halverson, Roy

    1978-01-01

    Describes an experimental program that prepares students for careers in newspaper management with a sequence of courses in journalism, accounting, marketing, management, finance, and statistics, ending with an internship in the business office of a daily or weekly newspaper. (RL)

  3. Introduction to Potato

    USDA-ARS?s Scientific Manuscript database

    This is an introductory chapter on potatoes which gives a brief history of the potato, potato morphology, taxonomy, production statistics, nutritional content, and future prospects for potato research and production. It will appear in a new book entitled Genetics, Genomics, and Breeding of Potato ...

  4. The need for conducting forensic analysis of decommissioned bridges.

    DOT National Transportation Integrated Search

    2014-01-01

    A limiting factor in current bridge management programs is a lack of detailed knowledge of bridge deterioration : mechanisms and processes. The current state of the art is to predict future condition using statistical forecasting : models based upon ...

  5. Method for Assessing Impacts of Global Sea Level Rise on Navigation Gate Operations

    NASA Astrophysics Data System (ADS)

    Obrien, P. S.; White, K. D.; Friedman, D.

    2015-12-01

    Coastal navigation infrastructure may be highly vulnerable to changing climate, including increasing sea levels and altered frequency and intensity of coastal storms. Future gate operations impacted by global sea level rise will pose unique challenges, especially for structures 50 years and older. Our approach is to estimate future changes in gate operational frequency based on a bootstrapping method to forecast future water levels. A case study will be presented to determine future changes in frequency of operations over the next 100 years. A statistical model in the R programming language was developed to apply future sea level rise projections using the three sea level rise scenarios prescribed by USACE Engineer Regulation ER 1100-2-8162. Information derived from the case study will help forecast changes in operational costs caused by increased gate operations and inform timing of decisions on adaptation measures.

  6. Applications of physical methods in high-frequency futures markets

    NASA Astrophysics Data System (ADS)

    Bartolozzi, M.; Mellen, C.; Chan, F.; Oliver, D.; Di Matteo, T.; Aste, T.

    2007-12-01

    In the present work we demonstrate the application of different physical methods to high-frequency or tick-bytick financial time series data. In particular, we calculate the Hurst exponent and inverse statistics for the price time series taken from a range of futures indices. Additionally, we show that in a limit order book the relaxation times of an imbalanced book state with more demand or supply can be described by stretched exponential laws analogous to those seen in many physical systems.

  7. Linking Excessive Heat with Daily Heat-Related Mortality over the Coterminous United States

    NASA Technical Reports Server (NTRS)

    Quattrochi, Dale A.; Crosson, William L.; Al-Hamdan, Mohammad Z.; Estes, Maurice G., Jr.

    2014-01-01

    In the United States, extreme heat is the most deadly weather-related hazard. In the face of a warming climate and urbanization, which contributes to local-scale urban heat islands, it is very likely that extreme heat events (EHEs) will become more common and more severe in the U.S. This research seeks to provide historical and future measures of climate-driven extreme heat events to enable assessments of the impacts of heat on public health over the coterminous U.S. We use atmospheric temperature and humidity information from meteorological reanalysis and from Global Climate Models (GCMs) to provide data on past and future heat events. The focus of research is on providing assessments of the magnitude, frequency and geographic distribution of extreme heat in the U.S. to facilitate public health studies. In our approach, long-term climate change is captured with GCM outputs, and the temporal and spatial characteristics of short-term extremes are represented by the reanalysis data. Two future time horizons for 2040 and 2090 are compared to the recent past period of 1981- 2000. We characterize regional-scale temperature and humidity conditions using GCM outputs for two climate change scenarios (A2 and A1B) defined in the Special Report on Emissions Scenarios (SRES). For each future period, 20 years of multi-model GCM outputs are analyzed to develop a 'heat stress climatology' based on statistics of extreme heat indicators. Differences between the two future and the past period are used to define temperature and humidity changes on a monthly time scale and regional spatial scale. These changes are combined with the historical meteorological data, which is hourly and at a spatial scale (12 km) much finer than that of GCMs, to create future climate realizations. From these realizations, we compute the daily heat stress measures and related spatially-specific climatological fields, such as the mean annual number of days above certain thresholds of maximum and minimum air temperatures, heat indices, and a new heat stress variable developed as part of this research that gives an integrated measure of heat stress (and relief) over the course of a day. Comparisons are made between projected (2040 and 2090) and past (1990) heat stress statistics. Outputs are aggregated to the county level, which is a popular scale of analysis for public health interests. County-level statistics are made available to public health researchers by the Centers for Disease Control and Prevention (CDC) via the Wide-ranging Online Data for Epidemiologic Research (WONDER) system. This addition of heat stress measures to CDC WONDER allows decision and policy makers to assess the impact of alternative approaches to optimize the public health response to EHEs. Through CDC WONDER, users are able to spatially and temporally query public health and heat-related data sets and create county-level maps and statistical charts of such data across the coterminous U.S.

  8. Statistical modelling for recurrent events: an application to sports injuries

    PubMed Central

    Ullah, Shahid; Gabbett, Tim J; Finch, Caroline F

    2014-01-01

    Background Injuries are often recurrent, with subsequent injuries influenced by previous occurrences and hence correlation between events needs to be taken into account when analysing such data. Objective This paper compares five different survival models (Cox proportional hazards (CoxPH) model and the following generalisations to recurrent event data: Andersen-Gill (A-G), frailty, Wei-Lin-Weissfeld total time (WLW-TT) marginal, Prentice-Williams-Peterson gap time (PWP-GT) conditional models) for the analysis of recurrent injury data. Methods Empirical evaluation and comparison of different models were performed using model selection criteria and goodness-of-fit statistics. Simulation studies assessed the size and power of each model fit. Results The modelling approach is demonstrated through direct application to Australian National Rugby League recurrent injury data collected over the 2008 playing season. Of the 35 players analysed, 14 (40%) players had more than 1 injury and 47 contact injuries were sustained over 29 matches. The CoxPH model provided the poorest fit to the recurrent sports injury data. The fit was improved with the A-G and frailty models, compared to WLW-TT and PWP-GT models. Conclusions Despite little difference in model fit between the A-G and frailty models, in the interest of fewer statistical assumptions it is recommended that, where relevant, future studies involving modelling of recurrent sports injury data use the frailty model in preference to the CoxPH model or its other generalisations. The paper provides a rationale for future statistical modelling approaches for recurrent sports injury. PMID:22872683

  9. Assessment of hi-resolution multi-ensemble statistical downscaling regional climate scenarios over Japan

    NASA Astrophysics Data System (ADS)

    Dairaku, K.

    2017-12-01

    The Asia-Pacific regions are increasingly threatened by large scale natural disasters. Growing concerns that loss and damages of natural disasters are projected to further exacerbate by climate change and socio-economic change. Climate information and services for risk assessments are of great concern. Fundamental regional climate information is indispensable for understanding changing climate and making decisions on when and how to act. To meet with the needs of stakeholders such as National/local governments, spatio-temporal comprehensive and consistent information is necessary and useful for decision making. Multi-model ensemble regional climate scenarios with 1km horizontal grid-spacing over Japan are developed by using CMIP5 37 GCMs (RCP8.5) and a statistical downscaling (Bias Corrected Spatial Disaggregation (BCSD)) to investigate uncertainty of projected change associated with structural differences of the GCMs for the periods of historical climate (1950-2005) and near future climate (2026-2050). Statistical downscaling regional climate scenarios show good performance for annual and seasonal averages for precipitation and temperature. The regional climate scenarios show systematic underestimate of extreme events such as hot days of over 35 Celsius and annual maximum daily precipitation because of the interpolation processes in the BCSD method. Each model projected different responses in near future climate because of structural differences. The most of CMIP5 37 models show qualitatively consistent increase of average and extreme temperature and precipitation. The added values of statistical/dynamical downscaling methods are also investigated for locally forced nonlinear phenomena, extreme events.

  10. Quantifying and reducing statistical uncertainty in sample-based health program costing studies in low- and middle-income countries.

    PubMed

    Rivera-Rodriguez, Claudia L; Resch, Stephen; Haneuse, Sebastien

    2018-01-01

    In many low- and middle-income countries, the costs of delivering public health programs such as for HIV/AIDS, nutrition, and immunization are not routinely tracked. A number of recent studies have sought to estimate program costs on the basis of detailed information collected on a subsample of facilities. While unbiased estimates can be obtained via accurate measurement and appropriate analyses, they are subject to statistical uncertainty. Quantification of this uncertainty, for example, via standard errors and/or 95% confidence intervals, provides important contextual information for decision-makers and for the design of future costing studies. While other forms of uncertainty, such as that due to model misspecification, are considered and can be investigated through sensitivity analyses, statistical uncertainty is often not reported in studies estimating the total program costs. This may be due to a lack of awareness/understanding of (1) the technical details regarding uncertainty estimation and (2) the availability of software with which to calculate uncertainty for estimators resulting from complex surveys. We provide an overview of statistical uncertainty in the context of complex costing surveys, emphasizing the various potential specific sources that contribute to overall uncertainty. We describe how analysts can compute measures of uncertainty, either via appropriately derived formulae or through resampling techniques such as the bootstrap. We also provide an overview of calibration as a means of using additional auxiliary information that is readily available for the entire program, such as the total number of doses administered, to decrease uncertainty and thereby improve decision-making and the planning of future studies. A recent study of the national program for routine immunization in Honduras shows that uncertainty can be reduced by using information available prior to the study. This method can not only be used when estimating the total cost of delivering established health programs but also to decrease uncertainty when the interest lies in assessing the incremental effect of an intervention. Measures of statistical uncertainty associated with survey-based estimates of program costs, such as standard errors and 95% confidence intervals, provide important contextual information for health policy decision-making and key inputs for the design of future costing studies. Such measures are often not reported, possibly because of technical challenges associated with their calculation and a lack of awareness of appropriate software. Modern statistical analysis methods for survey data, such as calibration, provide a means to exploit additional information that is readily available but was not used in the design of the study to significantly improve the estimation of total cost through the reduction of statistical uncertainty.

  11. Quantifying and reducing statistical uncertainty in sample-based health program costing studies in low- and middle-income countries

    PubMed Central

    Resch, Stephen

    2018-01-01

    Objectives: In many low- and middle-income countries, the costs of delivering public health programs such as for HIV/AIDS, nutrition, and immunization are not routinely tracked. A number of recent studies have sought to estimate program costs on the basis of detailed information collected on a subsample of facilities. While unbiased estimates can be obtained via accurate measurement and appropriate analyses, they are subject to statistical uncertainty. Quantification of this uncertainty, for example, via standard errors and/or 95% confidence intervals, provides important contextual information for decision-makers and for the design of future costing studies. While other forms of uncertainty, such as that due to model misspecification, are considered and can be investigated through sensitivity analyses, statistical uncertainty is often not reported in studies estimating the total program costs. This may be due to a lack of awareness/understanding of (1) the technical details regarding uncertainty estimation and (2) the availability of software with which to calculate uncertainty for estimators resulting from complex surveys. We provide an overview of statistical uncertainty in the context of complex costing surveys, emphasizing the various potential specific sources that contribute to overall uncertainty. Methods: We describe how analysts can compute measures of uncertainty, either via appropriately derived formulae or through resampling techniques such as the bootstrap. We also provide an overview of calibration as a means of using additional auxiliary information that is readily available for the entire program, such as the total number of doses administered, to decrease uncertainty and thereby improve decision-making and the planning of future studies. Results: A recent study of the national program for routine immunization in Honduras shows that uncertainty can be reduced by using information available prior to the study. This method can not only be used when estimating the total cost of delivering established health programs but also to decrease uncertainty when the interest lies in assessing the incremental effect of an intervention. Conclusion: Measures of statistical uncertainty associated with survey-based estimates of program costs, such as standard errors and 95% confidence intervals, provide important contextual information for health policy decision-making and key inputs for the design of future costing studies. Such measures are often not reported, possibly because of technical challenges associated with their calculation and a lack of awareness of appropriate software. Modern statistical analysis methods for survey data, such as calibration, provide a means to exploit additional information that is readily available but was not used in the design of the study to significantly improve the estimation of total cost through the reduction of statistical uncertainty. PMID:29636964

  12. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  13. Hunting Solomonoff's Swans: Exploring the Boundary Between Physics and Statistics in Hydrological Modeling

    NASA Astrophysics Data System (ADS)

    Nearing, G. S.

    2014-12-01

    Statistical models consistently out-perform conceptual models in the short term, however to account for a nonstationary future (or an unobserved past) scientists prefer to base predictions on unchanging and commutable properties of the universe - i.e., physics. The problem with physically-based hydrology models is, of course, that they aren't really based on physics - they are based on statistical approximations of physical interactions, and we almost uniformly lack an understanding of the entropy associated with these approximations. Thermodynamics is successful precisely because entropy statistics are computable for homogeneous (well-mixed) systems, and ergodic arguments explain the success of Newton's laws to describe systems that are fundamentally quantum in nature. Unfortunately, similar arguments do not hold for systems like watersheds that are heterogeneous at a wide range of scales. Ray Solomonoff formalized the situation in 1968 by showing that given infinite evidence, simultaneously minimizing model complexity and entropy in predictions always leads to the best possible model. The open question in hydrology is about what happens when we don't have infinite evidence - for example, when the future will not look like the past, or when one watershed does not behave like another. How do we isolate stationary and commutable components of watershed behavior? I propose that one possible answer to this dilemma lies in a formal combination of physics and statistics. In this talk I outline my recent analogue (Solomonoff's theorem was digital) of Solomonoff's idea that allows us to quantify the complexity/entropy tradeoff in a way that is intuitive to physical scientists. I show how to formally combine "physical" and statistical methods for model development in a way that allows us to derive the theoretically best possible model given any given physics approximation(s) and available observations. Finally, I apply an analogue of Solomonoff's theorem to evaluate the tradeoff between model complexity and prediction power.

  14. A study of statistics anxiety levels of graduate dental hygiene students.

    PubMed

    Welch, Paul S; Jacks, Mary E; Smiley, Lynn A; Walden, Carolyn E; Clark, William D; Nguyen, Carol A

    2015-02-01

    In light of increased emphasis on evidence-based practice in the profession of dental hygiene, it is important that today's dental hygienist comprehend statistical measures to fully understand research articles, and thereby apply scientific evidence to practice. Therefore, the purpose of this study was to investigate statistics anxiety among graduate dental hygiene students in the U.S. A web-based self-report, anonymous survey was emailed to directors of 17 MSDH programs in the U.S. with a request to distribute to graduate students. The survey collected data on statistics anxiety, sociodemographic characteristics and evidence-based practice. Statistic anxiety was assessed using the Statistical Anxiety Rating Scale. Study significance level was α=0.05. Only 8 of the 17 invited programs participated in the study. Statistical Anxiety Rating Scale data revealed graduate dental hygiene students experience low to moderate levels of statistics anxiety. Specifically, the level of anxiety on the Interpretation Anxiety factor indicated this population could struggle with making sense of scientific research. A decisive majority (92%) of students indicated statistics is essential for evidence-based practice and should be a required course for all dental hygienists. This study served to identify statistics anxiety in a previously unexplored population. The findings should be useful in both theory building and in practical applications. Furthermore, the results can be used to direct future research. Copyright © 2015 The American Dental Hygienists’ Association.

  15. The living Drake equation of the Tau Zero Foundation

    NASA Astrophysics Data System (ADS)

    Maccone, Claudio

    2011-03-01

    The living Drake equation is our statistical generalization of the Drake equation such that it can take into account any number of factors. This new result opens up the possibility to enrich the equation by inserting more new factors as long as the scientific learning increases. The adjective "Living" refers just to this continuous enrichment of the Drake equation and is the goal of a new research project that the Tau Zero Foundation has entrusted to this author as the discoverer of the statistical Drake equation described hereafter. From a simple product of seven positive numbers, the Drake equation is now turned into the product of seven positive random variables. We call this "the Statistical Drake Equation". The mathematical consequences of this transformation are then derived. The proof of our results is based on the Central Limit Theorem (CLT) of Statistics. In loose terms, the CLT states that the sum of any number of independent random variables, each of which may be arbitrarily distributed, approaches a Gaussian (i.e. normal) random variable. This is called the Lyapunov form of the CLT, or the Lindeberg form of the CLT, depending on the mathematical constraints assumed on the third moments of the various probability distributions. In conclusion, we show that: The new random variable N, yielding the number of communicating civilizations in the Galaxy, follows the lognormal distribution. Then, the mean value, standard deviation, mode, median and all the moments of this lognormal N can be derived from the means and standard deviations of the seven input random variables. In fact, the seven factors in the ordinary Drake equation now become seven independent positive random variables. The probability distribution of each random variable may be arbitrary. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into our statistical Drake equation by allowing an arbitrary probability distribution for each factor. This is both physically realistic and practically very useful, of course. An application of our statistical Drake equation then follows. The (average) distance between any two neighbouring and communicating civilizations in the Galaxy may be shown to be inversely proportional to the cubic root of N. Then, this distance now becomes a new random variable. We derive the relevant probability density function, apparently previously unknown (dubbed "Maccone distribution" by Paul Davies). Data Enrichment Principle. It should be noticed that any positive number of random variables in the statistical Drake equation is compatible with the CLT. So, our generalization allows for many more factors to be added in the future as long as more refined scientific knowledge about each factor will be known to the scientists. This capability to make room for more future factors in the statistical Drake equation we call the "Data Enrichment Principle", and regard as the key to more profound, future results in Astrobiology and SETI.

  16. Statistical Analysis of Large-Scale Structure of Universe

    NASA Astrophysics Data System (ADS)

    Tugay, A. V.

    While galaxy cluster catalogs were compiled many decades ago, other structural elements of cosmic web are detected at definite level only in the newest works. For example, extragalactic filaments were described by velocity field and SDSS galaxy distribution during the last years. Large-scale structure of the Universe could be also mapped in the future using ATHENA observations in X-rays and SKA in radio band. Until detailed observations are not available for the most volume of Universe, some integral statistical parameters can be used for its description. Such methods as galaxy correlation function, power spectrum, statistical moments and peak statistics are commonly used with this aim. The parameters of power spectrum and other statistics are important for constraining the models of dark matter, dark energy, inflation and brane cosmology. In the present work we describe the growth of large-scale density fluctuations in one- and three-dimensional case with Fourier harmonics of hydrodynamical parameters. In result we get power-law relation for the matter power spectrum.

  17. Predicted range expansion of Chinese tallow tree (Triadica sebifera) in forestlands of the southern United States

    Treesearch

    Hsiao-Hsuan Wang; William Grant; Todd Swannack; Jianbang Gan; William Rogers; Tomasz Koralewski; James Miller; John W. Taylor Jr.

    2011-01-01

    We present an integrated approach for predicting future range expansion of an invasive species (Chinese tallow tree) that incorporates statistical forecasting and analytical techniques within a spatially explicit, agent-based, simulation framework.

  18. Multilevel Model Prediction

    ERIC Educational Resources Information Center

    Frees, Edward W.; Kim, Jee-Seon

    2006-01-01

    Multilevel models are proven tools in social research for modeling complex, hierarchical systems. In multilevel modeling, statistical inference is based largely on quantification of random variables. This paper distinguishes among three types of random variables in multilevel modeling--model disturbances, random coefficients, and future response…

  19. Illinois' forest resource.

    Treesearch

    Gerhard K. Raile; Earl C. Leatherberry

    1988-01-01

    The third inventory of forest resources in Illinois shows a 1.2% increase in timberland and a 40.5% gain in growing stock volume between 1962 and 1985. Text and statistics are presented on area, volume, growth, mortality, removals, utilization, biomass, and future timber supply.

  20. Assessing Statistical Competencies in Clinical and Translational Science Education: One Size Does Not Fit All

    PubMed Central

    Lindsell, Christopher J.; Welty, Leah J.; Mazumdar, Madhu; Thurston, Sally W.; Rahbar, Mohammad H.; Carter, Rickey E.; Pollock, Bradley H.; Cucchiara, Andrew J.; Kopras, Elizabeth J.; Jovanovic, Borko D.; Enders, Felicity T.

    2014-01-01

    Abstract Introduction Statistics is an essential training component for a career in clinical and translational science (CTS). Given the increasing complexity of statistics, learners may have difficulty selecting appropriate courses. Our question was: what depth of statistical knowledge do different CTS learners require? Methods For three types of CTS learners (principal investigator, co‐investigator, informed reader of the literature), each with different backgrounds in research (no previous research experience, reader of the research literature, previous research experience), 18 experts in biostatistics, epidemiology, and research design proposed levels for 21 statistical competencies. Results Statistical competencies were categorized as fundamental, intermediate, or specialized. CTS learners who intend to become independent principal investigators require more specialized training, while those intending to become informed consumers of the medical literature require more fundamental education. For most competencies, less training was proposed for those with more research background. Discussion When selecting statistical coursework, the learner's research background and career goal should guide the decision. Some statistical competencies are considered to be more important than others. Baseline knowledge assessments may help learners identify appropriate coursework. Conclusion Rather than one size fits all, tailoring education to baseline knowledge, learner background, and future goals increases learning potential while minimizing classroom time. PMID:25212569

  1. Evaluation of adding item-response theory analysis for evaluation of the European Board of Ophthalmology Diploma examination.

    PubMed

    Mathysen, Danny G P; Aclimandos, Wagih; Roelant, Ella; Wouters, Kristien; Creuzot-Garcher, Catherine; Ringens, Peter J; Hawlina, Marko; Tassignon, Marie-José

    2013-11-01

    To investigate whether introduction of item-response theory (IRT) analysis, in parallel to the 'traditional' statistical analysis methods available for performance evaluation of multiple T/F items as used in the European Board of Ophthalmology Diploma (EBOD) examination, has proved beneficial, and secondly, to study whether the overall assessment performance of the current written part of EBOD is sufficiently high (KR-20≥ 0.90) to be kept as examination format in future EBOD editions. 'Traditional' analysis methods for individual MCQ item performance comprise P-statistics, Rit-statistics and item discrimination, while overall reliability is evaluated through KR-20 for multiple T/F items. The additional set of statistical analysis methods for the evaluation of EBOD comprises mainly IRT analysis. These analysis techniques are used to monitor whether the introduction of negative marking for incorrect answers (since EBOD 2010) has a positive influence on the statistical performance of EBOD as a whole and its individual test items in particular. Item-response theory analysis demonstrated that item performance parameters should not be evaluated individually, but should be related to one another. Before the introduction of negative marking, the overall EBOD reliability (KR-20) was good though with room for improvement (EBOD 2008: 0.81; EBOD 2009: 0.78). After the introduction of negative marking, the overall reliability of EBOD improved significantly (EBOD 2010: 0.92; EBOD 2011:0.91; EBOD 2012: 0.91). Although many statistical performance parameters are available to evaluate individual items, our study demonstrates that the overall reliability assessment remains the only crucial parameter to be evaluated allowing comparison. While individual item performance analysis is worthwhile to undertake as secondary analysis, drawing final conclusions seems to be more difficult. Performance parameters need to be related, as shown by IRT analysis. Therefore, IRT analysis has proved beneficial for the statistical analysis of EBOD. Introduction of negative marking has led to a significant increase in the reliability (KR-20 > 0.90), indicating that the current examination format can be kept for future EBOD examinations. © 2013 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  2. The Statistical Drake Equation

    NASA Astrophysics Data System (ADS)

    Maccone, Claudio

    2010-12-01

    We provide the statistical generalization of the Drake equation. From a simple product of seven positive numbers, the Drake equation is now turned into the product of seven positive random variables. We call this "the Statistical Drake Equation". The mathematical consequences of this transformation are then derived. The proof of our results is based on the Central Limit Theorem (CLT) of Statistics. In loose terms, the CLT states that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable. This is called the Lyapunov Form of the CLT, or the Lindeberg Form of the CLT, depending on the mathematical constraints assumed on the third moments of the various probability distributions. In conclusion, we show that: The new random variable N, yielding the number of communicating civilizations in the Galaxy, follows the LOGNORMAL distribution. Then, as a consequence, the mean value of this lognormal distribution is the ordinary N in the Drake equation. The standard deviation, mode, and all the moments of this lognormal N are also found. The seven factors in the ordinary Drake equation now become seven positive random variables. The probability distribution of each random variable may be ARBITRARY. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into our statistical Drake equation by allowing an arbitrary probability distribution for each factor. This is both physically realistic and practically very useful, of course. An application of our statistical Drake equation then follows. The (average) DISTANCE between any two neighboring and communicating civilizations in the Galaxy may be shown to be inversely proportional to the cubic root of N. Then, in our approach, this distance becomes a new random variable. We derive the relevant probability density function, apparently previously unknown and dubbed "Maccone distribution" by Paul Davies. DATA ENRICHMENT PRINCIPLE. It should be noticed that ANY positive number of random variables in the Statistical Drake Equation is compatible with the CLT. So, our generalization allows for many more factors to be added in the future as long as more refined scientific knowledge about each factor will be known to the scientists. This capability to make room for more future factors in the statistical Drake equation, we call the "Data Enrichment Principle," and we regard it as the key to more profound future results in the fields of Astrobiology and SETI. Finally, a practical example is given of how our statistical Drake equation works numerically. We work out in detail the case, where each of the seven random variables is uniformly distributed around its own mean value and has a given standard deviation. For instance, the number of stars in the Galaxy is assumed to be uniformly distributed around (say) 350 billions with a standard deviation of (say) 1 billion. Then, the resulting lognormal distribution of N is computed numerically by virtue of a MathCad file that the author has written. This shows that the mean value of the lognormal random variable N is actually of the same order as the classical N given by the ordinary Drake equation, as one might expect from a good statistical generalization.

  3. UNITY: Confronting Supernova Cosmology's Statistical and Systematic Uncertainties in a Unified Bayesian Framework

    NASA Astrophysics Data System (ADS)

    Rubin, D.; Aldering, G.; Barbary, K.; Boone, K.; Chappell, G.; Currie, M.; Deustua, S.; Fagrelius, P.; Fruchter, A.; Hayden, B.; Lidman, C.; Nordin, J.; Perlmutter, S.; Saunders, C.; Sofiatti, C.; Supernova Cosmology Project, The

    2015-11-01

    While recent supernova (SN) cosmology research has benefited from improved measurements, current analysis approaches are not statistically optimal and will prove insufficient for future surveys. This paper discusses the limitations of current SN cosmological analyses in treating outliers, selection effects, shape- and color-standardization relations, unexplained dispersion, and heterogeneous observations. We present a new Bayesian framework, called UNITY (Unified Nonlinear Inference for Type-Ia cosmologY), that incorporates significant improvements in our ability to confront these effects. We apply the framework to real SN observations and demonstrate smaller statistical and systematic uncertainties. We verify earlier results that SNe Ia require nonlinear shape and color standardizations, but we now include these nonlinear relations in a statistically well-justified way. This analysis was primarily performed blinded, in that the basic framework was first validated on simulated data before transitioning to real data. We also discuss possible extensions of the method.

  4. Statistical Prediction of Sea Ice Concentration over Arctic

    NASA Astrophysics Data System (ADS)

    Kim, Jongho; Jeong, Jee-Hoon; Kim, Baek-Min

    2017-04-01

    In this study, a statistical method that predict sea ice concentration (SIC) over the Arctic is developed. We first calculate the Season-reliant Empirical Orthogonal Functions (S-EOFs) of monthly Arctic SIC from Nimbus-7 SMMR and DMSP SSM/I-SSMIS Passive Microwave Data, which contain the seasonal cycles (12 months long) of dominant SIC anomaly patterns. Then, the current SIC state index is determined by projecting observed SIC anomalies for latest 12 months to the S-EOFs. Assuming the current SIC anomalies follow the spatio-temporal evolution in the S-EOFs, we project the future (upto 12 months) SIC anomalies by multiplying the SI and the corresponding S-EOF and then taking summation. The predictive skill is assessed by hindcast experiments initialized at all the months for 1980-2010. When comparing predictive skill of SIC predicted by statistical model and NCEP CFS v2, the statistical model shows a higher skill in predicting sea ice concentration and extent.

  5. Landing Site Dispersion Analysis and Statistical Assessment for the Mars Phoenix Lander

    NASA Technical Reports Server (NTRS)

    Bonfiglio, Eugene P.; Adams, Douglas; Craig, Lynn; Spencer, David A.; Strauss, William; Seelos, Frank P.; Seelos, Kimberly D.; Arvidson, Ray; Heet, Tabatha

    2008-01-01

    The Mars Phoenix Lander launched on August 4, 2007 and successfully landed on Mars 10 months later on May 25, 2008. Landing ellipse predicts and hazard maps were key in selecting safe surface targets for Phoenix. Hazard maps were based on terrain slopes, geomorphology maps and automated rock counts of MRO's High Resolution Imaging Science Experiment (HiRISE) images. The expected landing dispersion which led to the selection of Phoenix's surface target is discussed as well as the actual landing dispersion predicts determined during operations in the weeks, days, and hours before landing. A statistical assessment of these dispersions is performed, comparing the actual landing-safety probabilities to criteria levied by the project. Also discussed are applications for this statistical analysis which were used by the Phoenix project. These include using the statistical analysis used to verify the effectiveness of a pre-planned maneuver menu and calculating the probability of future maneuvers.

  6. Statistics in the pharmacy literature.

    PubMed

    Lee, Charlene M; Soin, Herpreet K; Einarson, Thomas R

    2004-09-01

    Research in statistical methods is essential for maintenance of high quality of the published literature. To update previous reports of the types and frequencies of statistical terms and procedures in research studies of selected professional pharmacy journals. We obtained all research articles published in 2001 in 6 journals: American Journal of Health-System Pharmacy, The Annals of Pharmacotherapy, Canadian Journal of Hospital Pharmacy, Formulary, Hospital Pharmacy, and Journal of the American Pharmaceutical Association. Two independent reviewers identified and recorded descriptive and inferential statistical terms/procedures found in the methods, results, and discussion sections of each article. Results were determined by tallying the total number of times, as well as the percentage, that each statistical term or procedure appeared in the articles. One hundred forty-four articles were included. Ninety-eight percent employed descriptive statistics; of these, 28% used only descriptive statistics. The most common descriptive statistical terms were percentage (90%), mean (74%), standard deviation (58%), and range (46%). Sixty-nine percent of the articles used inferential statistics, the most frequent being chi(2) (33%), Student's t-test (26%), Pearson's correlation coefficient r (18%), ANOVA (14%), and logistic regression (11%). Statistical terms and procedures were found in nearly all of the research articles published in pharmacy journals. Thus, pharmacy education should aim to provide current and future pharmacists with an understanding of the common statistical terms and procedures identified to facilitate the appropriate appraisal and consequential utilization of the information available in research articles.

  7. External validation of the Probability of repeated admission (Pra) risk prediction tool in older community-dwelling people attending general practice: a prospective cohort study.

    PubMed

    Wallace, Emma; McDowell, Ronald; Bennett, Kathleen; Fahey, Tom; Smith, Susan M

    2016-11-14

    Emergency admission is associated with the potential for adverse events in older people and risk prediction models are available to identify those at highest risk of admission. The aim of this study was to externally validate and compare the performance of the Probability of repeated admission (Pra) risk model and a modified version (incorporating a multimorbidity measure) in predicting emergency admission in older community-dwelling people. 15 general practices (GPs) in the Republic of Ireland. n=862, ≥70 years, community-dwelling people prospectively followed up for 2 years (2010-2012). Pra risk model (original and modified) calculated for baseline year where ≥0.5 denoted high risk (patient questionnaire, GP medical record review) of future emergency admission. Emergency admission over 1 year (GP medical record review). descriptive statistics, model discrimination (c-statistic) and calibration (Hosmer-Lemeshow statistic). Of 862 patients, a total of 154 (18%) had ≥1 emergency admission(s) in the follow-up year. 63 patients (7%) were classified as high risk by the original Pra and of these 26 (41%) were admitted. The modified Pra classified 391 (45%) patients as high risk and 103 (26%) were subsequently admitted. Both models demonstrated only poor discrimination (original Pra: c-statistic 0.65 (95% CI 0.61 to 0.70); modified Pra: c-statistic 0.67 (95% CI 0.62 to 0.72)). When categorised according to risk-category model, specificity was highest for the original Pra at cut-point of ≥0.5 denoting high risk (95%), and for the modified Pra at cut-point of ≥0.7 (95%). Both models overestimated the number of admissions across all risk strata. While the original Pra model demonstrated poor discrimination, model specificity was high and a small number of patients identified as high risk. Future validation studies should examine higher cut-points denoting high risk for the modified Pra, which has practical advantages in terms of application in GP. The original Pra tool may have a role in identifying higher-risk community-dwelling older people for inclusion in future trials aiming to reduce emergency admissions. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  8. Predictive data modeling of human type II diabetes related statistics

    NASA Astrophysics Data System (ADS)

    Jaenisch, Kristina L.; Jaenisch, Holger M.; Handley, James W.; Albritton, Nathaniel G.

    2009-04-01

    During the course of routine Type II treatment of one of the authors, it was decided to derive predictive analytical Data Models of the daily sampled vital statistics: namely weight, blood pressure, and blood sugar, to determine if the covariance among the observed variables could yield a descriptive equation based model, or better still, a predictive analytical model that could forecast the expected future trend of the variables and possibly eliminate the number of finger stickings required to montior blood sugar levels. The personal history and analysis with resulting models are presented.

  9. Modeling Cell Size Regulation: From Single-Cell-Level Statistics to Molecular Mechanisms and Population-Level Effects.

    PubMed

    Ho, Po-Yi; Lin, Jie; Amir, Ariel

    2018-05-20

    Most microorganisms regulate their cell size. In this article, we review some of the mathematical formulations of the problem of cell size regulation. We focus on coarse-grained stochastic models and the statistics that they generate. We review the biologically relevant insights obtained from these models. We then describe cell cycle regulation and its molecular implementations, protein number regulation, and population growth, all in relation to size regulation. Finally, we discuss several future directions for developing understanding beyond phenomenological models of cell size regulation.

  10. Bayesian inference based on dual generalized order statistics from the exponentiated Weibull model

    NASA Astrophysics Data System (ADS)

    Al Sobhi, Mashail M.

    2015-02-01

    Bayesian estimation for the two parameters and the reliability function of the exponentiated Weibull model are obtained based on dual generalized order statistics (DGOS). Also, Bayesian prediction bounds for future DGOS from exponentiated Weibull model are obtained. The symmetric and asymmetric loss functions are considered for Bayesian computations. The Markov chain Monte Carlo (MCMC) methods are used for computing the Bayes estimates and prediction bounds. The results have been specialized to the lower record values. Comparisons are made between Bayesian and maximum likelihood estimators via Monte Carlo simulation.

  11. Hydrological responses to dynamically and statistically downscaled climate model output

    USGS Publications Warehouse

    Wilby, R.L.; Hay, L.E.; Gutowski, W.J.; Arritt, R.W.; Takle, E.S.; Pan, Z.; Leavesley, G.H.; Clark, M.P.

    2000-01-01

    Daily rainfall and surface temperature series were simulated for the Animas River basin, Colorado using dynamically and statistically downscaled output from the National Center for Environmental Prediction/National Center for Atmospheric Research (NCEP/NCAR) re-analysis. A distributed hydrological model was then applied to the downscaled data. Relative to raw NCEP output, downscaled climate variables provided more realistic stimulations of basin scale hydrology. However, the results highlight the sensitivity of modeled processes to the choice of downscaling technique, and point to the need for caution when interpreting future hydrological scenarios.

  12. Understanding spatial organizations of chromosomes via statistical analysis of Hi-C data

    PubMed Central

    Hu, Ming; Deng, Ke; Qin, Zhaohui; Liu, Jun S.

    2015-01-01

    Understanding how chromosomes fold provides insights into the transcription regulation, hence, the functional state of the cell. Using the next generation sequencing technology, the recently developed Hi-C approach enables a global view of spatial chromatin organization in the nucleus, which substantially expands our knowledge about genome organization and function. However, due to multiple layers of biases, noises and uncertainties buried in the protocol of Hi-C experiments, analyzing and interpreting Hi-C data poses great challenges, and requires novel statistical methods to be developed. This article provides an overview of recent Hi-C studies and their impacts on biomedical research, describes major challenges in statistical analysis of Hi-C data, and discusses some perspectives for future research. PMID:26124977

  13. The Developing Infant Creates a Curriculum for Statistical Learning.

    PubMed

    Smith, Linda B; Jayaraman, Swapnaa; Clerkin, Elizabeth; Yu, Chen

    2018-04-01

    New efforts are using head cameras and eye-trackers worn by infants to capture everyday visual environments from the point of view of the infant learner. From this vantage point, the training sets for statistical learning develop as the sensorimotor abilities of the infant develop, yielding a series of ordered datasets for visual learning that differ in content and structure between timepoints but are highly selective at each timepoint. These changing environments may constitute a developmentally ordered curriculum that optimizes learning across many domains. Future advances in computational models will be necessary to connect the developmentally changing content and statistics of infant experience to the internal machinery that does the learning. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. THE DAMAGING EFFECTS OF ALCOHOL: CHRONIC AND PATTERN ALCOHOL USE EXPLAIN WHY SEXUAL ASSAULT FIGURES HAVE NOT SIGNIFICANTLY DROPPED IN THE UNITED STATES MILITARY

    DTIC Science & Technology

    2015-10-01

    Recommendations ······················································ 7 The Significance of Statistics ...further analysis and documentation in metrics in future surveys. This statistic , alone, in a public military report is enough to warrant an inquiry into...3 It is unknown how many of the total reported sexual assaults involved alcohol use. Other statistical reports indicate 32% of males in the

  15. Statistical Characterization and Classification of Edge-Localized Plasma Instabilities

    NASA Astrophysics Data System (ADS)

    Webster, A. J.; Dendy, R. O.

    2013-04-01

    The statistics of edge-localized plasma instabilities (ELMs) in toroidal magnetically confined fusion plasmas are considered. From first principles, standard experimentally motivated assumptions are shown to determine a specific probability distribution for the waiting times between ELMs: the Weibull distribution. This is confirmed empirically by a statistically rigorous comparison with a large data set from the Joint European Torus. The successful characterization of ELM waiting times enables future work to progress in various ways. Here we present a quantitative classification of ELM types, complementary to phenomenological approaches. It also informs us about the nature of ELM processes, such as whether they are random or deterministic. The methods are extremely general and can be applied to numerous other quasiperiodic intermittent phenomena.

  16. Statistical analysis of RHIC beam position monitors performance

    NASA Astrophysics Data System (ADS)

    Calaga, R.; Tomás, R.

    2004-04-01

    A detailed statistical analysis of beam position monitors (BPM) performance at RHIC is a critical factor in improving regular operations and future runs. Robust identification of malfunctioning BPMs plays an important role in any orbit or turn-by-turn analysis. Singular value decomposition and Fourier transform methods, which have evolved as powerful numerical techniques in signal processing, will aid in such identification from BPM data. This is the first attempt at RHIC to use a large set of data to statistically enhance the capability of these two techniques and determine BPM performance. A comparison from run 2003 data shows striking agreement between the two methods and hence can be used to improve BPM functioning at RHIC and possibly other accelerators.

  17. Forest area and timber resources of the San Joaquin area, California.

    Treesearch

    Charles L. Bolsinger

    1978-01-01

    This report presents statistics on forest area and timber volume and a description of the recent and future timber situations in Alpine, Amador, Calaveras, Fresno, Kern, Kings, Madera, Mariposa, Merced, Mono, San Joaquin, Stanislaus, Tulare, and Tuolumne Counties, California.

  18. The World Wide Web Virtual Library of Museums.

    ERIC Educational Resources Information Center

    Bowen, Jonathan P.

    1995-01-01

    Provides an introduction to and overview of the World Wide Web Virtual Library of Museums, an interactive directory of online museums, including organization of the hyperlinks visitor statistics, possible future direction, and information on some of the sites linked to the library. (JKP)

  19. Innovative United Kingdom Approaches To Measuring Service Quality.

    ERIC Educational Resources Information Center

    Winkworth, Ian

    2001-01-01

    Reports on approaches to measuring the service quality of academic libraries in the United Kingdom. Discusses the role of government and the national background of quality measurement; measurement frameworks; better use of statistics; benchmarking; measuring user satisfaction; and possible future development. (Author/LRW)

  20. Future Directions for NCI’s Surveillance Research Program

    Cancer.gov

    Since the early 1970s, NCI’s SEER program has been an invaluable resource for statistics on cancer in the United States. For the past several years, SEER researchers have been working toward a much broader and comprehensive goal for providing cancer stati

  1. Statistical Accounting for Uncertainty in Modeling Transport in Environmental Systems

    EPA Science Inventory

    Models frequently are used to predict the future extent of ground-water contamination, given estimates of their input parameters and forcing functions. Although models have a well established scientific basis for understanding the interactions between complex phenomena and for g...

  2. Kansas forest inventory, 1981.

    Treesearch

    John S. Jr. Spencer; John K. Strickler; William J. Moyer

    1984-01-01

    The third inventory of the timber resource of Kansas shows a 1.4% increase in commercial forest area and a 42% gain in growing-stock volume between 1965 and 1980. Text and statistics are presented on area, volume, growth, mortality, removals, utilization, biomass, and future timber supply.

  3. Nebraska's second forest inventory.

    Treesearch

    Gerhard K. Raile

    1986-01-01

    The second inventory of the timber resource of Nebraska shows a 25% decline in commercial forest area and a 23% gain in growing-stock volume between 1955 and 1983. Text and statistics are presented on area, volume, growth, mortality, removals, utilization, biomass, and future timber supply.

  4. Climate Change Impacts on the Upper Indus Hydrology: Sources, Shifts and Extremes

    PubMed Central

    Immerzeel, W. W.; Kraaijenbrink, P. D. A.; Shrestha, A. B.; Bierkens, M. F. P.

    2016-01-01

    The Indus basin heavily depends on its upstream mountainous part for the downstream supply of water while downstream demands are high. Since downstream demands will likely continue to increase, accurate hydrological projections for the future supply are important. We use an ensemble of statistically downscaled CMIP5 General Circulation Model outputs for RCP4.5 and RCP8.5 to force a cryospheric-hydrological model and generate transient hydrological projections for the entire 21st century for the upper Indus basin. Three methodological advances are introduced: (i) A new precipitation dataset that corrects for the underestimation of high-altitude precipitation is used. (ii) The model is calibrated using data on river runoff, snow cover and geodetic glacier mass balance. (iii) An advanced statistical downscaling technique is used that accounts for changes in precipitation extremes. The analysis of the results focuses on changes in sources of runoff, seasonality and hydrological extremes. We conclude that the future of the upper Indus basin’s water availability is highly uncertain in the long run, mainly due to the large spread in the future precipitation projections. Despite large uncertainties in the future climate and long-term water availability, basin-wide patterns and trends of seasonal shifts in water availability are consistent across climate change scenarios. Most prominent is the attenuation of the annual hydrograph and shift from summer peak flow towards the other seasons for most ensemble members. In addition there are distinct spatial patterns in the response that relate to monsoon influence and the importance of meltwater. Analysis of future hydrological extremes reveals that increases in intensity and frequency of extreme discharges are very likely for most of the upper Indus basin and most ensemble members. PMID:27828994

  5. Climate Change Impacts on the Upper Indus Hydrology: Sources, Shifts and Extremes.

    PubMed

    Lutz, A F; Immerzeel, W W; Kraaijenbrink, P D A; Shrestha, A B; Bierkens, M F P

    2016-01-01

    The Indus basin heavily depends on its upstream mountainous part for the downstream supply of water while downstream demands are high. Since downstream demands will likely continue to increase, accurate hydrological projections for the future supply are important. We use an ensemble of statistically downscaled CMIP5 General Circulation Model outputs for RCP4.5 and RCP8.5 to force a cryospheric-hydrological model and generate transient hydrological projections for the entire 21st century for the upper Indus basin. Three methodological advances are introduced: (i) A new precipitation dataset that corrects for the underestimation of high-altitude precipitation is used. (ii) The model is calibrated using data on river runoff, snow cover and geodetic glacier mass balance. (iii) An advanced statistical downscaling technique is used that accounts for changes in precipitation extremes. The analysis of the results focuses on changes in sources of runoff, seasonality and hydrological extremes. We conclude that the future of the upper Indus basin's water availability is highly uncertain in the long run, mainly due to the large spread in the future precipitation projections. Despite large uncertainties in the future climate and long-term water availability, basin-wide patterns and trends of seasonal shifts in water availability are consistent across climate change scenarios. Most prominent is the attenuation of the annual hydrograph and shift from summer peak flow towards the other seasons for most ensemble members. In addition there are distinct spatial patterns in the response that relate to monsoon influence and the importance of meltwater. Analysis of future hydrological extremes reveals that increases in intensity and frequency of extreme discharges are very likely for most of the upper Indus basin and most ensemble members.

  6. Eye-gaze determination of user intent at the computer interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldberg, J.H.; Schryver, J.C.

    1993-12-31

    Determination of user intent at the computer interface through eye-gaze monitoring can significantly aid applications for the disabled, as well as telerobotics and process control interfaces. Whereas current eye-gaze control applications are limited to object selection and x/y gazepoint tracking, a methodology was developed here to discriminate a more abstract interface operation: zooming-in or out. This methodology first collects samples of eve-gaze location looking at controlled stimuli, at 30 Hz, just prior to a user`s decision to zoom. The sample is broken into data frames, or temporal snapshots. Within a data frame, all spatial samples are connected into a minimummore » spanning tree, then clustered, according to user defined parameters. Each cluster is mapped to one in the prior data frame, and statistics are computed from each cluster. These characteristics include cluster size, position, and pupil size. A multiple discriminant analysis uses these statistics both within and between data frames to formulate optimal rules for assigning the observations into zooming, zoom-out, or no zoom conditions. The statistical procedure effectively generates heuristics for future assignments, based upon these variables. Future work will enhance the accuracy and precision of the modeling technique, and will empirically test users in controlled experiments.« less

  7. On the Statistical Errors of RADAR Location Sensor Networks with Built-In Wi-Fi Gaussian Linear Fingerprints

    PubMed Central

    Zhou, Mu; Xu, Yu Bin; Ma, Lin; Tian, Shuo

    2012-01-01

    The expected errors of RADAR sensor networks with linear probabilistic location fingerprints inside buildings with varying Wi-Fi Gaussian strength are discussed. As far as we know, the statistical errors of equal and unequal-weighted RADAR networks have been suggested as a better way to evaluate the behavior of different system parameters and the deployment of reference points (RPs). However, up to now, there is still not enough related work on the relations between the statistical errors, system parameters, number and interval of the RPs, let alone calculating the correlated analytical expressions of concern. Therefore, in response to this compelling problem, under a simple linear distribution model, much attention will be paid to the mathematical relations of the linear expected errors, number of neighbors, number and interval of RPs, parameters in logarithmic attenuation model and variations of radio signal strength (RSS) at the test point (TP) with the purpose of constructing more practical and reliable RADAR location sensor networks (RLSNs) and also guaranteeing the accuracy requirements for the location based services in future ubiquitous context-awareness environments. Moreover, the numerical results and some real experimental evaluations of the error theories addressed in this paper will also be presented for our future extended analysis. PMID:22737027

  8. On the statistical errors of RADAR location sensor networks with built-in Wi-Fi Gaussian linear fingerprints.

    PubMed

    Zhou, Mu; Xu, Yu Bin; Ma, Lin; Tian, Shuo

    2012-01-01

    The expected errors of RADAR sensor networks with linear probabilistic location fingerprints inside buildings with varying Wi-Fi Gaussian strength are discussed. As far as we know, the statistical errors of equal and unequal-weighted RADAR networks have been suggested as a better way to evaluate the behavior of different system parameters and the deployment of reference points (RPs). However, up to now, there is still not enough related work on the relations between the statistical errors, system parameters, number and interval of the RPs, let alone calculating the correlated analytical expressions of concern. Therefore, in response to this compelling problem, under a simple linear distribution model, much attention will be paid to the mathematical relations of the linear expected errors, number of neighbors, number and interval of RPs, parameters in logarithmic attenuation model and variations of radio signal strength (RSS) at the test point (TP) with the purpose of constructing more practical and reliable RADAR location sensor networks (RLSNs) and also guaranteeing the accuracy requirements for the location based services in future ubiquitous context-awareness environments. Moreover, the numerical results and some real experimental evaluations of the error theories addressed in this paper will also be presented for our future extended analysis.

  9. Human Responses to Climate Variability: The Case of South Africa

    NASA Astrophysics Data System (ADS)

    Oppenheimer, M.; Licker, R.; Mastrorillo, M.; Bohra-Mishra, P.; Estes, L. D.; Cai, R.

    2014-12-01

    Climate variability has been associated with a range of societal and individual outcomes including migration, violent conflict, changes in labor productivity, and health impacts. Some of these may be direct responses to changes in mean temperature or precipitation or extreme events, such as displacement of human populations by tropical cyclones. Others may be mediated by a variety of biological, social, or ecological factors such as migration in response to long-term changes in crops yields. Research is beginning to elucidate and distinguish the many channels through which climate variability may influence human behavior (ranging from the individual to the collective, societal level) in order to better understand how to improve resilience in the face of current variability as well as future climate change. Using a variety of data sets from South Africa, we show how climate variability has influenced internal (within country) migration in recent history. We focus on South Africa as it is a country with high levels of internal migration and dramatic temperature and precipitation changes projected for the 21st century. High poverty rates and significant levels of rain-fed, smallholder agriculture leave large portions of South Africa's population base vulnerable to future climate change. In this study, we utilize two complementary statistical models - one micro-level model, driven by individual and household level survey data, and one macro-level model, driven by national census statistics. In both models, we consider the effect of climate on migration both directly (with gridded climate reanalysis data) and indirectly (with agricultural production statistics). With our historical analyses of climate variability, we gain insights into how the migration decisions of South Africans may be influenced by future climate change. We also offer perspective on the utility of micro and macro level approaches in the study of climate change and human migration.

  10. Smart climate ensemble exploring approaches: the example of climate impacts on air pollution in Europe.

    NASA Astrophysics Data System (ADS)

    Lemaire, Vincent; Colette, Augustin; Menut, Laurent

    2016-04-01

    Because of its sensitivity to weather patterns, climate change will have an impact on air pollution so that, in the future, a climate penalty could jeopardize the expected efficiency of air pollution mitigation measures. A common method to assess the impact of climate on air quality consists in implementing chemistry-transport models forced by climate projections. However, at present, such impact assessment lack multi-model ensemble approaches to address uncertainties because of the substantial computing cost. Therefore, as a preliminary step towards exploring large climate ensembles with air quality models, we developed an ensemble exploration technique in order to point out the climate models that should be investigated in priority. By using a training dataset from a deterministic projection of climate and air quality over Europe, we identified the main meteorological drivers of air quality for 8 regions in Europe and developed statistical models that could be used to estimate future air pollutant concentrations. Applying this statistical model to the whole EuroCordex ensemble of climate projection, we find a climate penalty for six subregions out of eight (Eastern Europe, France, Iberian Peninsula, Mid Europe and Northern Italy). On the contrary, a climate benefit for PM2.5 was identified for three regions (Eastern Europe, Mid Europe and Northern Italy). The uncertainty of this statistical model challenges limits however the confidence we can attribute to associated quantitative projections. This technique allows however selecting a subset of relevant regional climate model members that should be used in priority for future deterministic projections to propose an adequate coverage of uncertainties. We are thereby proposing a smart ensemble exploration strategy that can also be used for other impacts studies beyond air quality.

  11. Knowledge of women in family planning and future desire to use contraception: a cross sectional survey in Urban Cameroon.

    PubMed

    Ajong, Atem Bethel; Njotang, Philip Nana; Kenfack, Bruno; Yakum, Martin Ndinakie; Mbu, Enow Robinson

    2016-07-18

    The rate of modern contraceptive use will be on an increase and maternal mortality on a decrease if women had a good knowledge on family planning and its methods. This survey was designed to evaluate the knowledge and determine the future desires to use contraception among women in Urban Cameroon. We conducted a cross sectional community based survey from March 2015 to April 2015 targeting women of childbearing age in the Biyem-Assi Health District. Participants were included using a multistep cluster sampling and the data collected face to face by well-trained surveyors using a pretested and validated questionnaire. The data were then analysed using the statistical software Epi-Info version 3.5.4. Proportions and their 95 % confidence intervals were calculated and in a multiple logistic regression model with threshold of significance set at p value ≤0.05, the odds ratio was used as the measure of association between selected covariates and future desire to use contraception. Among the 712 women included in the survey, the mean age was 27.5 ± 6.5 years. A good proportion (95.6 %) identified contraception as used to prevent unwanted pregnancy and this showed an increasing trend with increasing level of education. Also, 77.5 % thought that contraception should be used by all sexually active women. The most cited contraceptive methods were; condom 689 (96.8 %), oral pills 507 (71.2 %), and implants 390 (54.8 %). Their main sources of information were the health personnel (47.7 %) and the school (23.6 %). It was estimated that 31.0 [25.5-37.0] % of current contraceptive non-users had no desire of adopting a contraceptive method in the future. With the level of education, age, and marital status controlled, the number of unplanned pregnancies more than 3 (OR 0.66 [0.45-0.97], p = 0.035), and past adoption of more than 2 modern contraceptive methods (OR 0.45 [0.21-0.97], p = 0.041) were statistically significantly associated to decreased desire to adopt contraception in the future. The level of knowledge showed an association though not statistically significant with future desire to use contraception (OR 0.80 [0.47-1.37], p = 0.061). The knowledge of women of childbearing in the Biyem-Assi Health District was relatively high but still unsatisfactory. The proportion of contraceptive non users who have no desire of adopting any contraceptive method in future is still unacceptably high. Policy makers should improve on their strategies while empowering the health personnel and working in collaboration with the education ministries.

  12. The Science-Policy Link: Stakeholder Reactions to the Uncertainties of Future Sea Level Rise

    NASA Astrophysics Data System (ADS)

    Plag, H.; Bye, B.

    2011-12-01

    Policy makers and stakeholders in the coastal zone are equally challenged by the risk of an anticipated rise of coastal Local Sea Level (LSL) as a consequence of future global warming. Many low-lying and often densely populated coastal areas are under risk of increased inundation. More than 40% of the global population is living in or near the coastal zone and this fraction is steadily increasing. A rise in LSL will increase the vulnerability of coastal infrastructure and population dramatically, with potentially devastating consequences for the global economy, society, and environment. Policy makers are faced with a trade-off between imposing today the often very high costs of coastal protection and adaptation upon national economies and leaving the costs of potential major disasters to future generations. They are in need of actionable information that provides guidance for the development of coastal zones resilient to future sea level changes. Part of this actionable information comes from risk and vulnerability assessments, which require information on future LSL changes as input. In most cases, a deterministic approach has been applied based on predictions of the plausible range of future LSL trajectories as input. However, there is little consensus in the scientific community on how these trajectories should be determined, and what the boundaries of the plausible range are. Over the last few years, many publications in Science, Nature and other peer-reviewed scientific journals have revealed a broad range of possible futures and significant epistemic uncertainties and gaps concerning LSL changes. Based on the somewhat diffuse science input, policy and decision makers have made rather different choices for mitigation and adaptation in cases such as Venice, The Netherlands, New York City, and the San Francisco Bay area. Replacing the deterministic, prediction-based approach with a statistical one that fully accounts for the uncertainties and epistemic gaps would provide a different kind of science input to policy makers and stakeholders. Like in many other insurance problems (for example, earthquakes), where deterministic predictions are not possible and decisions have to be made on the basis of statistics and probabilities, the statistical approach to coastal resilience would require stakeholders to make decisions on the basis of probabilities instead of predictions. The science input for informed decisions on adaptation would consist of general probabilities of decadal to century scale sea level changes derived from paleo records, including the probabilities for large and rapid rises. Similar to other problems where the appearance of a hazard is associated with a high risk (like a fire in a house), this approach would also require a monitoring and warning system (a "smoke detector") capable of detecting any onset of a rapid sea level rise.

  13. [The Hessian care monitor. Transparency on regional labor markets].

    PubMed

    Lauxen, O; Bieräugel, R

    2013-08-01

    The Hessian Care Monitor is a Web-based monitoring system of the regional care labor market. It contains information on the current labor market and on future developments. Official statistics are analyzed, primary data are collected, and forecasts are calculated. Since 2008, the demand for nurses in Hesse has been higher than the supply. In 2010, there was a lack of more than 4,400 nurses. Moreover, in 2025, around 5,500 additional nurses will be needed to meet the increasing demand arising from demographic changes. However, there are three different regional patterns: regions with high current shortages but little additional demand in the future; regions with low current shortages but large future needs; and regions with high current shortages and large future demand. Appropriate strategies for handling labor shortages have to be selected according to the different regional patterns.

  14. The future of the New Zealand plastic surgery workforce.

    PubMed

    Adams, Brandon M; Klaassen, Michael F; Tan, Swee T

    2013-04-05

    The New Zealand (NZ) plastic and reconstructive surgery (PRS) workforce provides reconstructive plastic surgery (RPS) public services from six centres. There has been little analysis on whether the workforce is adequate to meet the needs of the NZ population currently or in the future. This study analysed the current workforce, its distribution and future requirements. PRS manpower data, workforce activities, population statistics, and population modelling were analysed to determine current needs and predict future needs for the PRS workforce. The NZ PRS workforce is compared with international benchmarks. Regional variation of the workforce was analysed with respect to the population's access to PRS services. Future supply of specialist plastic surgeons is analysed. NZ has a lower number of plastic surgeons per capita than comparable countries. The current NZ PRS workforce is mal-distributed. Areas of current and emerging future need are identified. The current workforce mal-distribution will worsen with future population growth and distribution. Up to 60% of the NZ population will be at risk of inadequate access to PRS services by 2027. Development of PRS services must be coordinated to ensure that equitable and sustainable services are available throughout NZ. Strategies for ensuring satisfactory future workforce are discussed.

  15. Future Orientation among Students Exposed to School Bullying and Cyberbullying Victimization

    PubMed Central

    Låftman, Sara B.; Alm, Susanne; Sandahl, Julia; Modin, Bitte

    2018-01-01

    Future orientation can be defined as an individual’s thoughts, beliefs, plans, and hopes for the future. Earlier research has shown adolescents’ future orientation to predict outcomes later in life, which makes it relevant to analyze differences in future orientation among youth. The aim of the present study was to analyze if bullying victimization was associated with an increased likelihood of reporting a pessimistic future orientation among school youth. To be able to distinguish between victims and bully-victims (i.e., students who are both bullies and victims), we also took perpetration into account. The data were derived from the Stockholm School Survey performed in 2016 among ninth grade students (ages 15–16 years) (n = 5144). Future orientation and involvement in school bullying and in cyberbullying were based on self-reports. The statistical method used was binary logistic regression. The results demonstrated that victims and bully-victims of school bullying and of cyberbullying were more likely to report a pessimistic future orientation compared with students not involved in bullying. These associations were shown also when involvement in school bullying and cyberbullying were mutually adjusted. The findings underline the importance of anti-bullying measures that target both school bullying and cyberbullying. PMID:29584631

  16. Future Orientation among Students Exposed to School Bullying and Cyberbullying Victimization.

    PubMed

    Låftman, Sara B; Alm, Susanne; Sandahl, Julia; Modin, Bitte

    2018-03-27

    Future orientation can be defined as an individual's thoughts, beliefs, plans, and hopes for the future. Earlier research has shown adolescents' future orientation to predict outcomes later in life, which makes it relevant to analyze differences in future orientation among youth. The aim of the present study was to analyze if bullying victimization was associated with an increased likelihood of reporting a pessimistic future orientation among school youth. To be able to distinguish between victims and bully-victims (i.e., students who are both bullies and victims), we also took perpetration into account. The data were derived from the Stockholm School Survey performed in 2016 among ninth grade students (ages 15-16 years) ( n = 5144). Future orientation and involvement in school bullying and in cyberbullying were based on self-reports. The statistical method used was binary logistic regression. The results demonstrated that victims and bully-victims of school bullying and of cyberbullying were more likely to report a pessimistic future orientation compared with students not involved in bullying. These associations were shown also when involvement in school bullying and cyberbullying were mutually adjusted. The findings underline the importance of anti-bullying measures that target both school bullying and cyberbullying.

  17. The importance of vegetation change in the prediction of future tropical cyclone flood statistics

    NASA Astrophysics Data System (ADS)

    Irish, J. L.; Resio, D.; Bilskie, M. V.; Hagen, S. C.; Weiss, R.

    2015-12-01

    Global sea level rise is a near certainty over the next century (e.g., Stocker et al. 2013 [IPCC] and references therein). With sea level rise, coastal topography and land cover (hereafter "landscape") is expected to change and tropical cyclone flood hazard is expected to accelerate (e.g., Irish et al. 2010 [Ocean Eng], Woodruff et al. 2013 [Nature], Bilskie et al. 2014 [Geophys Res Lett], Ferreira et al. 2014 [Coast Eng], Passeri et al. 2015 [Nat Hazards]). Yet, the relative importance of sea-level rise induced landscape change on future tropical cyclone flood hazard assessment is not known. In this paper, idealized scenarios are used to evaluate the relative impact of one class of landscape change on future tropical cyclone extreme-value statistics in back-barrier regions: sea level rise induced vegetation migration and loss. The joint probability method with optimal sampling (JPM-OS) (Resio et al. 2009 [Nat Hazards]) with idealized surge response functions (e.g., Irish et al. 2009 [Nat Hazards]) is used to quantify the present-day and future flood hazard under various sea level rise scenarios. Results are evaluated in terms of their impact on the flood statistics (a) when projected flood elevations are included directly in the JPM analysis (Figure 1) and (b) when represented as additional uncertainty within the JPM integral (Resio et al. 2013 [Nat Hazards]), i.e., as random error. Findings are expected to aid in determining the level of effort required to reasonably account for future landscape change in hazard assessments, namely in determining when such processes are sufficiently captured by added uncertainty and when sea level rise induced vegetation changes must be considered dynamically, via detailed modeling initiatives. Acknowledgements: This material is based upon work supported by the National Science Foundation under Grant No. CMMI-1206271 and by the National Sea Grant College Program of the U.S. Department of Commerce's National Oceanic and Atmospheric Administration under Grant No. NA10OAR4170099. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of these organizations. The STOKES ARCC at the University of Central Florida provided computational resources for storm surge simulations.

  18. A combination of routine blood analytes predicts fitness decrement in elderly endurance athletes.

    PubMed

    Haslacher, Helmuth; Ratzinger, Franz; Perkmann, Thomas; Batmyagmar, Delgerdalai; Nistler, Sonja; Scherzer, Thomas M; Ponocny-Seliger, Elisabeth; Pilger, Alexander; Gerner, Marlene; Scheichenberger, Vanessa; Kundi, Michael; Endler, Georg; Wagner, Oswald F; Winker, Robert

    2017-01-01

    Endurance sports are enjoying greater popularity, particularly among new target groups such as the elderly. Predictors of future physical capacities providing a basis for training adaptations are in high demand. We therefore aimed to estimate the future physical performance of elderly marathoners (runners/bicyclists) using a set of easily accessible standard laboratory parameters. To this end, 47 elderly marathon athletes underwent physical examinations including bicycle ergometry and a blood draw at baseline and after a three-year follow-up period. In order to compile a statistical model containing baseline laboratory results allowing prediction of follow-up ergometry performance, the cohort was subgrouped into a model training (n = 25) and a test sample (n = 22). The model containing significant predictors in univariate analysis (alanine aminotransferase, urea, folic acid, myeloperoxidase and total cholesterol) presented with high statistical significance and excellent goodness of fit (R2 = 0.789, ROC-AUC = 0.951±0.050) in the model training sample and was validated in the test sample (ROC-AUC = 0.786±0.098). Our results suggest that standard laboratory parameters could be particularly useful for predicting future physical capacity in elderly marathoners. It hence merits further research whether these conclusions can be translated to other disciplines or age groups.

  19. A combination of routine blood analytes predicts fitness decrement in elderly endurance athletes

    PubMed Central

    Ratzinger, Franz; Perkmann, Thomas; Batmyagmar, Delgerdalai; Nistler, Sonja; Scherzer, Thomas M.; Ponocny-Seliger, Elisabeth; Pilger, Alexander; Gerner, Marlene; Scheichenberger, Vanessa; Kundi, Michael; Endler, Georg; Wagner, Oswald F.; Winker, Robert

    2017-01-01

    Endurance sports are enjoying greater popularity, particularly among new target groups such as the elderly. Predictors of future physical capacities providing a basis for training adaptations are in high demand. We therefore aimed to estimate the future physical performance of elderly marathoners (runners/bicyclists) using a set of easily accessible standard laboratory parameters. To this end, 47 elderly marathon athletes underwent physical examinations including bicycle ergometry and a blood draw at baseline and after a three-year follow-up period. In order to compile a statistical model containing baseline laboratory results allowing prediction of follow-up ergometry performance, the cohort was subgrouped into a model training (n = 25) and a test sample (n = 22). The model containing significant predictors in univariate analysis (alanine aminotransferase, urea, folic acid, myeloperoxidase and total cholesterol) presented with high statistical significance and excellent goodness of fit (R2 = 0.789, ROC-AUC = 0.951±0.050) in the model training sample and was validated in the test sample (ROC-AUC = 0.786±0.098). Our results suggest that standard laboratory parameters could be particularly useful for predicting future physical capacity in elderly marathoners. It hence merits further research whether these conclusions can be translated to other disciplines or age groups. PMID:28475643

  20. Evaluating the uncertainty of predicting future climate time series at the hourly time scale

    NASA Astrophysics Data System (ADS)

    Caporali, E.; Fatichi, S.; Ivanov, V. Y.

    2011-12-01

    A stochastic downscaling methodology is developed to generate hourly, point-scale time series for several meteorological variables, such as precipitation, cloud cover, shortwave radiation, air temperature, relative humidity, wind speed, and atmospheric pressure. The methodology uses multi-model General Circulation Model (GCM) realizations and an hourly weather generator, AWE-GEN. Probabilistic descriptions of factors of change (a measure of climate change with respect to historic conditions) are computed for several climate statistics and different aggregation times using a Bayesian approach that weights the individual GCM contributions. The Monte Carlo method is applied to sample the factors of change from their respective distributions thereby permitting the generation of time series in an ensemble fashion, which reflects the uncertainty of climate projections of future as well as the uncertainty of the downscaling procedure. Applications of the methodology and probabilistic expressions of certainty in reproducing future climates for the periods, 2000 - 2009, 2046 - 2065 and 2081 - 2100, using the 1962 - 1992 period as the baseline, are discussed for the location of Firenze (Italy). The climate predictions for the period of 2000 - 2009 are tested against observations permitting to assess the reliability and uncertainties of the methodology in reproducing statistics of meteorological variables at different time scales.

  1. It's the Heat AND the Humidity -- Assessment of Extreme Heat Scenarios to Enable the Assessment of Climate Impacts on Public Health

    NASA Technical Reports Server (NTRS)

    Crosson, William L; Al-Hamdan, Mohammad Z.; Economou, Sigrid, A.; Estes, Maurice G.; Estes, Sue M.; Puckett, Mark; Quattrochi, Dale A

    2013-01-01

    In the United States, extreme heat is the most deadly weather-related hazard. In the face of a warming climate and urbanization, which contributes to local-scale urban heat islands, it is very likely that extreme heat events (EHEs) will become more common and more severe in the U.S. In a NASA-funded project supporting the National Climate Assessment, we are providing historical and future measures of extreme heat to enable assessments of the impacts of heat on public health over the coterminous U.S. We use atmospheric temperature and humidity information from meteorological reanalysis and from Global Climate Models (GCMs) to provide data on past and future heat events. The project s emphasis is on providing assessments of the magnitude, frequency and geographic distribution of extreme heat in the U.S. to facilitate public health studies. In our approach, long-term climate change is captured with GCM output, and the temporal and spatial characteristics of short-term extremes are represented by the reanalysis data. Two future time horizons, 2040 and 2090, are the focus of future assessments; these are compared to the recent past period of 1981-2000. We are characterizing regional-scale temperature and humidity conditions using GCM output for two climate change scenarios (A2 and A1B) defined in the Special Report on Emissions Scenarios (SRES). For each future period, 20 years of multi-model GCM output have been analyzed to develop a heat stress climatology based on statistics of extreme heat indicators. Differences between the two future and past periods have been used to define temperature and humidity changes on a monthly time scale and regional spatial scale. These changes, combined with hourly historical meteorological data at a spatial scale (12 km) much finer than that of GCMs, enable us to create future climate realizations, from which we compute the daily heat stress measures and related spatially-specific climatological fields. These include the mean annual number of days above certain thresholds of maximum and minimum air temperatures, heat indices and a new heat stress variable that gives an integrated measure of heat stress (and relief) over the course of a day. Comparisons are made between projected (2040 and 2090) and past (1990) heat stress statistics. All output is being provided at the 12 km spatial scale and will also be aggregated to the county level, which is a popular scale of analysis for public health researchers. County-level statistics will be made available by our collaborators at the Centers for Disease Control and Prevention (CDC) via the Wide-ranging Online Data for Epidemiologic Research (WONDER) system. CDC WONDER makes the information resources of the CDC available to public health professionals and the general public. This addition of heat stress measures to CDC WONDER will allow decision and policy makers to assess the impact of alternative approaches to optimize the public health response to EHEs. It will also allow public health researchers and policy makers to better include such heat stress measures in the context of national health data available in the CDC WONDER system. The users will be able to spatially and temporally query public health and heat-related data sets and create county-level maps and statistical charts of such data across the coterminous U.S.

  2. Validity and usability of a safe driving behaviors measure for older adults : strategy for congestion mitigation.

    DOT National Transportation Integrated Search

    2012-01-01

    Statistics project that crash/injury/fatality rates of older drivers will increase with the future growth of : this population. Accurate and precise measurement of older driver behaviors becomes imperative to : curtail these crash trends and resultin...

  3. Social Factors Impacting Recruitment and Retention of the Civilian Acquisition Workforce

    DTIC Science & Technology

    2002-12-01

    their own futures through entrepreneurism . Some marketing experts have caught on. In 1995, Prudential Insurance replaced its previous slogan, "Get a...counterpart… Millennial teens are extending Gen-X improvements. The latest-- 1999--statistics show record lows: suicide, lowest level since 1959

  4. 78 FR 41045 - Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-09

    ... an applicant's personality traits and tendencies, but the results are not part of the application... personality traits (or combination of factors) constitute statistically significant indicators of success as... to assess whether certain traits and/or behaviors are indicators of future success in the JAG Corps...

  5. Newer classification and regression tree techniques: Bagging and Random Forests for ecological prediction

    Treesearch

    Anantha M. Prasad; Louis R. Iverson; Andy Liaw; Andy Liaw

    2006-01-01

    We evaluated four statistical models - Regression Tree Analysis (RTA), Bagging Trees (BT), Random Forests (RF), and Multivariate Adaptive Regression Splines (MARS) - for predictive vegetation mapping under current and future climate scenarios according to the Canadian Climate Centre global circulation model.

  6. Aquatic invasive species early detection in the Great Lakes: Lessons concerning strategy

    EPA Science Inventory

    Great Lakes coastal systems are vulnerable to introduction of a wide variety of non-indigenous species (NIS), and the desire to effectively respond to future invaders is prompting efforts towards establishing a broad early-detection network. Such a network requires statistically...

  7. 77 FR 8804 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-15

    ..., is frequently used to monitor the business cycle. This survey provides an essential component of the... planning and analysis to business firms, trade associations, research and consulting agencies, and academia... project future movements in manufacturing activity. These statistics are valuable for analysts of business...

  8. The Optical Gravitational Lensing Experiment

    NASA Technical Reports Server (NTRS)

    Udalski, A.; Szymanski, M.; Kaluzny, J.; Kubiak, M.; Mateo, Mario

    1992-01-01

    The technical features are described of the Optical Gravitational Lensing Experiment, which aims to detect a statistically significant number of microlensing events toward the Galactic bulge. Clusters of galaxies observed during the 1992 season are listed and discussed and the reduction methods are described. Future plans are addressed.

  9. 50 CFR 600.315 - National Standard 2-Scientific Information.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...., abundance, environmental, catch statistics, market and trade trends) provide time-series information on... comment should be solicited at appropriate times during the review of scientific information... information or the promise of future data collection or analysis. In some cases, due to time constraints...

  10. Architectural Barriers to the Physically Disabled.

    ERIC Educational Resources Information Center

    Kirkland, Sue-Anne

    Presented is evidence on the increasing need to plan for the accommodation of the physically handicapped in the design and construction of present and future public buildings and transportation facilities in Canada. Terms such as "architectural barriers" and "disability" are defined. Statistics on disability incidence in Canada…

  11. Training in metabolomics research. II. Processing and statistical analysis of metabolomics data, metabolite identification, pathway analysis, applications of metabolomics and its future.

    PubMed

    Barnes, Stephen; Benton, H Paul; Casazza, Krista; Cooper, Sara J; Cui, Xiangqin; Du, Xiuxia; Engler, Jeffrey; Kabarowski, Janusz H; Li, Shuzhao; Pathmasiri, Wimal; Prasain, Jeevan K; Renfrow, Matthew B; Tiwari, Hemant K

    2016-08-01

    Metabolomics, a systems biology discipline representing analysis of known and unknown pathways of metabolism, has grown tremendously over the past 20 years. Because of its comprehensive nature, metabolomics requires careful consideration of the question(s) being asked, the scale needed to answer the question(s), collection and storage of the sample specimens, methods for extraction of the metabolites from biological matrices, the analytical method(s) to be employed and the quality control of the analyses, how collected data are correlated, the statistical methods to determine metabolites undergoing significant change, putative identification of metabolites and the use of stable isotopes to aid in verifying metabolite identity and establishing pathway connections and fluxes. This second part of a comprehensive description of the methods of metabolomics focuses on data analysis, emerging methods in metabolomics and the future of this discipline. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  12. Estimating Traffic Accidents in Turkey Using Differential Evolution Algorithm

    NASA Astrophysics Data System (ADS)

    Akgüngör, Ali Payıdar; Korkmaz, Ersin

    2017-06-01

    Estimating traffic accidents play a vital role to apply road safety procedures. This study proposes Differential Evolution Algorithm (DEA) models to estimate the number of accidents in Turkey. In the model development, population (P) and the number of vehicles (N) are selected as model parameters. Three model forms, linear, exponential and semi-quadratic models, are developed using DEA with the data covering from 2000 to 2014. Developed models are statistically compared to select the best fit model. The results of the DE models show that the linear model form is suitable to estimate the number of accidents. The statistics of this form is better than other forms in terms of performance criteria which are the Mean Absolute Percentage Errors (MAPE) and the Root Mean Square Errors (RMSE). To investigate the performance of linear DE model for future estimations, a ten-year period from 2015 to 2024 is considered. The results obtained from future estimations reveal the suitability of DE method for road safety applications.

  13. Special-"T" Training: Extended Follow-up Results from a Residency-Wide Professionalism Workshop on Transgender Health.

    PubMed

    Kidd, Jeremy D; Bockting, Walter; Cabaniss, Deborah L; Blumenshine, Philip

    2016-10-01

    Transgender people face unique challenges when accessing health care, including stigma and discrimination. Most residency programs devote little time to this marginalized population. The authors developed a 90-min workshop to enhance residents' ability to empathize with and professionally treat transgender patients. Attendees completed pre-, post, and 90-day follow-up surveys to assess perceived empathy, knowledge, comfort, interview skill, and motivation for future learning. Twenty-two residents (64.7 %) completed pre- and post-workshop surveys; 90.9 % of these completed the 90-day follow-up. Compared to baseline, there were statistically significant post-workshop increases in perceived empathy, knowledge, comfort, and motivation for future learning. However on 90-day follow-up, there were no statistically significant differences across any of the five domains, compared to baseline. This workshop produced significant short-term increases in resident professionalism toward transgender patients. However, extended follow-up results highlight the limitations of one-time interventions and call for recurrent programming to yield durable improvements.

  14. Holo-analysis.

    PubMed

    Rosen, G D

    2006-06-01

    Meta-analysis is a vague descriptor used to encompass very diverse methods of data collection analysis, ranging from simple averages to more complex statistical methods. Holo-analysis is a fully comprehensive statistical analysis of all available data and all available variables in a specified topic, with results expressed in a holistic factual empirical model. The objectives and applications of holo-analysis include software production for prediction of responses with confidence limits, translation of research conditions to praxis (field) circumstances, exposure of key missing variables, discovery of theoretically unpredictable variables and interactions, and planning future research. Holo-analyses are cited as examples of the effects on broiler feed intake and live weight gain of exogenous phytases, which account for 70% of variation in responses in terms of 20 highly significant chronological, dietary, environmental, genetic, managemental, and nutrient variables. Even better future accountancy of variation will be facilitated if and when authors of papers routinely provide key data for currently neglected variables, such as temperatures, complete feed formulations, and mortalities.

  15. Mathematics preparation for medical school: do all premedical students need calculus?

    PubMed

    Nusbaum, Neil J

    2006-01-01

    The premedical student confronts a disparate set of required and recommended courses from the various medical schools to which the student might apply. Students may feel compelled to take courses such as calculus even though most medical schools do not require it and even though it may not be related to either undergraduate academic plans or the core academic needs of the typical future physician. Basic mathematical knowledge--a knowledge of algebra, statistics, and overall numeracy--are each more important for most future physicians than is the traditional calculus course.

  16. Origin and Future of Plasmonic Optical Tweezers

    PubMed Central

    Huang, Jer-Shing; Yang, Ya-Tang

    2015-01-01

    Plasmonic optical tweezers can overcome the diffraction limits of conventional optical tweezers and enable the trapping of nanoscale objects. Extension of the trapping and manipulation of nanoscale objects with nanometer position precision opens up unprecedented opportunities for applications in the fields of biology, chemistry and statistical and atomic physics. Potential applications include direct molecular manipulation, lab-on-a-chip applications for viruses and vesicles and the study of nanoscale transport. This paper reviews the recent research progress and development bottlenecks and provides an overview of possible future directions in this field. PMID:28347051

  17. Origin and Future of Plasmonic Optical Tweezers.

    PubMed

    Huang, Jer-Shing; Yang, Ya-Tang

    2015-06-12

    Plasmonic optical tweezers can overcome the diffraction limits of conventional optical tweezers and enable the trapping of nanoscale objects. Extension of the trapping and manipulation of nanoscale objects with nanometer position precision opens up unprecedented opportunities for applications in the fields of biology, chemistry and statistical and atomic physics. Potential applications include direct molecular manipulation, lab-on-a-chip applications for viruses and vesicles and the study of nanoscale transport. This paper reviews the recent research progress and development bottlenecks and provides an overview of possible future directions in this field.

  18. Addressing the mischaracterization of extreme rainfall in regional climate model simulations - A synoptic pattern based bias correction approach

    NASA Astrophysics Data System (ADS)

    Li, Jingwan; Sharma, Ashish; Evans, Jason; Johnson, Fiona

    2018-01-01

    Addressing systematic biases in regional climate model simulations of extreme rainfall is a necessary first step before assessing changes in future rainfall extremes. Commonly used bias correction methods are designed to match statistics of the overall simulated rainfall with observations. This assumes that change in the mix of different types of extreme rainfall events (i.e. convective and non-convective) in a warmer climate is of little relevance in the estimation of overall change, an assumption that is not supported by empirical or physical evidence. This study proposes an alternative approach to account for the potential change of alternate rainfall types, characterized here by synoptic weather patterns (SPs) using self-organizing maps classification. The objective of this study is to evaluate the added influence of SPs on the bias correction, which is achieved by comparing the corrected distribution of future extreme rainfall with that using conventional quantile mapping. A comprehensive synthetic experiment is first defined to investigate the conditions under which the additional information of SPs makes a significant difference to the bias correction. Using over 600,000 synthetic cases, statistically significant differences are found to be present in 46% cases. This is followed by a case study over the Sydney region using a high-resolution run of the Weather Research and Forecasting (WRF) regional climate model, which indicates a small change in the proportions of the SPs and a statistically significant change in the extreme rainfall over the region, although the differences between the changes obtained from the two bias correction methods are not statistically significant.

  19. Current and future pluvial flood hazard analysis for the city of Antwerp

    NASA Astrophysics Data System (ADS)

    Willems, Patrick; Tabari, Hossein; De Niel, Jan; Van Uytven, Els; Lambrechts, Griet; Wellens, Geert

    2016-04-01

    For the city of Antwerp in Belgium, higher rainfall extremes were observed in comparison with surrounding areas. The differences were found statistically significant for some areas and may be the result of the heat island effect in combination with the higher concentrations of aerosols. A network of 19 rain gauges but with varying records length (the longest since the 1960s) and continuous radar data for 10 years were combined to map the spatial variability of rainfall extremes over the city at various durations from 15 minutes to 1 day together with the uncertainty. The improved spatial rainfall information was used as input in the sewer system model of the city to analyze the frequency of urban pluvial floods. Comparison with historical flood observations from various sources (fire brigade and media) confirmed that the improved spatial rainfall information also improved sewer impact results on both the magnitude and frequency of the sewer floods. Next to these improved urban flood impact results for recent and current climatological conditions, the new insights on the local rainfall microclimate were also helpful to enhance future projections on rainfall extremes and pluvial floods in the city. This was done by improved statistical downscaling of all available CMIP5 global climate model runs (160 runs) for the 4 RCP scenarios, as well as the available EURO-CORDEX regional climate model runs. Two types of statistical downscaling methods were applied for that purpose (a weather typing based method, and a quantile perturbation approach), making use of the microclimate results and its dependency on specific weather types. Changes in extreme rainfall intensities were analyzed and mapped as a function of the RCP scenario, together with the uncertainty, decomposed in the uncertainties related to the climate models, the climate model initialization or limited length of the 30-year time series (natural climate variability) and the statistical downscaling (albeit limited to two types of methods). These were finally transferred into future pluvial flash flood hazard maps for the city together with the uncertainties, and are considered as basis for spatial planning and adaptation.

  20. Nepal’s Strategic Future: Following India, or China, or Middle Road

    DTIC Science & Technology

    2010-12-10

    Statements The policies of the Nepalese, Indian, and Chinese governments, official speeches and statements, and statistics account for a large portion......vividly portrays Nepal as a country of extraordinary contrasts, which has been constantly buffeted throughout history by China and India. He further

  1. Iowa's forest resources, 1974.

    Treesearch

    John S. Jr. Spencer; Pamela J. Jakes

    1980-01-01

    The second inventory of Iowa's forest resources shows big declines in commercial forest area and in growing-stock and sawtimber volumes between 1954 and 1974. Presented are text and statistics on forest area and timber volume, growth, mortality, ownership, stocking, future timber supply, timber use, forest management opportunities, and nontimber resources.

  2. Climate model biases and statistical downscaling for application in hydrologic model

    USDA-ARS?s Scientific Manuscript database

    Climate change impact studies use global climate model (GCM) simulations to define future temperature and precipitation. The best available bias-corrected GCM output was obtained from Coupled Model Intercomparison Project phase 5 (CMIP5). CMIP5 data (temperature and precipitation) are available in d...

  3. 78 FR 38694 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-27

    ... involved in federally funded research. The information obtained by these surveys will be used to assist... also provide statistical and demographic basis for the design of follow-on surveys. Future surveys will..., Associated Form and OMB Number: Customer Satisfaction Surveys--Generic Clearance; OMB Control Number 0704...

  4. Advising African American and Latino Students

    ERIC Educational Resources Information Center

    Roscoe, Jason L.

    2015-01-01

    The volume of minority students entering colleges and universities will increase significantly over the next thirty-five years. Many of these students are statistically under-prepared both academically and socially for the higher education environment. To meet the needs of current and future minority students, particularly those from African…

  5. The fourth Minnesota forest inventory: timber volumes and projections of timber supply.

    Treesearch

    John S. Jr. Spencer

    1982-01-01

    The fourth inventory of Minnesota's forest resources shows a 21% increase in growing-stock volume between 1962 and 1977, from 9.4 to 11.5 billion cubic feet. Presented are text and statistics on timber volume, growth, mortality, removals, and future timber supply.

  6. Depressive personality disorder: theoretical issues, clinical findings, and future research questions.

    PubMed

    Huprich, S K

    1998-08-01

    This article reviews the theoretical construct of depressive personality disorder and its related research. The history of depressive personality disorder is reviewed. It is concluded that differing theories converge on similar descriptions and mechanisms of development for the depressive personality disorder. Substantial empirical work supports the diagnostic distinctiveness of depressive personality disorder in clinical populations. Past and current assessment devices for assessing depressive personality disorder are also described along with their psychometric properties and clinical value. Suggestions are made for future research on the etiology and validity of the depressive personality disorder construct in order to facilitate deciding whether or not to include depressive personality disorder in future editions of the Diagnostic and Statistical Manual of Mental Disorders.

  7. Probabilistic objective functions for sensor management

    NASA Astrophysics Data System (ADS)

    Mahler, Ronald P. S.; Zajic, Tim R.

    2004-08-01

    This paper continues the investigation of a foundational and yet potentially practical basis for control-theoretic sensor management, using a comprehensive, intuitive, system-level Bayesian paradigm based on finite-set statistics (FISST). In this paper we report our most recent progress, focusing on multistep look-ahead -- i.e., allocation of sensor resources throughout an entire future time-window. We determine future sensor states in the time-window using a "probabilistically natural" sensor management objective function, the posterior expected number of targets (PENT). This objective function is constructed using a new "maxi-PIMS" optimization strategy that hedges against unknowable future observation-collections. PENT is used in conjuction with approximate multitarget filters: the probability hypothesis density (PHD) filter or the multi-hypothesis correlator (MHC) filter.

  8. The future population and the future labour force.

    PubMed

    Young, C

    1994-01-01

    "The combination of two recent publications by the Australian Bureau of Statistics (ABS) provides a useful insight into feasible future trends in the population, the labour force and dependency ratios. In addition, earlier ABS census data and its regular publications from the Labour Force Surveys clarify the historical trends in the relative number of dependants and nondependants. These various sources of data are brought together in this paper.... Official population projections...highlight the fact that the combination of annual zero net migration and 10 per cent below replacement fertility would not produce an immediate decline in Australia's population.... The conventional labour-force dependency ratio suggests that the dependency situation in Australia in 2041 will be no worse than it was in the early 1980s." excerpt

  9. Communicating the risks of fetal alcohol spectrum disorder: effects of message framing and exemplification.

    PubMed

    Yu, Nan; Ahern, Lee A; Connolly-Ahern, Colleen; Shen, Fuyuan

    2010-12-01

    Health messages can be either informative or descriptive, and can emphasize either potential losses or gains. This study, guided by message framing theory and exemplification theory, specifically investigated the combined effects of messages with loss-gain frames mixed with statistics or exemplar appeals. The findings revealed a series of main effects and interactions for loss-gain frames and statistics-exemplar appeals on fetal alcohol spectrum disorder (FASD) prevention intention, intention to know more, perceived severity, perceived fear, perceived external efficacy, and perceived internal efficacy. The gain-statistics appeal showed an advantage in promoting perceived efficacy toward FASD, while the loss-exemplar appeal revealed an advantage in increasing prevention intention, perceived severity, and perceived fear toward FASD. Limitations and implications for future research are discussed.

  10. Meta- and statistical analysis of single-case intervention research data: quantitative gifts and a wish list.

    PubMed

    Kratochwill, Thomas R; Levin, Joel R

    2014-04-01

    In this commentary, we add to the spirit of the articles appearing in the special series devoted to meta- and statistical analysis of single-case intervention-design data. Following a brief discussion of historical factors leading to our initial involvement in statistical analysis of such data, we discuss: (a) the value added by including statistical-analysis recommendations in the What Works Clearinghouse Standards for single-case intervention designs; (b) the importance of visual analysis in single-case intervention research, along with the distinctive role that could be played by single-case effect-size measures; and (c) the elevated internal validity and statistical-conclusion validity afforded by the incorporation of various forms of randomization into basic single-case design structures. For the future, we envision more widespread application of quantitative analyses, as critical adjuncts to visual analysis, in both primary single-case intervention research studies and literature reviews in the behavioral, educational, and health sciences. Copyright © 2014 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  11. Effects of climate change on daily minimum and maximum temperatures and cloudiness in the Shikoku region: a statistical downscaling model approach

    NASA Astrophysics Data System (ADS)

    Tatsumi, Kenichi; Oizumi, Tsutao; Yamashiki, Yosuke

    2015-04-01

    In this study, we present a detailed analysis of the effect of changes in cloudiness (CLD) between a future period (2071-2099) and the base period (1961-1990) on daily minimum temperature (TMIN) and maximum temperature (TMAX) in the same period for the Shikoku region, Japan. This analysis was performed using climate data obtained with the use of the Statistical DownScaling Model (SDSM). We calibrated the SDSM using the National Center for Environmental Prediction (NCEP) reanalysis dataset for the SDSM input and daily time series of temperature and CLD from 10 surface data points (SDP) in Shikoku. Subsequently, we validated the SDSM outputs, specifically, TMIN, TMAX, and CLD, obtained with the use of the NCEP reanalysis dataset and general circulation model (GCM) data against the SDP. The GCM data used in the validation procedure were those from the Hadley Centre Coupled Model, version 3 (HadCM3) for the Special Report on Emission Scenarios (SRES) A2 and B2 scenarios and from the third generation Coupled Global Climate Model (CGCM3) for the SRES A2 and A1B scenarios. Finally, the validated SDSM was run to study the effect of future changes in CLD on TMIN and TMAX. Our analysis showed that (1) the negative linear fit between changes in TMAX and those in CLD was statistically significant in winter while the relationship between the two changes was not evident in summer, (2) the dependency of future changes in TMAX and TMIN on future changes in CLD were more evident in winter than in other seasons with the present SDSM, (3) the diurnal temperature range (DTR) decreased in the southern part of Shikoku in summer in all the SDSM projections while DTR increased in the northern part of Shikoku in the same season in these projections, (4) the dependencies of changes in DTR on changes in CLD were unclear in summer and winter. Results of the SDSM simulations performed for climate change scenarios such as those from this study contribute to local-scale agricultural and hydrological simulations and development of agricultural and hydrological models.

  12. Forging a link between mentoring and collaboration: a new training model for implementation science.

    PubMed

    Luke, Douglas A; Baumann, Ana A; Carothers, Bobbi J; Landsverk, John; Proctor, Enola K

    2016-10-13

    Training investigators for the rapidly developing field of implementation science requires both mentoring and scientific collaboration. Using social network descriptive analyses, visualization, and modeling, this paper presents results of an evaluation of the mentoring and collaborations fostered over time through the National Institute of Mental Health (NIMH) supported by Implementation Research Institute (IRI). Data were comprised of IRI participant self-reported collaborations and mentoring relationships, measured in three annual surveys from 2012 to 2014. Network descriptive statistics, visualizations, and network statistical modeling were conducted to examine patterns of mentoring and collaboration among IRI participants and to model the relationship between mentoring and subsequent collaboration. Findings suggest that IRI is successful in forming mentoring relationships among its participants, and that these mentoring relationships are related to future scientific collaborations. Exponential random graph network models demonstrated that mentoring received in 2012 was positively and significantly related to the likelihood of having a scientific collaboration 2 years later in 2014 (p = 0.001). More specifically, mentoring was significantly related to future collaborations focusing on new research (p = 0.009), grant submissions (p = 0.003), and publications (p = 0.017). Predictions based on the network model suggest that for every additional mentoring relationships established in 2012, the likelihood of a scientific collaboration 2 years later is increased by almost 7 %. These results support the importance of mentoring in implementation science specifically and team science more generally. Mentoring relationships were established quickly and early by the IRI core faculty. IRI fellows reported increasing scientific collaboration of all types over time, including starting new research, submitting new grants, presenting research results, and publishing peer-reviewed papers. Statistical network models demonstrated that mentoring was strongly and significantly related to subsequent scientific collaboration, which supported a core design principle of the IRI. Future work should establish the link between mentoring and scientific productivity. These results may be of interest to team science, as they suggest the importance of mentoring for future team collaborations, as well as illustrate the utility of network analysis for studying team characteristics and activities.

  13. Neural precursors of future liking and affective reciprocity.

    PubMed

    Zerubavel, Noam; Hoffman, Mark Anthony; Reich, Adam; Ochsner, Kevin N; Bearman, Peter

    2018-04-24

    Why do certain group members end up liking each other more than others? How does affective reciprocity arise in human groups? The prediction of interpersonal sentiment has been a long-standing pursuit in the social sciences. We combined fMRI and longitudinal social network data to test whether newly acquainted group members' reward-related neural responses to images of one another's faces predict their future interpersonal sentiment, even many months later. Specifically, we analyze associations between relationship-specific valuation activity and relationship-specific future liking. We found that one's own future (T2) liking of a particular group member is predicted jointly by actor's initial (T1) neural valuation of partner and by that partner's initial (T1) neural valuation of actor. These actor and partner effects exhibited equivalent predictive strength and were robust when statistically controlling for each other, both individuals' initial liking, and other potential drivers of liking. Behavioral findings indicated that liking was initially unreciprocated at T1 yet became strongly reciprocated by T2. The emergence of affective reciprocity was partly explained by the reciprocal pathways linking dyad members' T1 neural data both to their own and to each other's T2 liking outcomes. These findings elucidate interpersonal brain mechanisms that define how we ultimately end up liking particular interaction partners, how group members' initially idiosyncratic sentiments become reciprocated, and more broadly, how dyads evolve. This study advances a flexible framework for researching the neural foundations of interpersonal sentiments and social relations that-conceptually, methodologically, and statistically-emphasizes group members' neural interdependence. Copyright © 2018 the Author(s). Published by PNAS.

  14. Past and future trends of hydroclimatic intensity over the Indian monsoon region

    NASA Astrophysics Data System (ADS)

    Mohan, T. S.; Rajeevan, M.

    2017-01-01

    The hydroclimatic intensity index (HY-INT) is a single index that quantitatively combines measures of precipitation intensity and dry spell length, thus providing an integrated response of the hydrological cycle to global warming. The HY-INT index is a product of the precipitation intensity (PINT, intensity during wet days) and dry spell length (DSL). Using the observed gridded rainfall data sets of 1951-2010 period, the changes in HY-INT, PINT, and DSL over the Indian monsoon region have been examined in addition to changes in maximum consecutive dry days (MCD). We have also considered 10 Coupled Model Intercomparison Project Phase 5 (CMIP5) climate models for examining the changes in these indices during the present-day and future climate change scenarios. For climate change projections, the Representative Concentration Pathway (RCP) 4.5 scenario was considered. The analysis of observational data during the period 1951-2010 suggested an increase in DSL and MCD over most of central India. Further, statistically significant (95% level) increase in HY-INT is also noted during the period of 1951-2010, which is mainly caused due to significant increase in precipitation intensity. The CMIP5 model projections of future climate also suggest a statistically significant increase in HY-INT over the Indian region. Out of the 10 models considered, seven models suggest a consistent increase in HY-INT during the period of 2010-2100 under the RCP4.5 scenario. However, the projected increase in HY-INT is mainly due to increase in the precipitation intensity, while dry spell length (DSL) showed little changes in the future climate.

  15. A Statistical Modeling Framework for Projecting Future Ambient Ozone and its Health Impact due to Climate Change

    PubMed Central

    Chang, Howard H.; Hao, Hua; Sarnat, Stefanie Ebelt

    2014-01-01

    The adverse health effects of ambient ozone are well established. Given the high sensitivity of ambient ozone concentrations to meteorological conditions, the impacts of future climate change on ozone concentrations and its associated health effects are of concern. We describe a statistical modeling framework for projecting future ozone levels and its health impacts under a changing climate. This is motivated by the continual effort to evaluate projection uncertainties to inform public health risk assessment. The proposed approach was applied to the 20-county Atlanta metropolitan area using regional climate model (RCM) simulations from the North American Regional Climate Change Assessment Program. Future ozone levels and ozone-related excesses in asthma emergency department (ED) visits were examined for the period 2041–2070. The computationally efficient approach allowed us to consider 8 sets of climate model outputs based on different combinations of 4 RCMs and 4 general circulation models. Compared to the historical period of 1999–2004, we found consistent projections across climate models of an average 11.5% higher ozone levels (range: 4.8%, 16.2%), and an average 8.3% (range: −7% to 24%) higher number of ozone exceedance days. Assuming no change in the at-risk population, this corresponds to excess ozone-related ED visits ranging from 267 to 466 visits per year. Health impact projection uncertainty was driven predominantly by uncertainty in the health effect association and climate model variability. Calibrating climate simulations with historical observations reduced differences in projections across climate models. PMID:24764746

  16. Parameter Estimation and Model Validation of Nonlinear Dynamical Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abarbanel, Henry; Gill, Philip

    In the performance period of this work under a DOE contract, the co-PIs, Philip Gill and Henry Abarbanel, developed new methods for statistical data assimilation for problems of DOE interest, including geophysical and biological problems. This included numerical optimization algorithms for variational principles, new parallel processing Monte Carlo routines for performing the path integrals of statistical data assimilation. These results have been summarized in the monograph: “Predicting the Future: Completing Models of Observed Complex Systems” by Henry Abarbanel, published by Spring-Verlag in June 2013. Additional results and details have appeared in the peer reviewed literature.

  17. 32 CFR 310.22 - Non-consensual conditions of disclosure.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) Present and past position titles. (3) Present and past grades. (4) Present and past annual salary rates...) Grade or position. (3) Date of grade. (4) Gross salary. (5) Present and past assignments. (6) Future... Agencies”). (e) Disclosures for statistical research or reporting. (1) Records may be disclosed for...

  18. 32 CFR 310.22 - Non-consensual conditions of disclosure.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...) Present and past position titles. (3) Present and past grades. (4) Present and past annual salary rates...) Grade or position. (3) Date of grade. (4) Gross salary. (5) Present and past assignments. (6) Future... Agencies”). (e) Disclosures for statistical research or reporting. (1) Records may be disclosed for...

  19. 32 CFR 310.22 - Non-consensual conditions of disclosure.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) Present and past position titles. (3) Present and past grades. (4) Present and past annual salary rates...) Grade or position. (3) Date of grade. (4) Gross salary. (5) Present and past assignments. (6) Future... Agencies”). (e) Disclosures for statistical research or reporting. (1) Records may be disclosed for...

  20. The International Provision and Supply of Publications.

    ERIC Educational Resources Information Center

    Line, Maurice B.; And Others

    As part of a Universal Availability of Publications (UAP) program, this report describes the current situation in international interlending and possible future models. From a review of literature and statistics previously collected, and a 1979 study of 15 international supply centers, it is concluded that international loan demand is increasing…

  1. The Evolution of Organization Analysis in ASQ, 1959-1979.

    ERIC Educational Resources Information Center

    Daft, Richard L.

    1980-01-01

    During the period 1959-1979, a sharp trend toward low-variety statistical languages has taken place, which may represent an organizational mapping phase in which simple, quantifiable relationships have been formally defined and measured. A broader scope of research languages will be needed in the future. (Author/IRT)

  2. Mapping Remote and Multidisciplinary Learning Barriers: Lessons from "Challenge-Based Innovation" at CERN

    ERIC Educational Resources Information Center

    Jensen, Matilde Bisballe; Utriainen, Tuuli Maria; Steinert, Martin

    2018-01-01

    This paper presents the experienced difficulties of students participating in the multidisciplinary, remote collaborating engineering design course challenge-based innovation at CERN. This is with the aim to identify learning barriers and improve future learning experiences. We statistically analyse the rated differences between distinct design…

  3. How to Manage an Extensive Laserdisk Installation: The Texas A&M Experience.

    ERIC Educational Resources Information Center

    Tucker, Sandra L.; And Others

    1988-01-01

    The second of two articles on the acquisition and implementation of a large laserdisk service at Texas A&M University covers equipment and supplies, future plans, service, staffing, training of staff and patrons, and statistics. A floor plan, user instruction sheet, and news release are included. (MES)

  4. Timber in Missouri, 1972.

    Treesearch

    John S. Jr. Spencer; Burton L. Essex

    1976-01-01

    The third inventory of Missouri's timber resource shows a small gain in growing-stock volume and a somewhat larger gain in sawtimber volume since 1959. Area of commercial forest declined sharply between surveys. Presented are text and statistics on forest area and timber volume, growth, mortality, ownership, stocking, future timber supply, and forest management...

  5. VOCATIONAL EDUCATION INFORMATION SYSTEM. FINAL REPORT.

    ERIC Educational Resources Information Center

    ZWICKEL, I.; AND OTHERS

    STATE- AND FEDERAL-LEVEL DESIGN SPECIFICATIONS WERE DEVELOPED FOR A SYSTEM CAPABLE OF COLLECTING AND REDUCING NATIONWIDE STATISTICAL DATA ON VOCATIONAL EDUCATION. THESE SPECIFICATIONS WERE EXPECTED TO PROVIDE THE BASIS FOR THE ADOPTION BY ALL STATES OF AN INFORMATION REPORTING SYSTEM THAT WOULD MEET BOTH PRESENT AND FUTURE FEDERAL REPORTING…

  6. Deindustrialization and the Shift to Services.

    ERIC Educational Resources Information Center

    Kutscher, Ronald E.; Personick, Valerie A.

    1986-01-01

    Bureau of Labor Statistics data show the industrial sector as a whole in healthy shape, but a few manufacturing industries in deep trouble. These industries include tobacco manufacturers, iron and steel foundries, leather products, and steel manufacturers. Also examines shifts in employment and output, job quality, and outlook for the future. (CT)

  7. Statistical control in hydrologic forecasting.

    Treesearch

    H.G. Wilm

    1950-01-01

    With rapidly growing development and uses of water, a correspondingly great demand has developed for advance estimates of the volumes or rates of flow which are supplied by streams. Therefore much attention is being devoted to hydrologic forecasting, and numerous methods have been tested in efforts to make increasingly reliable estimates of future supplies.

  8. Adaptation of irrigation infrastructure on irrigation demands under future drought in the USA

    USDA-ARS?s Scientific Manuscript database

    More severe droughts in the United States will bring great challenges to irrigation water supply. Here, the authors assessed the potential adaptive effects of irrigation infrastructure under present and more extensive droughts. Based on data over 1985–2005, this study established a statistical model...

  9. Libraries and Computing Centers: Issues of Mutual Concern.

    ERIC Educational Resources Information Center

    Metz, Paul; Potter, William G.

    1989-01-01

    The first of two articles discusses the advantages of online subject searching, the recall and precision tradeoff, and possible future developments in electronic searching. The second reviews the experiences of academic libraries that offer online searching of bibliographic, full text, and statistical databases in addition to online catalogs. (CLB)

  10. Developing Sensitivity to Subword Combinatorial Orthographic Regularity (SCORe): A Two-Process Framework

    ERIC Educational Resources Information Center

    Mano, Quintino R.

    2016-01-01

    Accumulating evidence suggests that literacy acquisition involves developing sensitivity to the statistical regularities of the textual environment. To organize accumulating evidence and help guide future inquiry, this article integrates data from disparate fields of study and formalizes a new two-process framework for developing sensitivity to…

  11. Emotional Relationships between Mothers* and Infants: Knowns, Unknowns, and Unknown Unknowns

    PubMed Central

    Bornstein, Marc H.; Suwalsky, Joan T. D.; Breakstone, Dana A.

    2012-01-01

    An overview of the literature pertaining to the construct of emotional availability is presented, illustrated by a sampling of relevant studies. Methodological, statistical, and conceptual problems in the existing corpus of research are discussed, and suggestions for improving future investigations of this important construct are offered. PMID:22292998

  12. A MOOC on Approaches to Machine Translation

    ERIC Educational Resources Information Center

    Costa-jussà, Mart R.; Formiga, Lluís; Torrillas, Oriol; Petit, Jordi; Fonollosa, José A. R.

    2015-01-01

    This paper describes the design, development, and analysis of a MOOC entitled "Approaches to Machine Translation: Rule-based, statistical and hybrid", and provides lessons learned and conclusions to be taken into account in the future. The course was developed within the Canvas platform, used by recognized European universities. It…

  13. An automated library financial management system

    NASA Technical Reports Server (NTRS)

    Dueker, S.; Gustafson, L.

    1977-01-01

    A computerized library acquisition system developed for control of informational materials acquired at NASA Ames Research Center is described. The system monitors the acquisition of both library and individual researchers' orders and supplies detailed financial, statistical, and bibliographical information. Applicability for other libraries and the future availability of the program is discussed.

  14. EVALUATION AND INTERPRETATION OF NEURODEVELOPMENTAL ENDPOINTS FOR HUMAN HEALTH RISK ASSESSMENT -- POSITIVE CONTROL STUDIES, NORMAL VARIABILITY AND STATISTICAL ISSUES.

    EPA Science Inventory

    ILSI Research Foundation/Risk Science Institute convened an expert working group to assess the lessons learned from the implementation of the EPA Developmental Neurotoxicity (DNT) Guideline and provide guidance for future use. The group prepared manuscripts in five areas: public ...

  15. Technology Acceptance Predictors among Student Teachers and Experienced Classroom Teachers

    ERIC Educational Resources Information Center

    Smarkola, Claudia

    2007-01-01

    This study investigated 160 student teachers' and 158 experienced teachers' self-reported computer usage and their future intentions to use computer applications for school assignments. The Technology Acceptance Model (TAM) was used as the framework to determine computer usage and intentions. Statistically significant results showed that after…

  16. Progress towards design elements for a Great Lakes-wide aquatic invasive species early detection network

    EPA Science Inventory

    Great Lakes coastal systems are vulnerable to introduction of a wide variety of non-indigenous species (NIS), and the desire to effectively respond to future invaders is prompting efforts towards establishing a broad early-detection network. Such a network requires statistically...

  17. Development of the Research Competencies Scale

    ERIC Educational Resources Information Center

    Swank, Jacqueline M.; Lambie, Glenn W.

    2016-01-01

    The authors present the development of the Research Competencies Scale (RCS). The purpose of this article is threefold: (a) present a rationale for the RCS, (b) review statistical analysis procedures used in developing the RCS, and (c) offer implications for counselor education, the enhancement of scholar-researchers, and future research.

  18. Facts about Public Universities: Looking to the Future.

    ERIC Educational Resources Information Center

    National Association of State Universities and Land Grant Colleges, Washington, DC.

    This publication describes key characteristics and statistics on the nation's public universities as they look to the coming century including their role, meeting student needs, maintaining access, obtaining government support, and service the public through outreach and an expanded concept of public service. A section on meeting student needs…

  19. Implementation and Impact of the Check & Connect Mentoring Program

    ERIC Educational Resources Information Center

    Heppen, Jessica; O'Cummings, Mindee; Poland, Lindsay; Zeiser, Krissy; Mills, Nicholas

    2015-01-01

    High school graduation rates remain unacceptably low in the U.S., especially among disadvantaged youth (Chapman, Laird, Ifill, & KelalRamani, 2011; Stillwell, 2010), with troubling implications for future earnings and employment status (Bureau of Labor Statistics, 2012). Check & Connect (C&C) is an individualized program that pairs…

  20. The Digital Workforce: Update, August 2000 [and] The Digital Work Force: State Data & Rankings, September 2000.

    ERIC Educational Resources Information Center

    Sargent, John

    The Office of Technology Policy analyzed Bureau of Labor Statistics' growth projections for the core occupational classifications of IT (information technology) workers to assess future demand in the United States. Classifications studied were computer engineers, systems analysts, computer programmers, database administrators, computer support…

  1. Work, Disability, and the Future: Promoting Employment for People with Disabilities.

    ERIC Educational Resources Information Center

    Roessler, Richard T.

    1987-01-01

    Statistical data on unemployment emphasize problems experienced by people with disabilities in seeking work. Advocates changes in public policies, institutional practices, rehabilitation practices, and employer benefits to ensure people with disabilities a share in the prosperity anticipated in view of brighter economic prospects. (Author/KS)

  2. Multi-criterion model ensemble of CMIP5 surface air temperature over China

    NASA Astrophysics Data System (ADS)

    Yang, Tiantian; Tao, Yumeng; Li, Jingjing; Zhu, Qian; Su, Lu; He, Xiaojia; Zhang, Xiaoming

    2018-05-01

    The global circulation models (GCMs) are useful tools for simulating climate change, projecting future temperature changes, and therefore, supporting the preparation of national climate adaptation plans. However, different GCMs are not always in agreement with each other over various regions. The reason is that GCMs' configurations, module characteristics, and dynamic forcings vary from one to another. Model ensemble techniques are extensively used to post-process the outputs from GCMs and improve the variability of model outputs. Root-mean-square error (RMSE), correlation coefficient (CC, or R) and uncertainty are commonly used statistics for evaluating the performances of GCMs. However, the simultaneous achievements of all satisfactory statistics cannot be guaranteed in using many model ensemble techniques. In this paper, we propose a multi-model ensemble framework, using a state-of-art evolutionary multi-objective optimization algorithm (termed MOSPD), to evaluate different characteristics of ensemble candidates and to provide comprehensive trade-off information for different model ensemble solutions. A case study of optimizing the surface air temperature (SAT) ensemble solutions over different geographical regions of China is carried out. The data covers from the period of 1900 to 2100, and the projections of SAT are analyzed with regard to three different statistical indices (i.e., RMSE, CC, and uncertainty). Among the derived ensemble solutions, the trade-off information is further analyzed with a robust Pareto front with respect to different statistics. The comparison results over historical period (1900-2005) show that the optimized solutions are superior over that obtained simple model average, as well as any single GCM output. The improvements of statistics are varying for different climatic regions over China. Future projection (2006-2100) with the proposed ensemble method identifies that the largest (smallest) temperature changes will happen in the South Central China (the Inner Mongolia), the North Eastern China (the South Central China), and the North Western China (the South Central China), under RCP 2.6, RCP 4.5, and RCP 8.5 scenarios, respectively.

  3. Deep Space Ka-band Link Management and the MRO Demonstration: Long-term Weather Statistics Versus Forecasting

    NASA Technical Reports Server (NTRS)

    Davarian, Faramaz; Shambayati, Shervin; Slobin, Stephen

    2004-01-01

    During the last 40 years, deep space radio communication systems have experienced a move toward shorter wavelengths. In the 1960s a transition from L- to S-band occurred which was followed by a transition from S- to X-band in the 1970s. Both these transitions provided deep space links with wider bandwidths and improved radio metrics capability. Now, in the 2000s, a new change is taking place, namely a move to the Ka-band region of the radio frequency spectrum. Ka-band will soon replace X-band as the frequency of choice for deep space communications providing ample spectrum for the high data rate requirements of future missions. The low-noise receivers of deep space networks have a great need for link management techniques that can mitigate weather effects. In this paper, three approaches for managing Ka-band Earth-space links are investigated. The first approach uses aggregate annual statistics, the second one uses monthly statistics, and the third is based on the short-term forecasting of the local weather. An example of weather forecasting for Ka-band link performance prediction is presented. Furthermore, spacecraft commanding schemes suitable for Ka-band link management are investigated. Theses schemes will be demonstrated using NASA's Mars Reconnaissance Orbiter (MRO) spacecraft in the 2007 to 2008 time period, and the demonstration findings will be reported in a future publication.

  4. Nonparametric predictive inference for combining diagnostic tests with parametric copula

    NASA Astrophysics Data System (ADS)

    Muhammad, Noryanti; Coolen, F. P. A.; Coolen-Maturi, T.

    2017-09-01

    Measuring the accuracy of diagnostic tests is crucial in many application areas including medicine and health care. The Receiver Operating Characteristic (ROC) curve is a popular statistical tool for describing the performance of diagnostic tests. The area under the ROC curve (AUC) is often used as a measure of the overall performance of the diagnostic test. In this paper, we interest in developing strategies for combining test results in order to increase the diagnostic accuracy. We introduce nonparametric predictive inference (NPI) for combining two diagnostic test results with considering dependence structure using parametric copula. NPI is a frequentist statistical framework for inference on a future observation based on past data observations. NPI uses lower and upper probabilities to quantify uncertainty and is based on only a few modelling assumptions. While copula is a well-known statistical concept for modelling dependence of random variables. A copula is a joint distribution function whose marginals are all uniformly distributed and it can be used to model the dependence separately from the marginal distributions. In this research, we estimate the copula density using a parametric method which is maximum likelihood estimator (MLE). We investigate the performance of this proposed method via data sets from the literature and discuss results to show how our method performs for different family of copulas. Finally, we briefly outline related challenges and opportunities for future research.

  5. Implementation of Statistical Process Control: Evaluating the Mechanical Performance of a Candidate Silicone Elastomer Docking Seal

    NASA Technical Reports Server (NTRS)

    Oravec, Heather Ann; Daniels, Christopher C.

    2014-01-01

    The National Aeronautics and Space Administration has been developing a novel docking system to meet the requirements of future exploration missions to low-Earth orbit and beyond. A dynamic gas pressure seal is located at the main interface between the active and passive mating components of the new docking system. This seal is designed to operate in the harsh space environment, but is also to perform within strict loading requirements while maintaining an acceptable level of leak rate. In this study, a candidate silicone elastomer seal was designed, and multiple subscale test articles were manufactured for evaluation purposes. The force required to fully compress each test article at room temperature was quantified and found to be below the maximum allowable load for the docking system. However, a significant amount of scatter was observed in the test results. Due to the stochastic nature of the mechanical performance of this candidate docking seal, a statistical process control technique was implemented to isolate unusual compression behavior from typical mechanical performance. The results of this statistical analysis indicated a lack of process control, suggesting a variation in the manufacturing phase of the process. Further investigation revealed that changes in the manufacturing molding process had occurred which may have influenced the mechanical performance of the seal. This knowledge improves the chance of this and future space seals to satisfy or exceed design specifications.

  6. Financing physical therapy doctoral education: methods used by entry-level students and the financial impact after graduation.

    PubMed

    Thompson, Kris; Coon, Jill; Handford, Leandrea

    2011-01-01

    With the move to the doctor of physical therapy (DPT) degree and increasing tuition costs, there is concern about financing entry-level education. The purposes of this study were to identify how students finance their DPT education and to describe the financial impact after graduation. A written survey was used to collect data on financing DPT education, student debt, and the financial impact on graduates. There were 92 subjects who had graduated from one program. Frequencies as well as nonparametric statistics using cross-tabulations and chi-squared statistics were calculated. The response rate was 55%. Of the respondents, 86% had student loans, 66% worked during school, 57% received some family assistance, and 21% had some scholarship support. The amount of monthly loan repayment was not statistically related to the ability to save for a house, the ability to obtain a loan for a house or car, or the decision to have children. Saving for the future (p = 0.016) and lifestyle choices (p = 0.035) were related to the amount of monthly loan repayment. Major sources of funding were student loans, employment income, and/or family assistance. Respondent's ability to save for the future and lifestyle choices were negatively impacted when loan debt increased. Physical therapist education programs should consider offering debt planning and counseling.

  7. On the insufficiency of arbitrarily precise covariance matrices: non-Gaussian weak-lensing likelihoods

    NASA Astrophysics Data System (ADS)

    Sellentin, Elena; Heavens, Alan F.

    2018-01-01

    We investigate whether a Gaussian likelihood, as routinely assumed in the analysis of cosmological data, is supported by simulated survey data. We define test statistics, based on a novel method that first destroys Gaussian correlations in a data set, and then measures the non-Gaussian correlations that remain. This procedure flags pairs of data points that depend on each other in a non-Gaussian fashion, and thereby identifies where the assumption of a Gaussian likelihood breaks down. Using this diagnosis, we find that non-Gaussian correlations in the CFHTLenS cosmic shear correlation functions are significant. With a simple exclusion of the most contaminated data points, the posterior for s8 is shifted without broadening, but we find no significant reduction in the tension with s8 derived from Planck cosmic microwave background data. However, we also show that the one-point distributions of the correlation statistics are noticeably skewed, such that sound weak-lensing data sets are intrinsically likely to lead to a systematically low lensing amplitude being inferred. The detected non-Gaussianities get larger with increasing angular scale such that for future wide-angle surveys such as Euclid or LSST, with their very small statistical errors, the large-scale modes are expected to be increasingly affected. The shifts in posteriors may then not be negligible and we recommend that these diagnostic tests be run as part of future analyses.

  8. Bayesian inference for the spatio-temporal invasion of alien species.

    PubMed

    Cook, Alex; Marion, Glenn; Butler, Adam; Gibson, Gavin

    2007-08-01

    In this paper we develop a Bayesian approach to parameter estimation in a stochastic spatio-temporal model of the spread of invasive species across a landscape. To date, statistical techniques, such as logistic and autologistic regression, have outstripped stochastic spatio-temporal models in their ability to handle large numbers of covariates. Here we seek to address this problem by making use of a range of covariates describing the bio-geographical features of the landscape. Relative to regression techniques, stochastic spatio-temporal models are more transparent in their representation of biological processes. They also explicitly model temporal change, and therefore do not require the assumption that the species' distribution (or other spatial pattern) has already reached equilibrium as is often the case with standard statistical approaches. In order to illustrate the use of such techniques we apply them to the analysis of data detailing the spread of an invasive plant, Heracleum mantegazzianum, across Britain in the 20th Century using geo-referenced covariate information describing local temperature, elevation and habitat type. The use of Markov chain Monte Carlo sampling within a Bayesian framework facilitates statistical assessments of differences in the suitability of different habitat classes for H. mantegazzianum, and enables predictions of future spread to account for parametric uncertainty and system variability. Our results show that ignoring such covariate information may lead to biased estimates of key processes and implausible predictions of future distributions.

  9. Are weather models better than gridded observations for precipitation in the mountains? (Invited)

    NASA Astrophysics Data System (ADS)

    Gutmann, E. D.; Rasmussen, R.; Liu, C.; Ikeda, K.; Clark, M. P.; Brekke, L. D.; Arnold, J.; Raff, D. A.

    2013-12-01

    Mountain snowpack is a critical storage component in the water cycle, and it provides drinking water for tens of millions of people in the Western US alone. This water store is susceptible to climate change both because warming temperatures are likely to lead to earlier melt and a temporal shift of the hydrograph, and because changing atmospheric conditions are likely to change the precipitation patterns that produce the snowpack. Current measurements of snowfall in complex terrain are limited in number due in part to the logistics of installing equipment in complex terrain. We show that this limitation leads to statistical artifacts in gridded observations of current climate including errors in precipitation season totals of a factor of two or more, increases in wet day fraction, and decreases in storm intensity. In contrast, a high-resolution numerical weather model (WRF) is able to reproduce observed precipitation patterns, leading to confidence in its predictions for areas without measurements and new observations support this. Running WRF for a future climate scenario shows substantial changes in the spatial patterns of precipitation in the mountains related to the physics of hydrometeor production and detrainment that are not captured by statistical downscaling products. The stationarity in statistical downscaling products is likely to lead to important errors in our estimation of future precipitation in complex terrain.

  10. A Comparison of Three Different Methods of Fixation in the Management of Thoracolumbar Fractures.

    PubMed

    Panteliadis, Pavlos; Musbahi, Omar; Muthian, Senthil; Goyal, Shivam; Montgomery, Alexander Sheriff; Ranganathan, Arun

    2017-01-01

    Management of thoracolumbar fractures remains controversial in the literature. The primary aims of this study were to assess different levels of fixation with respect to radiological outcomes in terms of fracture reduction and future loss of correction. This is a single center, retrospective study. Fifty-five patients presenting with thoracolumbar fractures between January 2012 and December 2015 were analyzed in the study. The levels of fixation were divided in 3 groups, 1 vertebra above and 1 below the fracture (1/1), 2 above and 2 below (2/2), and 2 above and 1 below (2/1). The most common mechanism was high fall injury and the most common vertebra L1. Burst fractures were the ones with the highest incidence. The 2/2 fixation achieved the best reduction of the fracture but with no statistical significance. The correction is maintained better by the 2/2 fixation but there is no statistical difference compared to the other fixations. Insertion of screws at the fracture level did not improve outcomes. The data of this study identified a trend towards better radiological outcomes for fracture reduction and maintenance of the correction in the 2/2 fixations. However these results are not statistically significant. Future multicenter prospective clinical trials are needed in order to agree on the ideal management and method of fixation for thoracolumbar fractures.

  11. Event coincidence analysis for quantifying statistical interrelationships between event time series. On the role of flood events as triggers of epidemic outbreaks

    NASA Astrophysics Data System (ADS)

    Donges, J. F.; Schleussner, C.-F.; Siegmund, J. F.; Donner, R. V.

    2016-05-01

    Studying event time series is a powerful approach for analyzing the dynamics of complex dynamical systems in many fields of science. In this paper, we describe the method of event coincidence analysis to provide a framework for quantifying the strength, directionality and time lag of statistical interrelationships between event series. Event coincidence analysis allows to formulate and test null hypotheses on the origin of the observed interrelationships including tests based on Poisson processes or, more generally, stochastic point processes with a prescribed inter-event time distribution and other higher-order properties. Applying the framework to country-level observational data yields evidence that flood events have acted as triggers of epidemic outbreaks globally since the 1950s. Facing projected future changes in the statistics of climatic extreme events, statistical techniques such as event coincidence analysis will be relevant for investigating the impacts of anthropogenic climate change on human societies and ecosystems worldwide.

  12. Statistical mechanics of economics I

    NASA Astrophysics Data System (ADS)

    Kusmartsev, F. V.

    2011-02-01

    We show that statistical mechanics is useful in the description of financial crisis and economics. Taking a large amount of instant snapshots of a market over an interval of time we construct their ensembles and study their statistical interference. This results in a probability description of the market and gives capital, money, income, wealth and debt distributions, which in the most cases takes the form of the Bose-Einstein distribution. In addition, statistical mechanics provides the main market equations and laws which govern the correlations between the amount of money, debt, product, prices and number of retailers. We applied the found relations to a study of the evolution of the economics in USA between the years 1996 to 2008 and observe that over that time the income of a major population is well described by the Bose-Einstein distribution which parameters are different for each year. Each financial crisis corresponds to a peak in the absolute activity coefficient. The analysis correctly indicates the past crises and predicts the future one.

  13. Statistical Analyses of Raw Material Data for MTM45-1/CF7442A-36% RW: CMH Cure Cycle

    NASA Technical Reports Server (NTRS)

    Coroneos, Rula; Pai, Shantaram, S.; Murthy, Pappu

    2013-01-01

    This report describes statistical characterization of physical properties of the composite material system MTM45-1/CF7442A, which has been tested and is currently being considered for use on spacecraft structures. This composite system is made of 6K plain weave graphite fibers in a highly toughened resin system. This report summarizes the distribution types and statistical details of the tests and the conditions for the experimental data generated. These distributions will be used in multivariate regression analyses to help determine material and design allowables for similar material systems and to establish a procedure for other material systems. Additionally, these distributions will be used in future probabilistic analyses of spacecraft structures. The specific properties that are characterized are the ultimate strength, modulus, and Poisson??s ratio by using a commercially available statistical package. Results are displayed using graphical and semigraphical methods and are included in the accompanying appendixes.

  14. Performance of cancer cluster Q-statistics for case-control residential histories

    PubMed Central

    Sloan, Chantel D.; Jacquez, Geoffrey M.; Gallagher, Carolyn M.; Ward, Mary H.; Raaschou-Nielsen, Ole; Nordsborg, Rikke Baastrup; Meliker, Jaymie R.

    2012-01-01

    Few investigations of health event clustering have evaluated residential mobility, though causative exposures for chronic diseases such as cancer often occur long before diagnosis. Recently developed Q-statistics incorporate human mobility into disease cluster investigations by quantifying space- and time-dependent nearest neighbor relationships. Using residential histories from two cancer case-control studies, we created simulated clusters to examine Q-statistic performance. Results suggest the intersection of cases with significant clustering over their life course, Qi, with cases who are constituents of significant local clusters at given times, Qit, yielded the best performance, which improved with increasing cluster size. Upon comparison, a larger proportion of true positives were detected with Kulldorf’s spatial scan method if the time of clustering was provided. We recommend using Q-statistics to identify when and where clustering may have occurred, followed by the scan method to localize the candidate clusters. Future work should investigate the generalizability of these findings. PMID:23149326

  15. Assessment of Current Jet Noise Prediction Capabilities

    NASA Technical Reports Server (NTRS)

    Hunter, Craid A.; Bridges, James E.; Khavaran, Abbas

    2008-01-01

    An assessment was made of the capability of jet noise prediction codes over a broad range of jet flows, with the objective of quantifying current capabilities and identifying areas requiring future research investment. Three separate codes in NASA s possession, representative of two classes of jet noise prediction codes, were evaluated, one empirical and two statistical. The empirical code is the Stone Jet Noise Module (ST2JET) contained within the ANOPP aircraft noise prediction code. It is well documented, and represents the state of the art in semi-empirical acoustic prediction codes where virtual sources are attributed to various aspects of noise generation in each jet. These sources, in combination, predict the spectral directivity of a jet plume. A total of 258 jet noise cases were examined on the ST2JET code, each run requiring only fractions of a second to complete. Two statistical jet noise prediction codes were also evaluated, JeNo v1, and Jet3D. Fewer cases were run for the statistical prediction methods because they require substantially more resources, typically a Reynolds-Averaged Navier-Stokes solution of the jet, volume integration of the source statistical models over the entire plume, and a numerical solution of the governing propagation equation within the jet. In the evaluation process, substantial justification of experimental datasets used in the evaluations was made. In the end, none of the current codes can predict jet noise within experimental uncertainty. The empirical code came within 2dB on a 1/3 octave spectral basis for a wide range of flows. The statistical code Jet3D was within experimental uncertainty at broadside angles for hot supersonic jets, but errors in peak frequency and amplitude put it out of experimental uncertainty at cooler, lower speed conditions. Jet3D did not predict changes in directivity in the downstream angles. The statistical code JeNo,v1 was within experimental uncertainty predicting noise from cold subsonic jets at all angles, but did not predict changes with heating of the jet and did not account for directivity changes at supersonic conditions. Shortcomings addressed here give direction for future work relevant to the statistical-based prediction methods. A full report will be released as a chapter in a NASA publication assessing the state of the art in aircraft noise prediction.

  16. Low-flow characteristics of streams under natural and diversion conditions, Waipiʻo Valley, Island of Hawaiʻi, Hawaiʻi

    USGS Publications Warehouse

    Fontaine, Richard A.

    2012-01-01

    Over the past 100 years, natural streamflow in Waipiʻo Valley has been reduced by the transfer of water out of the valley by Upper and Lower Hāmākua Ditches. The physical condition and diversion practices along the two ditch systems have varied widely over the years, and as a result, so have their effects on natural streamflow in Waipiʻo Valley. Recent renovation and improvements to Lower Hāmākua Ditch system, along with proposals for its future operation and water-diversion strategies, have unknown implications. The purpose of this report is to quantify the availability of streamflow and to determine the effects of current and proposed diversion strategies on the low-flow hydrology in Waipiʻo Valley. In this report, the low-flow hydrology of Waipiʻo Valley is described in terms of flow-duration statistics. Flow-duration statistics were computed for three locations in the Waipiʻo Valley study area where long-term surface-water gaging stations have been operated. Using a variety of streamflow record-extension techniques, flow-duration statistics were estimated at an additional 13 locations where only few historical data are available or where discharge measurements were made as part of this study. Flow-duration statistics were computed to reflect natural conditions, current (2000-2005) diversion conditions, and proposed future diversion conditions at the 16 locations. At the downstream limit of the study area, on Wailoa Stream at an altitude of 190 feet, a baseline for evaluating the availability of streamflow is provided by computed flow-duration statistics that are representative of natural, no-diversion conditions. At the Wailoa gaging station, 95- and 50-percentile discharges under natural conditions were determined to be 86 and 112 cubic feet per second, respectively. Under 1965-1969 diversion conditions, natural 95- and 50-percentile discharges were reduced by 52 and 53 percent, to 41 and 53 cubic feet per second, respectively. Under current (2000-2005) diversion conditions, natural 95- and 50-percentile discharges were reduced by 21 and 24 percent, to 68 and 85 cubic feet per second, respectively. Under proposed future diversion conditions, natural 95- and 50-percentile discharges would be reduced by 33 and 24 percent, to 58 and 85 cubic feet per second, respectively. Compared to discharges that reflect current (2000-2005) diversion conditions, proposed future diversion conditions would reduce 95-percentile discharges, which are representative of moderate drought levels in the stream, by 15 percent. No change would be expected in 50-percentile discharges, which are representative of normal conditions. The effects of current (2000-2005) and proposed future diversion conditions on the natural flow of streams in the Waipiʻo Valley study area differ, depending on the location. Under current (2000-2005) diversion conditions, reductions in natural 95- or 50-percentile discharges of greater than 30 percent were found in Kawainui Stream downstream from Upper Hamakua Ditch to an altitude of about 1,435 feet and in the reach of Waimā Stream between Upper and Lower Hāmākua Ditches. Under proposed future diversion conditions, reductions in natural 95- or 50-percentile discharges of greater than 30 percent were found in Kawainui Stream downstream from Upper Hamakua Ditch to an altitude of about 1,435 feet, in the reach of Waimā Stream between Upper and Lower Hāmākua Ditches, and along most stream reaches downstream from Lower Hāmākua Ditch, except for Waimā Stream.

  17. Assessing the Impact of Climate Change on Stream Temperatures in the Methow River Basin, Washington

    NASA Astrophysics Data System (ADS)

    Gangopadhyay, S.; Caldwell, R. J.; Lai, Y.; Bountry, J.

    2011-12-01

    The Methow River in Washington offers prime spawning habitat for salmon and other cold-water fishes. During the summer months, low streamflows on the Methow result in cutoff side channels that limit the habitat available to these fishes. Future climate scenarios of increasing air temperature and decreasing precipitation suggest the potential for increasing loss of habitat and fish mortality as stream temperatures rise in response to lower flows and additional heating. To assess the impacts of climate change on stream temperature in the Methow River, the US Bureau of Reclamation is developing an hourly time-step, two-dimensional hydraulic model of the confluence of the Methow and Chewuch Rivers above Winthrop. The model will be coupled with a physical stream temperature model to generate spatial representations of stream conditions conducive for fish habitat. In this study, we develop a statistical framework for generating stream temperature time series from global climate model (GCM) and hydrologic model outputs. Regional observations of stream temperature and hydrometeorological conditions are used to develop statistical models of daily mean stream temperature for the Methow River at Winthrop, WA. Temperature and precipitation projections from 10 global climate models (GCMs) are coupled with the streamflow generated using the University of Washington Variable Infiltration Capacity model. The projections serve as input to the statistical models to generate daily time series of mean daily stream temperature. Since the output from the GCM, VIC, and statistical models offer only daily data, a k-nearest neighbor (k-nn) resampling technique is employed to select appropriate proportion vectors for disaggregating the Winthrop daily flow and temperature to an upstream location on each of the rivers above the confluence. Hourly proportion vectors are then used to disaggregate the daily flow and temperature to hourly values to be used in the hydraulic model. Historical meteorological variables are also selected using the k-nn method. We present the statistical modeling framework using Generalized Linear Models (GLMs), along with diagnostics and measurements of skill. We will also provide a comparison of the stream temperature projections from the future years of 2020, 2040, and 2080 and discuss the potential implications on fish habitat in the Methow River. Future integration of the hourly climate scenarios in the hydraulic model will provide the ability to assess the spatial extent of habitat impacts and allow the USBR to evaluate the effectiveness of various river restoration projects in maintaining or improving habitat in a changing climate.

  18. Current and Future Constraints on Higgs Couplings in the Nonlinear Effective Theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    de Blas, Jorge; Eberhardt, Otto; Krause, Claudius

    We perform a Bayesian statistical analysis of the constraints on the nonlinear Effective Theory given by the Higgs electroweak chiral Lagrangian. We obtain bounds on the effective coefficients entering in Higgs observables at the leading order, using all available Higgs-boson signal strengths from the LHC runs 1 and 2. Using a prior dependence study of the solutions, we discuss the results within the context of natural-sized Wilson coefficients. We further study the expected sensitivities to the different Wilson coefficients at various possible future colliders. Finally, we interpret our results in terms of some minimal composite Higgs models.

  19. The promise of air cargo: System aspects and vehicle design

    NASA Technical Reports Server (NTRS)

    Whitehead, A. H., Jr.

    1976-01-01

    The current operation of the air cargo system is reviewed. An assessment of the future of air cargo is provided by: (1) analyzing statistics and trends, (2) by noting system problems and inefficiencies, (3) by analyzing characteristics of 'air eligible' commodities, and (4) by showing the promise of new technology for future cargo aircraft with significant improvements in costs and efficiency. The following topics are discussed: (1) air cargo demand forecasts; (2) economics of air cargo transport; (3) the integrated air cargo system; (4) evolution of airfreighter design; and (5) the span distributed load concept.

  20. Life course approach in social epidemiology: an overview, application and future implications.

    PubMed

    Cable, Noriko

    2014-01-01

    The application of the life course approach to social epidemiology has helped epidemiologists theoretically examine social gradients in population health. Longitudinal data with rich contextual information collected repeatedly and advanced statistical approaches have made this challenging task easier. This review paper provides an overview of the life course approach in epidemiology, its research application, and future challenges. In summary, a systematic approach to methods, including theoretically guided measurement of socioeconomic position, would assist researchers in gathering evidence for reducing social gradients in health, and collaboration across individual disciplines will make this task achievable.

  1. State of mental healthcare systems in Eastern Europe: do we really understand what is going on?

    PubMed Central

    Winkler, Petr

    2016-01-01

    The article examines the current state of mental healthcare systems in countries of Eastern Europe and derives implications for future research and service development. Analysis of available statistics from the World Health Organization’s Mental Health Atlas suggests the need for better-quality data collection. Nonetheless, there appear to be insufficient resources allocated to mental health, lack of involvement of service users in policy-making and, to a large extent, systems continue to rely on mental hospitals. Based on the data presented, a set of directions for future reforms was drafted. PMID:29093919

  2. Teenage pregnancy and mental health beyond the postpartum period: a systematic review.

    PubMed

    Xavier, Chloé; Benoit, Anita; Brown, Hilary K

    2018-06-01

    Teenage mothers are at increased risk for adverse social outcomes and short-term health problems, but long-term impacts on mental health are poorly understood. The aims of our systematic review were to determine the association between teenage pregnancy and mental health beyond the postpartum period, critically appraise the literature's quality and guide future research. We systematically searched MEDLINE, Embase, PsycINFO, CINAHL, Scopus and Web of Science from inception to June 2017 for peer-reviewed articles written in English or French. Data were collected using a modified Cochrane Data Extraction Form. Study quality was assessed using the Effective Public Health Practice Project critical appraisal tool. Heterogeneity of studies permitted only a qualitative synthesis. Nine quantitative studies comprising the results from analyses of 11 cohorts met our criteria and were rated as strong (n=5), moderate (n=2) or weak (n=2). Three cohorts found a statistically significant association between teenage pregnancy and poor long-term mental health after adjustment, three found a statistically significant association before but not after adjustment and five did not find a statistically significant association. Studies observed varying degrees of attenuation after considering social context. Studies with statistically significant findings tended to comprise earlier cohorts, with outcomes measured at older ages. The association between teenage pregnancy and mental health beyond the postpartum period remains unclear. Future studies should employ age-period-cohort frameworks to disentangle effects of normative patterns and stress accumulation. Social factors are important in determining long-term mental health of teenage mothers and should be prioritised in prevention and intervention strategies. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  3. Possible future changes in South East Australian frost frequency: an inter-comparison of statistical downscaling approaches

    NASA Astrophysics Data System (ADS)

    Crimp, Steven; Jin, Huidong; Kokic, Philip; Bakar, Shuvo; Nicholls, Neville

    2018-04-01

    Anthropogenic climate change has already been shown to effect the frequency, intensity, spatial extent, duration and seasonality of extreme climate events. Understanding these changes is an important step in determining exposure, vulnerability and focus for adaptation. In an attempt to support adaptation decision-making we have examined statistical modelling techniques to improve the representation of global climate model (GCM) derived projections of minimum temperature extremes (frosts) in Australia. We examine the spatial changes in minimum temperature extreme metrics (e.g. monthly and seasonal frost frequency etc.), for a region exhibiting the strongest station trends in Australia, and compare these changes with minimum temperature extreme metrics derived from 10 GCMs, from the Coupled Model Inter-comparison Project Phase 5 (CMIP 5) datasets, and via statistical downscaling. We compare the observed trends with those derived from the "raw" GCM minimum temperature data as well as examine whether quantile matching (QM) or spatio-temporal (spTimerQM) modelling with Quantile Matching can be used to improve the correlation between observed and simulated extreme minimum temperatures. We demonstrate, that the spTimerQM modelling approach provides correlations with observed daily minimum temperatures for the period August to November of 0.22. This represents an almost fourfold improvement over either the "raw" GCM or QM results. The spTimerQM modelling approach also improves correlations with observed monthly frost frequency statistics to 0.84 as opposed to 0.37 and 0.81 for the "raw" GCM and QM results respectively. We apply the spatio-temporal model to examine future extreme minimum temperature projections for the period 2016 to 2048. The spTimerQM modelling results suggest the persistence of current levels of frost risk out to 2030, with the evidence of continuing decadal variation.

  4. Post-processing of multi-hydrologic model simulations for improved streamflow projections

    NASA Astrophysics Data System (ADS)

    khajehei, sepideh; Ahmadalipour, Ali; Moradkhani, Hamid

    2016-04-01

    Hydrologic model outputs are prone to bias and uncertainty due to knowledge deficiency in model and data. Uncertainty in hydroclimatic projections arises due to uncertainty in hydrologic model as well as the epistemic or aleatory uncertainties in GCM parameterization and development. This study is conducted to: 1) evaluate the recently developed multi-variate post-processing method for historical simulations and 2) assess the effect of post-processing on uncertainty and reliability of future streamflow projections in both high-flow and low-flow conditions. The first objective is performed for historical period of 1970-1999. Future streamflow projections are generated for 10 statistically downscaled GCMs from two widely used downscaling methods: Bias Corrected Statistically Downscaled (BCSD) and Multivariate Adaptive Constructed Analogs (MACA), over the period of 2010-2099 for two representative concentration pathways of RCP4.5 and RCP8.5. Three semi-distributed hydrologic models were employed and calibrated at 1/16 degree latitude-longitude resolution for over 100 points across the Columbia River Basin (CRB) in the pacific northwest USA. Streamflow outputs are post-processed through a Bayesian framework based on copula functions. The post-processing approach is relying on a transfer function developed based on bivariate joint distribution between the observation and simulation in historical period. Results show that application of post-processing technique leads to considerably higher accuracy in historical simulations and also reducing model uncertainty in future streamflow projections.

  5. An optimal scheme for top quark mass measurement near the \\rm{t}\\bar{t} threshold at future \\rm{e}^{+}{e}^{-} colliders

    NASA Astrophysics Data System (ADS)

    Chen, Wei-Guo; Wan, Xia; Wang, You-Kai

    2018-05-01

    A top quark mass measurement scheme near the {{t}}\\bar{{{t}}} production threshold in future {{{e}}}+{{{e}}}- colliders, e.g. the Circular Electron Positron Collider (CEPC), is simulated. A {χ }2 fitting method is adopted to determine the number of energy points to be taken and their locations. Our results show that the optimal energy point is located near the largest slope of the cross section v. beam energy plot, and the most efficient scheme is to concentrate all luminosity on this single energy point in the case of one-parameter top mass fitting. This suggests that the so-called data-driven method could be the best choice for future real experimental measurements. Conveniently, the top mass statistical uncertainty can also be calculated directly by the error matrix even without any sampling and fitting. The agreement of the above two optimization methods has been checked. Our conclusion is that by taking 50 fb‑1 total effective integrated luminosity data, the statistical uncertainty of the top potential subtracted mass can be suppressed to about 7 MeV and the total uncertainty is about 30 MeV. This precision will help to identify the stability of the electroweak vacuum at the Planck scale. Supported by National Science Foundation of China (11405102) and the Fundamental Research Funds for the Central Universities of China (GK201603027, GK201803019)

  6. Learning Temporal Statistics for Sensory Predictions in Aging.

    PubMed

    Luft, Caroline Di Bernardi; Baker, Rosalind; Goldstone, Aimee; Zhang, Yang; Kourtzi, Zoe

    2016-03-01

    Predicting future events based on previous knowledge about the environment is critical for successful everyday interactions. Here, we ask which brain regions support our ability to predict the future based on implicit knowledge about the past in young and older age. Combining behavioral and fMRI measurements, we test whether training on structured temporal sequences improves the ability to predict upcoming sensory events; we then compare brain regions involved in learning predictive structures between young and older adults. Our behavioral results demonstrate that exposure to temporal sequences without feedback facilitates the ability of young and older adults to predict the orientation of an upcoming stimulus. Our fMRI results provide evidence for the involvement of corticostriatal regions in learning predictive structures in both young and older learners. In particular, we showed learning-dependent fMRI responses for structured sequences in frontoparietal regions and the striatum (putamen) for young adults. However, for older adults, learning-dependent activations were observed mainly in subcortical (putamen, thalamus) regions but were weaker in frontoparietal regions. Significant correlations of learning-dependent behavioral and fMRI changes in these regions suggest a strong link between brain activations and behavioral improvement rather than general overactivation. Thus, our findings suggest that predicting future events based on knowledge of temporal statistics engages brain regions involved in implicit learning in both young and older adults.

  7. Quantification of rare earth elements using laser-induced breakdown spectroscopy

    DOE PAGES

    Martin, Madhavi; Martin, Rodger C.; Allman, Steve; ...

    2015-10-21

    In this paper, a study of the optical emission as a function of concentration of laser-ablated yttrium (Y) and of six rare earth elements, europium (Eu), gadolinium (Gd), lanthanum (La), praseodymium (Pr), neodymium (Nd), and samarium (Sm), has been evaluated using the laser-induced breakdown spectroscopy (LIBS) technique. Statistical methodology using multivariate analysis has been used to obtain the sampling errors, coefficient of regression, calibration, and cross-validation of measurements as they relate to the LIBS analysis in graphite-matrix pellets that were doped with elements at several concentrations. Each element (in oxide form) was mixed in the graphite matrix in percentages rangingmore » from 1% to 50% by weight and the LIBS spectra obtained for each composition as well as for pure oxide samples. Finally, a single pellet was mixed with all the elements in equal oxide masses to determine if we can identify the elemental peaks in a mixed pellet. This dataset is relevant for future application to studies of fission product content and distribution in irradiated nuclear fuels. These results demonstrate that LIBS technique is inherently well suited for the future challenge of in situ analysis of nuclear materials. Finally, these studies also show that LIBS spectral analysis using statistical methodology can provide quantitative results and suggest an approach in future to the far more challenging multielemental analysis of ~ 20 primary elements in high-burnup nuclear reactor fuel.« less

  8. [Strengthen the cancer surveillance to promote cancer prevention and control in China].

    PubMed

    He, J

    2018-01-23

    Cancer is a major chronic disease threatening the people's health in China. We reviewed the latest advances on cancer surveillance, prevention and control in our country, which may provide important clues for future cancer control. We used data from the National Central Cancer Registry, to describe and analyze the latest cancer statistics in China. We summarized updated informations on cancer control policies, conducting network, as well as programs in the country. We provided important suggestions on the future strategies of cancer prevention and control. The overall cancer burden in China has been increasing during the past decades. In 2014, there were about 3 804 000 new cancer cases and 2 296 000 cancer deaths in China. The age-standardized cancer incidence and mortality rates were 190.63/100 000 and 106.98/100 000, respectively. China has formed a comprehensive network on cancer prevention and control. Nationwide population-based cancer surveillance has been built up. The population coverage of cancer surveillance has been expanded, and the data quality has been improved. As the aging population is increasing and unhealthy life styles persist in our country, there will be an unnegligible cancer burden in China. Based on the comprehensive rationale of cancer control and prevention, National Cancer Center of China will perform its duty for future precise cancer control and prevention, based on cancer surveillance statistics.

  9. Quantifying the transmission potential of pandemic influenza

    NASA Astrophysics Data System (ADS)

    Chowell, Gerardo; Nishiura, Hiroshi

    2008-03-01

    This article reviews quantitative methods to estimate the basic reproduction number of pandemic influenza, a key threshold quantity to help determine the intensity of interventions required to control the disease. Although it is difficult to assess the transmission potential of a probable future pandemic, historical epidemiologic data is readily available from previous pandemics, and as a reference quantity for future pandemic planning, mathematical and statistical analyses of historical data are crucial. In particular, because many historical records tend to document only the temporal distribution of cases or deaths (i.e. epidemic curve), our review focuses on methods to maximize the utility of time-evolution data and to clarify the detailed mechanisms of the spread of influenza. First, we highlight structured epidemic models and their parameter estimation method which can quantify the detailed disease dynamics including those we cannot observe directly. Duration-structured epidemic systems are subsequently presented, offering firm understanding of the definition of the basic and effective reproduction numbers. When the initial growth phase of an epidemic is investigated, the distribution of the generation time is key statistical information to appropriately estimate the transmission potential using the intrinsic growth rate. Applications of stochastic processes are also highlighted to estimate the transmission potential using similar data. Critically important characteristics of influenza data are subsequently summarized, followed by our conclusions to suggest potential future methodological improvements.

  10. Depressive symptoms and long-term income: The Young Finns Study.

    PubMed

    Hakulinen, Christian; Elovainio, Marko; Pulkki-Råback, Laura; Böckerman, Petri; Viinikainen, Jutta; Pehkonen, Jaakko; Raitakari, Olli T; Keltikangas-Järvinen, Liisa; Hintsanen, Mirka

    2016-11-01

    Higher depressive symptoms have been associated with lower future income. However, studies examining this issue have had limited follow-up times and have used self-reported measures of income. Also, possible confounders or mediators have not been accounted. 971 women and 738 men were selected from the ongoing prospective Young Finns Study (YFS) that began in 1980. Depressive symptoms were measured in 1992 when participants were from 15 to 30 years old. Information on annual income and earnings from 1993 to 2010 were obtained from the Finnish Longitudinal Employer-Employee Data (FLEED) of Statistics Finland and linked to the YFS. Higher depressive symptoms were associated with lower future income and earnings. For men, the associations were robust for controlling childhood parental socioeconomic status, history of unemployment, and adulthood health behavior, but attenuated circa 35% when three major temperament traits were taken into account. For women, similar pattern was found, however, in the models adjusted for temperament traits the associations did not remain statistically significant. The association between depressive symptoms and earnings was three times stronger for men than women. Previous depressive episodes could have influenced on some participants' economic and educational choices. Higher depressive symptoms in adolescence and early adulthood lead to significant future losses of total income and earnings, and this association is particularly strong for men. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Clinical psychopharmacology of borderline personality disorder: an update on the available evidence in light of the Diagnostic and Statistical Manual of Mental Disorders - 5.

    PubMed

    Ripoll, Luis H

    2012-01-01

    Clinical considerations for evidence-based treatment of borderline personality disorder (BPD) are outlined in the context of the best available evidence, discussed with reference to BPD traits currently identified in the upcoming Diagnostic and Statistical Manual of Mental Disorders - 5 (DSM-5) revision. The DSM-5 will highlight refractory affective, interpersonal, and identity symptoms in BPD as potential treatment targets. In addition to providing a framework for clinical decision-making, future research strategies will also focus on neurotransmitter systems of greater relevance to understanding overall personality functioning. Although only a few randomized controlled trials of psychopharmacological treatments for BPD have been published recently, several meta-analyses and systematic reviews converge on the consensus effectiveness of lamotrigine, topiramate, valproate, aripiprazole, olanzapine, and omega-3 fatty acid supplementation. Stronger evidence exists for treating disinhibition and antagonism than negative affectivity, particularly interpersonal facets of such traits. In addition, basic research suggests a future role for modifying glutamatergic, opioid, and oxytocinergic neurotransmitter systems to treat BPD. Clinicians should utilize omega-3, anticonvulsants, and atypical antipsychotic agents in treating specific DSM-5 BPD traits, notably disinhibition, antagonism, and some aspects of negative affectivity. Future research will focus on normalizing opioid and oxytocin dysregulation, as an adjunct to evidence-based psychotherapy, in an effort to improve interpersonal functioning.

  12. Exoplanet Biosignatures: Future Directions

    PubMed Central

    Bains, William; Cronin, Leroy; DasSarma, Shiladitya; Danielache, Sebastian; Domagal-Goldman, Shawn; Kacar, Betul; Kiang, Nancy Y.; Lenardic, Adrian; Reinhard, Christopher T.; Moore, William; Schwieterman, Edward W.; Shkolnik, Evgenya L.; Smith, Harrison B.

    2018-01-01

    Abstract We introduce a Bayesian method for guiding future directions for detection of life on exoplanets. We describe empirical and theoretical work necessary to place constraints on the relevant likelihoods, including those emerging from better understanding stellar environment, planetary climate and geophysics, geochemical cycling, the universalities of physics and chemistry, the contingencies of evolutionary history, the properties of life as an emergent complex system, and the mechanisms driving the emergence of life. We provide examples for how the Bayesian formalism could guide future search strategies, including determining observations to prioritize or deciding between targeted searches or larger lower resolution surveys to generate ensemble statistics and address how a Bayesian methodology could constrain the prior probability of life with or without a positive detection. Key Words: Exoplanets—Biosignatures—Life detection—Bayesian analysis. Astrobiology 18, 779–824. PMID:29938538

  13. Reserve growth of oil and gas fields—Investigations and applications

    USGS Publications Warehouse

    Cook, Troy A.

    2013-01-01

    The reserve growth of fields has been a topic for ongoing discussion for over half a century and will continue to be studied well into the future. This is due to the expected size of the volumetric contribution of reserve growth to the future supply of oil and natural gas. Understanding past methods of estimating future volumes based on the data assembly methods that have been used can lead to a better understanding of their applicability. The statistical nature of past methods and the (1) possible high level of dependency on a limited number of fields, (2) assumption of an age-based correlation with effective reserve growth, and (3) assumption of long-lived and more common than not reserve growth, may be improved by employing a more geologically based approach.

  14. Exoplanet Biosignatures: Future Directions.

    PubMed

    Walker, Sara I; Bains, William; Cronin, Leroy; DasSarma, Shiladitya; Danielache, Sebastian; Domagal-Goldman, Shawn; Kacar, Betul; Kiang, Nancy Y; Lenardic, Adrian; Reinhard, Christopher T; Moore, William; Schwieterman, Edward W; Shkolnik, Evgenya L; Smith, Harrison B

    2018-06-01

    We introduce a Bayesian method for guiding future directions for detection of life on exoplanets. We describe empirical and theoretical work necessary to place constraints on the relevant likelihoods, including those emerging from better understanding stellar environment, planetary climate and geophysics, geochemical cycling, the universalities of physics and chemistry, the contingencies of evolutionary history, the properties of life as an emergent complex system, and the mechanisms driving the emergence of life. We provide examples for how the Bayesian formalism could guide future search strategies, including determining observations to prioritize or deciding between targeted searches or larger lower resolution surveys to generate ensemble statistics and address how a Bayesian methodology could constrain the prior probability of life with or without a positive detection. Key Words: Exoplanets-Biosignatures-Life detection-Bayesian analysis. Astrobiology 18, 779-824.

  15. Statistical Analysis of a Round-Robin Measurement Survey of Two Candidate Materials for a Seebeck Coefficient Standard Reference Material

    PubMed Central

    Lu, Z. Q. J.; Lowhorn, N. D.; Wong-Ng, W.; Zhang, W.; Thomas, E. L.; Otani, M.; Green, M. L.; Tran, T. N.; Caylor, C.; Dilley, N. R.; Downey, A.; Edwards, B.; Elsner, N.; Ghamaty, S.; Hogan, T.; Jie, Q.; Li, Q.; Martin, J.; Nolas, G.; Obara, H.; Sharp, J.; Venkatasubramanian, R.; Willigan, R.; Yang, J.; Tritt, T.

    2009-01-01

    In an effort to develop a Standard Reference Material (SRM™) for Seebeck coefficient, we have conducted a round-robin measurement survey of two candidate materials—undoped Bi2Te3 and Constantan (55 % Cu and 45 % Ni alloy). Measurements were performed in two rounds by twelve laboratories involved in active thermoelectric research using a number of different commercial and custom-built measurement systems and techniques. In this paper we report the detailed statistical analyses on the interlaboratory measurement results and the statistical methodology for analysis of irregularly sampled measurement curves in the interlaboratory study setting. Based on these results, we have selected Bi2Te3 as the prototype standard material. Once available, this SRM will be useful for future interlaboratory data comparison and instrument calibrations. PMID:27504212

  16. Short-term international migration trends in England and Wales from 2004 to 2009.

    PubMed

    Whitworth, Simon; Loukas, Konstantinos; McGregor, Ian

    2011-01-01

    Short-term migration estimates for England and Wales are the latest addition to the Office for National Statistics (ONS) migration statistics. This article discusses definitions of short-term migration and the methodology that is used to produce the estimates. Some of the estimates and the changes in the estimates over time are then discussed. The article includes previously unpublished short-term migration statistics and therefore helps to give a more complete picture of the size and characteristics of short-term international migration for England and Wales than has previously been possible. ONS have identified a clear user requirement for short-term migration estimates at local authority (LA) level. Consequently, attention is also paid to the progress that has been made and future work that is planned to distribute England and Wales short-term migration estimates to LA level.

  17. A Survey of Statistical Models for Reverse Engineering Gene Regulatory Networks

    PubMed Central

    Huang, Yufei; Tienda-Luna, Isabel M.; Wang, Yufeng

    2009-01-01

    Statistical models for reverse engineering gene regulatory networks are surveyed in this article. To provide readers with a system-level view of the modeling issues in this research, a graphical modeling framework is proposed. This framework serves as the scaffolding on which the review of different models can be systematically assembled. Based on the framework, we review many existing models for many aspects of gene regulation; the pros and cons of each model are discussed. In addition, network inference algorithms are also surveyed under the graphical modeling framework by the categories of point solutions and probabilistic solutions and the connections and differences among the algorithms are provided. This survey has the potential to elucidate the development and future of reverse engineering GRNs and bring statistical signal processing closer to the core of this research. PMID:20046885

  18. Novel statistical tools for management of public databases facilitate community-wide replicability and control of false discovery.

    PubMed

    Rosset, Saharon; Aharoni, Ehud; Neuvirth, Hani

    2014-07-01

    Issues of publication bias, lack of replicability, and false discovery have long plagued the genetics community. Proper utilization of public and shared data resources presents an opportunity to ameliorate these problems. We present an approach to public database management that we term Quality Preserving Database (QPD). It enables perpetual use of the database for testing statistical hypotheses while controlling false discovery and avoiding publication bias on the one hand, and maintaining testing power on the other hand. We demonstrate it on a use case of a replication server for GWAS findings, underlining its practical utility. We argue that a shift to using QPD in managing current and future biological databases will significantly enhance the community's ability to make efficient and statistically sound use of the available data resources. © 2014 WILEY PERIODICALS, INC.

  19. Attitudes towards statistics of graduate entry medical students: the role of prior learning experiences

    PubMed Central

    2014-01-01

    Background While statistics is increasingly taught as part of the medical curriculum, it can be an unpopular subject and feedback from students indicates that some find it more difficult than other subjects. Understanding attitudes towards statistics on entry to graduate entry medical programmes is particularly important, given that many students may have been exposed to quantitative courses in their previous degree and hence bring preconceptions of their ability and interest to their medical education programme. The aim of this study therefore is to explore, for the first time, attitudes towards statistics of graduate entry medical students from a variety of backgrounds and focus on understanding the role of prior learning experiences. Methods 121 first year graduate entry medical students completed the Survey of Attitudes toward Statistics instrument together with information on demographics and prior learning experiences. Results Students tended to appreciate the relevance of statistics in their professional life and be prepared to put effort into learning statistics. They had neutral to positive attitudes about their interest in statistics and their intellectual knowledge and skills when applied to it. Their feelings towards statistics were slightly less positive e.g. feelings of insecurity, stress, fear and frustration and they tended to view statistics as difficult. Even though 85% of students had taken a quantitative course in the past, only 24% of students described it as likely that they would take any course in statistics if the choice was theirs. How well students felt they had performed in mathematics in the past was a strong predictor of many of the components of attitudes. Conclusion The teaching of statistics to medical students should start with addressing the association between students’ past experiences in mathematics and their attitudes towards statistics and encouraging students to recognise the difference between the two disciplines. Addressing these issues may reduce students’ anxiety and perception of difficulty at the start of their learning experience and encourage students to engage with statistics in their future careers. PMID:24708762

  20. Attitudes towards statistics of graduate entry medical students: the role of prior learning experiences.

    PubMed

    Hannigan, Ailish; Hegarty, Avril C; McGrath, Deirdre

    2014-04-04

    While statistics is increasingly taught as part of the medical curriculum, it can be an unpopular subject and feedback from students indicates that some find it more difficult than other subjects. Understanding attitudes towards statistics on entry to graduate entry medical programmes is particularly important, given that many students may have been exposed to quantitative courses in their previous degree and hence bring preconceptions of their ability and interest to their medical education programme. The aim of this study therefore is to explore, for the first time, attitudes towards statistics of graduate entry medical students from a variety of backgrounds and focus on understanding the role of prior learning experiences. 121 first year graduate entry medical students completed the Survey of Attitudes toward Statistics instrument together with information on demographics and prior learning experiences. Students tended to appreciate the relevance of statistics in their professional life and be prepared to put effort into learning statistics. They had neutral to positive attitudes about their interest in statistics and their intellectual knowledge and skills when applied to it. Their feelings towards statistics were slightly less positive e.g. feelings of insecurity, stress, fear and frustration and they tended to view statistics as difficult. Even though 85% of students had taken a quantitative course in the past, only 24% of students described it as likely that they would take any course in statistics if the choice was theirs. How well students felt they had performed in mathematics in the past was a strong predictor of many of the components of attitudes. The teaching of statistics to medical students should start with addressing the association between students' past experiences in mathematics and their attitudes towards statistics and encouraging students to recognise the difference between the two disciplines. Addressing these issues may reduce students' anxiety and perception of difficulty at the start of their learning experience and encourage students to engage with statistics in their future careers.

  1. A systematic review of Bayesian articles in psychology: The last 25 years.

    PubMed

    van de Schoot, Rens; Winter, Sonja D; Ryan, Oisín; Zondervan-Zwijnenburg, Mariëlle; Depaoli, Sarah

    2017-06-01

    Although the statistical tools most often used by researchers in the field of psychology over the last 25 years are based on frequentist statistics, it is often claimed that the alternative Bayesian approach to statistics is gaining in popularity. In the current article, we investigated this claim by performing the very first systematic review of Bayesian psychological articles published between 1990 and 2015 (n = 1,579). We aim to provide a thorough presentation of the role Bayesian statistics plays in psychology. This historical assessment allows us to identify trends and see how Bayesian methods have been integrated into psychological research in the context of different statistical frameworks (e.g., hypothesis testing, cognitive models, IRT, SEM, etc.). We also describe take-home messages and provide "big-picture" recommendations to the field as Bayesian statistics becomes more popular. Our review indicated that Bayesian statistics is used in a variety of contexts across subfields of psychology and related disciplines. There are many different reasons why one might choose to use Bayes (e.g., the use of priors, estimating otherwise intractable models, modeling uncertainty, etc.). We found in this review that the use of Bayes has increased and broadened in the sense that this methodology can be used in a flexible manner to tackle many different forms of questions. We hope this presentation opens the door for a larger discussion regarding the current state of Bayesian statistics, as well as future trends. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  2. Analysis and meta-analysis of single-case designs: an introduction.

    PubMed

    Shadish, William R

    2014-04-01

    The last 10 years have seen great progress in the analysis and meta-analysis of single-case designs (SCDs). This special issue includes five articles that provide an overview of current work on that topic, including standardized mean difference statistics, multilevel models, Bayesian statistics, and generalized additive models. Each article analyzes a common example across articles and presents syntax or macros for how to do them. These articles are followed by commentaries from single-case design researchers and journal editors. This introduction briefly describes each article and then discusses several issues that must be addressed before we can know what analyses will eventually be best to use in SCD research. These issues include modeling trend, modeling error covariances, computing standardized effect size estimates, assessing statistical power, incorporating more accurate models of outcome distributions, exploring whether Bayesian statistics can improve estimation given the small samples common in SCDs, and the need for annotated syntax and graphical user interfaces that make complex statistics accessible to SCD researchers. The article then discusses reasons why SCD researchers are likely to incorporate statistical analyses into their research more often in the future, including changing expectations and contingencies regarding SCD research from outside SCD communities, changes and diversity within SCD communities, corrections of erroneous beliefs about the relationship between SCD research and statistics, and demonstrations of how statistics can help SCD researchers better meet their goals. Copyright © 2013 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  3. Views of medical students: what, when and how do they want statistics taught?

    PubMed

    Fielding, S; Poobalan, A; Prescott, G J; Marais, D; Aucott, L

    2015-11-01

    A key skill for a practising clinician is being able to do research, understand the statistical analyses and interpret results in the medical literature. Basic statistics has become essential within medical education, but when, what and in which format is uncertain. To inform curriculum design/development we undertook a quantitative survey of fifth year medical students and followed them up with a series of focus groups to obtain their opinions as to what statistics teaching they want, when and how. A total of 145 students undertook the survey and five focus groups were held with between 3 and 9 participants each. Previous statistical training varied and students recognised their knowledge was inadequate and keen to see additional training implemented. Students were aware of the importance of statistics to their future careers, but apprehensive about learning. Face-to-face teaching supported by online resources was popular. Focus groups indicated the need for statistical training early in their degree and highlighted their lack of confidence and inconsistencies in support. The study found that the students see the importance of statistics training in the medical curriculum but that timing and mode of delivery are key. The findings have informed the design of a new course to be implemented in the third undergraduate year. Teaching will be based around published studies aiming to equip students with the basics required with additional resources available through a virtual learning environment. © The Author(s) 2015.

  4. Finding the Root Causes of Statistical Inconsistency in Community Earth System Model Output

    NASA Astrophysics Data System (ADS)

    Milroy, D.; Hammerling, D.; Baker, A. H.

    2017-12-01

    Baker et al (2015) developed the Community Earth System Model Ensemble Consistency Test (CESM-ECT) to provide a metric for software quality assurance by determining statistical consistency between an ensemble of CESM outputs and new test runs. The test has proved useful for detecting statistical difference caused by compiler bugs and errors in physical modules. However, detection is only the necessary first step in finding the causes of statistical difference. The CESM is a vastly complex model comprised of millions of lines of code which is developed and maintained by a large community of software engineers and scientists. Any root cause analysis is correspondingly challenging. We propose a new capability for CESM-ECT: identifying the sections of code that cause statistical distinguishability. The first step is to discover CESM variables that cause CESM-ECT to classify new runs as statistically distinct, which we achieve via Randomized Logistic Regression. Next we use a tool developed to identify CESM components that define or compute the variables found in the first step. Finally, we employ the application Kernel GENerator (KGEN) created in Kim et al (2016) to detect fine-grained floating point differences. We demonstrate an example of the procedure and advance a plan to automate this process in our future work.

  5. Development of the Ethical and Legal Issues in Counseling Self-Efficacy Scale

    ERIC Educational Resources Information Center

    Mullen, Patrick R.; Lambie, Glenn W.; Conley, Abigail H.

    2014-01-01

    The authors present the development of the Ethical and Legal Issues in Counseling Self-Efficacy Scale (ELICSES). The purpose of this article is threefold: (a) present a rationale for the ELICSES, (b) review statistical analysis procedures used to develop the ELICSES, and (c) offer implications for future research and counselor education.

  6. Information-Theoretic Properties of Auditory Sequences Dynamically Influence Expectation and Memory

    ERIC Educational Resources Information Center

    Agres, Kat; Abdallah, Samer; Pearce, Marcus

    2018-01-01

    A basic function of cognition is to detect regularities in sensory input to facilitate the prediction and recognition of future events. It has been proposed that these implicit expectations arise from an internal predictive coding model, based on knowledge acquired through processes such as statistical learning, but it is unclear how different…

  7. Illinois Kids Count 2001: Envisioning the Future.

    ERIC Educational Resources Information Center

    Baker, Brenda; Familia, Yahaira; Gifford, Amy; Knowlton, Gretchen; Matakis, Brian; Olson, Melissa; Owens, Tracy; Zasadny, Julie

    This Kids Count report examines statewide trends in the well-being of Illinois' children. The statistical portrait is based on indicators in the areas of family, education and child care, arts and recreation, safety, health, and economic security. The indicators are: (1) percent of children living in poverty; (2) number of children enrolled in…

  8. Occupational Decision-Related Processes for Amotivated Adolescents: Confirmation of a Model

    ERIC Educational Resources Information Center

    Jung, Jae Yup; McCormick, John

    2011-01-01

    This study developed and (statistically) confirmed a new model of the occupational decision-related processes of adolescents, in terms of the extent to which they may be amotivated about choosing a future occupation. A theoretical framework guided the study. A questionnaire that had previously been administered to an Australian adolescent sample…

  9. 76 FR 9696 - Equipment Price Forecasting in Energy Conservation Standards Analysis

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-22

    ... for particular efficiency design options, an empirical experience curve fit to the available data may be used to forecast future costs of such design option technologies. If a statistical evaluation indicates a low level of confidence in estimates of the design option cost trend, this method should not be...

  10. A second look a North Dakota's timber lands, 1980.

    Treesearch

    Pamela J. Jakes; W. Brad Smith

    1982-01-01

    The second inventory of North Dakota forest resources shows a decline in commercial forest area between 1954 and 1980. Presented are text and statistics on forest area and timber volume, growth, mortality, ownership, stocking, future timber supply, timber use, forest management opportunities, and nontimber forest resources. A forest type map is included.

  11. Office of Indian Affairs Progress Report, June 1977, State of New Mexico.

    ERIC Educational Resources Information Center

    New Mexico State Commission on Indian Affairs, Santa Fe.

    Included in this report on the New Mexico Office of Indian Affairs are brief sections dealing with: the Agency's background; past projects and future goals; the Research and Statistics Center; projects under consideration; and the Agency's Tribal Liaison Officer, Tribal Relations Specialist; and Planner. Cited as major Agency accomplishments…

  12. A Profile of Public School Biology Teachers in the USA.

    ERIC Educational Resources Information Center

    Lindauer, Ivo E.; Queitzsch, Mary L.

    1996-01-01

    Uses data from the National Center for Educational Statistics' Schools and Staffing Survey (SASS) to present a profile of biology teachers. Discusses background of biology teachers, preparation in the physical and life sciences, who does the preparation, and expected future trends. Compares data with results reported for chemistry, earth science,…

  13. A Report on Girls in San Francisco: Benchmarks for the Future.

    ERIC Educational Resources Information Center

    Lehman, Ann; Sacco, Carol

    This study collected information on girls in San Francisco, California in the areas of demographics, economics, education, health, safety and violence, and criminal justice. Data came from local, state, and national sources (e.g., the U.S. Census Bureau; the California Bureau of Justice and the Criminal Statistics Center; the California Department…

  14. Indiana's timber.

    Treesearch

    John S. Jr. Spencer

    1969-01-01

    The second (1967) survey of Indiana's 4 million forested acres shows 3.5 billion cubic feet of growing stock on 3.9 million acres of commercial forest land. Presented are statistics on timber area, volume, growth, mortality, and use. Projections of timber growth, removals, and inventory are made to 1992, and possible future changes in the forest are discussed....

  15. Railroad safety program, task 2

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Aspects of railroad safety and the preparation of a National Inspection Plan (NIP) for rail safety improvement are examined. Methodology for the allocation of inspection resources, preparation of a NIP instruction manual, and recommendations for future NIP, are described. A statistical analysis of regional rail accidents is presented with causes and suggested preventive measures included.

  16. America's Changing Work Force. Statistics in Brief.

    ERIC Educational Resources Information Center

    American Association of Retired Persons, Washington, DC.

    This booklet provides information about the changing work force. It offers a profile of workers aged 45 and older, as well as likely changes in the work force of the future. Tables and graphs illustrate the following: profile of Americans aged 50 and older, by employment status; employment status of the civilian noninstitutional population by age…

  17. Preparing for Opportunities and Pitfalls...of the Future. Manpower Report No. 88-4.

    ERIC Educational Resources Information Center

    Purdue Univ., Lafayette, IN. Office of Manpower Studies.

    This report presents and discusses tabular statistical data covering trends in population distribution, education, social and economic well-being, and employment in the United States and the Great Lakes region. Section I, "Population Distribution and Change," contains a map of U.S. population growth projections for 1980-2000 and three…

  18. How Fast Is Fast Enough?

    ERIC Educational Resources Information Center

    Henke, Karen Greenwood

    2007-01-01

    Just how much bandwidth does the average student in the United States have access to today, and how much will he or she need in the future? That depends, according to district CTOs, state technology directors, industry experts, and classroom teachers. The National Center for Education Statistics reports that 97 percent of U.S. public schools with…

  19. International Graduate Student Mobility in the US: What More Can We Be Doing?

    ERIC Educational Resources Information Center

    Roberts, Darbi L.

    2012-01-01

    This article examines the current growth statistics of international graduate student populations in the United States in order to present trends in international student mobility. Although many scholars suggest the United States is facing a decrease in future international student demand, recent studies seem to challenge this theory. This article…

  20. The Effectiveness of the Counterinsurgency Operations During the Macedonian Conflict in 2001

    DTIC Science & Technology

    2010-12-10

    comes from the statistical analysis of demographic data that predicts that in the near future the high Albanian birth rate (Albanians have highest......among the first members of the NATO Partnership for Peace program, which was a first step to joining NATO. Until the emergence of the insurgency in

  1. Forces Influencing Rural Community Growth.

    ERIC Educational Resources Information Center

    Rainey, Kenneth D.

    The paper briefly focuses on two questions: Can the recent growth trend be expected to continue into the future? and What does this imply as far as public policy and programs are concerned? Statistics on growth in the seventies suggest three possibilities: a change in the functions of metropolitan and nonmetropolitan areas; the decline of the city…

  2. A Study of Graduate Student Ethics in Leadership Preparation Programs

    ERIC Educational Resources Information Center

    Roberts, Kerry; Sampson, Pauline

    2011-01-01

    The ethics and character of a superintendent is important and much research has been done in this area. This study attempted to take the instrument developed by Fenstermaker (1994) and determine if future superintendents have the same ethical response as current superintendents. The small number (n=20) limited statistical comparisons. Three…

  3. Behavioral and Social Science Research: A National Resource. Part II.

    ERIC Educational Resources Information Center

    Adams, Robert McC., Ed.; And Others

    Areas of behavioral and social science research that have achieved significant breakthroughs in knowledge or application or that show future promise of achieving such breakthroughs are discussed in 12 papers. For example, the paper on formal demography shows how mathematical or statistical techniques can be used to explain and predict change in…

  4. Enrollment Simulation and Planning. Strategies & Solutions Series, No. 3.

    ERIC Educational Resources Information Center

    McIntyre, Chuck

    Enrollment simulation and planning (ESP) is centered on the use of statistical models to describe how and why college enrollments fluctuate. College planners may use this approach with confidence to simulate any number of plausible future scenarios. Planners can then set a variety of possible college actions against these scenarios, and examine…

  5. Complete Statistical Survey Results of 1982 Texas Competency Validation Project.

    ERIC Educational Resources Information Center

    Rogers, Sandra K.; Dahlberg, Maurine F.

    This report documents a project to develop current statewide validated competencies for auto mechanics, diesel mechanics, welding, office occupations, and printing. Section 1 describes the four steps used in the current competency validation project and provides a standardized process for conducting future studies at the local or statewide level.…

  6. The Impact of Mentoring on Post-Secondary Success of Adult Education Graduates

    ERIC Educational Resources Information Center

    Kaltenbaugh, Jane Kitzer

    2017-01-01

    Earning a bachelor's degree or a post-secondary certificate can have a significant impact on a person's quality of life and financial future; however, statistics show that many high school graduates who continue to post-secondary studies do not complete their degree or certification. Adult education students (non-traditional students) are…

  7. A Strong Future for Public Library Use and Employment

    ERIC Educational Resources Information Center

    Griffiths, Jose-Marie; King, Donald W.

    2011-01-01

    The latest and most comprehensive assessment of public librarians' education and career paths to date, this important volume reports on a large-scale research project performed by authors Jose-Marie Griffiths and Donald W. King. Presented in collaboration with the Office for Research and Statistics (ORS), the book includes an examination of trends…

  8. Rethinking Nationalism: Seeking Answers for Future Black Voices

    ERIC Educational Resources Information Center

    Shockley, KMT G.

    2004-01-01

    Many organizations that conduct statistical analyses of demographic information indicate that the United States is rapidly becoming a more diverse place (e.g. U.S. Census, 2000). Many of the major newspapers such as the Washington Post on March 7, 2003, seemed eager to print stories about Hispanics "overtaking" Blacks as the nation's…

  9. Which Industries Are Sensitive to Business Cycles?

    ERIC Educational Resources Information Center

    Berman, Jay; Pfleeger, Janet

    1997-01-01

    An analysis of the 1994-2005 Bureau of Labor Statistics employment projections can be used to identify industries that are projected to move differently with business cycles in the future than with those of the past, and can be used to identify the industries and occupations that are most prone to business cycle swings. (Author)

  10. International housing construction developments - implications for hardwood utilization

    Treesearch

    Delton Alderman

    2011-01-01

    This paper describes the current state of international housing markets, providing general and statistical information on regional housing markets and will posit implications for the future. The emphasis is on regions that use appreciable quantities of wood in housing construction, principally North America, Europe, and Japan. In the past 15 years, housing markets...

  11. Writing Assessment in the Department of Mathematics at Rochester Institute of Technology.

    ERIC Educational Resources Information Center

    Birken, Marcia

    The goal of writing assessment in the Department of Mathematics at Rochester Institute of Technology (RIT) is to assure that students can communicate about mathematics or statistics in a manner appropriate to their future careers. A five-member writing committee, composed of mathematics faculty, assess students at three different times during…

  12. Direct Observation of Teacher and Student Behavior in School Settings: Trends, Issues and Future Directions

    ERIC Educational Resources Information Center

    Lewis, Timothy J.; Scott, Terrance M.; Wehby, Joseph H.; Wills, Howard P.

    2014-01-01

    Across the modern history of the field of special education and emotional/behavioral disorders (EBD), direct observation of student and educator behavior has been an essential component of the diagnostic process, student progress monitoring, and establishing functional and statistical relationships within research. This article provides an…

  13. Voces (Voices): A Profile of Today's Latino College Students

    ERIC Educational Resources Information Center

    Santiago, Deborah A.

    2007-01-01

    Latinos are the youngest and fastest growing ethnic group in the United States. It is imperative that institutional leaders and decision makers have a better understanding of Latino students today in order to shape the policies and practices to serve college students in the future. Currently, disparate statistics about Latino students in higher…

  14. A Voice from the Trenches: A Reaction to Ivey and Ivey (1998).

    ERIC Educational Resources Information Center

    Hinkle, J. Scott

    1999-01-01

    Offers reaction to Ivey and Ivey's article regarding the fourth edition of the Diagnostic and Statistical Manual of Mental Disorders. Discusses the medical model versus the developmental model in relation to counselor education and training, diagnostic bias, and the future identity of professional counselors. Concludes that defining a theoretical…

  15. Development and Factor Structure of the Helping Professional Wellness Discrepancy Scale

    ERIC Educational Resources Information Center

    Blount, Ashley J.; Lambie, Glenn W.

    2018-01-01

    The authors present the development of the Helping Professional Wellness Discrepancy Scale (HPWDS). The purpose of this article is threefold: (a) present a rationale for the HPWDS; (b) review statistical analyses procedures used to develop the HPWDS; and (c) offer implications for counselors, other helping professionals, and future research.

  16. America's Children and Their Families: Key Facts.

    ERIC Educational Resources Information Center

    Simons, Janet M.; Eng, Mary

    For advocates, parents, educators, researchers, and speechmakers, this book brings together key facts and statistical data about the American family, now and in the near future. The first section provides an overview of population and demographic trends extending through the first decade of the 21st century. This overview is followed by sections…

  17. The distribution of density in supersonic turbulence

    NASA Astrophysics Data System (ADS)

    Squire, Jonathan; Hopkins, Philip F.

    2017-11-01

    We propose a model for the statistics of the mass density in supersonic turbulence, which plays a crucial role in star formation and the physics of the interstellar medium (ISM). The model is derived by considering the density to be arranged as a collection of strong shocks of width ˜ M^{-2}, where M is the turbulent Mach number. With two physically motivated parameters, the model predicts all density statistics for M>1 turbulence: the density probability distribution and its intermittency (deviation from lognormality), the density variance-Mach number relation, power spectra and structure functions. For the proposed model parameters, reasonable agreement is seen between model predictions and numerical simulations, albeit within the large uncertainties associated with current simulation results. More generally, the model could provide a useful framework for more detailed analysis of future simulations and observational data. Due to the simple physical motivations for the model in terms of shocks, it is straightforward to generalize to more complex physical processes, which will be helpful in future more detailed applications to the ISM. We see good qualitative agreement between such extensions and recent simulations of non-isothermal turbulence.

  18. Multi objective climate change impact assessment using multi downscaled climate scenarios

    NASA Astrophysics Data System (ADS)

    Rana, Arun; Moradkhani, Hamid

    2016-04-01

    Global Climate Models (GCMs) are often used to downscale the climatic parameters on a regional and global scale. In the present study, we have analyzed the changes in precipitation and temperature for future scenario period of 2070-2099 with respect to historical period of 1970-2000 from a set of statistically downscaled GCM projections for Columbia River Basin (CRB). Analysis is performed using 2 different statistically downscaled climate projections namely the Bias Correction and Spatial Downscaling (BCSD) technique generated at Portland State University and the Multivariate Adaptive Constructed Analogs (MACA) technique, generated at University of Idaho, totaling to 40 different scenarios. Analysis is performed on spatial, temporal and frequency based parameters in the future period at a scale of 1/16th of degree for entire CRB region. Results have indicated in varied degree of spatial change pattern for the entire Columbia River Basin, especially western part of the basin. At temporal scales, winter precipitation has higher variability than summer and vice-versa for temperature. Frequency analysis provided insights into possible explanation to changes in precipitation.

  19. Multivariate Statistical Inference of Lightning Occurrence, and Using Lightning Observations

    NASA Technical Reports Server (NTRS)

    Boccippio, Dennis

    2004-01-01

    Two classes of multivariate statistical inference using TRMM Lightning Imaging Sensor, Precipitation Radar, and Microwave Imager observation are studied, using nonlinear classification neural networks as inferential tools. The very large and globally representative data sample provided by TRMM allows both training and validation (without overfitting) of neural networks with many degrees of freedom. In the first study, the flashing / or flashing condition of storm complexes is diagnosed using radar, passive microwave and/or environmental observations as neural network inputs. The diagnostic skill of these simple lightning/no-lightning classifiers can be quite high, over land (above 80% Probability of Detection; below 20% False Alarm Rate). In the second, passive microwave and lightning observations are used to diagnose radar reflectivity vertical structure. A priori diagnosis of hydrometeor vertical structure is highly important for improved rainfall retrieval from either orbital radars (e.g., the future Global Precipitation Mission "mothership") or radiometers (e.g., operational SSM/I and future Global Precipitation Mission passive microwave constellation platforms), we explore the incremental benefit to such diagnosis provided by lightning observations.

  20. Performance index for virtual reality phacoemulsification surgery

    NASA Astrophysics Data System (ADS)

    Söderberg, Per; Laurell, Carl-Gustaf; Simawi, Wamidh; Skarman, Eva; Nordqvist, Per; Nordh, Leif

    2007-02-01

    We have developed a virtual reality (VR) simulator for phacoemulsification (phaco) surgery. The current work aimed at developing a performance index that characterizes the performance of an individual trainee. We recorded measurements of 28 response variables during three iterated surgical sessions in 9 subjects naive to cataract surgery and 6 experienced cataract surgeons, separately for the sculpting phase and the evacuation phase of phacoemulsification surgery. We further defined a specific performance index for a specific measurement variable and a total performance index for a specific trainee. The distribution function for the total performance index was relatively evenly distributed both for the sculpting and the evacuation phase indicating that parametric statistics can be used for comparison of total average performance indices for different groups in the future. The current total performance index for an individual considers all measurement variables included with the same weight. It is possible that a future development of the system will indicate that a better characterization of a trainee can be obtained if the various measurements variables are given specific weights. The currently developed total performance index for a trainee is statistically an independent observation of that particular trainee.

  1. Gender differences in perception of workplace sexual harassment among future professionals

    PubMed Central

    Banerjee, Amitav; Sharma, Bhavana

    2011-01-01

    Background: Indian society is in a stage of rapid social transition. As more women enter the workforce, stresses vis-à-vis the genders are to be expected in patriarchal society to which most of our population belongs. Earlier studies in Western societies have revealed gender differences in perception of what constitutes sexual harassment. Aim: Elicit gender differences, if any, in the workplace sexual harassment among future professionals. Settings and Design: A cross-sectional study among the students of professional colleges. Materials and Methods: A total of 200 students of both sexes were randomly selected from four professional colleges. Data collection was done on a structured questionnaire by interview. Statistical Analysis: Internal consistency of the questionnaire was tested by Crohnbach's α coefficient. Associations between gender and perceptions were explored with Chi-square, Odds Ratio with 95% confidence interval, where applicable. Results: The differences in perception on what constitutes sexual harassment among the genders were statistically significant on many measures (P<0.01). Conclusions: Men and women differ in their awareness as to what constitute sexual harassment. Men were more lacking in awareness regarding sexual harassment. PMID:22969176

  2. A Fundamental Study on Spectrum Center Estimation of Solar Spectral Irradiation by the Statistical Pattern Recognition

    NASA Astrophysics Data System (ADS)

    Iijima, Aya; Suzuki, Kazumi; Wakao, Shinji; Kawasaki, Norihiro; Usami, Akira

    With a background of environmental problems and energy issues, it is expected that PV systems will be introduced rapidly and connected with power grids on a large scale in the future. For this reason, the concern to which PV power generation will affect supply and demand adjustment in electric power in the future arises and the technique of correctly grasping the PV power generation becomes increasingly important. The PV power generation depends on solar irradiance, temperature of a module and solar spectral irradiance. Solar spectral irradiance is distribution of the strength of the light for every wavelength. As the spectrum sensitivity of solar cell depends on kind of solar cell, it becomes important for exact grasp of PV power generation. Especially the preparation of solar spectral irradiance is, however, not easy because the observational instrument of solar spectral irradiance is expensive. With this background, in this paper, we propose a new method based on statistical pattern recognition for estimating the spectrum center which is representative index of solar spectral irradiance. Some numerical examples obtained by the proposed method are also presented.

  3. Field warming experiments shed light on the wheat yield response to temperature in China

    PubMed Central

    Zhao, Chuang; Piao, Shilong; Huang, Yao; Wang, Xuhui; Ciais, Philippe; Huang, Mengtian; Zeng, Zhenzhong; Peng, Shushi

    2016-01-01

    Wheat growth is sensitive to temperature, but the effect of future warming on yield is uncertain. Here, focusing on China, we compiled 46 observations of the sensitivity of wheat yield to temperature change (SY,T, yield change per °C) from field warming experiments and 102 SY,T estimates from local process-based and statistical models. The average SY,T from field warming experiments, local process-based models and statistical models is −0.7±7.8(±s.d.)% per °C, −5.7±6.5% per °C and 0.4±4.4% per °C, respectively. Moreover, SY,T is different across regions and warming experiments indicate positive SY,T values in regions where growing-season mean temperature is low, and water supply is not limiting, and negative values elsewhere. Gridded crop model simulations from the Inter-Sectoral Impact Model Intercomparison Project appear to capture the spatial pattern of SY,T deduced from warming observations. These results from local manipulative experiments could be used to improve crop models in the future. PMID:27853151

  4. Not my future? Core values and the neural representation of future events.

    PubMed

    Brosch, Tobias; Stussi, Yoann; Desrichard, Olivier; Sander, David

    2018-06-01

    Individuals with pronounced self-transcendence values have been shown to put greater weight on the long-term consequences of their actions when making decisions. Using functional magnetic resonance imaging, we investigated the neural mechanisms underlying the evaluation of events occurring several decades in the future as well as the role of core values in these processes. Thirty-six participants viewed a series of events, consisting of potential consequences of climate change, which could occur in the near future (around 2030), and thus would be experienced by the participants themselves, or in the far future (around 2080). We observed increased activation in anterior VMPFC (BA11), a region involved in encoding the personal significance of future events, when participants were envisioning far future events, demonstrating for the first time that the role of the VMPFC in future projection extends to the time scale of decades. Importantly, this activation increase was observed only in participants with pronounced self-transcendence values measured by self-report questionnaire, as shown by a statistically significant interaction of temporal distance and value structure. These findings suggest that future projection mechanisms are modulated by self-transcendence values to allow for a more extensive simulation of far future events. Consistent with this, these participants reported similar concern ratings for near and far future events, whereas participants with pronounced self-enhancement values were more concerned about near future events. Our findings provide a neural substrate for the tendency of individuals with pronounced self-transcendence values to consider the long-term consequences of their actions.

  5. Will Outer Tropical Cyclone Size Change due to Anthropogenic Warming?

    NASA Astrophysics Data System (ADS)

    Schenkel, B. A.; Lin, N.; Chavas, D. R.; Vecchi, G. A.; Knutson, T. R.; Oppenheimer, M.

    2017-12-01

    Prior research has shown significant interbasin and intrabasin variability in outer tropical cyclone (TC) size. Moreover, outer TC size has even been shown to vary substantially over the lifetime of the majority of TCs. However, the factors responsible for both setting initial outer TC size and determining its evolution throughout the TC lifetime remain uncertain. Given these gaps in our physical understanding, there remains uncertainty in how outer TC size will change, if at all, due to anthropogenic warming. The present study seeks to quantify whether outer TC size will change significantly in response to anthropogenic warming using data from a high-resolution global climate model and a regional hurricane model. Similar to prior work, the outer TC size metric used in this study is the radius in which the azimuthal-mean surface azimuthal wind equals 8 m/s. The initial results from the high-resolution global climate model data suggest that the distribution of outer TC size shifts significantly towards larger values in each global TC basin during future climates, as revealed by 1) statistically significant increase of the median outer TC size by 5-10% (p<0.05) according to a 1,000-sample bootstrap resampling approach with replacement and 2) statistically significant differences between distributions of outer TC size from current and future climate simulations as shown using two-sample Kolmogorov Smirnov testing (p<<0.01). Additional analysis of the high-resolution global climate model data reveals that outer TC size does not uniformly increase within each basin in future climates, but rather shows substantial locational dependence. Future work will incorporate the regional mesoscale hurricane model data to help focus on identifying the source of the spatial variability in outer TC size increases within each basin during future climates and, more importantly, why outer TC size changes in response to anthropogenic warming.

  6. Ocean waves from tropical cyclones in the Gulf of Mexico and the effect of climate change

    NASA Astrophysics Data System (ADS)

    Appendini, C. M.; Pedrozo-Acuña, A.; Meza-Padilla, R.; Torres-Freyermuth, A.; Cerezo-Mota, R.; López-González, J.

    2016-12-01

    To generate projections of wave climate associated to tropical cyclones is a challenge due to their short historical record of events, their low occurrence, and the poor wind field resolution in General Circulation Models. Synthetic tropical cyclones provide an alternative to overcome such limitations, improving robust statistics under present and future climates. We use synthetic events to characterize present and future wave climate associated with tropical cyclones in the Gulf of Mexico. The NCEP/NCAR atmospheric reanalysis and the Coupled Model Intercomparison Project Phase 5 models NOAA/GFDL CM3 and UK Met Office HADGEM2-ES, were used to derive present and future wave climate under RCPs 4.5 and 8.5. The results suggest an increase in wave activity for the future climate, particularly for the GFDL model that shows less bias in the present climate, although some areas are expected to decrease the wave energy. The practical implications of determining the future wave climate is exemplified by means of the 100-year design wave, where the use of the present climate may result in under/over design of structures, since the lifespan of a structure includes the future wave climate period.

  7. Climate change impact on soil erosion in the Mandakini River Basin, North India

    NASA Astrophysics Data System (ADS)

    Khare, Deepak; Mondal, Arun; Kundu, Sananda; Mishra, Prabhash Kumar

    2017-09-01

    Correct estimation of soil loss at catchment level helps the land and water resources planners to identify priority areas for soil conservation measures. Soil erosion is one of the major hazards affected by the climate change, particularly the increasing intensity of rainfall resulted in increasing erosion, apart from other factors like landuse change. Changes in climate have an adverse effect with increasing rainfall. It has caused increasing concern for modeling the future rainfall and projecting future soil erosion. In the present study, future rainfall has been generated with the downscaling of GCM (Global Circulation Model) data of Mandakini river basin, a hilly catchment in the state of Uttarakhand, India, to obtain future impact on soil erosion within the basin. The USLE is an erosion prediction model designed to predict the long-term average annual soil loss from specific field slopes in specified landuse and management systems (i.e., crops, rangeland, and recreational areas) using remote sensing and GIS technologies. Future soil erosion has shown increasing trend due to increasing rainfall which has been generated from the statistical-based downscaling method.

  8. The Hurst exponent in energy futures prices

    NASA Astrophysics Data System (ADS)

    Serletis, Apostolos; Rosenberg, Aryeh Adam

    2007-07-01

    This paper extends the work in Elder and Serletis [Long memory in energy futures prices, Rev. Financial Econ., forthcoming, 2007] and Serletis et al. [Detrended fluctuation analysis of the US stock market, Int. J. Bifurcation Chaos, forthcoming, 2007] by re-examining the empirical evidence for random walk type behavior in energy futures prices. In doing so, it uses daily data on energy futures traded on the New York Mercantile Exchange, over the period from July 2, 1990 to November 1, 2006, and a statistical physics approach-the ‘detrending moving average’ technique-providing a reliable framework for testing the information efficiency in financial markets as shown by Alessio et al. [Second-order moving average and scaling of stochastic time series, Eur. Phys. J. B 27 (2002) 197-200] and Carbone et al. [Time-dependent hurst exponent in financial time series. Physica A 344 (2004) 267-271; Analysis of clusters formed by the moving average of a long-range correlated time series. Phys. Rev. E 69 (2004) 026105]. The results show that energy futures returns display long memory and that the particular form of long memory is anti-persistence.

  9. Predictors of self-rated health: a 12-month prospective study of IT and media workers.

    PubMed

    Hasson, Dan; Arnetz, Bengt B; Theorell, Töres; Anderberg, Ulla Maria

    2006-07-31

    The aim of the present study was to determine health-related risk and salutogenic factors and to use these to construct prediction models for future self-rated health (SRH), i.e. find possible characteristics predicting individuals improving or worsening in SRH over time (0-12 months). A prospective study was conducted with measurements (physiological markers and self-ratings) at 0, 6 and 12 months, involving 303 employees (187 men and 116 women, age 23-64) from four information technology and two media companies. There were a multitude of statistically significant cross-sectional correlations (Spearman's Rho) between SRH and other self-ratings as well as physiological markers. Predictors of future SRH were baseline ratings of SRH, self-esteem and social support (logistic regression), and SRH, sleep quality and sense of coherence (linear regression). The results of the present study indicate that baseline SRH and other self-ratings are predictive of future SRH. It is cautiously implied that SRH, self-esteem, social support, sleep quality and sense of coherence might be predictors of future SRH and therefore possibly also of various future health outcomes.

  10. An operational GLS model for hydrologic regression

    USGS Publications Warehouse

    Tasker, Gary D.; Stedinger, J.R.

    1989-01-01

    Recent Monte Carlo studies have documented the value of generalized least squares (GLS) procedures to estimate empirical relationships between streamflow statistics and physiographic basin characteristics. This paper presents a number of extensions of the GLS method that deal with realities and complexities of regional hydrologic data sets that were not addressed in the simulation studies. These extensions include: (1) a more realistic model of the underlying model errors; (2) smoothed estimates of cross correlation of flows; (3) procedures for including historical flow data; (4) diagnostic statistics describing leverage and influence for GLS regression; and (5) the formulation of a mathematical program for evaluating future gaging activities. ?? 1989.

  11. Modeling longitudinal data, I: principles of multivariate analysis.

    PubMed

    Ravani, Pietro; Barrett, Brendan; Parfrey, Patrick

    2009-01-01

    Statistical models are used to study the relationship between exposure and disease while accounting for the potential role of other factors' impact on outcomes. This adjustment is useful to obtain unbiased estimates of true effects or to predict future outcomes. Statistical models include a systematic component and an error component. The systematic component explains the variability of the response variable as a function of the predictors and is summarized in the effect estimates (model coefficients). The error element of the model represents the variability in the data unexplained by the model and is used to build measures of precision around the point estimates (confidence intervals).

  12. Carrier statistics and quantum capacitance effects on mobility extraction in two-dimensional crystal semiconductor field-effect transistors

    NASA Astrophysics Data System (ADS)

    Ma, Nan; Jena, Debdeep

    2015-03-01

    In this work, the consequence of the high band-edge density of states on the carrier statistics and quantum capacitance in transition metal dichalcogenide two-dimensional semiconductor devices is explored. The study questions the validity of commonly used expressions for extracting carrier densities and field-effect mobilities from the transfer characteristics of transistors with such channel materials. By comparison to experimental data, a new method for the accurate extraction of carrier densities and mobilities is outlined. The work thus highlights a fundamental difference between these materials and traditional semiconductors that must be considered in future experimental measurements.

  13. Past and future changes in Canadian boreal wildfire activity.

    PubMed

    Girardin, Martin P; Mudelsee, Manfred

    2008-03-01

    Climate change in Canadian boreal forests is usually associated with increased drought severity and fire activity. However, future fire activity could well be within the range of values experienced during the preindustrial period. In this study, we contrast 21st century forecasts of fire occurrence (FireOcc, number of large forest fires per year) in the southern part of the Boreal Shield, Canada, with the historical range of the past 240 years statistically reconstructed from tree-ring width data. First, a historical relationship between drought indices and FireOcc is developed over the calibration period 1959-1998. Next, together with seven tree-ring based drought reconstructions covering the last 240 years and simulations from the CGCM3 and ECHAM4 global climate models, the calibration model is used to estimate past (prior to 1959) and future (post 1999) FireOcc. Last, time-dependent changes in mean FireOcc and in the occurrence rate of extreme fire years are evaluated with the aid of advanced methods of statistical time series analysis. Results suggest that the increase in precipitation projected toward the end of the 21st century will be insufficient to compensate for increasing temperatures and will be insufficient to maintain potential evapotranspiration at current levels. Limited moisture availability would cause FireOcc to increase as well. But will future FireOcc exceed its historical range? The results obtained from our approach suggest high probabilities of seeing future FireOcc reach the upper limit of the historical range. Predictions, which are essentially weighed on northwestern Ontario and eastern boreal Manitoba, indicate that, by 2061-2100, typical FireOcc could increase by more than 34% when compared with the past two centuries. Increases in fire activity as projected by this study could negatively affect the implementation in the next century of forest management inspired by historical or natural disturbance dynamics. This approach is indeed feasible only if current and future fire activities are sufficiently low compared with the preindustrial fire activity, so a substitution of fire by forest management could occur without elevating the overall frequency of disturbance. Conceivable management options will likely have to be directed toward minimizing the adverse impacts of the increasing fire activity.

  14. Modelling uncertainties and possible future trends of precipitation and temperature for 10 sub-basins in Columbia River Basin (CRB)

    NASA Astrophysics Data System (ADS)

    Ahmadalipour, A.; Rana, A.; Qin, Y.; Moradkhani, H.

    2014-12-01

    Trends and changes in future climatic parameters, such as, precipitation and temperature have been a central part of climate change studies. In the present work, we have analyzed the seasonal and yearly trends and uncertainties of prediction in all the 10 sub-basins of Columbia River Basin (CRB) for future time period of 2010-2099. The work is carried out using 2 different sets of statistically downscaled Global Climate Model (GCMs) projection datasets i.e. Bias correction and statistical downscaling (BCSD) generated at Portland State University and The Multivariate Adaptive Constructed Analogs (MACA) generated at University of Idaho. The analysis is done for with 10 GCM downscaled products each from CMIP5 daily dataset totaling to 40 different downscaled products for robust analysis. Summer, winter and yearly trend analysis is performed for all the 10 sub-basins using linear regression (significance tested by student t test) and Mann Kendall test (0.05 percent significance level), for precipitation (P), temperature maximum (Tmax) and temperature minimum (Tmin). Thereafter, all the parameters are modelled for uncertainty, across all models, in all the 10 sub-basins and across the CRB for future scenario periods. Results have indicated in varied degree of trends for all the sub-basins, mostly pointing towards a significant increase in all three climatic parameters, for all the seasons and yearly considerations. Uncertainty analysis have reveled very high change in all the parameters across models and sub-basins under consideration. Basin wide uncertainty analysis is performed to corroborate results from smaller, sub-basin scale. Similar trends and uncertainties are reported on the larger scale as well. Interestingly, both trends and uncertainties are higher during winter period than during summer, contributing to large part of the yearly change.

  15. Global Pyrogeography: the Current and Future Distribution of Wildfire

    PubMed Central

    Krawchuk, Meg A.; Moritz, Max A.; Parisien, Marc-André; Van Dorn, Jeff; Hayhoe, Katharine

    2009-01-01

    Climate change is expected to alter the geographic distribution of wildfire, a complex abiotic process that responds to a variety of spatial and environmental gradients. How future climate change may alter global wildfire activity, however, is still largely unknown. As a first step to quantifying potential change in global wildfire, we present a multivariate quantification of environmental drivers for the observed, current distribution of vegetation fires using statistical models of the relationship between fire activity and resources to burn, climate conditions, human influence, and lightning flash rates at a coarse spatiotemporal resolution (100 km, over one decade). We then demonstrate how these statistical models can be used to project future changes in global fire patterns, highlighting regional hotspots of change in fire probabilities under future climate conditions as simulated by a global climate model. Based on current conditions, our results illustrate how the availability of resources to burn and climate conditions conducive to combustion jointly determine why some parts of the world are fire-prone and others are fire-free. In contrast to any expectation that global warming should necessarily result in more fire, we find that regional increases in fire probabilities may be counter-balanced by decreases at other locations, due to the interplay of temperature and precipitation variables. Despite this net balance, our models predict substantial invasion and retreat of fire across large portions of the globe. These changes could have important effects on terrestrial ecosystems since alteration in fire activity may occur quite rapidly, generating ever more complex environmental challenges for species dispersing and adjusting to new climate conditions. Our findings highlight the potential for widespread impacts of climate change on wildfire, suggesting severely altered fire regimes and the need for more explicit inclusion of fire in research on global vegetation-climate change dynamics and conservation planning. PMID:19352494

  16. Plaque Echolucency and Stroke Risk in Asymptomatic Carotid Stenosis: A Systematic Review and Meta-Analysis

    PubMed Central

    Gupta, Ajay; Kesavabhotla, Kartik; Baradaran, Hediyeh; Kamel, Hooman; Pandya, Ankur; Giambrone, Ashley E.; Wright, Drew; Pain, Kevin J.; Mtui, Edward E.; Suri, Jasjit S.; Sanelli, Pina C.; Mushlin, Alvin I.

    2014-01-01

    Background and Purpose Ultrasonographic plaque echolucency has been studied as a stroke risk marker in carotid atherosclerotic disease. We performed a systematic review and meta-analysis to summarize the association between ultrasound determined carotid plaque echolucency and future ipsilateral stroke risk. Methods We searched the medical literature for studies evaluating the association between carotid plaque echolucency and future stroke in asymptomatic patients. We included prospective observational studies with stroke outcome ascertainment after baseline carotid plaque echolucency assessment. We performed a meta-analysis and assessed study heterogeneity and publication bias. We also performed subgroup analyses limited to patients with stenosis ≥50%, studies in which plaque echolucency was determined via subjective visual interpretation, studies with a relatively lower risk of bias, and studies published after the year 2000. Results We analyzed data from 7 studies on 7557 subjects with a mean follow up of 37.2 months. We found a significant positive relationship between predominantly echolucent (compared to predominantly echogenic) plaques and the risk of future ipsilateral stroke across all stenosis severities (0-99%) (relative risk [RR], 2.31, 95% CI, 1.58-3.39, P<.001) and in subjects with ≥50% stenosis (RR, 2.61 95% CI, 1.47-4.63, P=.001). A statistically significant increased RR for future stroke was preserved in all additional subgroup analyses. No statistically significant heterogeneity or publication bias was present in any of the meta-analyses. Conclusions The presence of ultrasound-determined carotid plaque echolucency provides predictive information in asymptomatic carotid artery stenosis beyond luminal stenosis. However, the magnitude of the increased risk is not sufficient on its own to identify patients likely to benefit from surgical revascularization. PMID:25406150

  17. Statistics of acoustic emissions and stress drops during granular shearing using a stick-slip fiber bundle mode

    NASA Astrophysics Data System (ADS)

    Cohen, D.; Michlmayr, G.; Or, D.

    2012-04-01

    Shearing of dense granular materials appears in many engineering and Earth sciences applications. Under a constant strain rate, the shearing stress at steady state oscillates with slow rises followed by rapid drops that are linked to the build up and failure of force chains. Experiments indicate that these drops display exponential statistics. Measurements of acoustic emissions during shearing indicates that the energy liberated by failure of these force chains has power-law statistics. Representing force chains as fibers, we use a stick-slip fiber bundle model to obtain analytical solutions of the statistical distribution of stress drops and failure energy. In the model, fibers stretch, fail, and regain strength during deformation. Fibers have Weibull-distributed threshold strengths with either quenched and annealed disorder. The shape of the distribution for drops and energy obtained from the model are similar to those measured during shearing experiments. This simple model may be useful to identify failure events linked to force chain failures. Future generalizations of the model that include different types of fiber failure may also allow identification of different types of granular failures that have distinct statistical acoustic emission signatures.

  18. A framework for evaluating statistical downscaling performance under changing climatic conditions (Invited)

    NASA Astrophysics Data System (ADS)

    Dixon, K. W.; Balaji, V.; Lanzante, J.; Radhakrishnan, A.; Hayhoe, K.; Stoner, A. K.; Gaitan, C. F.

    2013-12-01

    Statistical downscaling (SD) methods may be viewed as generating a value-added product - a refinement of global climate model (GCM) output designed to add finer scale detail and to address GCM shortcomings via a process that gleans information from a combination of observations and GCM-simulated climate change responses. Making use of observational data sets and GCM simulations representing the same historical period, cross-validation techniques allow one to assess how well an SD method meets this goal. However, lacking observations of future, the extent to which a particular SD method's skill might degrade when applied to future climate projections cannot be assessed in the same manner. Here we illustrate and describe extensions to a 'perfect model' experimental design that seeks to quantify aspects of SD method performance both for a historical period (1979-2008) and for late 21st century climate projections. Examples highlighting cases in which downscaling performance deteriorates in future climate projections will be discussed. Also, results will be presented showing how synthetic datasets having known statistical properties may be used to further isolate factors responsible for degradations in SD method skill under changing climatic conditions. We will describe a set of input files used to conduct these analyses that are being made available to researchers who wish to utilize this experimental framework to evaluate SD methods they have developed. The gridded data sets cover a region centered on the contiguous 48 United States with a grid spacing of approximately 25km, have daily time resolution (e.g., maximum and minimum near-surface temperature and precipitation), and represent a total of 120 years of model simulations. This effort is consistent with the 2013 National Climate Predictions and Projections Platform Quantitative Evaluation of Downscaling Workshop goal of supporting a community approach to promote the informed use of downscaled climate projections.

  19. Statistical Approaches for Spatiotemporal Prediction of Low Flows

    NASA Astrophysics Data System (ADS)

    Fangmann, A.; Haberlandt, U.

    2017-12-01

    An adequate assessment of regional climate change impacts on streamflow requires the integration of various sources of information and modeling approaches. This study proposes simple statistical tools for inclusion into model ensembles, which are fast and straightforward in their application, yet able to yield accurate streamflow predictions in time and space. Target variables for all approaches are annual low flow indices derived from a data set of 51 records of average daily discharge for northwestern Germany. The models require input of climatic data in the form of meteorological drought indices, derived from observed daily climatic variables, averaged over the streamflow gauges' catchments areas. Four different modeling approaches are analyzed. Basis for all pose multiple linear regression models that estimate low flows as a function of a set of meteorological indices and/or physiographic and climatic catchment descriptors. For the first method, individual regression models are fitted at each station, predicting annual low flow values from a set of annual meteorological indices, which are subsequently regionalized using a set of catchment characteristics. The second method combines temporal and spatial prediction within a single panel data regression model, allowing estimation of annual low flow values from input of both annual meteorological indices and catchment descriptors. The third and fourth methods represent non-stationary low flow frequency analyses and require fitting of regional distribution functions. Method three is subject to a spatiotemporal prediction of an index value, method four to estimation of L-moments that adapt the regional frequency distribution to the at-site conditions. The results show that method two outperforms successive prediction in time and space. Method three also shows a high performance in the near future period, but since it relies on a stationary distribution, its application for prediction of far future changes may be problematic. Spatiotemporal prediction of L-moments appeared highly uncertain for higher-order moments resulting in unrealistic future low flow values. All in all, the results promote an inclusion of simple statistical methods in climate change impact assessment.

  20. Arsenic contamination of drinking water in Ireland: A spatial analysis of occurrence and potential risk.

    PubMed

    McGrory, Ellen R; Brown, Colin; Bargary, Norma; Williams, Natalya Hunter; Mannix, Anthony; Zhang, Chaosheng; Henry, Tiernan; Daly, Eve; Nicholas, Sarah; Petrunic, Barbara M; Lee, Monica; Morrison, Liam

    2017-02-01

    The presence of arsenic in groundwater has become a global concern due to the health risks from drinking water with elevated concentrations. The Water Framework Directive (WFD) of the European Union calls for drinking water risk assessment for member states. The present study amalgamates readily available national and sub-national scale datasets on arsenic in groundwater in the Republic of Ireland. However, due to the presence of high levels of left censoring (i.e. arsenic values below an analytical detection limit) and changes in detection limits over time, the application of conventional statistical methods would inhibit the generation of meaningful results. In order to handle these issues several arsenic databases were integrated and the data modelled using statistical methods appropriate for non-detect data. In addition, geostatistical methods were used to assess principal risk components of elevated arsenic related to lithology, aquifer type and groundwater vulnerability. Geographic statistical methods were used to overcome some of the geographical limitations of the Irish Environmental Protection Agency (EPA) sample database. Nearest-neighbour inverse distance weighting (IDW) and local indicator of spatial association (LISA) methods were used to estimate risk in non-sampled areas. Significant differences were also noted between different aquifer lithologies, indicating that Rhyolite, Sandstone and Shale (Greywackes), and Impure Limestone potentially presented a greater risk of elevated arsenic in groundwaters. Significant differences also occurred among aquifer types with poorly productive aquifers, locally important fractured bedrock aquifers and regionally important fissured bedrock aquifers presenting the highest potential risk of elevated arsenic. No significant differences were detected among different groundwater vulnerability groups as defined by the Geological Survey of Ireland. This research will assist management and future policy directions of groundwater resources at EU level and guide future research focused on understanding arsenic mobilisation processes to facilitate in guiding future development, testing and treatment requirements of groundwater resources. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. How will precipitation change in extratropical cyclones as the planet warms? Insights from a large initial condition climate model ensemble

    NASA Astrophysics Data System (ADS)

    Yettella, Vineel; Kay, Jennifer E.

    2017-09-01

    The extratropical precipitation response to global warming is investigated within a 30-member initial condition climate model ensemble. As in observations, modeled cyclonic precipitation contributes a large fraction of extratropical precipitation, especially over the ocean and in the winter hemisphere. When compared to present day, the ensemble projects increased cyclone-associated precipitation under twenty-first century business-as-usual greenhouse gas forcing. While the cyclone-associated precipitation response is weaker in the near-future (2016-2035) than in the far-future (2081-2100), both future periods have similar patterns of response. Though cyclone frequency changes are important regionally, most of the increased cyclone-associated precipitation results from increased within-cyclone precipitation. Consistent with this result, cyclone-centric composites show statistically significant precipitation increases in all cyclone sectors. Decomposition into thermodynamic (mean cyclone water vapor path) and dynamic (mean cyclone wind speed) contributions shows that thermodynamics explains 92 and 95% of the near-future and far-future within-cyclone precipitation increases respectively. Surprisingly, the influence of dynamics on future cyclonic precipitation changes is negligible. In addition, the forced response exceeds internal variability in both future time periods. Overall, this work suggests that future cyclonic precipitation changes will result primarily from increased moisture availability in a warmer world, with secondary contributions from changes in cyclone frequency and cyclone dynamics.

  2. Developing future precipitation events from historic events: An Amsterdam case study.

    NASA Astrophysics Data System (ADS)

    Manola, Iris; van den Hurk, Bart; de Moel, Hans; Aerts, Jeroen

    2016-04-01

    Due to climate change, the frequency and intensity of extreme precipitation events is expected to increase. It is therefore of high importance to develop climate change scenarios tailored towards the local and regional needs of policy makers in order to develop efficient adaptation strategies to reduce the risks from extreme weather events. Current approaches to tailor climate scenarios are often not well adopted in hazard management, since average changes in climate are not a main concern to policy makers, and tailoring climate scenarios to simulate future extremes can be complex. Therefore, a new concept has been introduced recently that uses known historic extreme events as a basis, and modifies the observed data for these events so that the outcome shows how the same event would occur in a warmer climate. This concept is introduced as 'Future Weather', and appeals to the experience of stakeholders and users. This research presents a novel method of projecting a future extreme precipitation event, based on a historic event. The selected precipitation event took place over the broader area of Amsterdam, the Netherlands in the summer of 2014, which resulted in blocked highways, disruption of air transportation, flooded buildings and public facilities. An analysis of rain monitoring stations showed that an event of such intensity has a 5 to 15 years return period. The method of projecting a future event follows a non-linear delta transformation that is applied directly on the observed event assuming a warmer climate to produce an "up-scaled" future precipitation event. The delta transformation is based on the observed behaviour of the precipitation intensity as a function of the dew point temperature during summers. The outcome is then compared to a benchmark method using the HARMONIE numerical weather prediction model, where the boundary conditions of the event from the Ensemble Prediction System of ECMWF (ENS) are perturbed to indicate a warmer climate. The two methodologies are statistically compared and evaluated. The comparison between the historic event generated by the model and the observed event will give information on the realism of the model for this event. The comparison between the delta transformation method and the future simulation will provide information on how the dynamics would affect the precipitation field, as compared to the statistical method.

  3. Patterns of crop cover under future climates.

    PubMed

    Porfirio, Luciana L; Newth, David; Harman, Ian N; Finnigan, John J; Cai, Yiyong

    2017-04-01

    We study changes in crop cover under future climate and socio-economic projections. This study is not only organised around the global and regional adaptation or vulnerability to climate change but also includes the influence of projected changes in socio-economic, technological and biophysical drivers, especially regional gross domestic product. The climatic data are obtained from simulations of RCP4.5 and 8.5 by four global circulation models/earth system models from 2000 to 2100. We use Random Forest, an empirical statistical model, to project the future crop cover. Our results show that, at the global scale, increases and decreases in crop cover cancel each other out. Crop cover in the Northern Hemisphere is projected to be impacted more by future climate than the in Southern Hemisphere because of the disparity in the warming rate and precipitation patterns between the two Hemispheres. We found that crop cover in temperate regions is projected to decrease more than in tropical regions. We identified regions of concern and opportunities for climate change adaptation and investment.

  4. Volatilities, Traded Volumes, and Price Increments in Derivative Securities

    NASA Astrophysics Data System (ADS)

    Kim, Kyungsik; Lim, Gyuchang; Kim, Soo Yong; Scalas, Enrico

    2007-03-01

    We apply the detrended fluctuation analysis (DFA) to the statistics of the Korean treasury bond (KTB) futures from which the logarithmic increments, volatilities, and traded volumes are estimated over a specific time lag. For our case, the logarithmic increment of futures prices has no long-memory property, while the volatility and the traded volume exhibit the existence of long-memory property. To analyze and calculate whether the volatility clustering is due to the inherent higher-order correlation not detected by applying directly the DFA to logarithmic increments of the KTB futures, it is of importance to shuffle the original tick data of futures prices and to generate the geometric Brownian random walk with the same mean and standard deviation. It is really shown from comparing the three tick data that the higher-order correlation inherent in logarithmic increments makes the volatility clustering. Particularly, the result of the DFA on volatilities and traded volumes may be supported the hypothesis of price changes.

  5. Volatilities, traded volumes, and the hypothesis of price increments in derivative securities

    NASA Astrophysics Data System (ADS)

    Lim, Gyuchang; Kim, SooYong; Scalas, Enrico; Kim, Kyungsik

    2007-08-01

    A detrended fluctuation analysis (DFA) is applied to the statistics of Korean treasury bond (KTB) futures from which the logarithmic increments, volatilities, and traded volumes are estimated over a specific time lag. In this study, the logarithmic increment of futures prices has no long-memory property, while the volatility and the traded volume exhibit the existence of the long-memory property. To analyze and calculate whether the volatility clustering is due to a inherent higher-order correlation not detected by with the direct application of the DFA to logarithmic increments of KTB futures, it is of importance to shuffle the original tick data of future prices and to generate a geometric Brownian random walk with the same mean and standard deviation. It was found from a comparison of the three tick data that the higher-order correlation inherent in logarithmic increments leads to volatility clustering. Particularly, the result of the DFA on volatilities and traded volumes can be supported by the hypothesis of price changes.

  6. Inference from the small scales of cosmic shear with current and future Dark Energy Survey data

    DOE PAGES

    MacCrann, N.; Aleksić, J.; Amara, A.; ...

    2016-11-05

    Cosmic shear is sensitive to fluctuations in the cosmological matter density field, including on small physical scales, where matter clustering is affected by baryonic physics in galaxies and galaxy clusters, such as star formation, supernovae feedback and AGN feedback. While muddying any cosmological information that is contained in small scale cosmic shear measurements, this does mean that cosmic shear has the potential to constrain baryonic physics and galaxy formation. We perform an analysis of the Dark Energy Survey (DES) Science Verification (SV) cosmic shear measurements, now extended to smaller scales, and using the Mead et al. 2015 halo model tomore » account for baryonic feedback. While the SV data has limited statistical power, we demonstrate using a simulated likelihood analysis that the final DES data will have the statistical power to differentiate among baryonic feedback scenarios. We also explore some of the difficulties in interpreting the small scales in cosmic shear measurements, presenting estimates of the size of several other systematic effects that make inference from small scales difficult, including uncertainty in the modelling of intrinsic alignment on nonlinear scales, `lensing bias', and shape measurement selection effects. For the latter two, we make use of novel image simulations. While future cosmic shear datasets have the statistical power to constrain baryonic feedback scenarios, there are several systematic effects that require improved treatments, in order to make robust conclusions about baryonic feedback.« less

  7. Statistical downscaling of sub-daily (6-hour) temperature in Romania, by means of artificial neural networks

    NASA Astrophysics Data System (ADS)

    Birsan, Marius-Victor; Dumitrescu, Alexandru; Cǎrbunaru, Felicia

    2016-04-01

    The role of statistical downscaling is to model the relationship between large-scale atmospheric circulation and climatic variables on a regional and sub-regional scale, making use of the predictions of future circulation generated by General Circulation Models (GCMs) in order to capture the effects of climate change on smaller areas. The study presents a statistical downscaling model based on a neural network-based approach, by means of multi-layer perceptron networks. Sub-daily temperature data series from 81 meteorological stations over Romania, with full data records are used as predictands. As large-scale predictor, the NCEP/NCAD air temperature data at 850 hPa over the domain 20-30E / 40-50N was used, at a spatial resolution of 2.5×2.5 degrees. The period 1961-1990 was used for calibration, while the validation was realized over the 1991-2010 interval. Further, in order to estimate future changes in air temperature for 2021-2050 and 2071-2100, air temperature data at 850 hPa corresponding to the IPCC A1B scenario was extracted from the CNCM33 model (Meteo-France) and used as predictor. This work has been realized within the research project "Changes in climate extremes and associated impact in hydrological events in Romania" (CLIMHYDEX), code PN II-ID-2011-2-0073, financed by the Romanian Executive Agency for Higher Education Research, Development and Innovation Funding (UEFISCDI).

  8. Statistical approaches for the determination of cut points in anti-drug antibody bioassays.

    PubMed

    Schaarschmidt, Frank; Hofmann, Matthias; Jaki, Thomas; Grün, Bettina; Hothorn, Ludwig A

    2015-03-01

    Cut points in immunogenicity assays are used to classify future specimens into anti-drug antibody (ADA) positive or negative. To determine a cut point during pre-study validation, drug-naive specimens are often analyzed on multiple microtiter plates taking sources of future variability into account, such as runs, days, analysts, gender, drug-spiked and the biological variability of un-spiked specimens themselves. Five phenomena may complicate the statistical cut point estimation: i) drug-naive specimens may contain already ADA-positives or lead to signals that erroneously appear to be ADA-positive, ii) mean differences between plates may remain after normalization of observations by negative control means, iii) experimental designs may contain several factors in a crossed or hierarchical structure, iv) low sample sizes in such complex designs lead to low power for pre-tests on distribution, outliers and variance structure, and v) the choice between normal and log-normal distribution has a serious impact on the cut point. We discuss statistical approaches to account for these complex data: i) mixture models, which can be used to analyze sets of specimens containing an unknown, possibly larger proportion of ADA-positive specimens, ii) random effects models, followed by the estimation of prediction intervals, which provide cut points while accounting for several factors, and iii) diagnostic plots, which allow the post hoc assessment of model assumptions. All methods discussed are available in the corresponding R add-on package mixADA. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kooperman, Gabriel J.; Pritchard, Michael S.; Burt, Melissa A.

    Changes in the character of rainfall are assessed using a holistic set of statistics based on rainfall frequency and amount distributions in climate change experiments with three conventional and superparameterized versions of the Community Atmosphere Model (CAM and SPCAM). Previous work has shown that high-order statistics of present-day rainfall intensity are significantly improved with superparameterization, especially in regions of tropical convection. Globally, the two modeling approaches project a similar future increase in mean rainfall, especially across the Inter-Tropical Convergence Zone (ITCZ) and at high latitudes, but over land, SPCAM predicts a smaller mean change than CAM. Changes in high-order statisticsmore » are similar at high latitudes in the two models but diverge at lower latitudes. In the tropics, SPCAM projects a large intensification of moderate and extreme rain rates in regions of organized convection associated with the Madden Julian Oscillation, ITCZ, monsoons, and tropical waves. In contrast, this signal is missing in all versions of CAM, which are found to be prone to predicting increases in the amount but not intensity of moderate rates. Predictions from SPCAM exhibit a scale-insensitive behavior with little dependence on horizontal resolution for extreme rates, while lower resolution (~2°) versions of CAM are not able to capture the response simulated with higher resolution (~1°). Furthermore, moderate rain rates analyzed by the “amount mode” and “amount median” are found to be especially telling as a diagnostic for evaluating climate model performance and tracing future changes in rainfall statistics to tropical wave modes in SPCAM.« less

  10. Subsonic Aircraft Safety Icing Study

    NASA Technical Reports Server (NTRS)

    Jones, Sharon Monica; Reveley, Mary S.; Evans, Joni K.; Barrientos, Francesca A.

    2008-01-01

    NASA's Integrated Resilient Aircraft Control (IRAC) Project is one of four projects within the agency s Aviation Safety Program (AvSafe) in the Aeronautics Research Mission Directorate (ARMD). The IRAC Project, which was redesigned in the first half of 2007, conducts research to advance the state of the art in aircraft control design tools and techniques. A "Key Decision Point" was established for fiscal year 2007 with the following expected outcomes: document the most currently available statistical/prognostic data associated with icing for subsonic transport, summarize reports by subject matter experts in icing research on current knowledge of icing effects on control parameters and establish future requirements for icing research for subsonic transports including the appropriate alignment. This study contains: (1) statistical analyses of accident and incident data conducted by NASA researchers for this "Key Decision Point", (2) an examination of icing in other recent statistically based studies, (3) a summary of aviation safety priority lists that have been developed by various subject-matter experts, including the significance of aircraft icing research in these lists and (4) suggested future requirements for NASA icing research. The review of several studies by subject-matter experts was summarized into four high-priority icing research areas. Based on the Integrated Resilient Aircraft Control (IRAC) Project goals and objectives, the IRAC project was encouraged to conduct work in all of the high-priority icing research areas that were identified, with the exception of the developing of methods to sense and document actual icing conditions.

  11. Statistical model of exotic rotational correlations in emergent space-time

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogan, Craig; Kwon, Ohkyung; Richardson, Jonathan

    2017-06-06

    A statistical model is formulated to compute exotic rotational correlations that arise as inertial frames and causal structure emerge on large scales from entangled Planck scale quantum systems. Noncommutative quantum dynamics are represented by random transverse displacements that respect causal symmetry. Entanglement is represented by covariance of these displacements in Planck scale intervals defined by future null cones of events on an observer's world line. Light that propagates in a nonradial direction inherits a projected component of the exotic rotational correlation that accumulates as a random walk in phase. A calculation of the projection and accumulation leads to exact predictionsmore » for statistical properties of exotic Planck scale correlations in an interferometer of any configuration. The cross-covariance for two nearly co-located interferometers is shown to depart only slightly from the autocovariance. Specific examples are computed for configurations that approximate realistic experiments, and show that the model can be rigorously tested.« less

  12. Geostatistics and GIS: tools for characterizing environmental contamination.

    PubMed

    Henshaw, Shannon L; Curriero, Frank C; Shields, Timothy M; Glass, Gregory E; Strickland, Paul T; Breysse, Patrick N

    2004-08-01

    Geostatistics is a set of statistical techniques used in the analysis of georeferenced data that can be applied to environmental contamination and remediation studies. In this study, the 1,1-dichloro-2,2-bis(p-chlorophenyl)ethylene (DDE) contamination at a Superfund site in western Maryland is evaluated. Concern about the site and its future clean up has triggered interest within the community because residential development surrounds the area. Spatial statistical methods, of which geostatistics is a subset, are becoming increasingly popular, in part due to the availability of geographic information system (GIS) software in a variety of application packages. In this article, the joint use of ArcGIS software and the R statistical computing environment are demonstrated as an approach for comprehensive geostatistical analyses. The spatial regression method, kriging, is used to provide predictions of DDE levels at unsampled locations both within the site and the surrounding areas where residential development is ongoing.

  13. A Model Assessment of Satellite Observed Trends in Polar Sea Ice Extents

    NASA Technical Reports Server (NTRS)

    Vinnikov, Konstantin Y.; Cavalieri, Donald J.; Parkinson, Claire L.

    2005-01-01

    For more than three decades now, satellite passive microwave observations have been used to monitor polar sea ice. Here we utilize sea ice extent trends determined from primarily satellite data for both the Northern and Southern Hemispheres for the period 1972(73)-2004 and compare them with results from simulations by eleven climate models. In the Northern Hemisphere, observations show a statistically significant decrease of sea ice extent and an acceleration of sea ice retreat during the past three decades. However, from the modeled natural variability of sea ice extents in control simulations, we conclude that the acceleration is not statistically significant and should not be extrapolated into the future. Observations and model simulations show that the time scale of climate variability in sea ice extent in the Southern Hemisphere is much larger than in the Northern Hemisphere and that the Southern Hemisphere sea ice extent trends are not statistically significant.

  14. Relative fundamental frequency during vocal onset and offset in older speakers with and without Parkinson's disease.

    PubMed

    Stepp, Cara E

    2013-03-01

    The relative fundamental frequency (RFF) surrounding production of a voiceless consonant has previously been shown to be lower in speakers with hypokinetic dysarthria and Parkinson's disease (PD) relative to age/sex matched controls. Here RFF was calculated in 32 speakers with PD without overt hypokinetic dysarthria and 32 age and sex matched controls to better understand the relationships between RFF and PD progression, medication status, and sex. Results showed that RFF was statistically significantly lower in individuals with PD compared with healthy age-matched controls and was statistically significantly lower in individuals diagnosed at least 5 yrs prior to experimentation relative to individuals recorded less than 5 yrs past diagnosis. Contrary to previous trends, no effect of medication was found. However, a statistically significant effect of sex on offset RFF was shown, with lower values in males relative to females. Future work examining the physiological bases of RFF is warranted.

  15. An issue of literacy on pediatric arterial hypertension

    NASA Astrophysics Data System (ADS)

    Teodoro, M. Filomena; Romana, Andreia; Simão, Carla

    2017-11-01

    Arterial hypertension in pediatric age is a public health problem, whose prevalence has increased significantly over time. Pediatric arterial hypertension (PAH) is under-diagnosed in most cases, a highly prevalent disease, appears without notice with multiple consequences on the children's health and future adults. Children caregivers and close family must know the PAH existence, the negative consequences associated with it, the risk factors and, finally, must do prevention. In [12, 13] can be found a statistical data analysis using a simpler questionnaire introduced in [4] under the aim of a preliminary study about PAH caregivers acquaintance. A continuation of such analysis is detailed in [14]. An extension of such questionnaire was built and applied to a distinct population and it was filled online. The statistical approach is partially reproduced in the present work. Some statistical models were estimated using several approaches, namely multivariate analysis (factorial analysis), also adequate methods to analyze the kind of data in study.

  16. The Quantum and Fluid Mechanics of Global Warming

    NASA Astrophysics Data System (ADS)

    Marston, Brad

    2008-03-01

    Quantum physics and fluid mechanics are the foundation of any understanding of the Earth's climate. In this talk I invoke three well-known aspects of quantum mechanics to explore what will happen as the concentrations of greenhouse gases such as carbon dioxide continue to increase. Fluid dynamical models of the Earth's atmosphere, demonstrated here in live simulations, yield further insight into past, present, and future climates. Statistics of geophysical flows can, however, be ascertained directly without recourse to numerical simulation, using concepts borrowed from nonequilibrium statistical mechanicsootnotetextJ. B. Marston, E. Conover, and Tapio Schneider, ``Statistics of an Unstable Barotropic Jet from a Cumulant Expansion,'' arXiv:0705.0011, J. Atmos. Sci. (in press).. I discuss several other ways that theoretical physics may be able to contribute to a deeper understanding of climate changeootnotetextJ. Carlson, J. Harte, G. Falkovich, J. B. Marston, and R. Pierrehumbert, ``Physics of Climate Change'' 2008 Program of the Kavli Institute for Theoretical Physics..

  17. The Impact of United States Monetary Policy in the Crude Oil futures market

    NASA Astrophysics Data System (ADS)

    Padilla-Padilla, Fernando M.

    This research examines the empirical impact the United States monetary policy, through the federal fund interest rate, has on the volatility in the crude oil price in the futures market. Prior research has shown how macroeconomic events and variables have impacted different financial markets within short and long--term movements. After testing and decomposing the variables, the two stationary time series were analyzed using a Vector Autoregressive Model (VAR). The empirical evidence shows, with statistical significance, a direct relationship when explaining crude oil prices as function of fed fund rates (t-1) and an indirect relationship when explained as a function of fed fund rates (t-2). These results partially address the literature review lacunas within the topic of the existing implication monetary policy has within the crude oil futures market.

  18. Beyond refractory obsessions and anxiety states: toward remission.

    PubMed

    Hollander, Eric; Zohar, Joseph

    2004-01-01

    At the Sixth International Obsessive-Compulsive Disorder Conference (IOCDC), held November 13-15, 2003, in Lanzarote, Spain, 2 issues were discussed that are of great importance to future research on obsessive-compulsive disorder (OCD). The first of these is the possible inclusion of obsessive-compulsive spectrum disorders (OCSD) in the fifth edition of the Diagnostic and Statistical Manual of Mental Disorders. OCSD resemble OCD in their clinical symptoms, associated features, comorbidity, family/genetics, etiology, and neurocircuitry, as well as their selective response to treatment with serotonin reuptake inhibitors. The second issue is considering remission as the ultimate goal of treatment for OCD instead of just symptom reduction, as has been suggested in other disorders. These and other issues should be discussed at future meetings of the IOCDC and influence how we conceptualize the disorder and design future treatment trials.

  19. Calculating Confidence, Uncertainty, and Numbers of Samples When Using Statistical Sampling Approaches to Characterize and Clear Contaminated Areas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piepel, Gregory F.; Matzke, Brett D.; Sego, Landon H.

    2013-04-27

    This report discusses the methodology, formulas, and inputs needed to make characterization and clearance decisions for Bacillus anthracis-contaminated and uncontaminated (or decontaminated) areas using a statistical sampling approach. Specifically, the report includes the methods and formulas for calculating the • number of samples required to achieve a specified confidence in characterization and clearance decisions • confidence in making characterization and clearance decisions for a specified number of samples for two common statistically based environmental sampling approaches. In particular, the report addresses an issue raised by the Government Accountability Office by providing methods and formulas to calculate the confidence that amore » decision area is uncontaminated (or successfully decontaminated) if all samples collected according to a statistical sampling approach have negative results. Key to addressing this topic is the probability that an individual sample result is a false negative, which is commonly referred to as the false negative rate (FNR). The two statistical sampling approaches currently discussed in this report are 1) hotspot sampling to detect small isolated contaminated locations during the characterization phase, and 2) combined judgment and random (CJR) sampling during the clearance phase. Typically if contamination is widely distributed in a decision area, it will be detectable via judgment sampling during the characterization phrase. Hotspot sampling is appropriate for characterization situations where contamination is not widely distributed and may not be detected by judgment sampling. CJR sampling is appropriate during the clearance phase when it is desired to augment judgment samples with statistical (random) samples. The hotspot and CJR statistical sampling approaches are discussed in the report for four situations: 1. qualitative data (detect and non-detect) when the FNR = 0 or when using statistical sampling methods that account for FNR > 0 2. qualitative data when the FNR > 0 but statistical sampling methods are used that assume the FNR = 0 3. quantitative data (e.g., contaminant concentrations expressed as CFU/cm2) when the FNR = 0 or when using statistical sampling methods that account for FNR > 0 4. quantitative data when the FNR > 0 but statistical sampling methods are used that assume the FNR = 0. For Situation 2, the hotspot sampling approach provides for stating with Z% confidence that a hotspot of specified shape and size with detectable contamination will be found. Also for Situation 2, the CJR approach provides for stating with X% confidence that at least Y% of the decision area does not contain detectable contamination. Forms of these statements for the other three situations are discussed in Section 2.2. Statistical methods that account for FNR > 0 currently only exist for the hotspot sampling approach with qualitative data (or quantitative data converted to qualitative data). This report documents the current status of methods and formulas for the hotspot and CJR sampling approaches. Limitations of these methods are identified. Extensions of the methods that are applicable when FNR = 0 to account for FNR > 0, or to address other limitations, will be documented in future revisions of this report if future funding supports the development of such extensions. For quantitative data, this report also presents statistical methods and formulas for 1. quantifying the uncertainty in measured sample results 2. estimating the true surface concentration corresponding to a surface sample 3. quantifying the uncertainty of the estimate of the true surface concentration. All of the methods and formulas discussed in the report were applied to example situations to illustrate application of the methods and interpretation of the results.« less

  20. News from the ESO Science Archive Facility

    NASA Astrophysics Data System (ADS)

    Dobrzycki, A.; Arnaboldi, M.; Bierwirth, T.; Boelter, M.; Da Rocha, C.; Delmotte, N.; Forchì, V.; Fourniol, N.; klein Gebbinck, M.; Lange, U.; Mascetti, L.; Micol, A.; Moins, C.; Munte, C.; Pluciennik, C.; Retzlaff, J.; Romaniello, M.; Rosse, N.; Sequeiros, I. V.; Vuong, M.-H.; Zampieri, S.

    2015-09-01

    ESO Science Archive Facility (SAF) - one of the world's biggest astronomical archives - combines two roles: operational (ingest, tallying, safekeeping and distribution to observers of raw data taken with ESO telescopes and processed data generated both internally and externally) and scientific (publication and delivery of all flavours of data to external users). This paper presents the “State of the SAF.” SAF, as a living entity, is constantly implementing new services and upgrading the existing ones. We present recent and future developments related to the Archive's Request Handler and metadata handling as well as performance and usage statistics and trends. We also discuss the current and future datasets on offer at SAF.

  1. Development and Evaluation of a Systems Thinking Education Strategy for Baccalaureate Nursing Curriculum: A Pilot Study.

    PubMed

    Fura, Louise A; Wisser, Kathleen Z

    Nurse educators are charged to develop and evaluate curricula on systems thinking to prepare future nurses to provide safe nursing care. The goal of this pilot study was to design and evaluate a four-hour educational strategy that prepares future professional nurses to use systems thinking approaches in the delivery of safe patient care. This study exposed prelicensure baccalaureate nursing students to systems thinking principles, which included didactic and experiential activities. A descriptive design was used to determine the effect of an on-campus educational strategy. A paired samples t-test revealed statistical significance from pretest to posttest.

  2. The Enrollment Crisis: Factors, Actors, and Impacts. AAHE-ERIC/Higher Education Research Report No. 3, 1982.

    ERIC Educational Resources Information Center

    Baldridge, J. Victor; And Others

    The impact of demographic shifts and enrollment declines for higher education are examined, and possible institutional responses to these problems are studied. After a review of the national statistics and projections of future enrollment trends, attention is directed to the campus level and the dimensions of current enrollment problems. Based on…

  3. A Social Networks in Education

    ERIC Educational Resources Information Center

    Klimova, Blanka; Poulova, Petra

    2015-01-01

    At present social networks are becoming important in all areas of human activities. They are simply part and parcel of everyday life. They are mostly used for advertising, but they have already found their way into education. The future potential of social networks is high as it can be seen from their statistics on a daily, monthly or yearly…

  4. Image analysis library software development

    NASA Technical Reports Server (NTRS)

    Guseman, L. F., Jr.; Bryant, J.

    1977-01-01

    The Image Analysis Library consists of a collection of general purpose mathematical/statistical routines and special purpose data analysis/pattern recognition routines basic to the development of image analysis techniques for support of current and future Earth Resources Programs. Work was done to provide a collection of computer routines and associated documentation which form a part of the Image Analysis Library.

  5. Assessing the Development of Educational Research Literacy: The Effect of Courses on Research Methods in Studies of Educational Science

    ERIC Educational Resources Information Center

    Groß Ophoff, Jana; Schladitz, Sandra; Leuders, Juliane; Leuders, Timo; Wirtz, Markus A.

    2015-01-01

    The ability to purposefully access, reflect, and use evidence from educational research (Educational Research Literacy) is expected of future professionals in educational practice. Based on the presented conceptual framework, a test instrument was developed to assess the different competency aspects: Information Literacy, Statistical Literacy, and…

  6. Energy: Education and Industry Changes for a New Era Utilization System Modifications.

    ERIC Educational Resources Information Center

    Dille, Earl K.; Dreifke, Gerald E.

    This paper provides data and opinions on long- and short-term challenges and changes required to meet the human resource and educational needs in a nuclear electric era as seen from a utility company's point of view. In particular, statements on engineering education curriculum, statistics on certain future manpower requirements, electric utility…

  7. Engine-Building Contest Inspires Future Automotive Technicians

    ERIC Educational Resources Information Center

    Kraft, Thomas; Morphew, Rick; Norris, Gerald

    2004-01-01

    According to the Bureau of Labor Statistics, the requirement for automotive technicians will increase 17 percent by 2008 from the 10-year period starting in 1998. In numbers, this translates to the need for an additional 132,000 technicians over the 790,000 that existed in 1998. The nation is currently losing 10 percent of its automotive…

  8. Can Students Learn Economics and Personal Finance in a Specialized Elementary School?

    ERIC Educational Resources Information Center

    Posnanski, Tracy J.; Schug, Mark C.; Schmitt, Thomas

    2007-01-01

    Statistics from a number of surveys indicate there is a high rate of economic and financial illiteracy in the United States. Several other studies have pointed out that problems related to the widespread lack of economic and financial understanding have serious consequences on the future economic well-being of many citizens. Financial and economic…

  9. Ecstasy: It's the Rave

    ERIC Educational Resources Information Center

    Dennis, Dixie; Ballard, Michael

    2002-01-01

    National statistics reveal an alarming trend concerning the use of 3,4-methylenedioxymethamphetamine, which is better known as ecstasy. Results from the Monitoring the Future survey of 50,000 secondary youth reveal that use among 8th graders rose to 3.1%, 5.4% among 10th graders, and 8.2% among 12th graders. High school faculty and staff must be…

  10. Developing Teachers as Leaders

    ERIC Educational Resources Information Center

    Wetzler, Jeff

    2010-01-01

    According to the national statistics compiled by researchers and the federal government, there is a grim academic future for the students in low-income urban and rural communities across the U.S. They have no more than a 50% chance of graduating from high school, and those who do graduate will perform at the level of eighth graders in high-income…

  11. Meeting Contemporary Statistical Needs of Instructional Communication Research: Modeling Teaching and Learning as a Conditional Process. Forum: The Future of Instructional Communication

    ERIC Educational Resources Information Center

    Goodboy, Alan K.

    2017-01-01

    For decades, instructional communication scholars have relied predominantly on cross-sectional survey methods to generate empirical associations between effective teaching and student learning. These studies typically correlate students' perceptions of their instructor's teaching behaviors with subjective self-report assessments of their own…

  12. Computing the Average Square: An Agent-Based Introduction to Aspects of Current Psychometric Practice

    ERIC Educational Resources Information Center

    Stroup, Walter M.; Hills, Thomas; Carmona, Guadalupe

    2011-01-01

    This paper summarizes an approach to helping future educators to engage with key issues related to the application of measurement-related statistics to learning and teaching, especially in the contexts of science, mathematics, technology and engineering (STEM) education. The approach we outline has two major elements. First, students are asked to…

  13. The Future of Foreign Language Teaching on the North American Continent.

    ERIC Educational Resources Information Center

    Bouton, Charles P.

    Following a brief review of the history of interest in foreign languages in America, facts to be considered when interpreting falling enrollment statistics, such as a drop in the birth rate, are discussed. It is stressed that foreign language teaching cannot be neglected in a world having improved and extensive communication between people…

  14. School Experience as a Potential Determinant of Post-Compulsory Participation

    ERIC Educational Resources Information Center

    Gorard, Stephen

    2010-01-01

    This paper considers the views of young people aged 14-16 about their future education, training and occupation. It is based on a study of around 3000 year 11 pupils in 45 educational settings in England during 2007/2008, supplemented by documentary analysis, official statistics, and interviews and surveys with staff and parents. Pupil-reported…

  15. Metrics, The Measure of Your Future: Evaluation Report, 1977.

    ERIC Educational Resources Information Center

    North Carolina State Dept. of Public Instruction, Raleigh. Div. of Development.

    The primary goal of the Metric Education Project was the systematic development of a replicable educational model to facilitate the system-wide conversion to the metric system during the next five to ten years. This document is an evaluation of that project. Three sets of statistical evidence exist to support the fact that the project has been…

  16. OUTLOOK BY DENVER AREA OCCUPATIONS. OCCUPATIONS IN COLORADO, PART II.

    ERIC Educational Resources Information Center

    Colorado State Univ., Ft. Collins.

    EMPLOYMENT STATISTICS FOR 1960, ESTIMATED EMPLOYMENT FOR 1965 AND 1970, ESTIMATES OF ADDITIONAL WORKERS NEEDED BY 1970, AND SALARY INFORMATION ARE PROVIDED FOR A WIDE RANGE OF OCCUPATIONS IN THE DENVER AREA. DATA WERE OBTAINED FROM A DENVER STUDY, "JOBS AND THE FUTURE," BY ROBERT VAUGHAN OF THE MOUNTAIN STATES TELEPHONE CO., 1962, AND…

  17. Portrait of the Future: 1994 Kansas Kids Count Data Book.

    ERIC Educational Resources Information Center

    Hardman, Sydney, Ed.; And Others

    This Kids Count data book presents a statistical portrait of the well-being of and conditions faced by the children of Kansas, based on key indicators. Nineteen indicators are detailed in five subject areas: (1) economic well-being; (2) physical health and safety; (3) educational achievement; (4) emotional well-being; and (5) social behavior and…

  18. Predictors of NCLEX-PN Success for Practical Nursing Students

    ERIC Educational Resources Information Center

    Eickhoff, Mary Ann

    2016-01-01

    There is currently a nursing shortage in the United States. By 2022, the Bureau of Labor Statistics (BLS) expects, the number of job openings for Practical Nurses (PN) will be 168,500, an increase of 25% over 2012 (BLS, 2014). Nursing education does not currently meet present, much less future needs. Nursing programs have limited space; according…

  19. Closing the Gender Gap: Girls and Computers.

    ERIC Educational Resources Information Center

    Fuchs, Lucy

    While 15 years ago only a few schools had microcomputers, today a majority of public schools have some computers, although an adequate number of computers for students to use is still in the future. Unfortunately, statistics show that, in many states, a higher percentage of male students are enrolled in computer classes than female; boys seem to…

  20. Cyberspace Math Models

    DTIC Science & Technology

    2013-06-01

    or indicators are used as long range memory measurements. Hurst and Holder exponents are the most important and popular parameters. Traditionally...the relation between two important parameters, the Hurst exponent (measurement of global long range memory) and the Entropy (measurement of...empirical results and future study. II. BACKGROUND We recall briey the mathematical and statistical definitions and properties of the Hurst exponents

  1. Monitoring the Future: Questionnaire Responses from the Nation's High School Seniors, 1980.

    ERIC Educational Resources Information Center

    Bachman, Jerald G.; And Others

    This report presents descriptive statistical results from a 1980 national survey of high school seniors concerning their values, behaviors, and lifestyle. It is the sixth in a series. Questionnaires were filled out by 16,524 seniors in 107 public and 20 private high schools. Student response rate was 82%. Content areas measured include the…

  2. Women Entrepreneurship Across Racial Lines: Current Status, Critical Issues, and Future Implications

    ERIC Educational Resources Information Center

    Smith-Hunter, Andrea

    2004-01-01

    This article begins with a look at women employment over the years and the historical place of women entrepreneurship in today's economy. It continues by analyzing data statistically on women entrepreneurs in the United States across racial lines, with a particular focus on Hispanic women entrepreneurs. The article ends by examining the critical…

  3. A Five Year Study of Selected Demographics of Middlesex Community College Graduates: 1985-1989.

    ERIC Educational Resources Information Center

    Coggins, John H.; Muzeroll, Terry

    This analysis of selected demographic statistics of Middlesex Community College (MxCC) graduates is intended for future academic advising, curriculum planning, and decision making. This demographic profile is comprised of data from studies published between 1985 and 1989. The study focuses on fundamental demographic indicators, such as sex, age,…

  4. For Tests That Are Predictively Powerful and without Social Prejudice

    ERIC Educational Resources Information Center

    Soares, Joseph A.

    2012-01-01

    In Philip Pullman's dark matter sci-fi trilogy, there is a golden compass that in the hands of the right person is predictively powerful; the same was supposed to be true of the SAT/ACT--the statistically indistinguishable standardized tests for college admissions. They were intended to be reliable mechanisms for identifying future trajectories,…

  5. Training in the Food and Beverages Sector in the United Kingdom. Report for the FORCE Programme. First Edition.

    ERIC Educational Resources Information Center

    Burns, Jim A.; King, Richard

    An international team of researchers studied the following aspects of training in the United Kingdom's food and beverage sector: structure and characteristics, business and social context, training and recruitment, and future training requirements. Data were collected from an analysis of social and labor/employment statistics, literature review,…

  6. Repositioning Trends of Latina/o/x Student Enrollments in Community Colleges

    ERIC Educational Resources Information Center

    Zerquera, Desiree D.; Acevedo-Gil, Nancy; Flores, Elizabeth; Marantal, Patrick

    2018-01-01

    This study used descriptive statistics to complicate the national narrative of Latina/o/x student college-going trends and aims to provide directions for future research on Latina/o/x students in the community college. Taking a state-by-state perspective, this study examined whether Latina/o/x college students enrolled in community colleges at…

  7. The 2008-18 Job Outlook in Brief

    ERIC Educational Resources Information Center

    Occupational Outlook Quarterly, 2010

    2010-01-01

    Some occupations will fare better than others over the 2008-18 decade. Although it's impossible to predict the future, one can gain insight into job outlook by analyzing trends in population growth, technological advances, and business practices. This insight is helpful in planning a career. Every 2 years, the U.S. Bureau of Labor Statistics (BLS)…

  8. Gender Differences in Early Mother-Child Interactions: Talking about an Imminent Event.

    ERIC Educational Resources Information Center

    Eisenmann, Barbara

    1997-01-01

    Examines maternal modes of organizing an imminent emotional event, a brief separation from the child. Finds that the mothers displayed two ways of structuring the future event, and these different modes were related statistically to the gender of the child. Investigates how the mother directs the child's mental processes by using augments of…

  9. Future Climate Change in the Baltic Sea Area

    NASA Astrophysics Data System (ADS)

    Bøssing Christensen, Ole; Kjellström, Erik; Zorita, Eduardo; Sonnenborg, Torben; Meier, Markus; Grinsted, Aslak

    2015-04-01

    Regional climate models have been used extensively since the first assessment of climate change in the Baltic Sea region published in 2008, not the least for studies of Europe (and including the Baltic Sea catchment area). Therefore, conclusions regarding climate model results have a better foundation than was the case for the first BACC report of 2008. This presentation will report model results regarding future climate. What is the state of understanding about future human-driven climate change? We will cover regional models, statistical downscaling, hydrological modelling, ocean modelling and sea-level change as it is projected for the Baltic Sea region. Collections of regional model simulations from the ENSEMBLES project for example, financed through the European 5th Framework Programme and the World Climate Research Programme Coordinated Regional Climate Downscaling Experiment, have made it possible to obtain an increasingly robust estimation of model uncertainty. While the first Baltic Sea assessment mainly used four simulations from the European 5th Framework Programme PRUDENCE project, an ensemble of 13 transient regional simulations with twice the horizontal resolution reaching the end of the 21st century has been available from the ENSEMBLES project; therefore it has been possible to obtain more quantitative assessments of model uncertainty. The literature about future climate change in the Baltic Sea region is largely built upon the ENSEMBLES project. Also within statistical downscaling, a considerable number of papers have been published, encompassing now the application of non-linear statistical models, projected changes in extremes and correction of climate model biases. The uncertainty of hydrological change has received increasing attention since the previous Baltic Sea assessment. Several studies on the propagation of uncertainties originating in GCMs, RCMs, and emission scenarios are presented. The number of studies on uncertainties related to downscaling and impact models is relatively small, but more are emerging. A large number of coupled climate-environmental scenario simulations for the Baltic Sea have been performed within the BONUS+ projects (ECOSUPPORT, INFLOW, AMBER and Baltic-C (2009-2011)), using various combinations of output from GCMs, RCMs, hydrological models and scenarios for load and emission of nutrients as forcing for Baltic Sea models. Such a large ensemble of scenario simulations for the Baltic Sea has never before been produced and enables for the first time an estimation of uncertainties.

  10. Predicting lettuce canopy photosynthesis with statistical and neural network models

    NASA Technical Reports Server (NTRS)

    Frick, J.; Precetti, C.; Mitchell, C. A.

    1998-01-01

    An artificial neural network (NN) and a statistical regression model were developed to predict canopy photosynthetic rates (Pn) for 'Waldman's Green' leaf lettuce (Latuca sativa L.). All data used to develop and test the models were collected for crop stands grown hydroponically and under controlled-environment conditions. In the NN and regression models, canopy Pn was predicted as a function of three independent variables: shootzone CO2 concentration (600 to 1500 micromoles mol-1), photosynthetic photon flux (PPF) (600 to 1100 micromoles m-2 s-1), and canopy age (10 to 20 days after planting). The models were used to determine the combinations of CO2 and PPF setpoints required each day to maintain maximum canopy Pn. The statistical model (a third-order polynomial) predicted Pn more accurately than the simple NN (a three-layer, fully connected net). Over an 11-day validation period, average percent difference between predicted and actual Pn was 12.3% and 24.6% for the statistical and NN models, respectively. Both models lost considerable accuracy when used to determine relatively long-range Pn predictions (> or = 6 days into the future).

  11. Statistical moments of the Strehl ratio

    NASA Astrophysics Data System (ADS)

    Yaitskova, Natalia; Esselborn, Michael; Gladysz, Szymon

    2012-07-01

    Knowledge of the statistical characteristics of the Strehl ratio is essential for the performance assessment of the existing and future adaptive optics systems. For full assessment not only the mean value of the Strehl ratio but also higher statistical moments are important. Variance is related to the stability of an image and skewness reflects the chance to have in a set of short exposure images more or less images with the quality exceeding the mean. Skewness is a central parameter in the domain of lucky imaging. We present a rigorous theory for the calculation of the mean value, the variance and the skewness of the Strehl ratio. In our approach we represent the residual wavefront as being formed by independent cells. The level of the adaptive optics correction defines the number of the cells and the variance of the cells, which are the two main parameters of our theory. The deliverables are the values of the three moments as the functions of the correction level. We make no further assumptions except for the statistical independence of the cells.

  12. Synthesis of instrumentally and historically recorded earthquakes and studying their spatial statistical relationship (A case study: Dasht-e-Biaz, Eastern Iran)

    NASA Astrophysics Data System (ADS)

    Jalali, Mohammad; Ramazi, Hamidreza

    2018-06-01

    Earthquake catalogues are the main source of statistical seismology for the long term studies of earthquake occurrence. Therefore, studying the spatiotemporal problems is important to reduce the related uncertainties in statistical seismology studies. A statistical tool, time normalization method, has been determined to revise time-frequency relationship in one of the most active regions of Asia, Eastern Iran and West of Afghanistan, (a and b were calculated around 8.84 and 1.99 in the exponential scale, not logarithmic scale). Geostatistical simulation method has been further utilized to reduce the uncertainties in the spatial domain. A geostatistical simulation produces a representative, synthetic catalogue with 5361 events to reduce spatial uncertainties. The synthetic database is classified using a Geographical Information System, GIS, based on simulated magnitudes to reveal the underlying seismicity patterns. Although some regions with highly seismicity correspond to known faults, significantly, as far as seismic patterns are concerned, the new method highlights possible locations of interest that have not been previously identified. It also reveals some previously unrecognized lineation and clusters in likely future strain release.

  13. The suitability of using death certificates as a data source for cancer mortality assessment in Turkey

    PubMed Central

    Ulus, Tumer; Yurtseven, Eray; Cavdar, Sabanur; Erginoz, Ethem; Erdogan, M. Sarper

    2012-01-01

    Aim To compare the quality of the 2008 cancer mortality data of the Istanbul Directorate of Cemeteries (IDC) with the 2008 data of International Agency for Research on Cancer (IARC) and Turkish Statistical Institute (TUIK), and discuss the suitability of using this databank for estimations of cancer mortality in the future. Methods We used 2008 and 2010 death records of the IDC and compared it to TUIK and IARC data. Results According to the WHO statistics, in Turkey in 2008 there were 67 255 estimated cancer deaths. As the population of Turkey was 71 517 100, the cancer mortality rate was 9.4 per 10 000. According to the IDC statistics, the cancer mortality rate in Istanbul in 2008 was 5.97 per 10 000. Conclusion IDC estimates were higher than WHO estimates probably because WHO bases its estimates on a sample group and because of the restrictions of IDC data collection method. Death certificates could be a reliable and accurate data source for mortality statistics if the problems of data collection are solved. PMID:23100210

  14. Development and evaluation of statistical shape modeling for principal inner organs on torso CT images.

    PubMed

    Zhou, Xiangrong; Xu, Rui; Hara, Takeshi; Hirano, Yasushi; Yokoyama, Ryujiro; Kanematsu, Masayuki; Hoshi, Hiroaki; Kido, Shoji; Fujita, Hiroshi

    2014-07-01

    The shapes of the inner organs are important information for medical image analysis. Statistical shape modeling provides a way of quantifying and measuring shape variations of the inner organs in different patients. In this study, we developed a universal scheme that can be used for building the statistical shape models for different inner organs efficiently. This scheme combines the traditional point distribution modeling with a group-wise optimization method based on a measure called minimum description length to provide a practical means for 3D organ shape modeling. In experiments, the proposed scheme was applied to the building of five statistical shape models for hearts, livers, spleens, and right and left kidneys by use of 50 cases of 3D torso CT images. The performance of these models was evaluated by three measures: model compactness, model generalization, and model specificity. The experimental results showed that the constructed shape models have good "compactness" and satisfied the "generalization" performance for different organ shape representations; however, the "specificity" of these models should be improved in the future.

  15. Water resources management: Hydrologic characterization through hydrograph simulation may bias streamflow statistics

    NASA Astrophysics Data System (ADS)

    Farmer, W. H.; Kiang, J. E.

    2017-12-01

    The development, deployment and maintenance of water resources management infrastructure and practices rely on hydrologic characterization, which requires an understanding of local hydrology. With regards to streamflow, this understanding is typically quantified with statistics derived from long-term streamgage records. However, a fundamental problem is how to characterize local hydrology without the luxury of streamgage records, a problem that complicates water resources management at ungaged locations and for long-term future projections. This problem has typically been addressed through the development of point estimators, such as regression equations, to estimate particular statistics. Physically-based precipitation-runoff models, which are capable of producing simulated hydrographs, offer an alternative to point estimators. The advantage of simulated hydrographs is that they can be used to compute any number of streamflow statistics from a single source (the simulated hydrograph) rather than relying on a diverse set of point estimators. However, the use of simulated hydrographs introduces a degree of model uncertainty that is propagated through to estimated streamflow statistics and may have drastic effects on management decisions. We compare the accuracy and precision of streamflow statistics (e.g. the mean annual streamflow, the annual maximum streamflow exceeded in 10% of years, and the minimum seven-day average streamflow exceeded in 90% of years, among others) derived from point estimators (e.g. regressions, kriging, machine learning) to that of statistics derived from simulated hydrographs across the continental United States. Initial results suggest that the error introduced through hydrograph simulation may substantially bias the resulting hydrologic characterization.

  16. Statistics for the Relative Detectability of Chemicals in Weak Gaseous Plumes in LWIR Hyperspectral Imagery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Metoyer, Candace N.; Walsh, Stephen J.; Tardiff, Mark F.

    2008-10-30

    The detection and identification of weak gaseous plumes using thermal imaging data is complicated by many factors. These include variability due to atmosphere, ground and plume temperature, and background clutter. This paper presents an analysis of one formulation of the physics-based model that describes the at-sensor observed radiance. The motivating question for the analyses performed in this paper is as follows. Given a set of backgrounds, is there a way to predict the background over which the probability of detecting a given chemical will be the highest? Two statistics were developed to address this question. These statistics incorporate data frommore » the long-wave infrared band to predict the background over which chemical detectability will be the highest. These statistics can be computed prior to data collection. As a preliminary exploration into the predictive ability of these statistics, analyses were performed on synthetic hyperspectral images. Each image contained one chemical (either carbon tetrachloride or ammonia) spread across six distinct background types. The statistics were used to generate predictions for the background ranks. Then, the predicted ranks were compared to the empirical ranks obtained from the analyses of the synthetic images. For the simplified images under consideration, the predicted and empirical ranks showed a promising amount of agreement. One statistic accurately predicted the best and worst background for detection in all of the images. Future work may include explorations of more complicated plume ingredients, background types, and noise structures.« less

  17. Potential impacts of agricultural drought on crop yield variability under a changing climate in Texas

    NASA Astrophysics Data System (ADS)

    Lee, K.; Leng, G.; Huang, M.; Sheffield, J.; Zhao, G.; Gao, H.

    2017-12-01

    Texas has the largest farm area in the U.S, and its revenue from crop production ranks third overall. With the changing climate, hydrological extremes such as droughts are becoming more frequent and intensified, causing significant yield reduction in rainfed agricultural systems. The objective of this study is to investigate the potential impacts of agricultural drought on crop yields (corn, sorghum, and wheat) under a changing climate in Texas. The Variable Infiltration Capacity (VIC) model, which is calibrated and validated over 10 major Texas river basins during the historical period, is employed in this study.The model is forced by a set of statistically downscaled climate projections from Coupled Model Intercomparison Project Phase 5 (CMIP5) model ensembles at a spatial resolution of 1/8°. The CMIP5 projections contain four Representative Concentration Pathways (RCP) that represent different greenhouse gas concentration (4.5 and 8.5 w/m2 are selected in this study). To carry out the analysis, VIC simulations from 1950 to 2099 are first analyzed to investigate how the frequency and severity of agricultural droughts will be altered in Texas (under a changing climate). Second, future crop yields are projected using a statistical crop model. Third, the effects of agricultural drought on crop yields are quantitatively analyzed. The results are expected to contribute to future water resources planning, with a goal of mitigating the negative impacts of future droughts on agricultural production in Texas.

  18. Indicator-based approach to assess sustainability of current and projected water use in Korea

    NASA Astrophysics Data System (ADS)

    Kong, I.; Kim, I., Sr.

    2016-12-01

    Recently occurred failures in water supply system derived from lacking rainfall in Korea has raised severe concerns about limited water resources exacerbated by anthropogenic drivers as well as climatic changes. Since Korea is under unprecedented changes in both social and environmental aspects, it is required to integrate social and environmental changes as well as climate factors in order to consider underlying problems and their upcoming impacts on sustainable water use. In this study, we proposed a framework to assess multilateral aspects in sustainable water use in support of performance-based monitoring. The framework is consisted of four thematic indices (climate, infrastructure, pollution, and management capacity) and subordinate indicators. Second, in order to project future circumstances, climate variability, demographic, and land cover scenarios to 2050 were applied after conducting statistical analysis identifying correlations between indicators within the framework since water crisis are caused by numerous interrelated factors. Assessment was conducted throughout 161 administrative boundaries in Korea at the time of 2010, 2030, and 2050. Third, current and future status in water use were illustrated using GIS-based methodology and statistical clustering (K-means and HCA) to elucidate spatially explicit maps and to categorize administrative regions showing similar phenomenon in the future. Based on conspicuous results shown in spatial analysis and clustering method, we suggested policy implementations to navigate local communities to decide which countermeasures should be supplemented or adopted to increase resiliency to upcoming changes in water use environments.

  19. Atmospheric Rivers in the Mid-latitudes: A Modeling Study for Current and Future Climates

    NASA Astrophysics Data System (ADS)

    Shields, C. A.; Kiehl, J. T.

    2015-12-01

    Atmospheric rivers (ARs) are dynamically-driven narrow intense bands of moisture that transport significant amounts of moisture from the tropics to mid-latitudes and are thus an important aspect the Earth's hydrological cycle. They are often associated with extratropical cyclones whose low level circulation is able to tap into tropical moisture and transport it northward. The "Pineapple Express" is an example of an AR that impacts the west coast of California predominately in the winter months and can produce heavy amounts of precipitation in a short period of time (hours up to several days). This work will focus on three mid-latitude AR regions including the west coast of California, the Pacific Northwest, and the United Kingdom as modeled by a suite of high-resolution CESM (Community Earth System Model) simulations for 20th century and RCP8.5 future climate scenarios. The CESM version employed utilizes half-degree resolution atmosphere/land components (~0.5o) coupled to the standard (1o) ocean/ice components. We use the high-resolution atmosphere because it is able to more accurately represent extreme, regional precipitation. CESM realistically captures ARs as spatial and temporal statistics show. Projections for future climate statistics for all three regions as well as analysis of the dynamical and thermodynamical mechanisms driving ARs, such as vorticity, jets and the steering flow, and water vapor transport, and will presented. Finally, teleconnections to climate variability processes, such as ENSO will be explored.

  20. A new market risk model for cogeneration project financing---combined heat and power development without a power purchase agreement

    NASA Astrophysics Data System (ADS)

    Lockwood, Timothy A.

    Federal legislative changes in 2006 no longer entitle cogeneration project financings by law to receive the benefit of a power purchase agreement underwritten by an investment-grade investor-owned utility. Consequently, this research explored the need for a new market-risk model for future cogeneration and combined heat and power (CHP) project financing. CHP project investment represents a potentially enormous energy efficiency benefit through its application by reducing fossil fuel use up to 55% when compared to traditional energy generation, and concurrently eliminates constituent air emissions up to 50%, including global warming gases. As a supplemental approach to a comprehensive technical analysis, a quantitative multivariate modeling was also used to test the statistical validity and reliability of host facility energy demand and CHP supply ratios in predicting the economic performance of CHP project financing. The resulting analytical models, although not statistically reliable at this time, suggest a radically simplified CHP design method for future profitable CHP investments using four easily attainable energy ratios. This design method shows that financially successful CHP adoption occurs when the average system heat-to-power-ratio supply is less than or equal to the average host-convertible-energy-ratio, and when the average nominally-rated capacity is less than average host facility-load-factor demands. New CHP investments can play a role in solving the world-wide problem of accommodating growing energy demand while preserving our precious and irreplaceable air quality for future generations.

  1. Statistics teaching in medical school: opinions of practising doctors.

    PubMed

    Miles, Susan; Price, Gill M; Swift, Louise; Shepstone, Lee; Leinster, Sam J

    2010-11-04

    The General Medical Council expects UK medical graduates to gain some statistical knowledge during their undergraduate education; but provides no specific guidance as to amount, content or teaching method. Published work on statistics teaching for medical undergraduates has been dominated by medical statisticians, with little input from the doctors who will actually be using this knowledge and these skills after graduation. Furthermore, doctor's statistical training needs may have changed due to advances in information technology and the increasing importance of evidence-based medicine. Thus there exists a need to investigate the views of practising medical doctors as to the statistical training required for undergraduate medical students, based on their own use of these skills in daily practice. A questionnaire was designed to investigate doctors' views about undergraduate training in statistics and the need for these skills in daily practice, with a view to informing future teaching. The questionnaire was emailed to all clinicians with a link to the University of East Anglia Medical School. Open ended questions were included to elicit doctors' opinions about both their own undergraduate training in statistics and recommendations for the training of current medical students. Content analysis was performed by two of the authors to systematically categorize and describe all the responses provided by participants. 130 doctors responded, including both hospital consultants and general practitioners. The findings indicated that most had not recognised the value of their undergraduate teaching in statistics and probability at the time, but had subsequently found the skills relevant to their career. Suggestions for improving undergraduate teaching in these areas included referring to actual research and ensuring relevance to, and integration with, clinical practice. Grounding the teaching of statistics in the context of real research studies and including examples of typical clinical work may better prepare medical students for their subsequent career.

  2. Hitting Is Contagious in Baseball: Evidence from Long Hitting Streaks

    PubMed Central

    Bock, Joel R.; Maewal, Akhilesh; Gough, David A.

    2012-01-01

    Data analysis is used to test the hypothesis that “hitting is contagious”. A statistical model is described to study the effect of a hot hitter upon his teammates’ batting during a consecutive game hitting streak. Box score data for entire seasons comprising streaks of length games, including a total observations were compiled. Treatment and control sample groups () were constructed from core lineups of players on the streaking batter’s team. The percentile method bootstrap was used to calculate confidence intervals for statistics representing differences in the mean distributions of two batting statistics between groups. Batters in the treatment group (hot streak active) showed statistically significant improvements in hitting performance, as compared against the control. Mean for the treatment group was found to be to percentage points higher during hot streaks (mean difference increased points), while the batting heat index introduced here was observed to increase by points. For each performance statistic, the null hypothesis was rejected at the significance level. We conclude that the evidence suggests the potential existence of a “statistical contagion effect”. Psychological mechanisms essential to the empirical results are suggested, as several studies from the scientific literature lend credence to contagious phenomena in sports. Causal inference from these results is difficult, but we suggest and discuss several latent variables that may contribute to the observed results, and offer possible directions for future research. PMID:23251507

  3. Joint probability of statistical success of multiple phase III trials.

    PubMed

    Zhang, Jianliang; Zhang, Jenny J

    2013-01-01

    In drug development, after completion of phase II proof-of-concept trials, the sponsor needs to make a go/no-go decision to start expensive phase III trials. The probability of statistical success (PoSS) of the phase III trials based on data from earlier studies is an important factor in that decision-making process. Instead of statistical power, the predictive power of a phase III trial, which takes into account the uncertainty in the estimation of treatment effect from earlier studies, has been proposed to evaluate the PoSS of a single trial. However, regulatory authorities generally require statistical significance in two (or more) trials for marketing licensure. We show that the predictive statistics of two future trials are statistically correlated through use of the common observed data from earlier studies. Thus, the joint predictive power should not be evaluated as a simplistic product of the predictive powers of the individual trials. We develop the relevant formulae for the appropriate evaluation of the joint predictive power and provide numerical examples. Our methodology is further extended to the more complex phase III development scenario comprising more than two (K > 2) trials, that is, the evaluation of the PoSS of at least k₀ (k₀≤ K) trials from a program of K total trials. Copyright © 2013 John Wiley & Sons, Ltd.

  4. Statistical mechanics of protein structural transitions: Insights from the island model

    PubMed Central

    Kobayashi, Yukio

    2016-01-01

    The so-called island model of protein structural transition holds that hydrophobic interactions are the key to both the folding and function of proteins. Herein, the genesis and statistical mechanical basis of the island model of transitions are reviewed, by presenting the results of simulations of such transitions. Elucidating the physicochemical mechanism of protein structural formation is the foundation for understanding the hierarchical structure of life at the microscopic level. Based on the results obtained to date using the island model, remaining problems and future work in the field of protein structures are discussed, referencing Professor Saitô’s views on the hierarchic structure of science. PMID:28409078

  5. Statistics for demodulation RFI in inverting operational amplifier circuits

    NASA Astrophysics Data System (ADS)

    Sutu, Y.-H.; Whalen, J. J.

    An investigation was conducted with the objective to determine statistical variations for RFI demodulation responses in operational amplifier (op amp) circuits. Attention is given to the experimental procedures employed, a three-stage op amp LED experiment, NCAP (Nonlinear Circuit Analysis Program) simulations of demodulation RFI in 741 op amps, and a comparison of RFI in four op amp types. Three major recommendations for future investigations are presented on the basis of the obtained results. One is concerned with the conduction of additional measurements of demodulation RFI in inverting amplifiers, while another suggests the employment of an automatic measurement system. It is also proposed to conduct additional NCAP simulations in which parasitic effects are accounted for more thoroughly.

  6. A Data-Driven Solution for Performance Improvement

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Marketed as the "Software of the Future," Optimal Engineering Systems P.I. EXPERT(TM) technology offers statistical process control and optimization techniques that are critical to businesses looking to restructure or accelerate operations in order to gain a competitive edge. Kennedy Space Center granted Optimal Engineering Systems the funding and aid necessary to develop a prototype of the process monitoring and improvement software. Completion of this prototype demonstrated that it was possible to integrate traditional statistical quality assurance tools with robust optimization techniques in a user- friendly format that is visually compelling. Using an expert system knowledge base, the software allows the user to determine objectives, capture constraints and out-of-control processes, predict results, and compute optimal process settings.

  7. Using genetic data to strengthen causal inference in observational research.

    PubMed

    Pingault, Jean-Baptiste; O'Reilly, Paul F; Schoeler, Tabea; Ploubidis, George B; Rijsdijk, Frühling; Dudbridge, Frank

    2018-06-05

    Causal inference is essential across the biomedical, behavioural and social sciences.By progressing from confounded statistical associations to evidence of causal relationships, causal inference can reveal complex pathways underlying traits and diseases and help to prioritize targets for intervention. Recent progress in genetic epidemiology - including statistical innovation, massive genotyped data sets and novel computational tools for deep data mining - has fostered the intense development of methods exploiting genetic data and relatedness to strengthen causal inference in observational research. In this Review, we describe how such genetically informed methods differ in their rationale, applicability and inherent limitations and outline how they should be integrated in the future to offer a rich causal inference toolbox.

  8. The association between students taking elective courses in chiropractic technique and their anticipated chiropractic technique choices in future practice.

    PubMed

    Wanlass, Paul W; Sikorski, David M; Kizhakkeveettil, Anupama; Tobias, Gene S

    2018-03-12

    To assess students' opinions of the potential influence of taking elective courses in chiropractic techniques and their future practice preferences. An anonymous, voluntary survey was conducted among graduating students from a doctor of chiropractic program. The survey included questions regarding the chiropractic technique elective courses they had completed and the potential influence of these courses on their chiropractic technique choices in future practice. Surveys were pretested for face validity, and data were analyzed using descriptive and inferential statistics. Of the 56 surveys distributed, 46 were completed, for a response rate of 82%. More than half of the students reported having taken at least 1 elective course in diversified technique (80%), Cox technique (76%), Activator Methods (70%), or sacro-occipital technique (63%). Less than half of the respondents reported taking technique elective courses in Gonstead or Thompson techniques. More than half of the students stated they were more likely to use Activator (72%), Thompson (68%), diversified (57%), or Cox (54%) techniques in their future practice after taking an elective course in that technique. Females stated that they were more likely to use Activator Methods ( p = .006) in future practice. Chiropractic technique elective courses in the doctor of chiropractic curriculum may influence students' choices of future practice chiropractic technique.

  9. Factors influencing medical students' choice of future specialization in medical sciences: a cross-sectional questionnaire survey from medical schools in china, malaysia and regions of South asian association for regional cooperation.

    PubMed

    Kumar, Arun; Mitra, Kasturi; Nagarajan, Sangeetha; Poudel, Bibek

    2014-03-01

    In future, increase in the number of healthcare professionals is dependent on the career interest among present undergraduate medical students. Based on their interest to pursue their specialty, the availability of medical doctors in each specialty could be done. This study was to find out future career interest and factors that influence undergraduate medical students to choose their future specialization. The study was carried out among first-year medical students from five countries. The students were asked to complete an 8-item questionnaire. Two thousand one hundred fifty three participants were enrolled in the study. Data were analyzed in Microsoft-Excel and Statistical Package for the Social Sciences. Of the 2153 participants, only 1470 responded. Among the 1470 participants, 169 participants were excluded due to the ambiguity in responses, finally making it to 1301participants. Among them, Anatomy (49.3%) followed by Biochemistry (26.7%) and Physiology (24%) were the most preferred subjects. Anatomy was the most preferred basic science subject among the other subjects and the students were interested to pursuing surgery in future. Furthermore, the most preferred future specialties were surgery, internal medicine and pediatrics with gender variations; males preferring surgery and females in obstetrics and gynecology.

  10. Moxie matters: associations of future orientation with active life expectancy.

    PubMed

    Laditka, Sarah B; Laditka, James N

    2017-10-01

    Being oriented toward the future has been associated with better future health. We studied associations of future orientation with life expectancy and the percentage of life with disability. We used the Panel Study of Income Dynamics (n = 5249). Participants' average age in 1968 was 33.0. Six questions repeatedly measured future orientation, 1968-1976. Seven waves (1999-2011, 33,331 person-years) measured disability in activities of daily living for the same individuals, whose average age in 1999 was 64.0. We estimated monthly probabilities of disability and death with multinomial logistic Markov models adjusted for age, sex, race/ethnicity, childhood health, and education. Using the probabilities, we created large populations with microsimulation, measuring disability in each month for each individual, age 55 through death. Life expectancy from age 55 for white men with high future orientation was age 77.6 (95% confidence interval 75.5-79.0), 6.9% (4.9-7.2) of those years with disability; results with low future orientation were 73.6 (72.2-75.4) and 9.6% (7.7-10.7). Comparable results for African American men were 74.8 (72.9-75.3), 8.1 (5.6-9.3), 71.0 (69.6-72.8), and 11.3 (9.1-11.7). For women, there were no significant differences associated with levels of future orientation for life expectancy. For white women with high future orientation 9.1% of remaining life from age 55 was disabled (6.3-9.9), compared to 12.4% (10.2-13.2) with low future orientation. Disability results for African American women were similar but statistically significant only at age 80 and over. High future orientation during early to middle adult ages may be associated with better health in older age.

  11. Quantifying the intra-annual uncertainties in climate change assessment over 10 sub-basins across the Pacific Northwest US

    NASA Astrophysics Data System (ADS)

    Ahmadalipour, Ali; Moradkhani, Hamid; Rana, Arun

    2017-04-01

    Uncertainty is an inevitable feature of climate change impact assessments. Understanding and quantifying different sources of uncertainty is of high importance, which can help modeling agencies improve the current models and scenarios. In this study, we have assessed the future changes in three climate variables (i.e. precipitation, maximum temperature, and minimum temperature) over 10 sub-basins across the Pacific Northwest US. To conduct the study, 10 statistically downscaled CMIP5 GCMs from two downscaling methods (i.e. BCSD and MACA) were utilized at 1/16 degree spatial resolution for the historical period of 1970-2000 and future period of 2010-2099. For the future projections, two future scenarios of RCP4.5 and RCP8.5 were used. Furthermore, Bayesian Model Averaging (BMA) was employed to develop a probabilistic future projection for each climate variable. Results indicate superiority of BMA simulations compared to individual models. Increasing temperature and precipitation are projected at annual timescale. However, the changes are not uniform among different seasons. Model uncertainty shows to be the major source of uncertainty, while downscaling uncertainty significantly contributes to the total uncertainty, especially in summer.

  12. Identification of Vehicle Health Assurance Related Trends

    NASA Technical Reports Server (NTRS)

    Phojanamongkolkij, Nipa; Evans, Joni K.; Barr, Lawrence C.; Leone, Karen M.; Reveley, Mary S.

    2014-01-01

    Trend analysis in aviation as related to vehicle health management (VHM) was performed by reviewing the most current statistical and prognostics data available from the National Transportation Safety Board (NTSB) accident, the Federal Aviation Administration (FAA) incident, and the NASA Aviation Safety Reporting System (ASRS) incident datasets. In addition, future directions in aviation technology related to VHM research areas were assessed through the Commercial Aviation Safety Team (CAST) Safety Enhancements Reserved for Future Implementations (SERFIs), the National Transportation Safety Board (NTSB) Most-Wanted List and recent open safety recommendations, the National Research Council (NRC) Decadal Survey of Civil Aeronautics, and the Future Aviation Safety Team (FAST) areas of change. Future research direction in the VHM research areas is evidently strong as seen from recent research solicitations from the Naval Air Systems Command (NAVAIR), and VHM-related technologies actively being developed by aviation industry leaders, including GE, Boeing, Airbus, and UTC Aerospace Systems. Given the highly complex VHM systems, modifications can be made in the future so that the Vehicle Systems Safety Technology Project (VSST) technical challenges address inadequate maintenance crew's trainings and skills, and the certification methods of such systems as recommended by the NTSB, NRC, and FAST areas of change.

  13. The Multi-Frequency Correlation Between Eua and sCER Futures Prices: Evidence from the Emd Approach

    NASA Astrophysics Data System (ADS)

    Zhang, Yue-Jun; Huang, Yi-Song

    2015-05-01

    Currently European Union Allowances (EUA) and secondary Certified Emission Reduction (sCER) have become two dominant carbon trading assets for investors and their linkage attracts much attention from academia and practitioners in recent years. Under this circumstance, we use the empirical mode decomposition (EMD) approach to decompose the two carbon futures contract prices and discuss their correlation from the multi-frequency perspective. The empirical results indicate that, first, the EUA and sCER futures price movements can be divided into those triggered by the long-term, medium-term and short-term market impacts. Second, the price movements in the EUA and sCER futures markets are primarily caused by the long-term impact, while the short-term impact can only explain a small fraction. Finally, the long-term (short-term) effect on EUA prices is statistically uncorrelated with the short-term (long-term) effect of sCER prices, and there is a medium or strong lead-and-lag correlation between the EUA and sCER price components with the same time scales. These results may provide some important insights of price forecast and arbitraging activities for carbon futures market investors, analysts and regulators.

  14. Cross-correlations between agricultural commodity futures markets in the US and China

    NASA Astrophysics Data System (ADS)

    Li, Zhihui; Lu, Xinsheng

    2012-08-01

    This paper examines the cross-correlation properties of agricultural futures markets between the US and China using a cross-correlation statistic test and multifractal detrended cross-correlation analysis (MF-DCCA). The results show that the cross-correlations between the two geographically distant markets for four pairs of important agricultural commodities futures are significantly multifractal. By introducing the concept of a “crossover”, we find that the multifractality of cross-correlations between the two markets is not long lasting. The cross-correlations in the short term are more strongly multifractal, but they are weakly so in the long term. Moreover, cross-correlations of small fluctuations are persistent and those of large fluctuations are anti-persistent in the short term while cross-correlations of all kinds of fluctuations for soy bean and soy meal futures are persistent and for corn and wheat futures are anti-persistent in the long term. We also find that cross-correlation exponents are less than the averaged generalized Hurst exponent when q<0 and more than the averaged generalized Hurst exponent when q>0 in the short term, while in the long term they are almost the same.

  15. Genetic Diversity and Ecological Niche Modelling of Wild Barley: Refugia, Large-Scale Post-LGM Range Expansion and Limited Mid-Future Climate Threats?

    PubMed Central

    Russell, Joanne; van Zonneveld, Maarten; Dawson, Ian K.; Booth, Allan; Waugh, Robbie; Steffenson, Brian

    2014-01-01

    Describing genetic diversity in wild barley (Hordeum vulgare ssp. spontaneum) in geographic and environmental space in the context of current, past and potential future climates is important for conservation and for breeding the domesticated crop (Hordeum vulgare ssp. vulgare). Spatial genetic diversity in wild barley was revealed by both nuclear- (2,505 SNP, 24 nSSR) and chloroplast-derived (5 cpSSR) markers in 256 widely-sampled geo-referenced accessions. Results were compared with MaxEnt-modelled geographic distributions under current, past (Last Glacial Maximum, LGM) and mid-term future (anthropogenic scenario A2, the 2080s) climates. Comparisons suggest large-scale post-LGM range expansion in Central Asia and relatively small, but statistically significant, reductions in range-wide genetic diversity under future climate. Our analyses support the utility of ecological niche modelling for locating genetic diversity hotspots and determine priority geographic areas for wild barley conservation under anthropogenic climate change. Similar research on other cereal crop progenitors could play an important role in tailoring conservation and crop improvement strategies to support future human food security. PMID:24505252

  16. Rock Statistics at the Mars Pathfinder Landing Site, Roughness and Roving on Mars

    NASA Technical Reports Server (NTRS)

    Haldemann, A. F. C.; Bridges, N. T.; Anderson, R. C.; Golombek, M. P.

    1999-01-01

    Several rock counts have been carried out at the Mars Pathfinder landing site producing consistent statistics of rock coverage and size-frequency distributions. These rock statistics provide a primary element of "ground truth" for anchoring remote sensing information used to pick the Pathfinder, and future, landing sites. The observed rock population statistics should also be consistent with the emplacement and alteration processes postulated to govern the landing site landscape. The rock population databases can however be used in ways that go beyond the calculation of cumulative number and cumulative area distributions versus rock diameter and height. Since the spatial parameters measured to characterize each rock are determined with stereo image pairs, the rock database serves as a subset of the full landing site digital terrain model (DTM). Insofar as a rock count can be carried out in a speedier, albeit coarser, manner than the full DTM analysis, rock counting offers several operational and scientific products in the near term. Quantitative rock mapping adds further information to the geomorphic study of the landing site, and can also be used for rover traverse planning. Statistical analysis of the surface roughness using the rock count proxy DTM is sufficiently accurate when compared to the full DTM to compare with radar remote sensing roughness measures, and with rover traverse profiles.

  17. Combining Statistics and Physics to Improve Climate Downscaling

    NASA Astrophysics Data System (ADS)

    Gutmann, E. D.; Eidhammer, T.; Arnold, J.; Nowak, K.; Clark, M. P.

    2017-12-01

    Getting useful information from climate models is an ongoing problem that has plagued climate science and hydrologic prediction for decades. While it is possible to develop statistical corrections for climate models that mimic current climate almost perfectly, this does not necessarily guarantee that future changes are portrayed correctly. In contrast, convection permitting regional climate models (RCMs) have begun to provide an excellent representation of the regional climate system purely from first principles, providing greater confidence in their change signal. However, the computational cost of such RCMs prohibits the generation of ensembles of simulations or long time periods, thus limiting their applicability for hydrologic applications. Here we discuss a new approach combining statistical corrections with physical relationships for a modest computational cost. We have developed the Intermediate Complexity Atmospheric Research model (ICAR) to provide a climate and weather downscaling option that is based primarily on physics for a fraction of the computational requirements of a traditional regional climate model. ICAR also enables the incorporation of statistical adjustments directly within the model. We demonstrate that applying even simple corrections to precipitation while the model is running can improve the simulation of land atmosphere feedbacks in ICAR. For example, by incorporating statistical corrections earlier in the modeling chain, we permit the model physics to better represent the effect of mountain snowpack on air temperature changes.

  18. Dynamic modelling of n-of-1 data: powerful and flexible data analytics applied to individualised studies.

    PubMed

    Vieira, Rute; McDonald, Suzanne; Araújo-Soares, Vera; Sniehotta, Falko F; Henderson, Robin

    2017-09-01

    N-of-1 studies are based on repeated observations within an individual or unit over time and are acknowledged as an important research method for generating scientific evidence about the health or behaviour of an individual. Statistical analyses of n-of-1 data require accurate modelling of the outcome while accounting for its distribution, time-related trend and error structures (e.g., autocorrelation) as well as reporting readily usable contextualised effect sizes for decision-making. A number of statistical approaches have been documented but no consensus exists on which method is most appropriate for which type of n-of-1 design. We discuss the statistical considerations for analysing n-of-1 studies and briefly review some currently used methodologies. We describe dynamic regression modelling as a flexible and powerful approach, adaptable to different types of outcomes and capable of dealing with the different challenges inherent to n-of-1 statistical modelling. Dynamic modelling borrows ideas from longitudinal and event history methodologies which explicitly incorporate the role of time and the influence of past on future. We also present an illustrative example of the use of dynamic regression on monitoring physical activity during the retirement transition. Dynamic modelling has the potential to expand researchers' access to robust and user-friendly statistical methods for individualised studies.

  19. Functional brain networks for learning predictive statistics.

    PubMed

    Giorgio, Joseph; Karlaftis, Vasilis M; Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew; Kourtzi, Zoe

    2017-08-18

    Making predictions about future events relies on interpreting streams of information that may initially appear incomprehensible. This skill relies on extracting regular patterns in space and time by mere exposure to the environment (i.e., without explicit feedback). Yet, we know little about the functional brain networks that mediate this type of statistical learning. Here, we test whether changes in the processing and connectivity of functional brain networks due to training relate to our ability to learn temporal regularities. By combining behavioral training and functional brain connectivity analysis, we demonstrate that individuals adapt to the environment's statistics as they change over time from simple repetition to probabilistic combinations. Further, we show that individual learning of temporal structures relates to decision strategy. Our fMRI results demonstrate that learning-dependent changes in fMRI activation within and functional connectivity between brain networks relate to individual variability in strategy. In particular, extracting the exact sequence statistics (i.e., matching) relates to changes in brain networks known to be involved in memory and stimulus-response associations, while selecting the most probable outcomes in a given context (i.e., maximizing) relates to changes in frontal and striatal networks. Thus, our findings provide evidence that dissociable brain networks mediate individual ability in learning behaviorally-relevant statistics. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  20. Daniel Goodman’s empirical approach to Bayesian statistics

    USGS Publications Warehouse

    Gerrodette, Tim; Ward, Eric; Taylor, Rebecca L.; Schwarz, Lisa K.; Eguchi, Tomoharu; Wade, Paul; Himes Boor, Gina

    2016-01-01

    Bayesian statistics, in contrast to classical statistics, uses probability to represent uncertainty about the state of knowledge. Bayesian statistics has often been associated with the idea that knowledge is subjective and that a probability distribution represents a personal degree of belief. Dr. Daniel Goodman considered this viewpoint problematic for issues of public policy. He sought to ground his Bayesian approach in data, and advocated the construction of a prior as an empirical histogram of “similar” cases. In this way, the posterior distribution that results from a Bayesian analysis combined comparable previous data with case-specific current data, using Bayes’ formula. Goodman championed such a data-based approach, but he acknowledged that it was difficult in practice. If based on a true representation of our knowledge and uncertainty, Goodman argued that risk assessment and decision-making could be an exact science, despite the uncertainties. In his view, Bayesian statistics is a critical component of this science because a Bayesian analysis produces the probabilities of future outcomes. Indeed, Goodman maintained that the Bayesian machinery, following the rules of conditional probability, offered the best legitimate inference from available data. We give an example of an informative prior in a recent study of Steller sea lion spatial use patterns in Alaska.

  1. Fast, Statistical Model of Surface Roughness for Ion-Solid Interaction Simulations and Efficient Code Coupling

    NASA Astrophysics Data System (ADS)

    Drobny, Jon; Curreli, Davide; Ruzic, David; Lasa, Ane; Green, David; Canik, John; Younkin, Tim; Blondel, Sophie; Wirth, Brian

    2017-10-01

    Surface roughness greatly impacts material erosion, and thus plays an important role in Plasma-Surface Interactions. Developing strategies for efficiently introducing rough surfaces into ion-solid interaction codes will be an important step towards whole-device modeling of plasma devices and future fusion reactors such as ITER. Fractal TRIDYN (F-TRIDYN) is an upgraded version of the Monte Carlo, BCA program TRIDYN developed for this purpose that includes an explicit fractal model of surface roughness and extended input and output options for file-based code coupling. Code coupling with both plasma and material codes has been achieved and allows for multi-scale, whole-device modeling of plasma experiments. These code coupling results will be presented. F-TRIDYN has been further upgraded with an alternative, statistical model of surface roughness. The statistical model is significantly faster than and compares favorably to the fractal model. Additionally, the statistical model compares well to alternative computational surface roughness models and experiments. Theoretical links between the fractal and statistical models are made, and further connections to experimental measurements of surface roughness are explored. This work was supported by the PSI-SciDAC Project funded by the U.S. Department of Energy through contract DOE-DE-SC0008658.

  2. Climate Change Impacts on Worldwide Coffee Production

    NASA Astrophysics Data System (ADS)

    Foreman, T.; Rising, J. A.

    2015-12-01

    Coffee (Coffea arabica and Coffea canephora) plays a vital role in many countries' economies, providing necessary income to 25 million members of tropical countries, and supporting a $81 billion industry, making it one of the most valuable commodities in the world. At the same time, coffee is at the center of many issues of sustainability. It is vulnerable to climate change, with disease outbreaks becoming more common and suitable regions beginning to shift. We develop a statistical production model for coffee which incorporates temperature, precipitation, frost, and humidity effects using a new database of worldwide coffee production. We then use this model to project coffee yields and production into the future based on a variety of climate forecasts. This model can then be used together with a market model to forecast the locations of future coffee production as well as future prices, supply, and demand.

  3. Mapping remote and multidisciplinary learning barriers: lessons from challenge-based innovation at CERN

    NASA Astrophysics Data System (ADS)

    Jensen, Matilde Bisballe; Utriainen, Tuuli Maria; Steinert, Martin

    2018-01-01

    This paper presents the experienced difficulties of students participating in the multidisciplinary, remote collaborating engineering design course challenge-based innovation at CERN. This is with the aim to identify learning barriers and improve future learning experiences. We statistically analyse the rated differences between distinct design activities, educational background and remote vs. co-located collaboration. The analysis is based on a quantitative and qualitative questionnaire (N = 37). Our analysis found significant ranking differences between remote and co-located activities. This questions whether the remote factor might be a barrier for the originally intended learning goals. Further a correlation between analytical and converging design phases was identified. Hence, future facilitators are suggested to help students in the transition from one design phase to the next rather than only teaching methods in the individual design phases. Finally, we discuss how educators address the identified learning barriers when designing future courses including multidisciplinary or remote collaboration.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tessore, Nicolas; Metcalf, R. Benton; Winther, Hans A.

    A number of alternatives to general relativity exhibit gravitational screening in the non-linear regime of structure formation. We describe a set of algorithms that can produce weak lensing maps of large scale structure in such theories and can be used to generate mock surveys for cosmological analysis. By analysing a few basic statistics we indicate how these alternatives can be distinguished from general relativity with future weak lensing surveys.

  5. Minorities & Women in the Health Fields: Applicants, Students, and Workers. Health Manpower References.

    ERIC Educational Resources Information Center

    Philpot, Wilbertine P.; Bernstein, Stuart

    A comprehensive look at the current and future supply of women and minorities in the health professions and in health professions schools is provided in this statistical report. Its data are more extensive than those presented in either of two earlier reports, hence, it can prove useful in assisting analysis of the composition of the nation's…

  6. Projecting wildfire area burned in the south-eastern United States, 2011-60

    Treesearch

    Jeffrey P. Prestemon; Uma Shankar; Aijun Xiu; K. Talgo; D. Yang; Ernest Dixon; Donald McKenzie; Karen L. Abt

    2016-01-01

    Future changes in society and climate are expected to affect wildfire activity in the south-eastern United States. The objective of this research was to understand how changes in both climate and society may affect wildfire in the coming decades.We estimated a three-stage statistical model of wildfire area burned by ecoregion province for lightning and human causes (...

  7. Comparing Weighted and Unweighted Grade Point Averages in Predicting College Success of Diverse and Low-Income College Students

    ERIC Educational Resources Information Center

    Warne, Russell T.; Nagaishi, Chanel; Slade, Michael K.; Hermesmeyer, Paul; Peck, Elizabeth Kimberli

    2014-01-01

    While research has shown the statistical significance of high school grade point averages (HSGPAs) in predicting future academic outcomes, the systems with which HSGPAs are calculated vary drastically across schools. Some schools employ unweighted grades that carry the same point value regardless of the course in which they are earned; other…

  8. New Labor Force Projections to 1990. Special Labor Force Report 197.

    ERIC Educational Resources Information Center

    Fullerton, Howard N., Jr.; Flaim, Paul O.

    Prepared as part of the Bureau of Labor Statistics' periodic reassessment of its projections of the future growth trends of the various sectors of the American economy, new labor force projections to 1990 are presented based on trends in labor force participation as observed through 1975 and on the most recent population projections of the U.S.…

  9. NASA CONNECT(TradeMark): Space Suit Science in the Classroom

    NASA Technical Reports Server (NTRS)

    Williams, William B.; Giersch, Chris; Bensen, William E.; Holland, Susan M.

    2003-01-01

    NASA CONNECT's(TradeMark) program titled Functions and Statistics: Dressed for Space initially aired on Public Broadcasting Stations (PBS) nationwide on May 9, 2002. The program traces the evolution of past space suit technologies in the design of space suits for future flight. It serves as the stage to provide educators, parents, and students "space suit science" in the classroom.

  10. 2015 Distributed Wind Market Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Orrell, Alice C.; Foster, Nikolas A.F.; Homer, Juliet S.

    The U.S. Department of Energy’s (DOE’s) annual Distributed Wind Market Report provides stakeholders with statistics and analysis of the market along with insights into its trends and characteristics. By providing a comprehensive overview of the distributed wind market, this report can help plan and guide future investments and decisions by industry, utilities, federal and state agencies, and other interested parties.

  11. The State of the State: Building a Better Future for Kansas Kids.

    ERIC Educational Resources Information Center

    Kansas Action for Children, Inc., Topeka.

    This special Kids Count report compares the current well-being of Kansas children to that of children in other states. The statistical portrait is based on a composite rank and 10 indicators of child well-being: (1) percent low birthweight infants; (2) infant mortality rate; (3) child death rate; (4) teen death rate by accident, homicide, and…

  12. Measures of Child Well-Being in Utah, 2003: Counting on a Better Future for Utah's Kids.

    ERIC Educational Resources Information Center

    Haven, Terry, Ed.

    This Kids Count report examines statewide trends in the well-being of Utah's children. The statistical portrait is based on 28 indicators of children's well-being in five areas: (1) child health (prenatal care, low birth-weight births, infant mortality, child injury deaths, injury-related hospital discharges, child abuse, childhood immunizations,…

  13. How can my research paper be useful for future meta-analyses on forest restoration practices?

    Treesearch

    Enrique Andivia; Pedro Villar‑Salvador; Juan A. Oliet; Jaime Puertolas; R. Kasten Dumroese

    2018-01-01

    Statistical meta-analysis is a powerful and useful tool to quantitatively synthesize the information conveyed in published studies on a particular topic. It allows identifying and quantifying overall patterns and exploring causes of variation. The inclusion of published works in meta-analyses requires, however, a minimum quality standard of the reported data and...

  14. Status of the NASA Robotic Mission Conjunction Assessment Effort

    NASA Technical Reports Server (NTRS)

    Newman, Lauri Kraft

    2007-01-01

    This viewgraph presentation discusses NASA's processes and tools used to mitigate threats to NASA's robotic assets. The topics include: 1) Background; 2) Goddard Stakeholders and Mission Support; 3) ESC and TDRS Mission Descriptions; 4) TDRS Conjunction Assessment Process; 5) ESMO Conjunction Assessment Process; 6) Recent Operations Experiences; 7) Statistics Collected for ESC Regime; and 8) Current and Future Analysis Items.

  15. Sympathy and Callousness: The Impact of Deliberative Thought on Donations to Identifiable and Statistical Victims

    ERIC Educational Resources Information Center

    Small, Deborah A.; Loewenstein, George; Slovic, Paul

    2007-01-01

    When donating to charitable causes, people do not value lives consistently. Money is often concentrated on a single victim even though more people would be helped, if resources were dispersed or spent protecting future victims. We examine the impact of deliberating about donation decisions on generosity. In a series of field experiments, we show…

  16. Uses of Multivariate Analytical Techniques in Online and Blended Business Education: An Assessment of Current Practice and Recommendations for Future Research

    ERIC Educational Resources Information Center

    Arbaugh, J. B.; Hwang, Alvin

    2013-01-01

    Seeking to assess the analytical rigor of empirical research in management education, this article reviews the use of multivariate statistical techniques in 85 studies of online and blended management education over the past decade and compares them with prescriptions offered by both the organization studies and educational research communities.…

  17. America's Changing Work Force: Statistics in Brief.

    ERIC Educational Resources Information Center

    American Association of Retired Persons, Washington, DC.

    This booklet provides information about the demographics of the changing work force. It offers an at-a-glance profile of workers age 45 and older and considers likely changes in the work force of the future. The document includes topics such as the composition of the work force of today and tomorrow by age and sex, labor force participation rates,…

  18. Tanzania at the Turn of the Century: Background Papers and Statistics. A World Bank Country Study.

    ERIC Educational Resources Information Center

    World Bank, Washington, DC.

    This report presents lessons from Tanzania's development experience of the past four decades, with emphasis on the period since the last report (1996), and assesses the imperatives for higher sustained growth and better livelihood for its citizens in the future. The background papers review and assess Tanzania's actual growth and poverty reduction…

  19. Investing in the Future: Strategic Planning, FY 2001 Appropriations Requests, Supplemental & Statistical Information.

    ERIC Educational Resources Information Center

    Iowa State Board of Regents, Des Moines.

    This document presents the State of Iowa Board of Regents fiscal year (FY) 2001 budget requests and provides information about the Board and its institutions. An introductory section 1 offers an overview of the Board of Regents' functions, including a mission state and governance process, and explains FY 2000 appropriation reductions and FY 2001…

  20. All Kids Count! Assessing the Well-Being of African-American, American Indian, Asian, and Latino Children.

    ERIC Educational Resources Information Center

    Kids Count Minnesota, Minneapolis.

    This Kids Count data book examines trends in the well-being of Minnesota's African-American, American Indian, Asian, and Latino children. The statistical portrait is based on 22 indicators of child well-being: (1) attitudes about race; (2) housing patterns; (3) future plans; (4) social involvement; (5) park usage; (6) negative treatment; (7) bias…

Top