Sample records for statistical analysis employed

  1. Gis-Based Spatial Statistical Analysis of College Graduates Employment

    NASA Astrophysics Data System (ADS)

    Tang, R.

    2012-07-01

    It is urgently necessary to be aware of the distribution and employment status of college graduates for proper allocation of human resources and overall arrangement of strategic industry. This study provides empirical evidence regarding the use of geocoding and spatial analysis in distribution and employment status of college graduates based on the data from 2004-2008 Wuhan Municipal Human Resources and Social Security Bureau, China. Spatio-temporal distribution of employment unit were analyzed with geocoding using ArcGIS software, and the stepwise multiple linear regression method via SPSS software was used to predict the employment and to identify spatially associated enterprise and professionals demand in the future. The results show that the enterprises in Wuhan east lake high and new technology development zone increased dramatically from 2004 to 2008, and tended to distributed southeastward. Furthermore, the models built by statistical analysis suggest that the specialty of graduates major in has an important impact on the number of the employment and the number of graduates engaging in pillar industries. In conclusion, the combination of GIS and statistical analysis which helps to simulate the spatial distribution of the employment status is a potential tool for human resource development research.

  2. An Analysis of the Navy’s Voluntary Education Program

    DTIC Science & Technology

    2007-03-01

    NAVAL ANALYSIS VOLED STUDY .........11 1. Data .........................................11 2. Statistical Models ...........................12 3...B. EMPLOYER FINANCED GENERAL TRAINING ................31 1. Data .........................................32 2. Statistical Model...37 1. Data .........................................38 2. Statistical Model ............................38 3. Findings

  3. Analysis of Employment Flow of Landscape Architecture Graduates in Agricultural Universities

    ERIC Educational Resources Information Center

    Yao, Xia; He, Linchun

    2012-01-01

    A statistical analysis of employment flow of landscape architecture graduates was conducted on the employment data of graduates major in landscape architecture in 2008 to 2011. The employment flow of graduates was to be admitted to graduate students, industrial direction and regional distribution, etc. Then, the features of talent flow and factors…

  4. Statistical methods in personality assessment research.

    PubMed

    Schinka, J A; LaLone, L; Broeckel, J A

    1997-06-01

    Emerging models of personality structure and advances in the measurement of personality and psychopathology suggest that research in personality and personality assessment has entered a stage of advanced development, in this article we examine whether researchers in these areas have taken advantage of new and evolving statistical procedures. We conducted a review of articles published in the Journal of Personality, Assessment during the past 5 years. Of the 449 articles that included some form of data analysis, 12.7% used only descriptive statistics, most employed only univariate statistics, and fewer than 10% used multivariate methods of data analysis. We discuss the cost of using limited statistical methods, the possible reasons for the apparent reluctance to employ advanced statistical procedures, and potential solutions to this technical shortcoming.

  5. Recent statistical methods for orientation data

    NASA Technical Reports Server (NTRS)

    Batschelet, E.

    1972-01-01

    The application of statistical methods for determining the areas of animal orientation and navigation are discussed. The method employed is limited to the two-dimensional case. Various tests for determining the validity of the statistical analysis are presented. Mathematical models are included to support the theoretical considerations and tables of data are developed to show the value of information obtained by statistical analysis.

  6. Employer Learning and the Signaling Value of Education. National Longitudinal Surveys Discussion Paper.

    ERIC Educational Resources Information Center

    Altonji, Joseph G.; Pierret, Charles R.

    A statistical analysis was performed to test the hypothesis that, if profit-maximizing firms have limited information about the general productivity of new workers, they may choose to use easily observable characteristics such as years of education to discriminate statistically among workers. Information about employer learning was obtained by…

  7. An Analysis of Research Methods and Statistical Techniques Used by Doctoral Dissertation at the Education Sciences in Turkey

    ERIC Educational Resources Information Center

    Karadag, Engin

    2010-01-01

    To assess research methods and analysis of statistical techniques employed by educational researchers, this study surveyed unpublished doctoral dissertation from 2003 to 2007. Frequently used research methods consisted of experimental research; a survey; a correlational study; and a case study. Descriptive statistics, t-test, ANOVA, factor…

  8. Two Paradoxes in Linear Regression Analysis.

    PubMed

    Feng, Ge; Peng, Jing; Tu, Dongke; Zheng, Julia Z; Feng, Changyong

    2016-12-25

    Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection.

  9. A Study on Predictive Analytics Application to Ship Machinery Maintenance

    DTIC Science & Technology

    2013-09-01

    Looking at the nature of the time series forecasting method , it would be better applied to offline analysis . The application for real- time online...other system attributes in future. Two techniques of statistical analysis , mainly time series models and cumulative sum control charts, are discussed in...statistical tool employed for the two techniques of statistical analysis . Both time series forecasting as well as CUSUM control charts are shown to be

  10. An overview of the mathematical and statistical analysis component of RICIS

    NASA Technical Reports Server (NTRS)

    Hallum, Cecil R.

    1987-01-01

    Mathematical and statistical analysis components of RICIS (Research Institute for Computing and Information Systems) can be used in the following problem areas: (1) quantification and measurement of software reliability; (2) assessment of changes in software reliability over time (reliability growth); (3) analysis of software-failure data; and (4) decision logic for whether to continue or stop testing software. Other areas of interest to NASA/JSC where mathematical and statistical analysis can be successfully employed include: math modeling of physical systems, simulation, statistical data reduction, evaluation methods, optimization, algorithm development, and mathematical methods in signal processing.

  11. Technological Tools in the Introductory Statistics Classroom: Effects on Student Understanding of Inferential Statistics

    ERIC Educational Resources Information Center

    Meletiou-Mavrotheris, Maria

    2004-01-01

    While technology has become an integral part of introductory statistics courses, the programs typically employed are professional packages designed primarily for data analysis rather than for learning. Findings from several studies suggest that use of such software in the introductory statistics classroom may not be very effective in helping…

  12. Two Paradoxes in Linear Regression Analysis

    PubMed Central

    FENG, Ge; PENG, Jing; TU, Dongke; ZHENG, Julia Z.; FENG, Changyong

    2016-01-01

    Summary Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection. PMID:28638214

  13. 41 CFR 60-2.35 - Compliance status.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... workforce (i.e., the employment of minorities or women at a percentage rate below, or above, the goal level... obligations will be determined by analysis of statistical data and other non-statistical information which...

  14. Education, Training, and Employment Outcomes: Analysis of a National Survey of Employers. Final Technical Report.

    ERIC Educational Resources Information Center

    Hollenbeck, Kevin

    A study examined the effect of education and training on the economy and on employment outcomes. Data collected during a 1982 nationwide telephone survey of 3,500 employers were used as the basis for statistical models of voluntary and involuntary job separations and job promotions. Four major conclusions resulted from the modeling process…

  15. The Impact of Social Capital on the Employment of College Graduates

    ERIC Educational Resources Information Center

    Fengqiao, Yan; Dan, Mao

    2015-01-01

    This article addresses the impact of social capital on college graduate employment. After reviewing the literature, the authors analyze data collected by Peking University from 34 universities in 2005 and use statistical analysis to clarify the impact of social capital on students' choice of employment or further study, job placement rate,…

  16. Bureau of Labor Statistics Employment Projections: Detailed Analysis of Selected Occupations and Industries. Report to the Honorable Berkley Bedell, United States House of Representatives.

    ERIC Educational Resources Information Center

    General Accounting Office, Washington, DC.

    To compile its projections of future employment levels, the Bureau of Labor Statistics (BLS) combines the following five interlinked models in a six-step process: a labor force model, an econometric model of the U.S. economy, an industry activity model, an industry labor demand model, and an occupational labor demand model. The BLS was asked to…

  17. Measuring Efficiency and Tradeoffs in Attainment of EEO Goals.

    DTIC Science & Technology

    1982-02-01

    in FY78 and FY79. i.e., T9tese goals Are based on undifferentiated Civilian Labor Force (CLF) ratios required for reporting by the Equal Employment...Lewis and R.J. Niehaus, "Design and Development of Equal Employment Opportunity Human Resources Planning Models," NPDRC TR79--141 (San Diego: Navy...Approach to Analysis of Tradeoffs Among Household Ptoduction Outputs," American Statistical Association 1979 Proceedings of the Social Statistics Section

  18. Developing Employability Skills via Extra-Curricular Activities in Vietnamese Universities: Student Engagement and Inhibitors of Their Engagement

    ERIC Educational Resources Information Center

    Tran, Le Huu Nghia

    2017-01-01

    This article reports a study that investigated student engagement and inhibitors of their engagement with developing employability skills via extra-curricular activities in Vietnamese universities. Content analysis of 18 interviews with students and statistical analysis of 423 students' responses to a paper-based survey showed that despite a…

  19. Probability and Statistics in Sensor Performance Modeling

    DTIC Science & Technology

    2010-12-01

    language software program is called Environmental Awareness for Sensor and Emitter Employment. Some important numerical issues in the implementation...3 Statistical analysis for measuring sensor performance...complementary cumulative distribution function cdf cumulative distribution function DST decision-support tool EASEE Environmental Awareness of

  20. Role of microstructure on twin nucleation and growth in HCP titanium: A statistical study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arul Kumar, M.; Wroński, M.; McCabe, Rodney James

    In this study, a detailed statistical analysis is performed using Electron Back Scatter Diffraction (EBSD) to establish the effect of microstructure on twin nucleation and growth in deformed commercial purity hexagonal close packed (HCP) titanium. Rolled titanium samples are compressed along rolling, transverse and normal directions to establish statistical correlations for {10–12}, {11–21}, and {11–22} twins. A recently developed automated EBSD-twinning analysis software is employed for the statistical analysis. Finally, the analysis provides the following key findings: (I) grain size and strain dependence is different for twin nucleation and growth; (II) twinning statistics can be generalized for the HCP metalsmore » magnesium, zirconium and titanium; and (III) complex microstructure, where grain shape and size distribution is heterogeneous, requires multi-point statistical correlations.« less

  1. Role of microstructure on twin nucleation and growth in HCP titanium: A statistical study

    DOE PAGES

    Arul Kumar, M.; Wroński, M.; McCabe, Rodney James; ...

    2018-02-01

    In this study, a detailed statistical analysis is performed using Electron Back Scatter Diffraction (EBSD) to establish the effect of microstructure on twin nucleation and growth in deformed commercial purity hexagonal close packed (HCP) titanium. Rolled titanium samples are compressed along rolling, transverse and normal directions to establish statistical correlations for {10–12}, {11–21}, and {11–22} twins. A recently developed automated EBSD-twinning analysis software is employed for the statistical analysis. Finally, the analysis provides the following key findings: (I) grain size and strain dependence is different for twin nucleation and growth; (II) twinning statistics can be generalized for the HCP metalsmore » magnesium, zirconium and titanium; and (III) complex microstructure, where grain shape and size distribution is heterogeneous, requires multi-point statistical correlations.« less

  2. The Multiplier Effect of the Development of Forest Park Tourism on Employment Creation in China

    ERIC Educational Resources Information Center

    Shuifa, Ke; Chenguang, Pan; Jiahua, Pan; Yan, Zheng; Ying, Zhang

    2011-01-01

    The focus of this article was employment creation by developing forest park tourism industries in China. Analysis of the statistical data and an input-output approach showed that 1 direct job opportunity in tourism industries created 1.15 other job opportunities. In the high, middle, and low scenarios, the total predicted employment in forest park…

  3. Rapid analysis of pharmaceutical drugs using LIBS coupled with multivariate analysis.

    PubMed

    Tiwari, P K; Awasthi, S; Kumar, R; Anand, R K; Rai, P K; Rai, A K

    2018-02-01

    Type 2 diabetes drug tablets containing voglibose having dose strengths of 0.2 and 0.3 mg of various brands have been examined, using laser-induced breakdown spectroscopy (LIBS) technique. The statistical methods such as the principal component analysis (PCA) and the partial least square regression analysis (PLSR) have been employed on LIBS spectral data for classifying and developing the calibration models of drug samples. We have developed the ratio-based calibration model applying PLSR in which relative spectral intensity ratios H/C, H/N and O/N are used. Further, the developed model has been employed to predict the relative concentration of element in unknown drug samples. The experiment has been performed in air and argon atmosphere, respectively, and the obtained results have been compared. The present model provides rapid spectroscopic method for drug analysis with high statistical significance for online control and measurement process in a wide variety of pharmaceutical industrial applications.

  4. Trends in study design and the statistical methods employed in a leading general medicine journal.

    PubMed

    Gosho, M; Sato, Y; Nagashima, K; Takahashi, S

    2018-02-01

    Study design and statistical methods have become core components of medical research, and the methodology has become more multifaceted and complicated over time. The study of the comprehensive details and current trends of study design and statistical methods is required to support the future implementation of well-planned clinical studies providing information about evidence-based medicine. Our purpose was to illustrate study design and statistical methods employed in recent medical literature. This was an extension study of Sato et al. (N Engl J Med 2017; 376: 1086-1087), which reviewed 238 articles published in 2015 in the New England Journal of Medicine (NEJM) and briefly summarized the statistical methods employed in NEJM. Using the same database, we performed a new investigation of the detailed trends in study design and individual statistical methods that were not reported in the Sato study. Due to the CONSORT statement, prespecification and justification of sample size are obligatory in planning intervention studies. Although standard survival methods (eg Kaplan-Meier estimator and Cox regression model) were most frequently applied, the Gray test and Fine-Gray proportional hazard model for considering competing risks were sometimes used for a more valid statistical inference. With respect to handling missing data, model-based methods, which are valid for missing-at-random data, were more frequently used than single imputation methods. These methods are not recommended as a primary analysis, but they have been applied in many clinical trials. Group sequential design with interim analyses was one of the standard designs, and novel design, such as adaptive dose selection and sample size re-estimation, was sometimes employed in NEJM. Model-based approaches for handling missing data should replace single imputation methods for primary analysis in the light of the information found in some publications. Use of adaptive design with interim analyses is increasing after the presentation of the FDA guidance for adaptive design. © 2017 John Wiley & Sons Ltd.

  5. An Analysis of Unemployment and Other Labor Market Indicators in 10 Countries.

    ERIC Educational Resources Information Center

    Moy, Joyanna

    1988-01-01

    Compares unemployment, employment, and related labor market statistics in the United States, Canada, Australia, Japan, France, Germany, Italy, the Netherlands, Sweden, and the United Kingdom. Introduces employment-to-population ratios by sex and discusses unemployment rates published by the Organization for Economic Cooperation and Development and…

  6. Quantitative investigation of inappropriate regression model construction and the importance of medical statistics experts in observational medical research: a cross-sectional study.

    PubMed

    Nojima, Masanori; Tokunaga, Mutsumi; Nagamura, Fumitaka

    2018-05-05

    To investigate under what circumstances inappropriate use of 'multivariate analysis' is likely to occur and to identify the population that needs more support with medical statistics. The frequency of inappropriate regression model construction in multivariate analysis and related factors were investigated in observational medical research publications. The inappropriate algorithm of using only variables that were significant in univariate analysis was estimated to occur at 6.4% (95% CI 4.8% to 8.5%). This was observed in 1.1% of the publications with a medical statistics expert (hereinafter 'expert') as the first author, 3.5% if an expert was included as coauthor and in 12.2% if experts were not involved. In the publications where the number of cases was 50 or less and the study did not include experts, inappropriate algorithm usage was observed with a high proportion of 20.2%. The OR of the involvement of experts for this outcome was 0.28 (95% CI 0.15 to 0.53). A further, nation-level, analysis showed that the involvement of experts and the implementation of unfavourable multivariate analysis are associated at the nation-level analysis (R=-0.652). Based on the results of this study, the benefit of participation of medical statistics experts is obvious. Experts should be involved for proper confounding adjustment and interpretation of statistical models. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  7. The Costs of Employing Older Workers. An Information Paper Prepared for Use by the Special Committee on Aging, United States Senate.

    ERIC Educational Resources Information Center

    Morrison, Malcolm; Rappaport, Anna

    Analysis of the costs of employing older workers indicates that some types of employment costs do vary by age and that overall compensation costs increase with age, largely because of increasing employee benefit costs. There is, however, no statistical evidence that direct salary costs increase by age on an economy-wide basis. The belief that…

  8. Qualification and Employment Opportunities. IAB Labour Market Research Topics No. 38.

    ERIC Educational Resources Information Center

    Rauch, Angela; Reinberg, Alexander

    Official German unemployment statistics were analyzed along with data from Germany's microcensus and other published sources to identify recent labor market trends and to clarify the relationship between qualifications and employment opportunities in the new German economy. The analysis revealed that, as has been true for years, the lower the…

  9. Assessing the Effectiveness of Statistical Classification Techniques in Predicting Future Employment of Participants in the Temporary Assistance for Needy Families Program

    ERIC Educational Resources Information Center

    Montoya, Isaac D.

    2008-01-01

    Three classification techniques (Chi-square Automatic Interaction Detection [CHAID], Classification and Regression Tree [CART], and discriminant analysis) were tested to determine their accuracy in predicting Temporary Assistance for Needy Families program recipients' future employment. Technique evaluation was based on proportion of correctly…

  10. Selected Statistics on the Status of Asian-American Women

    ERIC Educational Resources Information Center

    Fong, Pauline; Cabezas, Amado

    1977-01-01

    Taken from a paper on "The Economic and Employment Status of Asian Women in America" by Pauline Fong and Amado Cabezas of ASIAN, Inc., this brief analysis of statistics on Asian women indicates that highly educated Asian women do not have higher incomes or better jobs than many of those with less education.

  11. Statistical approaches in published ophthalmic clinical science papers: a comparison to statistical practice two decades ago.

    PubMed

    Zhang, Harrison G; Ying, Gui-Shuang

    2018-02-09

    The aim of this study is to evaluate the current practice of statistical analysis of eye data in clinical science papers published in British Journal of Ophthalmology ( BJO ) and to determine whether the practice of statistical analysis has improved in the past two decades. All clinical science papers (n=125) published in BJO in January-June 2017 were reviewed for their statistical analysis approaches for analysing primary ocular measure. We compared our findings to the results from a previous paper that reviewed BJO papers in 1995. Of 112 papers eligible for analysis, half of the studies analysed the data at an individual level because of the nature of observation, 16 (14%) studies analysed data from one eye only, 36 (32%) studies analysed data from both eyes at ocular level, one study (1%) analysed the overall summary of ocular finding per individual and three (3%) studies used the paired comparison. Among studies with data available from both eyes, 50 (89%) of 56 papers in 2017 did not analyse data from both eyes or ignored the intereye correlation, as compared with in 60 (90%) of 67 papers in 1995 (P=0.96). Among studies that analysed data from both eyes at an ocular level, 33 (92%) of 36 studies completely ignored the intereye correlation in 2017, as compared with in 16 (89%) of 18 studies in 1995 (P=0.40). A majority of studies did not analyse the data properly when data from both eyes were available. The practice of statistical analysis did not improve in the past two decades. Collaborative efforts should be made in the vision research community to improve the practice of statistical analysis for ocular data. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  12. Cluster Analysis of Minnesota School Districts. A Research Report.

    ERIC Educational Resources Information Center

    Cleary, James

    The term "cluster analysis" refers to a set of statistical methods that classify entities with similar profiles of scores on a number of measured dimensions, in order to create empirically based typologies. A 1980 Minnesota House Research Report employed cluster analysis to categorize school districts according to their relative mixtures…

  13. Do-it-yourself statistics: A computer-assisted likelihood approach to analysis of data from genetic crosses.

    PubMed Central

    Robbins, L G

    2000-01-01

    Graduate school programs in genetics have become so full that courses in statistics have often been eliminated. In addition, typical introductory statistics courses for the "statistics user" rather than the nascent statistician are laden with methods for analysis of measured variables while genetic data are most often discrete numbers. These courses are often seen by students and genetics professors alike as largely irrelevant cookbook courses. The powerful methods of likelihood analysis, although commonly employed in human genetics, are much less often used in other areas of genetics, even though current computational tools make this approach readily accessible. This article introduces the MLIKELY.PAS computer program and the logic of do-it-yourself maximum-likelihood statistics. The program itself, course materials, and expanded discussions of some examples that are only summarized here are available at http://www.unisi. it/ricerca/dip/bio_evol/sitomlikely/mlikely.h tml. PMID:10628965

  14. In-school service predictors of employment for individuals with intellectual disability.

    PubMed

    Park, Jiyoon; Bouck, Emily

    2018-06-01

    Although there are many secondary data analyses of the National Longitudinal Transition Study-2 (NLTS-2) to investigate post-school outcome for students with disabilities, there has been a lack of research with in-school service predictors and post-school outcome for students with specific disability categories. This study was a secondary data analysis of NLTS-2 to investigate the relationship between current employment status and in-school services for individuals with intellectual disability. Statistical methods such as descriptive statistics and logistic regression were used to analyze NLTS-2 data set. The main findings included that in-school services were correlated with current employment status, and that primary disability (i.e., mild intellectual disability and moderate/severe intellectual disability) was associated with current employment status. In-school services are critical in predicting current employment for individuals with intellectual disability. Also, data suggest additional research is needed to investigate various in-school services and variables that could predict employment differences between individuals with mild and moderate/severe intellectual disability. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. The Labor Market in the Regions of Belarus: An Analysis of Employment Tendencies

    ERIC Educational Resources Information Center

    Sokolova, G. N.

    2013-01-01

    In Belarus, the ways in which statistics are compiled, the complex rules for registering as unemployed, and the segmentation of the labor market and job-seeking activities, all combine to hide the actual levels of employment and unemployment. This in turn makes it difficult to develop appropriate and effective labor policies, and to have support…

  16. An information-theoretic approach to the modeling and analysis of whole-genome bisulfite sequencing data.

    PubMed

    Jenkinson, Garrett; Abante, Jordi; Feinberg, Andrew P; Goutsias, John

    2018-03-07

    DNA methylation is a stable form of epigenetic memory used by cells to control gene expression. Whole genome bisulfite sequencing (WGBS) has emerged as a gold-standard experimental technique for studying DNA methylation by producing high resolution genome-wide methylation profiles. Statistical modeling and analysis is employed to computationally extract and quantify information from these profiles in an effort to identify regions of the genome that demonstrate crucial or aberrant epigenetic behavior. However, the performance of most currently available methods for methylation analysis is hampered by their inability to directly account for statistical dependencies between neighboring methylation sites, thus ignoring significant information available in WGBS reads. We present a powerful information-theoretic approach for genome-wide modeling and analysis of WGBS data based on the 1D Ising model of statistical physics. This approach takes into account correlations in methylation by utilizing a joint probability model that encapsulates all information available in WGBS methylation reads and produces accurate results even when applied on single WGBS samples with low coverage. Using the Shannon entropy, our approach provides a rigorous quantification of methylation stochasticity in individual WGBS samples genome-wide. Furthermore, it utilizes the Jensen-Shannon distance to evaluate differences in methylation distributions between a test and a reference sample. Differential performance assessment using simulated and real human lung normal/cancer data demonstrate a clear superiority of our approach over DSS, a recently proposed method for WGBS data analysis. Critically, these results demonstrate that marginal methods become statistically invalid when correlations are present in the data. This contribution demonstrates clear benefits and the necessity of modeling joint probability distributions of methylation using the 1D Ising model of statistical physics and of quantifying methylation stochasticity using concepts from information theory. By employing this methodology, substantial improvement of DNA methylation analysis can be achieved by effectively taking into account the massive amount of statistical information available in WGBS data, which is largely ignored by existing methods.

  17. Role strain among male RNs in the critical care setting: Perceptions of an unfriendly workplace.

    PubMed

    Carte, Nicholas S; Williams, Collette

    2017-12-01

    Traditionally, nursing has been a female-dominated profession. Men employed as registered nurses have been in the minority and little is known about the experiences of this demographic. The purpose of this descriptive, quantitative study was to understand the relationship between the variables of demographics and causes of role strain among male nurses in critical care settings. The Sherrod Role Strain Scale assesses role strain within the context of role conflict, role overload, role ambiguity and role incongruity. Data analysis of the results included descriptive and inferential statistics. Inferential statistics involved the use of repeated measures ANOVA testing for significant difference in the causes of role strain between male nurses employed in critical care settings and a post hoc comparison of specific demographic data using multivariate analyses of variance (MANOVAs). Data from 37 male nurses in critical care settings from the northeast of the United States were used to calculate descriptive statistics standard deviation, mean of the data analysis and results of the repeated ANOVA and the post hoc secondary MANOVA analysis. The descriptive data showed that all participants worked full-time. There was an even split from those participants who worked day shift (46%) vs. night shift (43%), most the participants indicated they had 15 years or more experience as an registered nurse (54%). Significant findings of this study include two causes of role strain in male nurses employed in critical care settings which are: role ambiguity and role overload based on ethnicity. Consistent with previous research findings, the results of this study suggest that male registered nurses employed in critical care settings do experience role strain. The two main causes of role strain in male nurses are role ambiguity and role overload. Copyright © 2017. Published by Elsevier Ltd.

  18. Intraoperative optical biopsy for brain tumors using spectro-lifetime properties of intrinsic fluorophores

    NASA Astrophysics Data System (ADS)

    Vasefi, Fartash; Kittle, David S.; Nie, Zhaojun; Falcone, Christina; Patil, Chirag G.; Chu, Ray M.; Mamelak, Adam N.; Black, Keith L.; Butte, Pramod V.

    2016-04-01

    We have developed and tested a system for real-time intra-operative optical identification and classification of brain tissues using time-resolved fluorescence spectroscopy (TRFS). A supervised learning algorithm using linear discriminant analysis (LDA) employing selected intrinsic fluorescence decay temporal points in 6 spectral bands was employed to maximize statistical significance difference between training groups. The linear discriminant analysis on in vivo human tissues obtained by TRFS measurements (N = 35) were validated by histopathologic analysis and neuronavigation correlation to pre-operative MRI images. These results demonstrate that TRFS can differentiate between normal cortex, white matter and glioma.

  19. Redshift data and statistical inference

    NASA Technical Reports Server (NTRS)

    Newman, William I.; Haynes, Martha P.; Terzian, Yervant

    1994-01-01

    Frequency histograms and the 'power spectrum analysis' (PSA) method, the latter developed by Yu & Peebles (1969), have been widely employed as techniques for establishing the existence of periodicities. We provide a formal analysis of these two classes of methods, including controlled numerical experiments, to better understand their proper use and application. In particular, we note that typical published applications of frequency histograms commonly employ far greater numbers of class intervals or bins than is advisable by statistical theory sometimes giving rise to the appearance of spurious patterns. The PSA method generates a sequence of random numbers from observational data which, it is claimed, is exponentially distributed with unit mean and variance, essentially independent of the distribution of the original data. We show that the derived random processes is nonstationary and produces a small but systematic bias in the usual estimate of the mean and variance. Although the derived variable may be reasonably described by an exponential distribution, the tail of the distribution is far removed from that of an exponential, thereby rendering statistical inference and confidence testing based on the tail of the distribution completely unreliable. Finally, we examine a number of astronomical examples wherein these methods have been used giving rise to widespread acceptance of statistically unconfirmed conclusions.

  20. Analysis of Publications and Citations from a Geophysics Research Institute.

    ERIC Educational Resources Information Center

    Frohlich, Cliff; Resler, Lynn

    2001-01-01

    Performs an analysis of all 1128 publications produced by scientists during their employment at the University of Texas Institute for Geophysics, thus assessing research performance using as bibliometric indicators such statistics as publications per year, citations per paper, and cited half-lives. Evaluates five different methods for determining…

  1. The Hard but Necessary Task of Gathering Order-One Effect Size Indices in Meta-Analysis

    ERIC Educational Resources Information Center

    Ortego, Carmen; Botella, Juan

    2010-01-01

    Meta-analysis of studies with two groups and two measurement occasions must employ order-one effect size indices to represent study outcomes. Especially with non-random assignment, non-equivalent control group designs, a statistical analysis restricted to post-treatment scores can lead to severely biased conclusions. The 109 primary studies…

  2. The Social Profile of Students in Basic General Education in Ecuador: A Data Analysis

    ERIC Educational Resources Information Center

    Buri, Olga Elizabeth Minchala; Stefos, Efstathios

    2017-01-01

    The objective of this study is to examine the social profile of students who are enrolled in Basic General Education in Ecuador. Both a descriptive and multidimensional statistical analysis was carried out based on the data provided by the National Survey of Employment, Unemployment and Underemployment in 2015. The descriptive analysis shows the…

  3. The Outlook for Technological Change and Employment. Technology and the American Economy, Appendix Volume I.

    ERIC Educational Resources Information Center

    National Commission on Technology, Automation and Economic Progress, Washington, DC.

    Findings of a study of the nation's manpower requirements to 1975 are presented. Part I, on the employment outlook, consists of a 10-year projection of manpower requirements by occupation and by industry prepared by the Bureau of Labor Statistics and an analysis of the growth prospects and the state of fiscal policy in the United States economy as…

  4. Monitoring of an antigen manufacturing process.

    PubMed

    Zavatti, Vanessa; Budman, Hector; Legge, Raymond; Tamer, Melih

    2016-06-01

    Fluorescence spectroscopy in combination with multivariate statistical methods was employed as a tool for monitoring the manufacturing process of pertactin (PRN), one of the virulence factors of Bordetella pertussis utilized in whopping cough vaccines. Fluorophores such as amino acids and co-enzymes were detected throughout the process. The fluorescence data collected at different stages of the fermentation and purification process were treated employing principal component analysis (PCA). Through PCA, it was feasible to identify sources of variability in PRN production. Then, partial least square (PLS) was employed to correlate the fluorescence spectra obtained from pure PRN samples and the final protein content measured by a Kjeldahl test from these samples. In view that a statistically significant correlation was found between fluorescence and PRN levels, this approach could be further used as a method to predict the final protein content.

  5. Spatial analysis on future housing markets: economic development and housing implications.

    PubMed

    Liu, Xin; Wang, Lizhe

    2014-01-01

    A coupled projection method combining formal modelling and other statistical techniques was developed to delineate the relationship between economic and social drivers for net new housing allocations. Using the example of employment growth in Tyne and Wear, UK, until 2016, the empirical analysis yields housing projections at the macro- and microspatial levels (e.g., region to subregion to elected ward levels). The results have important implications for the strategic planning of locations for housing and employment, demonstrating both intuitively and quantitatively how local economic developments affect housing demand.

  6. Spatial Analysis on Future Housing Markets: Economic Development and Housing Implications

    PubMed Central

    Liu, Xin; Wang, Lizhe

    2014-01-01

    A coupled projection method combining formal modelling and other statistical techniques was developed to delineate the relationship between economic and social drivers for net new housing allocations. Using the example of employment growth in Tyne and Wear, UK, until 2016, the empirical analysis yields housing projections at the macro- and microspatial levels (e.g., region to subregion to elected ward levels). The results have important implications for the strategic planning of locations for housing and employment, demonstrating both intuitively and quantitatively how local economic developments affect housing demand. PMID:24892097

  7. Turnover Among Air Force Nurses.

    DTIC Science & Technology

    1987-03-01

    Statistics. - The U. S. Air Force Institute of Technology, Civilian Institutions , Allied Health Branch, for their assistance and partial financial support...The University of Utah, Computer Center, for their financial support and use of computer equipment used in the statistical analysis. xiv The following...retain the nurses currently employed (Decker, et al., 19S2; Weisman, 1982). White (1980) concludes that nursina staff is a considerable [ financial

  8. Employment of patients receiving maintenance dialysis and after kidney transplant: a cross-sectional study from Finland.

    PubMed

    Helanterä, Ilkka; Haapio, Mikko; Koskinen, Petri; Grönhagen-Riska, Carola; Finne, Patrik

    2012-05-01

    Associations between mode of renal replacement therapy and employment rate have not been well characterized. Cross-sectional registry analysis. The employment status of all prevalent 15- to 64-year-old dialysis and kidney transplant patients in Finland at the end of 2007 (N = 2,637) was analyzed by combining data from the Finnish Registry for Kidney Diseases with individual-level employment statistics of the Finnish government. Prevalence rate ratios (PRRs) of employment according to treatment modality with adjustment for age, sex, cause of end-stage renal disease (ESRD), duration of ESRD, and comorbid conditions were estimated using Cox regression with a constant time at risk. Employment status of patients on dialysis therapy or after transplant. Clinical data were collected from the Finnish Registry for Kidney Diseases, and employment data were acquired from Statistics Finland. 19% of hemodialysis patients, 31% of peritoneal dialysis patients, and 40% of patients with a functioning transplant were employed; the overall employment rate for the Finnish population aged 15-64 years is 67%. Home hemodialysis patients and those treated with automated peritoneal dialysis had employment rates of 39% and 44%, respectively. In adjusted analysis, patients on home hemodialysis therapy (PRR, 1.87), on automated peritoneal dialysis therapy (PRR, 2.14), or with a kidney transplant (PRR, 2.30) had higher probabilities of employment than in-center hemodialysis patients. Patients with type 1 or 2 diabetes as the cause of ESRD had the lowest probability of employment (PRR, 0.48-0.60 compared with glomerulonephritis). Patients aged 25-54 years more frequently were employed than those younger than 25 or older than 54 years. Sex did not predict employment. For transplant recipients, longer time since transplant was associated with higher employment in addition to the mentioned factors. Cross-sectional design. Employment rate of home dialysis patients was similar to that of transplant recipients and higher than that of in-center hemodialysis patients. Patients with diabetes were less likely to be employed. Copyright © 2012 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.

  9. Statistical significance of trace evidence matches using independent physicochemical measurements

    NASA Astrophysics Data System (ADS)

    Almirall, Jose R.; Cole, Michael; Furton, Kenneth G.; Gettinby, George

    1997-02-01

    A statistical approach to the significance of glass evidence is proposed using independent physicochemical measurements and chemometrics. Traditional interpretation of the significance of trace evidence matches or exclusions relies on qualitative descriptors such as 'indistinguishable from,' 'consistent with,' 'similar to' etc. By performing physical and chemical measurements with are independent of one another, the significance of object exclusions or matches can be evaluated statistically. One of the problems with this approach is that the human brain is excellent at recognizing and classifying patterns and shapes but performs less well when that object is represented by a numerical list of attributes. Chemometrics can be employed to group similar objects using clustering algorithms and provide statistical significance in a quantitative manner. This approach is enhanced when population databases exist or can be created and the data in question can be evaluated given these databases. Since the selection of the variables used and their pre-processing can greatly influence the outcome, several different methods could be employed in order to obtain a more complete picture of the information contained in the data. Presently, we report on the analysis of glass samples using refractive index measurements and the quantitative analysis of the concentrations of the metals: Mg, Al, Ca, Fe, Mn, Ba, Sr, Ti and Zr. The extension of this general approach to fiber and paint comparisons also is discussed. This statistical approach should not replace the current interpretative approaches to trace evidence matches or exclusions but rather yields an additional quantitative measure. The lack of sufficient general population databases containing the needed physicochemical measurements and the potential for confusion arising from statistical analysis currently hamper this approach and ways of overcoming these obstacles are presented.

  10. U.S. Manufacturing: Federal Programs Reported Providing Support and Addressing Trends

    DTIC Science & Technology

    2017-03-28

    Bureau of Labor Statistics CDC Certified Development Company CES Center for Economic Studies CFDA Catalog of Federal Domestic Assistance...nation, such as employing 12.3 million U.S. workers and generating $2.2 trillion in economic activity in 2015.1 U.S. manufacturing is comprised of...Manufacturing, NAICS 31-33: Employment, all employees (seasonally adjusted), 1945-2016; and Bureau of Economic Analysis, GDP by Industry, 1947

  11. An Investigation of Civilians Preparedness to Compete with Individuals with Military Experience for Army Board Select Acquisition Positions

    DTIC Science & Technology

    2017-05-25

    37 Research Design ... research employed a mixed research methodology – quantitative with descriptive statistical analysis and qualitative with a thematic analysis approach...mixed research methodology – quantitative and qualitative, using interviews to collect the data. The interviews included demographic and open-ended

  12. Inquiring the Most Critical Teacher's Technology Education Competences in the Highest Efficient Technology Education Learning Organization

    ERIC Educational Resources Information Center

    Yung-Kuan, Chan; Hsieh, Ming-Yuan; Lee, Chin-Feng; Huang, Chih-Cheng; Ho, Li-Chih

    2017-01-01

    Under the hyper-dynamic education situation, this research, in order to comprehensively explore the interplays between Teacher Competence Demands (TCD) and Learning Organization Requests (LOR), cross-employs the data refined method of Descriptive Statistics (DS) method and Analysis of Variance (ANOVA) and Principal Components Analysis (PCA)…

  13. Teaching for Art Criticism: Incorporating Feldman's Critical Analysis Learning Model in Students' Studio Practice

    ERIC Educational Resources Information Center

    Subramaniam, Maithreyi; Hanafi, Jaffri; Putih, Abu Talib

    2016-01-01

    This study adopted 30 first year graphic design students' artwork, with critical analysis using Feldman's model of art criticism. Data were analyzed quantitatively; descriptive statistical techniques were employed. The scores were viewed in the form of mean score and frequencies to determine students' performances in their critical ability.…

  14. Vocational Preparation for Women: A Critical Analysis.

    ERIC Educational Resources Information Center

    Steiger, JoAnn

    In this analysis of vocational preparation for women material is presented to substantiate the claim that women are joining the labor force in increasing numbers and their career opportunities are expanding, but that the educational system has failed to respond. Statistical data is cited showing that women have traditionally been employed in just…

  15. Statistics for demodulation RFI in inverting operational amplifier circuits

    NASA Astrophysics Data System (ADS)

    Sutu, Y.-H.; Whalen, J. J.

    An investigation was conducted with the objective to determine statistical variations for RFI demodulation responses in operational amplifier (op amp) circuits. Attention is given to the experimental procedures employed, a three-stage op amp LED experiment, NCAP (Nonlinear Circuit Analysis Program) simulations of demodulation RFI in 741 op amps, and a comparison of RFI in four op amp types. Three major recommendations for future investigations are presented on the basis of the obtained results. One is concerned with the conduction of additional measurements of demodulation RFI in inverting amplifiers, while another suggests the employment of an automatic measurement system. It is also proposed to conduct additional NCAP simulations in which parasitic effects are accounted for more thoroughly.

  16. Duration on unemployment: geographic mobility and selectivity bias.

    PubMed

    Goss, E P; Paul, C; Wilhite, A

    1994-01-01

    Modeling the factors affecting the duration of unemployment was found to be influenced by the inclusion of migration factors. Traditional models which did not control for migration factors were found to underestimate movers' probability of finding an acceptable job. The empirical test of the theory, based on the analysis of data on US household heads unemployed in 1982 and employed in 1982 and 1983, found that the cumulative probability of reemployment in the traditional model was .422 and in the migration selectivity model was .624 after 30 weeks of searching. In addition, controlling for selectivity eliminated the significance of the relationship between race and job search duration in the model. The relationship between search duration and the county unemployment rate in 1982 became statistically significant, and the relationship between search duration and 1980 population per square mile in the 1982 county of residence became statistically insignificant. The finding that non-Whites have a longer duration of unemployment can better be understood as non-Whites' lower geographic mobility and lack of greater job contacts. The statistical significance of a high unemployment rate in the home labor market reducing the probability of finding employment was more in keeping with expectations. The findings assumed that the duration of employment accurately reflected the length of job search. The sample was redrawn to exclude discouraged workers and the analysis was repeated. The findings were similar to the full sample, with the coefficient for migration variable being negative and statistically significant and the coefficient for alpha remaining positive and statistically significant. Race in the selectivity model remained statistically insignificant. The findings supported the Schwartz model hypothesizing that the expansion of the radius of the search would reduce the duration of unemployment. The exclusion of the migration factor misspecified the equation for unemployment duration. Policy should be directed to the problems of geographic mobility, particularly among non-Whites.

  17. Statistical software applications used in health services research: analysis of published studies in the U.S

    PubMed Central

    2011-01-01

    Background This study aims to identify the statistical software applications most commonly employed for data analysis in health services research (HSR) studies in the U.S. The study also examines the extent to which information describing the specific analytical software utilized is provided in published articles reporting on HSR studies. Methods Data were extracted from a sample of 1,139 articles (including 877 original research articles) published between 2007 and 2009 in three U.S. HSR journals, that were considered to be representative of the field based upon a set of selection criteria. Descriptive analyses were conducted to categorize patterns in statistical software usage in those articles. The data were stratified by calendar year to detect trends in software use over time. Results Only 61.0% of original research articles in prominent U.S. HSR journals identified the particular type of statistical software application used for data analysis. Stata and SAS were overwhelmingly the most commonly used software applications employed (in 46.0% and 42.6% of articles respectively). However, SAS use grew considerably during the study period compared to other applications. Stratification of the data revealed that the type of statistical software used varied considerably by whether authors were from the U.S. or from other countries. Conclusions The findings highlight a need for HSR investigators to identify more consistently the specific analytical software used in their studies. Knowing that information can be important, because different software packages might produce varying results, owing to differences in the software's underlying estimation methods. PMID:21977990

  18. A critique of the usefulness of inferential statistics in applied behavior analysis

    PubMed Central

    Hopkins, B. L.; Cole, Brian L.; Mason, Tina L.

    1998-01-01

    Researchers continue to recommend that applied behavior analysts use inferential statistics in making decisions about effects of independent variables on dependent variables. In many other approaches to behavioral science, inferential statistics are the primary means for deciding the importance of effects. Several possible uses of inferential statistics are considered. Rather than being an objective means for making decisions about effects, as is often claimed, inferential statistics are shown to be subjective. It is argued that the use of inferential statistics adds nothing to the complex and admittedly subjective nonstatistical methods that are often employed in applied behavior analysis. Attacks on inferential statistics that are being made, perhaps with increasing frequency, by those who are not behavior analysts, are discussed. These attackers are calling for banning the use of inferential statistics in research publications and commonly recommend that behavioral scientists should switch to using statistics aimed at interval estimation or the method of confidence intervals. Interval estimation is shown to be contrary to the fundamental assumption of behavior analysis that only individuals behave. It is recommended that authors who wish to publish the results of inferential statistics be asked to justify them as a means for helping us to identify any ways in which they may be useful. PMID:22478304

  19. The economic impact of Mexico City's smoke-free law.

    PubMed

    López, Carlos Manuel Guerrero; Ruiz, Jorge Alberto Jiménez; Shigematsu, Luz Myriam Reynales; Waters, Hugh R

    2011-07-01

    To evaluate the economic impact of Mexico City's 2008 smoke-free law--The Non-Smokers' Health Protection Law on restaurants, bars and nightclubs. We used the Monthly Services Survey of businesses from January 2005 to April 2009--with revenues, employment and payments to employees as the principal outcomes. The results are estimated using a differences-in-differences regression model with fixed effects. The states of Jalisco, Nuevo León and México, where the law was not in effect, serve as a counterfactual comparison group. In restaurants, after accounting for observable factors and the fixed effects, there was a 24.8% increase in restaurants' revenue associated with the smoke-free law. This difference is not statistically significant but shows that, on average, restaurants did not suffer economically as a result of the law. Total wages increased by 28.2% and employment increased by 16.2%. In nightclubs, bars and taverns there was a decrease of 1.5% in revenues and an increase of 0.1% and 3.0%, respectively, in wages and employment. None of these effects are statistically significant in multivariate analysis. There is no statistically significant evidence that the Mexico City smoke-free law had a negative impact on restaurants' income, employees' wages and levels of employment. On the contrary, the results show a positive, though statistically non-significant, impact of the law on most of these outcomes. Mexico City's experience suggests that smoke-free laws in Mexico and elsewhere will not hurt economic productivity in the restaurant and bar industries.

  20. A wavelet-based statistical analysis of FMRI data: I. motivation and data distribution modeling.

    PubMed

    Dinov, Ivo D; Boscardin, John W; Mega, Michael S; Sowell, Elizabeth L; Toga, Arthur W

    2005-01-01

    We propose a new method for statistical analysis of functional magnetic resonance imaging (fMRI) data. The discrete wavelet transformation is employed as a tool for efficient and robust signal representation. We use structural magnetic resonance imaging (MRI) and fMRI to empirically estimate the distribution of the wavelet coefficients of the data both across individuals and spatial locations. An anatomical subvolume probabilistic atlas is used to tessellate the structural and functional signals into smaller regions each of which is processed separately. A frequency-adaptive wavelet shrinkage scheme is employed to obtain essentially optimal estimations of the signals in the wavelet space. The empirical distributions of the signals on all the regions are computed in a compressed wavelet space. These are modeled by heavy-tail distributions because their histograms exhibit slower tail decay than the Gaussian. We discovered that the Cauchy, Bessel K Forms, and Pareto distributions provide the most accurate asymptotic models for the distribution of the wavelet coefficients of the data. Finally, we propose a new model for statistical analysis of functional MRI data using this atlas-based wavelet space representation. In the second part of our investigation, we will apply this technique to analyze a large fMRI dataset involving repeated presentation of sensory-motor response stimuli in young, elderly, and demented subjects.

  1. The application of feature selection to the development of Gaussian process models for percutaneous absorption.

    PubMed

    Lam, Lun Tak; Sun, Yi; Davey, Neil; Adams, Rod; Prapopoulou, Maria; Brown, Marc B; Moss, Gary P

    2010-06-01

    The aim was to employ Gaussian processes to assess mathematically the nature of a skin permeability dataset and to employ these methods, particularly feature selection, to determine the key physicochemical descriptors which exert the most significant influence on percutaneous absorption, and to compare such models with established existing models. Gaussian processes, including automatic relevance detection (GPRARD) methods, were employed to develop models of percutaneous absorption that identified key physicochemical descriptors of percutaneous absorption. Using MatLab software, the statistical performance of these models was compared with single linear networks (SLN) and quantitative structure-permeability relationships (QSPRs). Feature selection methods were used to examine in more detail the physicochemical parameters used in this study. A range of statistical measures to determine model quality were used. The inherently nonlinear nature of the skin data set was confirmed. The Gaussian process regression (GPR) methods yielded predictive models that offered statistically significant improvements over SLN and QSPR models with regard to predictivity (where the rank order was: GPR > SLN > QSPR). Feature selection analysis determined that the best GPR models were those that contained log P, melting point and the number of hydrogen bond donor groups as significant descriptors. Further statistical analysis also found that great synergy existed between certain parameters. It suggested that a number of the descriptors employed were effectively interchangeable, thus questioning the use of models where discrete variables are output, usually in the form of an equation. The use of a nonlinear GPR method produced models with significantly improved predictivity, compared with SLN or QSPR models. Feature selection methods were able to provide important mechanistic information. However, it was also shown that significant synergy existed between certain parameters, and as such it was possible to interchange certain descriptors (i.e. molecular weight and melting point) without incurring a loss of model quality. Such synergy suggested that a model constructed from discrete terms in an equation may not be the most appropriate way of representing mechanistic understandings of skin absorption.

  2. An analysis of job placement patterns of black and non-black male and female undergraduates at the University of Virginia and Hampton Institute. Ph.D. Thesis - Virginia Univ.

    NASA Technical Reports Server (NTRS)

    Anderson, A. F.

    1974-01-01

    Research questions were proposed to determine the relationship between independent variables (race, sex, and institution attended) and dependent variables (number of job offers received, salary received, and willingness to recommend source of employer contact). The control variables were academic major, grade point average, placement registration, nonemployment activity, employer, and source of employer contact. An analysis of the results revealed no statistical significance of the institution attended as a predictor of job offers or salary, although significant relationships were found between race and sex and number of job offers received. It was found that academic major, grade point average, and source of employer contact were more useful than race in the prediction of salary. Sex and nonemployment activity were found to be the most important variables in the model. The analysis also indicated that Black students received more job offers than non-Black students.

  3. Event time analysis of longitudinal neuroimage data.

    PubMed

    Sabuncu, Mert R; Bernal-Rusiel, Jorge L; Reuter, Martin; Greve, Douglas N; Fischl, Bruce

    2014-08-15

    This paper presents a method for the statistical analysis of the associations between longitudinal neuroimaging measurements, e.g., of cortical thickness, and the timing of a clinical event of interest, e.g., disease onset. The proposed approach consists of two steps, the first of which employs a linear mixed effects (LME) model to capture temporal variation in serial imaging data. The second step utilizes the extended Cox regression model to examine the relationship between time-dependent imaging measurements and the timing of the event of interest. We demonstrate the proposed method both for the univariate analysis of image-derived biomarkers, e.g., the volume of a structure of interest, and the exploratory mass-univariate analysis of measurements contained in maps, such as cortical thickness and gray matter density. The mass-univariate method employs a recently developed spatial extension of the LME model. We applied our method to analyze structural measurements computed using FreeSurfer, a widely used brain Magnetic Resonance Image (MRI) analysis software package. We provide a quantitative and objective empirical evaluation of the statistical performance of the proposed method on longitudinal data from subjects suffering from Mild Cognitive Impairment (MCI) at baseline. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. Multidimensional Rasch Analysis of a Psychological Test with Multiple Subtests: A Statistical Solution for the Bandwidth-Fidelity Dilemma

    ERIC Educational Resources Information Center

    Cheng, Ying-Yao; Wang, Wen-Chung; Ho, Yi-Hui

    2009-01-01

    Educational and psychological tests are often composed of multiple short subtests, each measuring a distinct latent trait. Unfortunately, short subtests suffer from low measurement precision, which makes the bandwidth-fidelity dilemma inevitable. In this study, the authors demonstrate how a multidimensional Rasch analysis can be employed to take…

  5. Automated Box-Cox Transformations for Improved Visual Encoding.

    PubMed

    Maciejewski, Ross; Pattath, Avin; Ko, Sungahn; Hafen, Ryan; Cleveland, William S; Ebert, David S

    2013-01-01

    The concept of preconditioning data (utilizing a power transformation as an initial step) for analysis and visualization is well established within the statistical community and is employed as part of statistical modeling and analysis. Such transformations condition the data to various inherent assumptions of statistical inference procedures, as well as making the data more symmetric and easier to visualize and interpret. In this paper, we explore the use of the Box-Cox family of power transformations to semiautomatically adjust visual parameters. We focus on time-series scaling, axis transformations, and color binning for choropleth maps. We illustrate the usage of this transformation through various examples, and discuss the value and some issues in semiautomatically using these transformations for more effective data visualization.

  6. ToNER: A tool for identifying nucleotide enrichment signals in feature-enriched RNA-seq data.

    PubMed

    Promworn, Yuttachon; Kaewprommal, Pavita; Shaw, Philip J; Intarapanich, Apichart; Tongsima, Sissades; Piriyapongsa, Jittima

    2017-01-01

    Biochemical methods are available for enriching 5' ends of RNAs in prokaryotes, which are employed in the differential RNA-seq (dRNA-seq) and the more recent Cappable-seq protocols. Computational methods are needed to locate RNA 5' ends from these data by statistical analysis of the enrichment. Although statistical-based analysis methods have been developed for dRNA-seq, they may not be suitable for Cappable-seq data. The more efficient enrichment method employed in Cappable-seq compared with dRNA-seq could affect data distribution and thus algorithm performance. We present Transformation of Nucleotide Enrichment Ratios (ToNER), a tool for statistical modeling of enrichment from RNA-seq data obtained from enriched and unenriched libraries. The tool calculates nucleotide enrichment scores and determines the global transformation for fitting to the normal distribution using the Box-Cox procedure. From the transformed distribution, sites of significant enrichment are identified. To increase power of detection, meta-analysis across experimental replicates is offered. We tested the tool on Cappable-seq and dRNA-seq data for identifying Escherichia coli transcript 5' ends and compared the results with those from the TSSAR tool, which is designed for analyzing dRNA-seq data. When combining results across Cappable-seq replicates, ToNER detects more known transcript 5' ends than TSSAR. In general, the transcript 5' ends detected by ToNER but not TSSAR occur in regions which cannot be locally modeled by TSSAR. ToNER uses a simple yet robust statistical modeling approach, which can be used for detecting RNA 5'ends from Cappable-seq data, in particular when combining information from experimental replicates. The ToNER tool could potentially be applied for analyzing other RNA-seq datasets in which enrichment for other structural features of RNA is employed. The program is freely available for download at ToNER webpage (http://www4a.biotec.or.th/GI/tools/toner) and GitHub repository (https://github.com/PavitaKae/ToNER).

  7. An Exploratory Data Analysis System for Support in Medical Decision-Making

    PubMed Central

    Copeland, J. A.; Hamel, B.; Bourne, J. R.

    1979-01-01

    An experimental system was developed to allow retrieval and analysis of data collected during a study of neurobehavioral correlates of renal disease. After retrieving data organized in a relational data base, simple bivariate statistics of parametric and nonparametric nature could be conducted. An “exploratory” mode in which the system provided guidance in selection of appropriate statistical analyses was also available to the user. The system traversed a decision tree using the inherent qualities of the data (e.g., the identity and number of patients, tests, and time epochs) to search for the appropriate analyses to employ.

  8. Economic Impacts of Wind Turbine Development in U.S. Counties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J., Brown; B., Hoen; E., Lantz

    2011-07-25

    The objective is to address the research question using post-project construction, county-level data, and econometric evaluation methods. Wind energy is expanding rapidly in the United States: Over the last 4 years, wind power has contributed approximately 35 percent of all new electric power capacity. Wind power plants are often developed in rural areas where local economic development impacts from the installation are projected, including land lease and property tax payments and employment growth during plant construction and operation. Wind energy represented 2.3 percent of the U.S. electricity supply in 2010, but studies show that penetrations of at least 20 percentmore » are feasible. Several studies have used input-output models to predict direct, indirect, and induced economic development impacts. These analyses have often been completed prior to project construction. Available studies have not yet investigated the economic development impacts of wind development at the county level using post-construction econometric evaluation methods. Analysis of county-level impacts is limited. However, previous county-level analyses have estimated operation-period employment at 0.2 to 0.6 jobs per megawatt (MW) of power installed and earnings at $9,000/MW to $50,000/MW. We find statistically significant evidence of positive impacts of wind development on county-level per capita income from the OLS and spatial lag models when they are applied to the full set of wind and non-wind counties. The total impact on annual per capita income of wind turbine development (measured in MW per capita) in the spatial lag model was $21,604 per MW. This estimate is within the range of values estimated in the literature using input-output models. OLS results for the wind-only counties and matched samples are similar in magnitude, but are not statistically significant at the 10-percent level. We find a statistically significant impact of wind development on employment in the OLS analysis for wind counties only, but not in the other models. Our estimates of employment impacts are not precise enough to assess the validity of employment impacts from input-output models applied in advance of wind energy project construction. The analysis provides empirical evidence of positive income effects at the county level from cumulative wind turbine development, consistent with the range of impacts estimated using input-output models. Employment impacts are less clear.« less

  9. Precarious employment in Chile: psychometric properties of the Chilean version of Employment Precariousness Scale in private sector workers.

    PubMed

    Vives-Vergara, Alejandra; González-López, Francisca; Solar, Orielle; Bernales-Baksai, Pamela; González, María José; Benach, Joan

    2017-04-20

    The purpose of this study is to perform a psychometric analysis (acceptability, reliability and factor structure) of the Chilean version of the new Employment Precariousness Scale (EPRES). The data is drawn from a sample of 4,248 private salaried workers with a formal contract from the first Chilean Employment Conditions, Work, Health and Quality of Life (ENETS) survey, applied to a nationally representative sample of the Chilean workforce in 2010. Item and scale-level statistics were performed to assess scaling properties, acceptability and reliability. The six-dimensional factor structure was examined with confirmatory factor analysis. The scale exhibited high acceptability (roughly 80%) and reliability (Cronbach's alpha 0.83) and the factor structure was confirmed. One subscale (rights) demonstrated poorer metric properties without compromising the overall scale. The Chilean version of the Employment Precariousness Scale (EPRES-Ch) demonstrated good metric properties, pointing to its suitability for use in epidemiologic and public health research.

  10. Endpoint in plasma etch process using new modified w-multivariate charts and windowed regression

    NASA Astrophysics Data System (ADS)

    Zakour, Sihem Ben; Taleb, Hassen

    2017-09-01

    Endpoint detection is very important undertaking on the side of getting a good understanding and figuring out if a plasma etching process is done in the right way, especially if the etched area is very small (0.1%). It truly is a crucial part of supplying repeatable effects in every single wafer. When the film being etched has been completely cleared, the endpoint is reached. To ensure the desired device performance on the produced integrated circuit, the high optical emission spectroscopy (OES) sensor is employed. The huge number of gathered wavelengths (profiles) is then analyzed and pre-processed using a new proposed simple algorithm named Spectra peak selection (SPS) to select the important wavelengths, then we employ wavelet analysis (WA) to enhance the performance of detection by suppressing noise and redundant information. The selected and treated OES wavelengths are then used in modified multivariate control charts (MEWMA and Hotelling) for three statistics (mean, SD and CV) and windowed polynomial regression for mean. The employ of three aforementioned statistics is motivated by controlling mean shift, variance shift and their ratio (CV) if both mean and SD are not stable. The control charts show their performance in detecting endpoint especially W-mean Hotelling chart and the worst result is given by CV statistic. As the best detection of endpoint is given by the W-Hotelling mean statistic, this statistic will be used to construct a windowed wavelet Hotelling polynomial regression. This latter can only identify the window containing endpoint phenomenon.

  11. A study of correlations between crude oil spot and futures markets: A rolling sample test

    NASA Astrophysics Data System (ADS)

    Liu, Li; Wan, Jieqiu

    2011-10-01

    In this article, we investigate the asymmetries of exceedance correlations and cross-correlations between West Texas Intermediate (WTI) spot and futures markets. First, employing the test statistic proposed by Hong et al. [Asymmetries in stock returns: statistical tests and economic evaluation, Review of Financial Studies 20 (2007) 1547-1581], we find that the exceedance correlations were overall symmetric. However, the results from rolling windows show that some occasional events could induce the significant asymmetries of the exceedance correlations. Second, employing the test statistic proposed by Podobnik et al. [Quantifying cross-correlations using local and global detrending approaches, European Physics Journal B 71 (2009) 243-250], we find that the cross-correlations were significant even for large lagged orders. Using the detrended cross-correlation analysis proposed by Podobnik and Stanley [Detrended cross-correlation analysis: a new method for analyzing two nonstationary time series, Physics Review Letters 100 (2008) 084102], we find that the cross-correlations were weakly persistent and were stronger between spot and futures contract with larger maturity. Our results from rolling sample test also show the apparent effects of the exogenous events. Additionally, we have some relevant discussions on the obtained evidence.

  12. Geospatial methods and data analysis for assessing distribution of grazing livestock

    USDA-ARS?s Scientific Manuscript database

    Free-ranging livestock research must begin with a well conceived problem statement and employ appropriate data acquisition tools and analytical techniques to accomplish the research objective. These requirements are especially critical in addressing animal distribution. Tools and statistics used t...

  13. Counting Penguins.

    ERIC Educational Resources Information Center

    Perry, Mike; Kader, Gary

    1998-01-01

    Presents an activity on the simplification of penguin counting by employing the basic ideas and principles of sampling to teach students to understand and recognize its role in statistical claims. Emphasizes estimation, data analysis and interpretation, and central limit theorem. Includes a list of items for classroom discussion. (ASK)

  14. Statistical analysis of water-quality data containing multiple detection limits II: S-language software for nonparametric distribution modeling and hypothesis testing

    USGS Publications Warehouse

    Lee, L.; Helsel, D.

    2007-01-01

    Analysis of low concentrations of trace contaminants in environmental media often results in left-censored data that are below some limit of analytical precision. Interpretation of values becomes complicated when there are multiple detection limits in the data-perhaps as a result of changing analytical precision over time. Parametric and semi-parametric methods, such as maximum likelihood estimation and robust regression on order statistics, can be employed to model distributions of multiply censored data and provide estimates of summary statistics. However, these methods are based on assumptions about the underlying distribution of data. Nonparametric methods provide an alternative that does not require such assumptions. A standard nonparametric method for estimating summary statistics of multiply-censored data is the Kaplan-Meier (K-M) method. This method has seen widespread usage in the medical sciences within a general framework termed "survival analysis" where it is employed with right-censored time-to-failure data. However, K-M methods are equally valid for the left-censored data common in the geosciences. Our S-language software provides an analytical framework based on K-M methods that is tailored to the needs of the earth and environmental sciences community. This includes routines for the generation of empirical cumulative distribution functions, prediction or exceedance probabilities, and related confidence limits computation. Additionally, our software contains K-M-based routines for nonparametric hypothesis testing among an unlimited number of grouping variables. A primary characteristic of K-M methods is that they do not perform extrapolation and interpolation. Thus, these routines cannot be used to model statistics beyond the observed data range or when linear interpolation is desired. For such applications, the aforementioned parametric and semi-parametric methods must be used.

  15. Hiring a Gay Man, Taking a Risk?: A Lab Experiment on Employment Discrimination and Risk Aversion.

    PubMed

    Baert, Stijn

    2018-01-01

    We investigate risk aversion as a driver of labor market discrimination against homosexual men. We show that more hiring discrimination by more risk-averse employers is consistent with taste-based and statistical discrimination. To test this hypothesis we conduct a scenario experiment in which experimental employers take a fictitious hiring decision concerning a heterosexual or homosexual male job candidate. In addition, participants are surveyed on their risk aversion and other characteristics that might correlate with this risk aversion. Analysis of the (post-)experimental data confirms our hypothesis. The likelihood of a beneficial hiring decision for homosexual male candidates decreases by 31.7% when employers are a standard deviation more risk-averse.

  16. A glossary for big data in population and public health: discussion and commentary on terminology and research methods.

    PubMed

    Fuller, Daniel; Buote, Richard; Stanley, Kevin

    2017-11-01

    The volume and velocity of data are growing rapidly and big data analytics are being applied to these data in many fields. Population and public health researchers may be unfamiliar with the terminology and statistical methods used in big data. This creates a barrier to the application of big data analytics. The purpose of this glossary is to define terms used in big data and big data analytics and to contextualise these terms. We define the five Vs of big data and provide definitions and distinctions for data mining, machine learning and deep learning, among other terms. We provide key distinctions between big data and statistical analysis methods applied to big data. We contextualise the glossary by providing examples where big data analysis methods have been applied to population and public health research problems and provide brief guidance on how to learn big data analysis methods. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  17. Simultaneous ocular and muscle artifact removal from EEG data by exploiting diverse statistics.

    PubMed

    Chen, Xun; Liu, Aiping; Chen, Qiang; Liu, Yu; Zou, Liang; McKeown, Martin J

    2017-09-01

    Electroencephalography (EEG) recordings are frequently contaminated by both ocular and muscle artifacts. These are normally dealt with separately, by employing blind source separation (BSS) techniques relying on either second-order or higher-order statistics (SOS & HOS respectively). When HOS-based methods are used, it is usually in the setting of assuming artifacts are statistically independent to the EEG. When SOS-based methods are used, it is assumed that artifacts have autocorrelation characteristics distinct from the EEG. In reality, ocular and muscle artifacts do not completely follow the assumptions of strict temporal independence to the EEG nor completely unique autocorrelation characteristics, suggesting that exploiting HOS or SOS alone may be insufficient to remove these artifacts. Here we employ a novel BSS technique, independent vector analysis (IVA), to jointly employ HOS and SOS simultaneously to remove ocular and muscle artifacts. Numerical simulations and application to real EEG recordings were used to explore the utility of the IVA approach. IVA was superior in isolating both ocular and muscle artifacts, especially for raw EEG data with low signal-to-noise ratio, and also integrated usually separate SOS and HOS steps into a single unified step. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. The economic impact of Mexico City's smoke-free law

    PubMed Central

    Guerrero López, Carlos Manuel; Jiménez Ruiz, Jorge Alberto; Reynales Shigematsu, Luz Myriam

    2011-01-01

    Objective To evaluate the economic impact of Mexico City's 2008 smoke-free law—The Non-Smokers' Health Protection Law on restaurants, bars and nightclubs. Material and methods We used the Monthly Services Survey of businesses from January 2005 to April 2009—with revenues, employment and payments to employees as the principal outcomes. The results are estimated using a differences-in-differences regression model with fixed effects. The states of Jalisco, Nuevo León and México, where the law was not in effect, serve as a counterfactual comparison group. Results In restaurants, after accounting for observable factors and the fixed effects, there was a 24.8% increase in restaurants' revenue associated with the smoke-free law. This difference is not statistically significant but shows that, on average, restaurants did not suffer economically as a result of the law. Total wages increased by 28.2% and employment increased by 16.2%. In nightclubs, bars and taverns there was a decrease of 1.5% in revenues and an increase of 0.1% and 3.0%, respectively, in wages and employment. None of these effects are statistically significant in multivariate analysis. Conclusions There is no statistically significant evidence that the Mexico City smoke-free law had a negative impact on restaurants' income, employees' wages and levels of employment. On the contrary, the results show a positive, though statistically non-significant, impact of the law on most of these outcomes. Mexico City's experience suggests that smoke-free laws in Mexico and elsewhere will not hurt economic productivity in the restaurant and bar industries. PMID:21292808

  19. Advanced building energy management system demonstration for Department of Defense buildings.

    PubMed

    O'Neill, Zheng; Bailey, Trevor; Dong, Bing; Shashanka, Madhusudana; Luo, Dong

    2013-08-01

    This paper presents an advanced building energy management system (aBEMS) that employs advanced methods of whole-building performance monitoring combined with statistical methods of learning and data analysis to enable identification of both gradual and discrete performance erosion and faults. This system assimilated data collected from multiple sources, including blueprints, reduced-order models (ROM) and measurements, and employed advanced statistical learning algorithms to identify patterns of anomalies. The results were presented graphically in a manner understandable to facilities managers. A demonstration of aBEMS was conducted in buildings at Naval Station Great Lakes. The facility building management systems were extended to incorporate the energy diagnostics and analysis algorithms, producing systematic identification of more efficient operation strategies. At Naval Station Great Lakes, greater than 20% savings were demonstrated for building energy consumption by improving facility manager decision support to diagnose energy faults and prioritize alternative, energy-efficient operation strategies. The paper concludes with recommendations for widespread aBEMS success. © 2013 New York Academy of Sciences.

  20. Gender and Employment. Current Statistics and Their Implications.

    ERIC Educational Resources Information Center

    Equity Issues, 1996

    1996-01-01

    This publication contains three fact sheets on gender and employment statistics and their implications. The fact sheets are divided into two sections--statistics and implications. The statistics present the current situation of men and women workers as they relate to occupations, education, and earnings. The implications express suggestions for…

  1. Statistical Analysis of Solar PV Power Frequency Spectrum for Optimal Employment of Building Loads

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olama, Mohammed M; Sharma, Isha; Kuruganti, Teja

    In this paper, a statistical analysis of the frequency spectrum of solar photovoltaic (PV) power output is conducted. This analysis quantifies the frequency content that can be used for purposes such as developing optimal employment of building loads and distributed energy resources. One year of solar PV power output data was collected and analyzed using one-second resolution to find ideal bounds and levels for the different frequency components. The annual, seasonal, and monthly statistics of the PV frequency content are computed and illustrated in boxplot format. To examine the compatibility of building loads for PV consumption, a spectral analysis ofmore » building loads such as Heating, Ventilation and Air-Conditioning (HVAC) units and water heaters was performed. This defined the bandwidth over which these devices can operate. Results show that nearly all of the PV output (about 98%) is contained within frequencies lower than 1 mHz (equivalent to ~15 min), which is compatible for consumption with local building loads such as HVAC units and water heaters. Medium frequencies in the range of ~15 min to ~1 min are likely to be suitable for consumption by fan equipment of variable air volume HVAC systems that have time constants in the range of few seconds to few minutes. This study indicates that most of the PV generation can be consumed by building loads with the help of proper control strategies, thereby reducing impact on the grid and the size of storage systems.« less

  2. ParallABEL: an R library for generalized parallelization of genome-wide association studies.

    PubMed

    Sangket, Unitsa; Mahasirimongkol, Surakameth; Chantratita, Wasun; Tandayya, Pichaya; Aulchenko, Yurii S

    2010-04-29

    Genome-Wide Association (GWA) analysis is a powerful method for identifying loci associated with complex traits and drug response. Parts of GWA analyses, especially those involving thousands of individuals and consuming hours to months, will benefit from parallel computation. It is arduous acquiring the necessary programming skills to correctly partition and distribute data, control and monitor tasks on clustered computers, and merge output files. Most components of GWA analysis can be divided into four groups based on the types of input data and statistical outputs. The first group contains statistics computed for a particular Single Nucleotide Polymorphism (SNP), or trait, such as SNP characterization statistics or association test statistics. The input data of this group includes the SNPs/traits. The second group concerns statistics characterizing an individual in a study, for example, the summary statistics of genotype quality for each sample. The input data of this group includes individuals. The third group consists of pair-wise statistics derived from analyses between each pair of individuals in the study, for example genome-wide identity-by-state or genomic kinship analyses. The input data of this group includes pairs of SNPs/traits. The final group concerns pair-wise statistics derived for pairs of SNPs, such as the linkage disequilibrium characterisation. The input data of this group includes pairs of individuals. We developed the ParallABEL library, which utilizes the Rmpi library, to parallelize these four types of computations. ParallABEL library is not only aimed at GenABEL, but may also be employed to parallelize various GWA packages in R. The data set from the North American Rheumatoid Arthritis Consortium (NARAC) includes 2,062 individuals with 545,080, SNPs' genotyping, was used to measure ParallABEL performance. Almost perfect speed-up was achieved for many types of analyses. For example, the computing time for the identity-by-state matrix was linearly reduced from approximately eight hours to one hour when ParallABEL employed eight processors. Executing genome-wide association analysis using the ParallABEL library on a computer cluster is an effective way to boost performance, and simplify the parallelization of GWA studies. ParallABEL is a user-friendly parallelization of GenABEL.

  3. Descriptive and inferential statistical methods used in burns research.

    PubMed

    Al-Benna, Sammy; Al-Ajam, Yazan; Way, Benjamin; Steinstraesser, Lars

    2010-05-01

    Burns research articles utilise a variety of descriptive and inferential methods to present and analyse data. The aim of this study was to determine the descriptive methods (e.g. mean, median, SD, range, etc.) and survey the use of inferential methods (statistical tests) used in articles in the journal Burns. This study defined its population as all original articles published in the journal Burns in 2007. Letters to the editor, brief reports, reviews, and case reports were excluded. Study characteristics, use of descriptive statistics and the number and types of statistical methods employed were evaluated. Of the 51 articles analysed, 11(22%) were randomised controlled trials, 18(35%) were cohort studies, 11(22%) were case control studies and 11(22%) were case series. The study design and objectives were defined in all articles. All articles made use of continuous and descriptive data. Inferential statistics were used in 49(96%) articles. Data dispersion was calculated by standard deviation in 30(59%). Standard error of the mean was quoted in 19(37%). The statistical software product was named in 33(65%). Of the 49 articles that used inferential statistics, the tests were named in 47(96%). The 6 most common tests used (Student's t-test (53%), analysis of variance/co-variance (33%), chi(2) test (27%), Wilcoxon & Mann-Whitney tests (22%), Fisher's exact test (12%)) accounted for the majority (72%) of statistical methods employed. A specified significance level was named in 43(88%) and the exact significance levels were reported in 28(57%). Descriptive analysis and basic statistical techniques account for most of the statistical tests reported. This information should prove useful in deciding which tests should be emphasised in educating burn care professionals. These results highlight the need for burn care professionals to have a sound understanding of basic statistics, which is crucial in interpreting and reporting data. Advice should be sought from professionals in the fields of biostatistics and epidemiology when using more advanced statistical techniques. Copyright 2009 Elsevier Ltd and ISBI. All rights reserved.

  4. Classification of edible oils by employing 31P and 1H NMR spectroscopy in combination with multivariate statistical analysis. A proposal for the detection of seed oil adulteration in virgin olive oils.

    PubMed

    Vigli, Georgia; Philippidis, Angelos; Spyros, Apostolos; Dais, Photis

    2003-09-10

    A combination of (1)H NMR and (31)P NMR spectroscopy and multivariate statistical analysis was used to classify 192 samples from 13 types of vegetable oils, namely, hazelnut, sunflower, corn, soybean, sesame, walnut, rapeseed, almond, palm, groundnut, safflower, coconut, and virgin olive oils from various regions of Greece. 1,2-Diglycerides, 1,3-diglycerides, the ratio of 1,2-diglycerides to total diglycerides, acidity, iodine value, and fatty acid composition determined upon analysis of the respective (1)H NMR and (31)P NMR spectra were selected as variables to establish a classification/prediction model by employing discriminant analysis. This model, obtained from the training set of 128 samples, resulted in a significant discrimination among the different classes of oils, whereas 100% of correct validated assignments for 64 samples were obtained. Different artificial mixtures of olive-hazelnut, olive-corn, olive-sunflower, and olive-soybean oils were prepared and analyzed by (1)H NMR and (31)P NMR spectroscopy. Subsequent discriminant analysis of the data allowed detection of adulteration as low as 5% w/w, provided that fresh virgin olive oil samples were used, as reflected by their high 1,2-diglycerides to total diglycerides ratio (D > or = 0.90).

  5. Anomalous heat transfer modes of nanofluids: a review based on statistical analysis

    NASA Astrophysics Data System (ADS)

    Sergis, Antonis; Hardalupas, Yannis

    2011-05-01

    This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids.

  6. Anomalous heat transfer modes of nanofluids: a review based on statistical analysis.

    PubMed

    Sergis, Antonis; Hardalupas, Yannis

    2011-05-19

    This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids.

  7. Anomalous heat transfer modes of nanofluids: a review based on statistical analysis

    PubMed Central

    2011-01-01

    This paper contains the results of a concise statistical review analysis of a large amount of publications regarding the anomalous heat transfer modes of nanofluids. The application of nanofluids as coolants is a novel practise with no established physical foundations explaining the observed anomalous heat transfer. As a consequence, traditional methods of performing a literature review may not be adequate in presenting objectively the results representing the bulk of the available literature. The current literature review analysis aims to resolve the problems faced by researchers in the past by employing an unbiased statistical analysis to present and reveal the current trends and general belief of the scientific community regarding the anomalous heat transfer modes of nanofluids. The thermal performance analysis indicated that statistically there exists a variable enhancement for conduction, convection/mixed heat transfer, pool boiling heat transfer and critical heat flux modes. The most popular proposed mechanisms in the literature to explain heat transfer in nanofluids are revealed, as well as possible trends between nanofluid properties and thermal performance. The review also suggests future experimentation to provide more conclusive answers to the control mechanisms and influential parameters of heat transfer in nanofluids. PMID:21711932

  8. Real English Project Report.

    ERIC Educational Resources Information Center

    Cautin, Harvey; Regan, Edward

    Requirements are discussed for an information retrieval language that enables users to employ natural language sentences in interaction with computer-stored files. Anticipated modes of operation of the system are outlined. These are: the search mode, the dictionary mode, the tables mode, and the statistical mode. Analysis of sample sentences…

  9. Objective research of auscultation signals in Traditional Chinese Medicine based on wavelet packet energy and support vector machine.

    PubMed

    Yan, Jianjun; Shen, Xiaojing; Wang, Yiqin; Li, Fufeng; Xia, Chunming; Guo, Rui; Chen, Chunfeng; Shen, Qingwei

    2010-01-01

    This study aims at utilising Wavelet Packet Transform (WPT) and Support Vector Machine (SVM) algorithm to make objective analysis and quantitative research for the auscultation in Traditional Chinese Medicine (TCM) diagnosis. First, Wavelet Packet Decomposition (WPD) at level 6 was employed to split more elaborate frequency bands of the auscultation signals. Then statistic analysis was made based on the extracted Wavelet Packet Energy (WPE) features from WPD coefficients. Furthermore, the pattern recognition was used to distinguish mixed subjects' statistical feature values of sample groups through SVM. Finally, the experimental results showed that the classification accuracies were at a high level.

  10. metaCCA: summary statistics-based multivariate meta-analysis of genome-wide association studies using canonical correlation analysis.

    PubMed

    Cichonska, Anna; Rousu, Juho; Marttinen, Pekka; Kangas, Antti J; Soininen, Pasi; Lehtimäki, Terho; Raitakari, Olli T; Järvelin, Marjo-Riitta; Salomaa, Veikko; Ala-Korpela, Mika; Ripatti, Samuli; Pirinen, Matti

    2016-07-01

    A dominant approach to genetic association studies is to perform univariate tests between genotype-phenotype pairs. However, analyzing related traits together increases statistical power, and certain complex associations become detectable only when several variants are tested jointly. Currently, modest sample sizes of individual cohorts, and restricted availability of individual-level genotype-phenotype data across the cohorts limit conducting multivariate tests. We introduce metaCCA, a computational framework for summary statistics-based analysis of a single or multiple studies that allows multivariate representation of both genotype and phenotype. It extends the statistical technique of canonical correlation analysis to the setting where original individual-level records are not available, and employs a covariance shrinkage algorithm to achieve robustness.Multivariate meta-analysis of two Finnish studies of nuclear magnetic resonance metabolomics by metaCCA, using standard univariate output from the program SNPTEST, shows an excellent agreement with the pooled individual-level analysis of original data. Motivated by strong multivariate signals in the lipid genes tested, we envision that multivariate association testing using metaCCA has a great potential to provide novel insights from already published summary statistics from high-throughput phenotyping technologies. Code is available at https://github.com/aalto-ics-kepaco anna.cichonska@helsinki.fi or matti.pirinen@helsinki.fi Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  11. metaCCA: summary statistics-based multivariate meta-analysis of genome-wide association studies using canonical correlation analysis

    PubMed Central

    Cichonska, Anna; Rousu, Juho; Marttinen, Pekka; Kangas, Antti J.; Soininen, Pasi; Lehtimäki, Terho; Raitakari, Olli T.; Järvelin, Marjo-Riitta; Salomaa, Veikko; Ala-Korpela, Mika; Ripatti, Samuli; Pirinen, Matti

    2016-01-01

    Motivation: A dominant approach to genetic association studies is to perform univariate tests between genotype-phenotype pairs. However, analyzing related traits together increases statistical power, and certain complex associations become detectable only when several variants are tested jointly. Currently, modest sample sizes of individual cohorts, and restricted availability of individual-level genotype-phenotype data across the cohorts limit conducting multivariate tests. Results: We introduce metaCCA, a computational framework for summary statistics-based analysis of a single or multiple studies that allows multivariate representation of both genotype and phenotype. It extends the statistical technique of canonical correlation analysis to the setting where original individual-level records are not available, and employs a covariance shrinkage algorithm to achieve robustness. Multivariate meta-analysis of two Finnish studies of nuclear magnetic resonance metabolomics by metaCCA, using standard univariate output from the program SNPTEST, shows an excellent agreement with the pooled individual-level analysis of original data. Motivated by strong multivariate signals in the lipid genes tested, we envision that multivariate association testing using metaCCA has a great potential to provide novel insights from already published summary statistics from high-throughput phenotyping technologies. Availability and implementation: Code is available at https://github.com/aalto-ics-kepaco Contacts: anna.cichonska@helsinki.fi or matti.pirinen@helsinki.fi Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153689

  12. [Evaluation of using statistical methods in selected national medical journals].

    PubMed

    Sych, Z

    1996-01-01

    The paper covers the performed evaluation of frequency with which the statistical methods were applied in analyzed works having been published in six selected, national medical journals in the years 1988-1992. For analysis the following journals were chosen, namely: Klinika Oczna, Medycyna Pracy, Pediatria Polska, Polski Tygodnik Lekarski, Roczniki Państwowego Zakładu Higieny, Zdrowie Publiczne. Appropriate number of works up to the average in the remaining medical journals was randomly selected from respective volumes of Pol. Tyg. Lek. The studies did not include works wherein the statistical analysis was not implemented, which referred both to national and international publications. That exemption was also extended to review papers, casuistic ones, reviews of books, handbooks, monographies, reports from scientific congresses, as well as papers on historical topics. The number of works was defined in each volume. Next, analysis was performed to establish the mode of finding out a suitable sample in respective studies, differentiating two categories: random and target selections. Attention was also paid to the presence of control sample in the individual works. In the analysis attention was also focussed on the existence of sample characteristics, setting up three categories: complete, partial and lacking. In evaluating the analyzed works an effort was made to present the results of studies in tables and figures (Tab. 1, 3). Analysis was accomplished with regard to the rate of employing statistical methods in analyzed works in relevant volumes of six selected, national medical journals for the years 1988-1992, simultaneously determining the number of works, in which no statistical methods were used. Concurrently the frequency of applying the individual statistical methods was analyzed in the scrutinized works. Prominence was given to fundamental statistical methods in the field of descriptive statistics (measures of position, measures of dispersion) as well as most important methods of mathematical statistics such as parametric tests of significance, analysis of variance (in single and dual classifications). non-parametric tests of significance, correlation and regression. The works, in which use was made of either multiple correlation or multiple regression or else more complex methods of studying the relationship for two or more numbers of variables, were incorporated into the works whose statistical methods were constituted by correlation and regression as well as other methods, e.g. statistical methods being used in epidemiology (coefficients of incidence and morbidity, standardization of coefficients, survival tables) factor analysis conducted by Jacobi-Hotellng's method, taxonomic methods and others. On the basis of the performed studies it has been established that the frequency of employing statistical methods in the six selected national, medical journals in the years 1988-1992 was 61.1-66.0% of the analyzed works (Tab. 3), and they generally were almost similar to the frequency provided in English language medical journals. On a whole, no significant differences were disclosed in the frequency of applied statistical methods (Tab. 4) as well as in frequency of random tests (Tab. 3) in the analyzed works, appearing in the medical journals in respective years 1988-1992. The most frequently used statistical methods in analyzed works for 1988-1992 were the measures of position 44.2-55.6% and measures of dispersion 32.5-38.5% as well as parametric tests of significance 26.3-33.1% of the works analyzed (Tab. 4). For the purpose of increasing the frequency and reliability of the used statistical methods, the didactics should be widened in the field of biostatistics at medical studies and postgraduation training designed for physicians and scientific-didactic workers.

  13. Symmetrized Nearest Neighbor Regression Estimates.

    DTIC Science & Technology

    1987-12-01

    TELEPHONE NUMBER 22C. OFFICE SYMBO0L (Inetude A me. Code) Major Brian Woodruff 1(202) 767-5026 1 Dr -’ 00 PORN 147,303- APR EDI1TION OF I JAN 73 IS...in tenth of a pence) in 1973. The data come from the Family Ex- penditure Survey, Annual Base Tapes 1968-198S, Department of Employment, Statistics...Statistics, 13, 1465- 1481. Hildenbrand, K. and Hildenbrand, W. (1986). On the mean income effect: a data analysis of the U.K. family expenditure

  14. 78 FR 19098 - Wage Methodology for the Temporary Non-Agricultural Employment H-2B Program; Delay of Effective Date

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-29

    ... by dividing the Bureau of Labor Statistics Occupational Employment Statistics Survey (OES survey... DEPARTMENT OF LABOR Employment and Training Administration 20 CFR Part 655 RIN 1205-AB61 Wage Methodology for the Temporary Non-Agricultural Employment H- 2B Program; Delay of Effective Date AGENCY...

  15. Worklife expectancies of fixed-term Finnish employees in 1997-2006.

    PubMed

    Nurminen, Markku

    2008-04-01

    Fixed-term employment is prevalent in the Finnish labor force. This form of employment contract is marked by fragmentary work periods, demands for flexibility in workhours, and concern for multiple insecurities. A nonpermanent employee may also incur adverse health consequences. Yet there exist no exact statistics on the duration of fixed-term employment. This paper estimated the future duration of the time that a Finn is expected to be engaged in irregular work. Multistate regression modeling and stochastic analysis were applied to aggregated data from surveys conducted among the labor force by Statistics Finland in 1997-2006. In 2006, a Finnish male was expected to work a total of 3.8 years in fixed-term employment, combined over consecutive or separate time spans; this time amounts to 8% of his remaining work career from entry into the work force until final retirement. For a woman the expectancy was greater, 6.5 years or 13%. For the age interval 20-29 years, the total was 16% for men and 23% for women. The type and duration of employment is influenced by security factors and economic cycles, both of which affect men and women differently. Over the past decade, fixed-term employment increased consistently in the female labor contingent, and it was more pronounced during economic slowdowns. This labor market development calls for standards for flexibility and guarantees for security in the fragmented future worklives of fixed-term employees.

  16. Statistical Learning Analysis in Neuroscience: Aiming for Transparency

    PubMed Central

    Hanke, Michael; Halchenko, Yaroslav O.; Haxby, James V.; Pollmann, Stefan

    2009-01-01

    Encouraged by a rise of reciprocal interest between the machine learning and neuroscience communities, several recent studies have demonstrated the explanatory power of statistical learning techniques for the analysis of neural data. In order to facilitate a wider adoption of these methods, neuroscientific research needs to ensure a maximum of transparency to allow for comprehensive evaluation of the employed procedures. We argue that such transparency requires “neuroscience-aware” technology for the performance of multivariate pattern analyses of neural data that can be documented in a comprehensive, yet comprehensible way. Recently, we introduced PyMVPA, a specialized Python framework for machine learning based data analysis that addresses this demand. Here, we review its features and applicability to various neural data modalities. PMID:20582270

  17. Statistical Analysis of Protein Ensembles

    NASA Astrophysics Data System (ADS)

    Máté, Gabriell; Heermann, Dieter

    2014-04-01

    As 3D protein-configuration data is piling up, there is an ever-increasing need for well-defined, mathematically rigorous analysis approaches, especially that the vast majority of the currently available methods rely heavily on heuristics. We propose an analysis framework which stems from topology, the field of mathematics which studies properties preserved under continuous deformations. First, we calculate a barcode representation of the molecules employing computational topology algorithms. Bars in this barcode represent different topological features. Molecules are compared through their barcodes by statistically determining the difference in the set of their topological features. As a proof-of-principle application, we analyze a dataset compiled of ensembles of different proteins, obtained from the Ensemble Protein Database. We demonstrate that our approach correctly detects the different protein groupings.

  18. Employment status, inflation and suicidal behaviour: an analysis of a stratified sample in Italy.

    PubMed

    Solano, Paola; Pizzorno, Enrico; Gallina, Anna M; Mattei, Chiara; Gabrielli, Filippo; Kayman, Joshua

    2012-09-01

    There is abundant empirical evidence of a surplus risk of suicide among the unemployed, although few studies have investigated the influence of economic downturns on suicidal behaviours in an employment status-stratified sample. We investigated how economic inflation affected suicidal behaviours according to employment status in Italy from 2001 to 2008. Data concerning economically active people were provided by the Italian Institute for Statistical Analysis and by the International Monetary Fund. The association between inflation and completed versus attempted suicide with respect to employment status was investigated in every year and quarter-year of the study time frame. We considered three occupational categories: employed, unemployed who were previously employed and unemployed who had never worked. The unemployed are at higher suicide risk than the employed. Among the PE, a significant association between inflation and suicide attempt was found, whereas no association was reported concerning completed suicides. No association was found between completed and attempted suicides among the employed, the NE and inflation. Completed suicide in females is significantly associated with unemployment in every quarter-year. The reported vulnerability to suicidal behaviours among the PE as inflation rises underlines the need of effective support strategies for both genders in times of economic downturns.

  19. Illinois Teacher Supply and Demand, 1984-1985.

    ERIC Educational Resources Information Center

    Bartolini, Leandro

    Statistics are presented on the current status of teacher supply and demand trends in Illinois. This report reviews and discusses the factors affecting teacher supply and demand, changes in student enrollment, teacher retirements, changes in state mandates, and opportunity for employment. An analysis of the data collected on teacher employment…

  20. Introducing Mathematics to Information Problem-Solving Tasks: Surface or Substance?

    ERIC Educational Resources Information Center

    Erickson, Ander

    2017-01-01

    This study employs a cross-case analysis in order to explore the demands and opportunities that arise when information problem-solving tasks are introduced into college mathematics classes. Professors at three universities collaborated with me to develop statistics-related activities that required students to engage in research outside the…

  1. Training and Learning in the Knowledge and Service Economy

    ERIC Educational Resources Information Center

    Sloman, Martyn; Philpott, John

    2006-01-01

    Purpose: The purpose of this paper is to consider whether the shift from training to learning is related to employment categories using a categorisation popularised by Robert Reich. Design/methodology/approach: Collation and analysis of existing CIPD research information and assessment of labour statistics. Findings: An examination of the national…

  2. 78 FR 50373 - Proposed Information Collection; Comment Request; Annual Capital Expenditures Survey

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-19

    ... source of detailed comprehensive statistics on actual business spending for non-farm companies, non- governmental companies, organizations, and associations operating in the United States. Both employer and nonemployer companies are included in the survey. The Bureau of Economic Analysis, the primary Federal user of...

  3. Health status after cancer: does it matter which hospital you belong to?

    PubMed

    Fiva, Jon H; Haegeland, Torbjørn; Rønning, Marte

    2010-07-13

    Survival rates are widely used to compare the quality of cancer care. However, the extent to which cancer survivors regain full physical or cognitive functioning is not captured by this statistic. To address this concern we introduce post-diagnosis employment as a supplemental measure of the quality of cancer care. This study is based on individual level data from the Norwegian Cancer Registry (n = 46,720) linked with data on labor market outcomes and socioeconomic status from Statistics Norway. We study variation across Norwegian hospital catchment areas (n = 55) with respect to survival and employment five years after cancer diagnosis. To handle the selection problem, we exploit the fact that cancer patients in Norway (until 2001) have been allocated to local hospitals based on their place of residence. We document substantial differences across catchment areas with respect to patients' post-diagnosis employment rates. Conventional quality indicators based on survival rates indicate smaller differences. The two sets of indicators are only moderately correlated. This analysis shows that indicators based on survival and post-diagnosis employment may capture different parts of the health status distribution, and that using only one of them to capture quality of care may be insufficient.

  4. Optimization of fermentation medium for the production of atrazine degrading strain Acinetobacter sp. DNS(32) by statistical analysis system.

    PubMed

    Zhang, Ying; Wang, Yang; Wang, Zhi-Gang; Wang, Xi; Guo, Huo-Sheng; Meng, Dong-Fang; Wong, Po-Keung

    2012-01-01

    Statistical experimental designs provided by statistical analysis system (SAS) software were applied to optimize the fermentation medium composition for the production of atrazine-degrading Acinetobacter sp. DNS(32) in shake-flask cultures. A "Plackett-Burman Design" was employed to evaluate the effects of different components in the medium. The concentrations of corn flour, soybean flour, and K(2)HPO(4) were found to significantly influence Acinetobacter sp. DNS(32) production. The steepest ascent method was employed to determine the optimal regions of these three significant factors. Then, these three factors were optimized using central composite design of "response surface methodology." The optimized fermentation medium composition was composed as follows (g/L): corn flour 39.49, soybean flour 25.64, CaCO(3) 3, K(2)HPO(4) 3.27, MgSO(4)·7H(2)O 0.2, and NaCl 0.2. The predicted and verifiable values in the medium with optimized concentration of components in shake flasks experiments were 7.079 × 10(8) CFU/mL and 7.194 × 10(8) CFU/mL, respectively. The validated model can precisely predict the growth of atrazine-degraing bacterium, Acinetobacter sp. DNS(32).

  5. Statistical imprints of CMB B -type polarization leakage in an incomplete sky survey analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santos, Larissa; Wang, Kai; Hu, Yangrui

    2017-01-01

    One of the main goals of modern cosmology is to search for primordial gravitational waves by looking on their imprints in the B -type polarization in the cosmic microwave background radiation. However, this signal is contaminated by various sources, including cosmic weak lensing, foreground radiations, instrumental noises, as well as the E -to- B leakage caused by the partial sky surveys, which should be well understood to avoid the misinterpretation of the observed data. In this paper, we adopt the E / B decomposition method suggested by Smith in 2006, and study the imprints of E -to- B leakage residualsmore » in the constructed B -type polarization maps, B( n-circumflex ), by employing various statistical tools. We find that the effects of E -to- B leakage are negligible for the B-mode power spectrum, as well as the skewness and kurtosis analyses of B-maps. However, if employing the morphological statistical tools, including Minkowski functionals and/or Betti numbers, we find the effect of leakage can be detected at very high confidence level, which shows that in the morphological analysis, the leakage can play a significant role as a contaminant for measuring the primordial B -mode signal and must be taken into account for a correct explanation of the data.« less

  6. Estimating short-run and long-run interaction mechanisms in interictal state.

    PubMed

    Ozkaya, Ata; Korürek, Mehmet

    2010-04-01

    We address the issue of analyzing electroencephalogram (EEG) from seizure patients in order to test, model and determine the statistical properties that distinguish between EEG states (interictal, pre-ictal, ictal) by introducing a new class of time series analysis methods. In the present study: firstly, we employ statistical methods to determine the non-stationary behavior of focal interictal epileptiform series within very short time intervals; secondly, for such intervals that are deemed non-stationary we suggest the concept of Autoregressive Integrated Moving Average (ARIMA) process modelling, well known in time series analysis. We finally address the queries of causal relationships between epileptic states and between brain areas during epileptiform activity. We estimate the interaction between different EEG series (channels) in short time intervals by performing Granger-causality analysis and also estimate such interaction in long time intervals by employing Cointegration analysis, both analysis methods are well-known in econometrics. Here we find: first, that the causal relationship between neuronal assemblies can be identified according to the duration and the direction of their possible mutual influences; second, that although the estimated bidirectional causality in short time intervals yields that the neuronal ensembles positively affect each other, in long time intervals neither of them is affected (increasing amplitudes) from this relationship. Moreover, Cointegration analysis of the EEG series enables us to identify whether there is a causal link from the interictal state to ictal state.

  7. Comparison of contact conditions obtained by direct simulation with statistical analysis for normally distributed isotropic surfaces

    NASA Astrophysics Data System (ADS)

    Uchidate, M.

    2018-09-01

    In this study, with the aim of establishing a systematic knowledge on the impact of summit extraction methods and stochastic model selection in rough contact analysis, the contact area ratio (A r /A a ) obtained by statistical contact models with different summit extraction methods was compared with a direct simulation using the boundary element method (BEM). Fifty areal topography datasets with different autocorrelation functions in terms of the power index and correlation length were used for investigation. The non-causal 2D auto-regressive model which can generate datasets with specified parameters was employed in this research. Three summit extraction methods, Nayak’s theory, 8-point analysis and watershed segmentation, were examined. With regard to the stochastic model, Bhushan’s model and BGT (Bush-Gibson-Thomas) model were applied. The values of A r /A a from the stochastic models tended to be smaller than BEM. The discrepancy between the Bhushan’s model with the 8-point analysis and BEM was slightly smaller than Nayak’s theory. The results with the watershed segmentation was similar to those with the 8-point analysis. The impact of the Wolf pruning on the discrepancy between the stochastic analysis and BEM was not very clear. In case of the BGT model which employs surface gradients, good quantitative agreement against BEM was obtained when the Nayak’s bandwidth parameter was large.

  8. The influence of anthropometrics on physical employment standard performance.

    PubMed

    Reilly, T; Spivock, M; Prayal-Brown, A; Stockbrugger, B; Blacklock, R

    2016-10-01

    The Canadian Armed Forces (CAF) recently implemented the Fitness for Operational Requirements of CAF Employment (FORCE), a new physical employment standard (PES). Data collection throughout development included anthropometric profiles of the CAF. To determine if anthropometric measurements and demographic information would predict the performance outcomes of the FORCE and/or Common Military Task Fitness Evaluation (CMTFE). We conducted a secondary analysis of data from FORCE research. We obtained bioelectrical impedance and segmental analysis. Statistical analysis included correlation and linear regression analyses. Among the 668 study subjects, as predicted, any task requiring lifting, pulling or moving of an object was significantly and positively correlated (r > 0.67) to lean body mass (LBM) measurements. LBM correlated with stretcher carry (r = 0.78) and with lifting actions such as sand bag drag (r = 0.77), vehicle extrication (r = 0.71), sand bag fortification (r = 0.68) and sand bag lift time (r = -0.67). The difference between the correlation of dead mass (DM) with task performance compared with LBM was not statistically significant. DM and LBM can be used in a PES to predict success on military tasks such as casualty evacuation and manual material handling. However, there is no minimum LBM required to perform these tasks successfully. These data direct future research on how we should diversify research participants by anthropometrics, in addition to the traditional demographic variables of gender and age, to highlight potential important adverse impact with PES design. In addition, the results can be used to develop better training regimens to facilitate passing a PES. © All rights reserved. ‘The Influence of Anthropometrics on Physical Employment Standard Performance’ has been reproduced with the permission of DND, 2016.

  9. Assessment of statistical methods used in library-based approaches to microbial source tracking.

    PubMed

    Ritter, Kerry J; Carruthers, Ethan; Carson, C Andrew; Ellender, R D; Harwood, Valerie J; Kingsley, Kyle; Nakatsu, Cindy; Sadowsky, Michael; Shear, Brian; West, Brian; Whitlock, John E; Wiggins, Bruce A; Wilbur, Jayson D

    2003-12-01

    Several commonly used statistical methods for fingerprint identification in microbial source tracking (MST) were examined to assess the effectiveness of pattern-matching algorithms to correctly identify sources. Although numerous statistical methods have been employed for source identification, no widespread consensus exists as to which is most appropriate. A large-scale comparison of several MST methods, using identical fecal sources, presented a unique opportunity to assess the utility of several popular statistical methods. These included discriminant analysis, nearest neighbour analysis, maximum similarity and average similarity, along with several measures of distance or similarity. Threshold criteria for excluding uncertain or poorly matched isolates from final analysis were also examined for their ability to reduce false positives and increase prediction success. Six independent libraries used in the study were constructed from indicator bacteria isolated from fecal materials of humans, seagulls, cows and dogs. Three of these libraries were constructed using the rep-PCR technique and three relied on antibiotic resistance analysis (ARA). Five of the libraries were constructed using Escherichia coli and one using Enterococcus spp. (ARA). Overall, the outcome of this study suggests a high degree of variability across statistical methods. Despite large differences in correct classification rates among the statistical methods, no single statistical approach emerged as superior. Thresholds failed to consistently increase rates of correct classification and improvement was often associated with substantial effective sample size reduction. Recommendations are provided to aid in selecting appropriate analyses for these types of data.

  10. Mapping the global health employment market: an analysis of global health jobs.

    PubMed

    Keralis, Jessica M; Riggin-Pathak, Brianne L; Majeski, Theresa; Pathak, Bogdan A; Foggia, Janine; Cullinen, Kathleen M; Rajagopal, Abbhirami; West, Heidi S

    2018-02-27

    The number of university global health training programs has grown in recent years. However, there is little research on the needs of the global health profession. We therefore set out to characterize the global health employment market by analyzing global health job vacancies. We collected data from advertised, paid positions posted to web-based job boards, email listservs, and global health organization websites from November 2015 to May 2016. Data on requirements for education, language proficiency, technical expertise, physical location, and experience level were analyzed for all vacancies. Descriptive statistics were calculated for the aforementioned job characteristics. Associations between technical specialty area and requirements for non-English language proficiency and overseas experience were calculated using Chi-square statistics. A qualitative thematic analysis was performed on a subset of vacancies. We analyzed the data from 1007 global health job vacancies from 127 employers. Among private and non-profit sector vacancies, 40% (n = 354) were for technical or subject matter experts, 20% (n = 177) for program directors, and 16% (n = 139) for managers, compared to 9.8% (n = 87) for entry-level and 13.6% (n = 120) for mid-level positions. The most common technical focus area was program or project management, followed by HIV/AIDS and quantitative analysis. Thematic analysis demonstrated a common emphasis on program operations, relations, design and planning, communication, and management. Our analysis shows a demand for candidates with several years of experience with global health programs, particularly program managers/directors and technical experts, with very few entry-level positions accessible to recent graduates of global health training programs. It is unlikely that global health training programs equip graduates to be competitive for the majority of positions that are currently available in this field.

  11. He Who Seeks Shall Find... Or Perhaps Not? Analysis of Firms' Searches for Qualified Personnel, Using Data from the IAB Establishment Panel 2000. IAB Labour Market Research Topics.

    ERIC Educational Resources Information Center

    Kolling, Arnd

    The success of German firms' searches for qualified personnel to fill openings in skilled occupations was examined through a statistical analysis of data from the Institut fur Arbeitsmarkt- und Berufsforschung der Bundesanstalt fur Arbeit's (IAB) establishment panel for 2000. An employer search model was used to explain the current German debate…

  12. ToNER: A tool for identifying nucleotide enrichment signals in feature-enriched RNA-seq data

    PubMed Central

    Promworn, Yuttachon; Kaewprommal, Pavita; Shaw, Philip J.; Intarapanich, Apichart; Tongsima, Sissades

    2017-01-01

    Background Biochemical methods are available for enriching 5′ ends of RNAs in prokaryotes, which are employed in the differential RNA-seq (dRNA-seq) and the more recent Cappable-seq protocols. Computational methods are needed to locate RNA 5′ ends from these data by statistical analysis of the enrichment. Although statistical-based analysis methods have been developed for dRNA-seq, they may not be suitable for Cappable-seq data. The more efficient enrichment method employed in Cappable-seq compared with dRNA-seq could affect data distribution and thus algorithm performance. Results We present Transformation of Nucleotide Enrichment Ratios (ToNER), a tool for statistical modeling of enrichment from RNA-seq data obtained from enriched and unenriched libraries. The tool calculates nucleotide enrichment scores and determines the global transformation for fitting to the normal distribution using the Box-Cox procedure. From the transformed distribution, sites of significant enrichment are identified. To increase power of detection, meta-analysis across experimental replicates is offered. We tested the tool on Cappable-seq and dRNA-seq data for identifying Escherichia coli transcript 5′ ends and compared the results with those from the TSSAR tool, which is designed for analyzing dRNA-seq data. When combining results across Cappable-seq replicates, ToNER detects more known transcript 5′ ends than TSSAR. In general, the transcript 5′ ends detected by ToNER but not TSSAR occur in regions which cannot be locally modeled by TSSAR. Conclusion ToNER uses a simple yet robust statistical modeling approach, which can be used for detecting RNA 5′ends from Cappable-seq data, in particular when combining information from experimental replicates. The ToNER tool could potentially be applied for analyzing other RNA-seq datasets in which enrichment for other structural features of RNA is employed. The program is freely available for download at ToNER webpage (http://www4a.biotec.or.th/GI/tools/toner) and GitHub repository (https://github.com/PavitaKae/ToNER). PMID:28542466

  13. GHEP-ISFG collaborative exercise on mixture profiles (GHEP-MIX06). Reporting conclusions: Results and evaluation.

    PubMed

    Barrio, P A; Crespillo, M; Luque, J A; Aler, M; Baeza-Richer, C; Baldassarri, L; Carnevali, E; Coufalova, P; Flores, I; García, O; García, M A; González, R; Hernández, A; Inglés, V; Luque, G M; Mosquera-Miguel, A; Pedrosa, S; Pontes, M L; Porto, M J; Posada, Y; Ramella, M I; Ribeiro, T; Riego, E; Sala, A; Saragoni, V G; Serrano, A; Vannelli, S

    2018-07-01

    One of the main goals of the Spanish and Portuguese-Speaking Group of the International Society for Forensic Genetics (GHEP-ISFG) is to promote and contribute to the development and dissemination of scientific knowledge in the field of forensic genetics. Due to this fact, GHEP-ISFG holds different working commissions that are set up to develop activities in scientific aspects of general interest. One of them, the Mixture Commission of GHEP-ISFG, has organized annually, since 2009, a collaborative exercise on analysis and interpretation of autosomal short tandem repeat (STR) mixture profiles. Until now, six exercises have been organized. At the present edition (GHEP-MIX06), with 25 participant laboratories, the exercise main aim was to assess mixture profiles results by issuing a report, from the proposal of a complex mock case. One of the conclusions obtained from this exercise is the increasing tendency of participating laboratories to validate DNA mixture profiles analysis following international recommendations. However, the results have shown some differences among them regarding the edition and also the interpretation of mixture profiles. Besides, although the last revision of ISO/IEC 17025:2017 gives indications of how results should be reported, not all laboratories strictly follow their recommendations. Regarding the statistical aspect, all those laboratories that have performed statistical evaluation of the data have employed the likelihood ratio (LR) as a parameter to evaluate the statistical compatibility. However, LR values obtained show a wide range of variation. This fact could not be attributed to the software employed, since the vast majority of laboratories that performed LR calculation employed the same software (LRmixStudio). Thus, the final allelic composition of the edited mixture profile and the parameters employed in the software could explain this data dispersion. This highlights the need, for each laboratory, to define through internal validations its criteria for editing and interpreting mixtures, and to continuous train in software handling. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. 76 FR 44960 - Agency Information Collection Activities; Submission for OMB Review; Comment Request; Report on...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-27

    ... for OMB Review; Comment Request; Report on Current Employment Statistics ACTION: Notice. SUMMARY: The Department of Labor (DOL) is submitting the revised Bureau of Labor Statistics (BLS) sponsored information collection request (ICR) titled, ``Report on Current Employment Statistics,'' to the Office of Management and...

  15. Resting-state fMRI data reflects default network activity rather than null data: A defense of commonly employed methods to correct for multiple comparisons.

    PubMed

    Slotnick, Scott D

    2017-07-01

    Analysis of functional magnetic resonance imaging (fMRI) data typically involves over one hundred thousand independent statistical tests; therefore, it is necessary to correct for multiple comparisons to control familywise error. In a recent paper, Eklund, Nichols, and Knutsson used resting-state fMRI data to evaluate commonly employed methods to correct for multiple comparisons and reported unacceptable rates of familywise error. Eklund et al.'s analysis was based on the assumption that resting-state fMRI data reflect null data; however, their 'null data' actually reflected default network activity that inflated familywise error. As such, Eklund et al.'s results provide no basis to question the validity of the thousands of published fMRI studies that have corrected for multiple comparisons or the commonly employed methods to correct for multiple comparisons.

  16. Graphical tools for network meta-analysis in STATA.

    PubMed

    Chaimani, Anna; Higgins, Julian P T; Mavridis, Dimitris; Spyridonos, Panagiota; Salanti, Georgia

    2013-01-01

    Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results.

  17. Graphical Tools for Network Meta-Analysis in STATA

    PubMed Central

    Chaimani, Anna; Higgins, Julian P. T.; Mavridis, Dimitris; Spyridonos, Panagiota; Salanti, Georgia

    2013-01-01

    Network meta-analysis synthesizes direct and indirect evidence in a network of trials that compare multiple interventions and has the potential to rank the competing treatments according to the studied outcome. Despite its usefulness network meta-analysis is often criticized for its complexity and for being accessible only to researchers with strong statistical and computational skills. The evaluation of the underlying model assumptions, the statistical technicalities and presentation of the results in a concise and understandable way are all challenging aspects in the network meta-analysis methodology. In this paper we aim to make the methodology accessible to non-statisticians by presenting and explaining a series of graphical tools via worked examples. To this end, we provide a set of STATA routines that can be easily employed to present the evidence base, evaluate the assumptions, fit the network meta-analysis model and interpret its results. PMID:24098547

  18. Exploring Preferences of Mentoring Activities among Generational Groups of Registered Nurses in Florida

    ERIC Educational Resources Information Center

    Posey-Goodwin, Patricia Ann

    2013-01-01

    The purpose of this study was to explore differences in perceptions of mentoring activities from four generations of registered nurses in Florida, using the Alleman Mentoring Activities Questionnaire ® (AMAQ ®). Statistical procedures of analysis of variance (ANOVA) were employed to explore differences among 65 registered nurses in Florida from…

  19. STATISTICAL ESTIMATES OF VARIANCE FOR 15N ISOTOPE DILUTION MEASUREMENTS OF GROSS RATES OF NITROGEN CYCLE PROCESSES

    EPA Science Inventory

    It has been fifty years since Kirkham and Bartholmew (1954) presented the conceptual framework and derived the mathematical equations that formed the basis of the now commonly employed method of 15N isotope dilution. Although many advances in methodology and analysis have been ma...

  20. Turbulent Chemically Reacting Flows According to a Kinetic Theory. Ph.D. Thesis; [statistical analysis/gas flow

    NASA Technical Reports Server (NTRS)

    Hong, Z. C.

    1975-01-01

    A review of various methods of calculating turbulent chemically reacting flow such as the Green Function, Navier-Stokes equation, and others is presented. Nonequilibrium degrees of freedom were employed to study the mixing behavior of a multiscale turbulence field. Classical and modern theories are discussed.

  1. Gender, Academic Careers and the Sabbatical: A New Zealand Case Study

    ERIC Educational Resources Information Center

    Smith, D.; Spronken-Smith, R.; Stringer, R.; Wilson, C. A.

    2016-01-01

    This article examines academics' access to and perceptions of sabbaticals at a research-intensive university in New Zealand. Statistical and inductive analysis of survey data from 915 academics (47% of all academics employed) revealed inequalities in access to and experience of sabbaticals, and highlighted academic, personal and gender issues. Men…

  2. Socioeconomic Determinants of Urban Poverty Area Workers' Labor Force Participation and Income.

    ERIC Educational Resources Information Center

    Pinkerton, James R.

    This study examined how the socioeconomic characteristics of male workers from poverty areas in Saint Louis, Missouri, San Antonio, Texas, and Chicago, Illinois, affect their incomes, hours of employment, unemployment, and labor force participation. The research was based on statistical analysis, using an interaction model, of data from the 1970…

  3. Electronic Resource Expenditure and the Decline in Reference Transaction Statistics in Academic Libraries

    ERIC Educational Resources Information Center

    Dubnjakovic, Ana

    2012-01-01

    The current study investigates factors influencing increase in reference transactions in a typical week in academic libraries across the United States of America. Employing multiple regression analysis and general linear modeling, variables of interest from the "Academic Library Survey (ALS) 2006" survey (sample size 3960 academic libraries) were…

  4. Which Industries Are Sensitive to Business Cycles?

    ERIC Educational Resources Information Center

    Berman, Jay; Pfleeger, Janet

    1997-01-01

    An analysis of the 1994-2005 Bureau of Labor Statistics employment projections can be used to identify industries that are projected to move differently with business cycles in the future than with those of the past, and can be used to identify the industries and occupations that are most prone to business cycle swings. (Author)

  5. An Economic Analysis of the Demand for State and Local Government Employees.

    ERIC Educational Resources Information Center

    Ehrenberg, Ronald G.

    This study presents estimates of the wage elasticities of demand for state and local government employees. Almost uniformly each functional category of state and local government employee's employment level is shown to be statistically significantly negatively related to the category real and relative wage level. However, the magnitude of these…

  6. Neighbourhood non-employment and daily smoking: a population-based study of women and men in Sweden.

    PubMed

    Ohlander, Emma; Vikström, Max; Lindström, Martin; Sundquist, Kristina

    2006-02-01

    To examine whether neighbourhood non-employment is associated with daily smoking after adjustment for individual characteristics, such as employment status. Cross-sectional study of a simple, random sample of 31,164 women and men aged 25-64, representative of the entire population in Sweden. Data were collected from the years 1993-2000. The individual variables included age, sex, employment status, occupation and housing tenure. Logistic regression was used in the analysis with neighbourhood non-employment rates measured at small area market statistics level. There was a significant association between neighbourhood non-employment rates and daily smoking for both women and men. After adjustment for employment status and housing tenure the odds ratios of daily smoking were 1.39 (95% CI = 1.22-1.58) for women and 1.41 (95% CI = 1.23-1.61) for men living in neighbourhoods with the highest non-employment rates. The individual variables of unemployment, low occupational level and renting were associated with daily smoking. Neighbourhood non-employment is associated with daily smoking. Smoking prevention in primary health care should address both individuals and neighbourhoods.

  7. Collaborative classification of hyperspectral and visible images with convolutional neural network

    NASA Astrophysics Data System (ADS)

    Zhang, Mengmeng; Li, Wei; Du, Qian

    2017-10-01

    Recent advances in remote sensing technology have made multisensor data available for the same area, and it is well-known that remote sensing data processing and analysis often benefit from multisource data fusion. Specifically, low spatial resolution of hyperspectral imagery (HSI) degrades the quality of the subsequent classification task while using visible (VIS) images with high spatial resolution enables high-fidelity spatial analysis. A collaborative classification framework is proposed to fuse HSI and VIS images for finer classification. First, the convolutional neural network model is employed to extract deep spectral features for HSI classification. Second, effective binarized statistical image features are learned as contextual basis vectors for the high-resolution VIS image, followed by a classifier. The proposed approach employs diversified data in a decision fusion, leading to an integration of the rich spectral information, spatial information, and statistical representation information. In particular, the proposed approach eliminates the potential problems of the curse of dimensionality and excessive computation time. The experiments evaluated on two standard data sets demonstrate better classification performance offered by this framework.

  8. The influence of economic business cycles on United States suicide rates.

    PubMed

    Wasserman, I M

    1984-01-01

    A number of social science investigators have shown that a downturn in the economy leads to an increase in the suicide rate. However, the previous works on the subject are flawed by the fact that they employ years as their temporal unit of analysis. This time period is so large that it makes it difficult for investigators to precisely determine the length of the lag effect, while at the same time removing the autocorrelation effects. Also, although most works on suicide and the business cycle employ unemployment as a measure of a downturn in the business cycle, the average duration of unemployment represents a better measure for determining the social impact of an economic downturn. From 1947 to 1977 the average monthly duration of unemployment is statistically related to the suicide rate using multivariate time-series analysis. From 1910 to 1939 the Ayres business index, a surrogate measure for movement in the business cycle, is statistically related to the monthly suicide rate. An examination of the findings confirms that in most cases a downturn in the economy causes an increase in the suicide rate.

  9. Profile Of 'Original Articles' Published In 2016 By The Journal Of Ayub Medical College, Pakistan.

    PubMed

    Shaikh, Masood Ali

    2018-01-01

    Journal of Ayub Medical College (JAMC) is the only Medline indexed biomedical journal of Pakistan that is edited and published by a medical college. Assessing the trends of study designs employed, statistical methods used, and statistical analysis software used in the articles of medical journals help understand the sophistication of research published. The objectives of this descriptive study were to assess all original articles published by JAMC in the year 2016. JAMC published 147 original articles in the year 2016. The most commonly used study design was crosssectional studies, with 64 (43.5%) articles reporting its use. Statistical tests involving bivariate analysis were most common and reported by 73 (49.6%) articles. Use of SPSS software was reported by 109 (74.1%) of articles. Most 138 (93.9%) of the original articles published were based on studies conducted in Pakistan. The number and sophistication of analysis reported in JAMC increased from year 2014 to 2016.

  10. Summary Statistics of Public TV Licensees, 1972.

    ERIC Educational Resources Information Center

    Lee, S. Young; Pedone, Ronald J.

    Statistics in the areas of finance, employment, broadcast and production for public TV licenses in 1972 are given in this report. Tables in the area of finance are presented specifying total funds, income, direct operating costs, and capital expenditures. Employment is divided into all employment with subdivisions for full- and part-time employees…

  11. Some Aspects of Part-Time Work.

    ERIC Educational Resources Information Center

    Australian Dept. of Labour and National Service, Melbourne. Women's Bureau.

    Of major importance to many married women seeking employment in Australia is the availability of part-time work. To describe the economic aspects of part-time employment for women, a review was made of statistics published by the Commonwealth Bureau of Census and Statistics and of research on part-time employment in overseas countries, and a…

  12. Physics Education: A Significant Backbone of Sustainable Development in Developing Countries

    NASA Astrophysics Data System (ADS)

    Akintola, R. A.

    2006-08-01

    In the quest for technological self-reliance, many policies, programs and projects have been proposed and implemented in order to procure solutions to the problems of technological inadequacies of developing countries. It has been observed that all these failed. This research identifies the problems and proposes lasting solutions to emancipate physics education in developing nations and highlight possible future gains. The statistical analysis employed was based on questionnaires, interviews and data analysis.

  13. ParallABEL: an R library for generalized parallelization of genome-wide association studies

    PubMed Central

    2010-01-01

    Background Genome-Wide Association (GWA) analysis is a powerful method for identifying loci associated with complex traits and drug response. Parts of GWA analyses, especially those involving thousands of individuals and consuming hours to months, will benefit from parallel computation. It is arduous acquiring the necessary programming skills to correctly partition and distribute data, control and monitor tasks on clustered computers, and merge output files. Results Most components of GWA analysis can be divided into four groups based on the types of input data and statistical outputs. The first group contains statistics computed for a particular Single Nucleotide Polymorphism (SNP), or trait, such as SNP characterization statistics or association test statistics. The input data of this group includes the SNPs/traits. The second group concerns statistics characterizing an individual in a study, for example, the summary statistics of genotype quality for each sample. The input data of this group includes individuals. The third group consists of pair-wise statistics derived from analyses between each pair of individuals in the study, for example genome-wide identity-by-state or genomic kinship analyses. The input data of this group includes pairs of SNPs/traits. The final group concerns pair-wise statistics derived for pairs of SNPs, such as the linkage disequilibrium characterisation. The input data of this group includes pairs of individuals. We developed the ParallABEL library, which utilizes the Rmpi library, to parallelize these four types of computations. ParallABEL library is not only aimed at GenABEL, but may also be employed to parallelize various GWA packages in R. The data set from the North American Rheumatoid Arthritis Consortium (NARAC) includes 2,062 individuals with 545,080, SNPs' genotyping, was used to measure ParallABEL performance. Almost perfect speed-up was achieved for many types of analyses. For example, the computing time for the identity-by-state matrix was linearly reduced from approximately eight hours to one hour when ParallABEL employed eight processors. Conclusions Executing genome-wide association analysis using the ParallABEL library on a computer cluster is an effective way to boost performance, and simplify the parallelization of GWA studies. ParallABEL is a user-friendly parallelization of GenABEL. PMID:20429914

  14. Hedonic approaches based on spatial econometrics and spatial statistics: application to evaluation of project benefits

    NASA Astrophysics Data System (ADS)

    Tsutsumi, Morito; Seya, Hajime

    2009-12-01

    This study discusses the theoretical foundation of the application of spatial hedonic approaches—the hedonic approach employing spatial econometrics or/and spatial statistics—to benefits evaluation. The study highlights the limitations of the spatial econometrics approach since it uses a spatial weight matrix that is not employed by the spatial statistics approach. Further, the study presents empirical analyses by applying the Spatial Autoregressive Error Model (SAEM), which is based on the spatial econometrics approach, and the Spatial Process Model (SPM), which is based on the spatial statistics approach. SPMs are conducted based on both isotropy and anisotropy and applied to different mesh sizes. The empirical analysis reveals that the estimated benefits are quite different, especially between isotropic and anisotropic SPM and between isotropic SPM and SAEM; the estimated benefits are similar for SAEM and anisotropic SPM. The study demonstrates that the mesh size does not affect the estimated amount of benefits. Finally, the study provides a confidence interval for the estimated benefits and raises an issue with regard to benefit evaluation.

  15. Prediction of the Electromagnetic Field Distribution in a Typical Aircraft Using the Statistical Energy Analysis

    NASA Astrophysics Data System (ADS)

    Kovalevsky, Louis; Langley, Robin S.; Caro, Stephane

    2016-05-01

    Due to the high cost of experimental EMI measurements significant attention has been focused on numerical simulation. Classical methods such as Method of Moment or Finite Difference Time Domain are not well suited for this type of problem, as they require a fine discretisation of space and failed to take into account uncertainties. In this paper, the authors show that the Statistical Energy Analysis is well suited for this type of application. The SEA is a statistical approach employed to solve high frequency problems of electromagnetically reverberant cavities at a reduced computational cost. The key aspects of this approach are (i) to consider an ensemble of system that share the same gross parameter, and (ii) to avoid solving Maxwell's equations inside the cavity, using the power balance principle. The output is an estimate of the field magnitude distribution in each cavity. The method is applied on a typical aircraft structure.

  16. Urban-Induced Rainfall Anomalies in an Arid Regime: Evidence from a 108-Year Data Record and Satellite Measurements

    NASA Technical Reports Server (NTRS)

    Shepherd, J. Marshall

    2004-01-01

    The study employs a 108-year precipitation data record to identify statistically significant anomalies in rainfall downwind of the Phoenix urban region. The analysis reveals that during the monsoon season locations northeastern suburbs and exurbs of the Phoenix metropolitan area have experienced statistically significant increases in mean precipitation of 12 to 14 percent from a pre-urban (1895-1949) to post-urban (1950-2003) period. Mean and median post-urban precipitation totals in the anomaly region are significantly greater, in the statistical sense, than regions west of the city and in nearby mountainous regions of similar or greater topography. Further analysis of satellite-based rainfall totals for the summer of 2003 also reveal the existence of the anomaly region during a severe drought period. The anomaly can not simply be attributed to maximum topographic relief and is hypothesize to be related to urban-topographic interactions.

  17. Factors affecting employment among people with mobility disabilities in South Korea.

    PubMed

    Park, Soo-Kyung; Yoon, Jae-Young; Henderson, Terrence

    2007-03-01

    Employment provides not only income but also opportunities for social participation. This is especially important for people with disabilities, but the employment of disabled people in many countries is subject to significant barriers. This study examines the actual state of employment of people with mobility disabilities in Korea and which characteristics affect employment among people with mobility disabilities. Analysis of responses to the Community Integration Questionnaire and independent variables among the study participants showed that the rate of employment among people with mobility disabilities (34.2%) is much lower than that of the general population (60.3%), with only 13.2% in full-time positions. Gender appeared to be a statistically significant factor influencing employment. Other demographic characteristics such as age, level of education and cohabitation did not influence employment in this study, but people with less severe disability had a higher probability of being employed. Disability acceptance appeared to be a vital factor in the process of vocational rehabilitation. The use of vocational rehabilitation services did not have a significant effect on employment. These results suggest that the role of the formal services system in the employment process of disabled people is insufficient.

  18. Combined data preprocessing and multivariate statistical analysis characterizes fed-batch culture of mouse hybridoma cells for rational medium design.

    PubMed

    Selvarasu, Suresh; Kim, Do Yun; Karimi, Iftekhar A; Lee, Dong-Yup

    2010-10-01

    We present an integrated framework for characterizing fed-batch cultures of mouse hybridoma cells producing monoclonal antibody (mAb). This framework systematically combines data preprocessing, elemental balancing and statistical analysis technique. Initially, specific rates of cell growth, glucose/amino acid consumptions and mAb/metabolite productions were calculated via curve fitting using logistic equations, with subsequent elemental balancing of the preprocessed data indicating the presence of experimental measurement errors. Multivariate statistical analysis was then employed to understand physiological characteristics of the cellular system. The results from principal component analysis (PCA) revealed three major clusters of amino acids with similar trends in their consumption profiles: (i) arginine, threonine and serine, (ii) glycine, tyrosine, phenylalanine, methionine, histidine and asparagine, and (iii) lysine, valine and isoleucine. Further analysis using partial least square (PLS) regression identified key amino acids which were positively or negatively correlated with the cell growth, mAb production and the generation of lactate and ammonia. Based on these results, the optimal concentrations of key amino acids in the feed medium can be inferred, potentially leading to an increase in cell viability and productivity, as well as a decrease in toxic waste production. The study demonstrated how the current methodological framework using multivariate statistical analysis techniques can serve as a potential tool for deriving rational medium design strategies. Copyright © 2010 Elsevier B.V. All rights reserved.

  19. Assessment of Reliable Change Using 95% Credible Intervals for the Differences in Proportions: A Statistical Analysis for Case-Study Methodology.

    PubMed

    Unicomb, Rachael; Colyvas, Kim; Harrison, Elisabeth; Hewat, Sally

    2015-06-01

    Case-study methodology studying change is often used in the field of speech-language pathology, but it can be criticized for not being statistically robust. Yet with the heterogeneous nature of many communication disorders, case studies allow clinicians and researchers to closely observe and report on change. Such information is valuable and can further inform large-scale experimental designs. In this research note, a statistical analysis for case-study data is outlined that employs a modification to the Reliable Change Index (Jacobson & Truax, 1991). The relationship between reliable change and clinical significance is discussed. Example data are used to guide the reader through the use and application of this analysis. A method of analysis is detailed that is suitable for assessing change in measures with binary categorical outcomes. The analysis is illustrated using data from one individual, measured before and after treatment for stuttering. The application of this approach to assess change in categorical, binary data has potential application in speech-language pathology. It enables clinicians and researchers to analyze results from case studies for their statistical and clinical significance. This new method addresses a gap in the research design literature, that is, the lack of analysis methods for noncontinuous data (such as counts, rates, proportions of events) that may be used in case-study designs.

  20. Health Status After Cancer: Does It Matter Which Hospital You Belong To?

    PubMed Central

    2010-01-01

    Background Survival rates are widely used to compare the quality of cancer care. However, the extent to which cancer survivors regain full physical or cognitive functioning is not captured by this statistic. To address this concern we introduce post-diagnosis employment as a supplemental measure of the quality of cancer care. Methods This study is based on individual level data from the Norwegian Cancer Registry (n = 46,720) linked with data on labor market outcomes and socioeconomic status from Statistics Norway. We study variation across Norwegian hospital catchment areas (n = 55) with respect to survival and employment five years after cancer diagnosis. To handle the selection problem, we exploit the fact that cancer patients in Norway (until 2001) have been allocated to local hospitals based on their place of residence. Results We document substantial differences across catchment areas with respect to patients' post-diagnosis employment rates. Conventional quality indicators based on survival rates indicate smaller differences. The two sets of indicators are only moderately correlated. Conclusions This analysis shows that indicators based on survival and post-diagnosis employment may capture different parts of the health status distribution, and that using only one of them to capture quality of care may be insufficient. PMID:20626866

  1. Training in metabolomics research. II. Processing and statistical analysis of metabolomics data, metabolite identification, pathway analysis, applications of metabolomics and its future

    PubMed Central

    Barnes, Stephen; Benton, H. Paul; Casazza, Krista; Cooper, Sara; Cui, Xiangqin; Du, Xiuxia; Engler, Jeffrey; Kabarowski, Janusz H.; Li, Shuzhao; Pathmasiri, Wimal; Prasain, Jeevan K.; Renfrow, Matthew B.; Tiwari, Hemant K.

    2017-01-01

    Metabolomics, a systems biology discipline representing analysis of known and unknown pathways of metabolism, has grown tremendously over the past 20 years. Because of its comprehensive nature, metabolomics requires careful consideration of the question(s) being asked, the scale needed to answer the question(s), collection and storage of the sample specimens, methods for extraction of the metabolites from biological matrices, the analytical method(s) to be employed and the quality control of the analyses, how collected data are correlated, the statistical methods to determine metabolites undergoing significant change, putative identification of metabolites, and the use of stable isotopes to aid in verifying metabolite identity and establishing pathway connections and fluxes. This second part of a comprehensive description of the methods of metabolomics focuses on data analysis, emerging methods in metabolomics and the future of this discipline. PMID:28239968

  2. Analysis/forecast experiments with a multivariate statistical analysis scheme using FGGE data

    NASA Technical Reports Server (NTRS)

    Baker, W. E.; Bloom, S. C.; Nestler, M. S.

    1985-01-01

    A three-dimensional, multivariate, statistical analysis method, optimal interpolation (OI) is described for modeling meteorological data from widely dispersed sites. The model was developed to analyze FGGE data at the NASA-Goddard Laboratory of Atmospherics. The model features a multivariate surface analysis over the oceans, including maintenance of the Ekman balance and a geographically dependent correlation function. Preliminary comparisons are made between the OI model and similar schemes employed at the European Center for Medium Range Weather Forecasts and the National Meteorological Center. The OI scheme is used to provide input to a GCM, and model error correlations are calculated for forecasts of 500 mb vertical water mixing ratios and the wind profiles. Comparisons are made between the predictions and measured data. The model is shown to be as accurate as a successive corrections model out to 4.5 days.

  3. Assessment of environmental impacts part one. Intervention analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hipel, Keith William; Lettenmaier, Dennis P.; McLeod, A. Ian

    The use of intervention analysis as a statistical method of gauging the effects of environmental changes is discussed. The Box-Jenkins model, serves as the basis for the intervention analysis methodology. Environmental studies of the Aswan Dam, the South Saskatchewan River, and a forest fire near the Pipers Hole River, Canada, are included as case studies in which intervention analysis was employed. Methods of data collection for intervention analysis are found to have a significant impact on model reliability; effective data collection processes for the Box-Jenkins model are provided. (15 graphs, 27 references, 2 tables)

  4. Radioactivity Registered With a Small Number of Events

    NASA Astrophysics Data System (ADS)

    Zlokazov, Victor; Utyonkov, Vladimir

    2018-02-01

    The synthesis of superheavy elements asks for the analysis of low statistics experimental data presumably obeying an unknown exponential distribution and to take the decision whether they originate from one source or have admixtures. Here we analyze predictions following from non-parametrical methods, employing only such fundamental sample properties as the sample mean, the median and the mode.

  5. Analysis of the Relationship between the Emotional Intelligence and Professional Burnout Levels of Teachers

    ERIC Educational Resources Information Center

    Adilogullari, Ilhan

    2014-01-01

    The purpose of this study is to analyze the relationship between the emotional intelligence and professional burnout levels of teachers. The nature of the study consists of high school teachers employed in city center of Kirsehir Province; 563 volunteer teachers form the nature of sampling. The statistical implementation of the study is performed…

  6. An Analysis of Gender Equity in the Federal Labor Relations Career Field.

    ERIC Educational Resources Information Center

    Baker, Bud; Wendt, Ann; Slonaker, William

    2002-01-01

    Government employment statistics indicate that the number of federal labor relations specialists declined 7% from 1991-2000; the proportion of women in the field grew from 42.2% to 50.9%; and the pay gap narrowed. The number of women in upper management rose 18% between 1991 and 1998. (Contains 31 references.) (SK)

  7. Phenotype profiling and multivariate statistical analysis of Spur-pruning type Grapevine in National Clonal Germplasm Repository (NCGR, Davis)

    USDA-ARS?s Scientific Manuscript database

    Most Korean vineyards employed spur-pruning type modified-T trellis system. This produce system is suitable to spur-pruning type cultivars. But most European table grape is not adaptable to this produce system because their fruitfulness is sufficient to cane-pruning type system. Total 20 of fruit ch...

  8. INVESTIGATION OF THE USE OF STATISTICS IN COUNSELING STUDENTS.

    ERIC Educational Resources Information Center

    HEWES, ROBERT F.

    THE OBJECTIVE WAS TO EMPLOY TECHNIQUES OF PROFILE ANALYSIS TO DEVELOP THE JOINT PROBABILITY OF SELECTING A SUITABLE SUBJECT MAJOR AND OF ASSURING TO A HIGH DEGREE GRADUATION FROM COLLEGE WITH THAT MAJOR. THE SAMPLE INCLUDED 1,197 MIT FRESHMEN STUDENTS IN 1952-53, AND THE VALIDATION GROUP INCLUDED 699 ENTRANTS IN 1954. DATA INCLUDED SECONDARY…

  9. ADHD and Method Variance: A Latent Variable Approach Applied to a Nationally Representative Sample of College Freshmen

    ERIC Educational Resources Information Center

    Konold, Timothy R.; Glutting, Joseph J.

    2008-01-01

    This study employed a correlated trait-correlated method application of confirmatory factor analysis to disentangle trait and method variance from measures of attention-deficit/hyperactivity disorder obtained at the college level. The two trait factors were "Diagnostic and Statistical Manual of Mental Disorders-Fourth Edition" ("DSM-IV")…

  10. Testing and Evaluating C3I Systems That Employ AI. Volume 1. Handbook for Testing Expert Systems

    DTIC Science & Technology

    1991-01-31

    Designs ....... ............. .. 6-29 Nonequivalent Control Group Design ...does not receive the system; and (c) nonequivalent (and nonrandomized) control group designs that rely on statistical techniques like analysis of...implementation); (b) multiple time-series designs using a control group ; and (c) nonequivalent control group designs that obtain pretest and

  11. Training in the Food and Beverages Sector in the United Kingdom. Report for the FORCE Programme. First Edition.

    ERIC Educational Resources Information Center

    Burns, Jim A.; King, Richard

    An international team of researchers studied the following aspects of training in the United Kingdom's food and beverage sector: structure and characteristics, business and social context, training and recruitment, and future training requirements. Data were collected from an analysis of social and labor/employment statistics, literature review,…

  12. School-to-Work Transition and After: Do Inequalities between the Sexes Defy Diplomas?

    ERIC Educational Resources Information Center

    Couppie, Thomas; Epiphane, Dominique; Fournier, Christine

    1997-01-01

    Sex-related differences between the employment opportunities available in France to males and females with comparable levels of education were examined through an analysis of data from two types of sources: statistics derived from quantitative surveys conducted on broad samples of graduates 2-4 years after the end of their training and in-depth…

  13. Confocal Raman microscopy and multivariate statistical analysis for determination of different penetration abilities of caffeine and propylene glycol applied simultaneously in a mixture on porcine skin ex vivo.

    PubMed

    Mujica Ascencio, Saul; Choe, ChunSik; Meinke, Martina C; Müller, Rainer H; Maksimov, George V; Wigger-Alberti, Walter; Lademann, Juergen; Darvin, Maxim E

    2016-07-01

    Propylene glycol is one of the known substances added in cosmetic formulations as a penetration enhancer. Recently, nanocrystals have been employed also to increase the skin penetration of active components. Caffeine is a component with many applications and its penetration into the epidermis is controversially discussed in the literature. In the present study, the penetration ability of two components - caffeine nanocrystals and propylene glycol, applied topically on porcine ear skin in the form of a gel, was investigated ex vivo using two confocal Raman microscopes operated at different excitation wavelengths (785nm and 633nm). Several depth profiles were acquired in the fingerprint region and different spectral ranges, i.e., 526-600cm(-1) and 810-880cm(-1) were chosen for independent analysis of caffeine and propylene glycol penetration into the skin, respectively. Multivariate statistical methods such as principal component analysis (PCA) and linear discriminant analysis (LDA) combined with Student's t-test were employed to calculate the maximum penetration depths of each substance (caffeine and propylene glycol). The results show that propylene glycol penetrates significantly deeper than caffeine (20.7-22.0μm versus 12.3-13.0μm) without any penetration enhancement effect on caffeine. The results confirm that different substances, even if applied onto the skin as a mixture, can penetrate differently. The penetration depths of caffeine and propylene glycol obtained using two different confocal Raman microscopes are comparable showing that both types of microscopes are well suited for such investigations and that multivariate statistical PCA-LDA methods combined with Student's t-test are very useful for analyzing the penetration of different substances into the skin. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Raising the bar for reproducible science at the U.S. Environmental Protection Agency Office of Research and Development.

    PubMed

    George, Barbara Jane; Sobus, Jon R; Phelps, Lara P; Rashleigh, Brenda; Simmons, Jane Ellen; Hines, Ronald N

    2015-05-01

    Considerable concern has been raised regarding research reproducibility both within and outside the scientific community. Several factors possibly contribute to a lack of reproducibility, including a failure to adequately employ statistical considerations during study design, bias in sample selection or subject recruitment, errors in developing data inclusion/exclusion criteria, and flawed statistical analysis. To address some of these issues, several publishers have developed checklists that authors must complete. Others have either enhanced statistical expertise on existing editorial boards, or formed distinct statistics editorial boards. Although the U.S. Environmental Protection Agency, Office of Research and Development, already has a strong Quality Assurance Program, an initiative was undertaken to further strengthen statistics consideration and other factors in study design and also to ensure these same factors are evaluated during the review and approval of study protocols. To raise awareness of the importance of statistical issues and provide a forum for robust discussion, a Community of Practice for Statistics was formed in January 2014. In addition, three working groups were established to develop a series of questions or criteria that should be considered when designing or reviewing experimental, observational, or modeling focused research. This article describes the process used to develop these study design guidance documents, their contents, how they are being employed by the Agency's research enterprise, and expected benefits to Agency science. The process and guidance documents presented here may be of utility for any research enterprise interested in enhancing the reproducibility of its science. © The Author 2015. Published by Oxford University Press on behalf of the Society of Toxicology.

  15. Employing Introductory Statistics Students at "Stats Dairy"

    ERIC Educational Resources Information Center

    Keeling, Kellie

    2011-01-01

    To combat students' fear of statistics I employ my students at a fictional company, Stats Dairy, run by cows. Almost all examples used in the class notes, exercises, humour and exams use data "collected" from this company.

  16. Statistical analysis of water-quality data containing multiple detection limits: S-language software for regression on order statistics

    USGS Publications Warehouse

    Lee, L.; Helsel, D.

    2005-01-01

    Trace contaminants in water, including metals and organics, often are measured at sufficiently low concentrations to be reported only as values below the instrument detection limit. Interpretation of these "less thans" is complicated when multiple detection limits occur. Statistical methods for multiply censored, or multiple-detection limit, datasets have been developed for medical and industrial statistics, and can be employed to estimate summary statistics or model the distributions of trace-level environmental data. We describe S-language-based software tools that perform robust linear regression on order statistics (ROS). The ROS method has been evaluated as one of the most reliable procedures for developing summary statistics of multiply censored data. It is applicable to any dataset that has 0 to 80% of its values censored. These tools are a part of a software library, or add-on package, for the R environment for statistical computing. This library can be used to generate ROS models and associated summary statistics, plot modeled distributions, and predict exceedance probabilities of water-quality standards. ?? 2005 Elsevier Ltd. All rights reserved.

  17. A cohort mortality study of employees exposed to chlorinated chemicals.

    PubMed

    Wong, O

    1988-01-01

    The cohort of this historical prospective mortality study consisted of 697 male employees at a chlorination plant. A majority of the cohort was potentially exposed to benzotrichloride, benzyl chloride, benzoyl chloride, and other related chemicals. The mortality experience of the cohort was observed from 1943 through 1982. For the cohort as a whole, no statistically significant mortality excess was detected. The overall Standardized Mortality Ratio (SMR) was 100, and the SMR for all cancers combined was 122 (not significant). The respiratory cancer SMR for the cohort as a whole was 246 (7 observed vs. 2.8 expected). The excess was of borderline statistical significance, the lower 95% confidence limit being 99. Analysis by race showed that all 7 respiratory cancer deaths came from the white male employees, with an SMR of 265 (p less than 0.05). The respiratory cancer mortality excess was higher among employees in maintenance (SMR = 229) than among those in operations or production (SMR = 178). The lung cancer mortality excess among the laboratory employees was statistically significant (SMR = 1292). However, this observation should be viewed with caution, since it was based on only 2 deaths. Further analysis indicated that the respiratory cancer mortality excess was limited to the male employees with 15 or more years of employment (SMR = 379, p less than 0.05). Based on animal data as well as other epidemiologic studies, together with the internal consistency of analysis by length of employment, the data suggest an association between the chlorination process of toluene at the plant and an increased risk of respiratory cancer.(ABSTRACT TRUNCATED AT 250 WORDS)

  18. Automated system for the on-line monitoring of powder blending processes using near-infrared spectroscopy. Part I. System development and control.

    PubMed

    Hailey, P A; Doherty, P; Tapsell, P; Oliver, T; Aldridge, P K

    1996-03-01

    An automated system for the on-line monitoring of powder blending processes is described. The system employs near-infrared (NIR) spectroscopy using fibre-optics and a graphical user interface (GUI) developed in the LabVIEW environment. The complete supervisory control and data analysis (SCADA) software controls blender and spectrophotometer operation and performs statistical spectral data analysis in real time. A data analysis routine using standard deviation is described to demonstrate an approach to the real-time determination of blend homogeneity.

  19. [Economic development and married women's employment in Taiwan: a study of female marginalization].

    PubMed

    Lu, Y

    1994-07-01

    As in other developing countries, the industrial development in Taiwan seems to marginalize female workers. This study tries to examine the trend of women's employment status, using both macro- and micro-level data. The statistics suggest that female employment had significantly declined during the early stages of industrialization. Although rapid economic development has expanded women's job opportunities, most women are concentrated in lower-status jobs and the informal sector. Informal employment is especially prevalent among married women. In the micro-level analysis the study examines the factors that led to the marginalization of women's labor force. The empirical analysis applies a multinomial logistic model to a 1980 KAP (knowledge, attitude, and practice) survey sample of 3859 married women. The results suggest that married women's work patterns in terms of formal vs. informal employment are determined by the family organization rather than by labor market conditions. Wives from families with small businesses are more likely to be involved in informal employment. Wives also tend to work informally when they have young children. On the other hand, the effects of labor market conditions are mediated by the types of family economy. Therefore the women's informal employment in Taiwan, as a characteristic of female marginalization, is the result of the sexual division of labor in the family organization and the prevalence of the family business, rather than that of being excluded into the marginal forms of employment through the process of capitalistic production, as argued by the female marginalization theorists.

  20. Tolerancing aspheres based on manufacturing statistics

    NASA Astrophysics Data System (ADS)

    Wickenhagen, S.; Möhl, A.; Fuchs, U.

    2017-11-01

    A standard way of tolerancing optical elements or systems is to perform a Monte Carlo based analysis within a common optical design software package. Although, different weightings and distributions are assumed they are all counting on statistics, which usually means several hundreds or thousands of systems for reliable results. Thus, employing these methods for small batch sizes is unreliable, especially when aspheric surfaces are involved. The huge database of asphericon was used to investigate the correlation between the given tolerance values and measured data sets. The resulting probability distributions of these measured data were analyzed aiming for a robust optical tolerancing process.

  1. The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bihn T. Pham; Jeffrey J. Einerson

    2010-06-01

    This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automatedmore » processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.« less

  2. A statistical probe into variability within total ozone time series over Arosa, Switzerland (9.68°E, 46.78°N)

    NASA Astrophysics Data System (ADS)

    Chakraborthy, Parthasarathi; Chattopadhyay, Surajit

    2013-02-01

    Endeavor of the present paper is to investigate the statistical properties of the total ozone concentration time series over Arosa, Switzerland (9.68°E, 46.78°N). For this purpose, different statistical data analysis procedures have been employed for analyzing the mean monthly total ozone concentration data, collected over a period of 40 years (1932-1971), at the above location. Based on the computations on the available data set, the study reports different degrees of variations in different months. The month of July is reported as the month of lowest variability. April and May are found to be the most correlated months with respect to total ozone concentration.

  3. Design and analysis issues in quantitative proteomics studies.

    PubMed

    Karp, Natasha A; Lilley, Kathryn S

    2007-09-01

    Quantitative proteomics is the comparison of distinct proteomes which enables the identification of protein species which exhibit changes in expression or post-translational state in response to a given stimulus. Many different quantitative techniques are being utilized and generate large datasets. Independent of the technique used, these large datasets need robust data analysis to ensure valid conclusions are drawn from such studies. Approaches to address the problems that arise with large datasets are discussed to give insight into the types of statistical analyses of data appropriate for the various experimental strategies that can be employed by quantitative proteomic studies. This review also highlights the importance of employing a robust experimental design and highlights various issues surrounding the design of experiments. The concepts and examples discussed within will show how robust design and analysis will lead to confident results that will ensure quantitative proteomics delivers.

  4. Generating a Magellanic star cluster catalog with ASteCA

    NASA Astrophysics Data System (ADS)

    Perren, G. I.; Piatti, A. E.; Vázquez, R. A.

    2016-08-01

    An increasing number of software tools have been employed in the recent years for the automated or semi-automated processing of astronomical data. The main advantages of using these tools over a standard by-eye analysis include: speed (particularly for large databases), homogeneity, reproducibility, and precision. At the same time, they enable a statistically correct study of the uncertainties associated with the analysis, in contrast with manually set errors, or the still widespread practice of simply not assigning errors. We present a catalog comprising 210 star clusters located in the Large and Small Magellanic Clouds, observed with Washington photometry. Their fundamental parameters were estimated through an homogeneous, automatized and completely unassisted process, via the Automated Stellar Cluster Analysis package ( ASteCA). Our results are compared with two types of studies on these clusters: one where the photometry is the same, and another where the photometric system is different than that employed by ASteCA.

  5. Statistical Approaches Used to Assess the Equity of Access to Food Outlets: A Systematic Review

    PubMed Central

    Lamb, Karen E.; Thornton, Lukar E.; Cerin, Ester; Ball, Kylie

    2015-01-01

    Background Inequalities in eating behaviours are often linked to the types of food retailers accessible in neighbourhood environments. Numerous studies have aimed to identify if access to healthy and unhealthy food retailers is socioeconomically patterned across neighbourhoods, and thus a potential risk factor for dietary inequalities. Existing reviews have examined differences between methodologies, particularly focussing on neighbourhood and food outlet access measure definitions. However, no review has informatively discussed the suitability of the statistical methodologies employed; a key issue determining the validity of study findings. Our aim was to examine the suitability of statistical approaches adopted in these analyses. Methods Searches were conducted for articles published from 2000–2014. Eligible studies included objective measures of the neighbourhood food environment and neighbourhood-level socio-economic status, with a statistical analysis of the association between food outlet access and socio-economic status. Results Fifty-four papers were included. Outlet accessibility was typically defined as the distance to the nearest outlet from the neighbourhood centroid, or as the number of food outlets within a neighbourhood (or buffer). To assess if these measures were linked to neighbourhood disadvantage, common statistical methods included ANOVA, correlation, and Poisson or negative binomial regression. Although all studies involved spatial data, few considered spatial analysis techniques or spatial autocorrelation. Conclusions With advances in GIS software, sophisticated measures of neighbourhood outlet accessibility can be considered. However, approaches to statistical analysis often appear less sophisticated. Care should be taken to consider assumptions underlying the analysis and the possibility of spatially correlated residuals which could affect the results. PMID:29546115

  6. Hydrometeorological application of an extratropical cyclone classification scheme in the southern United States

    NASA Astrophysics Data System (ADS)

    Senkbeil, J. C.; Brommer, D. M.; Comstock, I. J.; Loyd, T.

    2012-07-01

    Extratropical cyclones (ETCs) in the southern United States are often overlooked when compared with tropical cyclones in the region and ETCs in the northern United States. Although southern ETCs are significant weather events, there is currently not an operational scheme used for identifying and discussing these nameless storms. In this research, we classified 84 ETCs (1970-2009). We manually identified five distinct formation regions and seven unique ETC types using statistical classification. Statistical classification employed the use of principal components analysis and two methods of cluster analysis. Both manual and statistical storm types generally showed positive (negative) relationships with El Niño (La Niña). Manual storm types displayed precipitation swaths consistent with discrete storm tracks which further legitimizes the existence of multiple modes of southern ETCs. Statistical storm types also displayed unique precipitation intensity swaths, but these swaths were less indicative of track location. It is hoped that by classifying southern ETCs into types, that forecasters, hydrologists, and broadcast meteorologists might be able to better anticipate projected amounts of precipitation at their locations.

  7. Investing in Upskilling: Gains for Individuals, Employers and Government. In Focus: Benefit Receipt Payments

    ERIC Educational Resources Information Center

    Murray, Scott; Shillington, Richard

    2012-01-01

    Examining costs and savings associated with moving every Canadian with a Literacy Level 1 or 2 (on the international literacy scale) to Level 3, this analysis is based upon statistically matched data from the "2003 International Adult Literacy and Skills Survey and the 2005-2009 Surveys of Labour and Income Dynamics." The methods provide…

  8. Correlation-based network analysis of metabolite and enzyme profiles reveals a role of citrate biosynthesis in modulating N and C metabolism in zea mays

    USDA-ARS?s Scientific Manuscript database

    To investigate the natural variability of leaf metabolism and enzymatic activity in a maize inbred population, statistical and network analyses were employed on metabolite and enzyme profiles. The test of coefficient of variation showed that sugars and amino acids displayed opposite trends in their ...

  9. Environmental Studies: Mathematical, Computational and Statistical Analyses

    DTIC Science & Technology

    1993-03-03

    mathematical analysis addresses the seasonally and longitudinally averaged circulation which is under the influence of a steady forcing located asymmetrically...employed, as has been suggested for some situations. A general discussion of how interfacial phenomena influence both the original contamination process...describing the large-scale advective and dispersive behaviour of contaminants transported by groundwater and the uncertainty associated with field-scale

  10. Got Power? A Systematic Review of Sample Size Adequacy in Health Professions Education Research

    ERIC Educational Resources Information Center

    Cook, David A.; Hatala, Rose

    2015-01-01

    Many education research studies employ small samples, which in turn lowers statistical power. We re-analyzed the results of a meta-analysis of simulation-based education to determine study power across a range of effect sizes, and the smallest effect that could be plausibly excluded. We systematically searched multiple databases through May 2011,…

  11. Strategy for Promoting the Equitable Development of Basic Education in Underdeveloped Counties as Seen from Cili County

    ERIC Educational Resources Information Center

    Shihua, Peng; Rihui, Tan

    2009-01-01

    Employing statistical analysis, this study has made a preliminary exploration of promoting the equitable development of basic education in underdeveloped counties through the case study of Cili county. The unequally developed basic education in the county has been made clear, the reasons for the inequitable education have been analyzed, and,…

  12. Genome-wide scans of genetic variants for psychophysiological endophenotypes: a methodological overview.

    PubMed

    Iacono, William G; Malone, Stephen M; Vaidyanathan, Uma; Vrieze, Scott I

    2014-12-01

    This article provides an introductory overview of the investigative strategy employed to evaluate the genetic basis of 17 endophenotypes examined as part of a 20-year data collection effort from the Minnesota Center for Twin and Family Research. Included are characterization of the study samples, descriptive statistics for key properties of the psychophysiological measures, and rationale behind the steps taken in the molecular genetic study design. The statistical approach included (a) biometric analysis of twin and family data, (b) heritability analysis using 527,829 single nucleotide polymorphisms (SNPs), (c) genome-wide association analysis of these SNPs and 17,601 autosomal genes, (d) follow-up analyses of candidate SNPs and genes hypothesized to have an association with each endophenotype, (e) rare variant analysis of nonsynonymous SNPs in the exome, and (f) whole genome sequencing association analysis using 27 million genetic variants. These methods were used in the accompanying empirical articles comprising this special issue, Genome-Wide Scans of Genetic Variants for Psychophysiological Endophenotypes. Copyright © 2014 Society for Psychophysiological Research.

  13. Outcomes of home-based employment service programs for people with disabilities and their related factors--a preliminary study in Taiwan.

    PubMed

    Lin, Yi-Jiun; Huang, I-Chun; Wang, Yun-Tung

    2014-01-01

    The aim of this exploratory study is to gain an understanding of the outcomes of home-based employment service programs for people with disabilities and their related factors in Taiwan. This study used survey method to collect 132 questionnaires. Descriptive and two-variable statistics including chi-square (χ(2)), independent sample t-test and analysis of variance were employed. The results found that 36.5% of the subjects improved their employment status and 75.8% of them improved in employability. Educational level and and vocational categories including "web page production", "e-commerce", "internet marketing", "on-line store" and "website set-up and management" were significantly "positively" associated with either of the two outcome indicators - change of employment status and employability. This study is the first evidence-based study about the outcomes of home-based employment service programs and their related factors for people with disabilities in Taiwan. The outcomes of the home-based employment service programs for people with disabilities were presented. Implications for Rehabilitation Home-based rehabilitation for people with disabilities can be effective. A programme of this kind supports participants in improving or gaining employment status as well as developing employability skills. Further consideration should be given to developing cost-effective home-based programmes and evaluating their effectiveness.

  14. Cost-benefit analysis for sheltered employment service programs for people with disabilities in Taiwan - a preliminary study.

    PubMed

    Wang, Yun-Tung; Lin, Yi-Jiun; Shu, Ching-Hsien

    2012-01-01

    The aim of this study is to do a cost-benefit analysis with monetary and non-monetary benefits for sheltered employment service programs and try to provide more evidence-based information for policy makers and practitioners to understand the outcomes of sheltered employment services. This study analyzed 3 sheltered employment service programs for people with disabilities (2006-2007) implemented by Sunshine Social Welfare Foundation in Taiwan using cost-benefit analysis (including non-monetary benefits). Three groups were analyzed, including participants in the programs, taxpayers, and society (participants and taxpayers). This study found that the net social monetary benefit was $NT29,432.07 per participant per year and the benefit cost ratio was 1.43. (In 2006-2007, $US1 = $NT32.5 averagely around.) The net monetary benefit for the participants was between $NT7,890.86 and $NT91,890.86 per participant per year. On the non-monetary benefit side, the physical health (raised 7.49%), social relationship (raised 3.36%) domains, and general quality of life (raised 2.53%) improved. However, the psychological (decreased 1.51%) and working/environment (decreased 3.85%) domains backslided. In addition, the differences between pre-test and post-test average scores of all domains were not statistically significant. This study is the first to use monetary and non-monetary cost-benefit analysis methods to analyze sheltered employment service programs for people with disabilities in Taiwan. The findings indicated that sheltered employment service programs for people with disabilities could be efficient and beneficial for the whole society and sheltered employees/clients, and also helpful for raising their quality of lives.

  15. Improvement of submerged culture conditions to produce colorants by Penicillium purpurogenum

    PubMed Central

    Santos-Ebinuma, Valéria Carvalho; Roberto, Inês Conceição; Teixeira, Maria Francisca Simas; Pessoa, Adalberto

    2014-01-01

    Safety issues related to the employment of synthetic colorants in different industrial segments have increased the interest in the production of colorants from natural sources, such as microorganisms. Improved cultivation technologies have allowed the use of microorganisms as an alternative source of natural colorants. The objective of this work was to evaluate the influence of some factors on natural colorants production by a recently isolated from Amazon Forest, Penicillium purpurogenum DPUA 1275 employing statistical tools. To this purpose the following variables: orbital stirring speed, pH, temperature, sucrose and yeast extract concentrations and incubation time were studied through two fractional factorial, one full factorial and a central composite factorial designs. The regression analysis pointed out that sucrose and yeast extract concentrations were the variables that influenced more in colorants production. Under the best conditions (yeast extract concentration around 10 g/L and sucrose concentration of 50 g/L) an increase of 10, 33 and 23% respectively to yellow, orange and red colorants absorbance was achieved. These results show that P. purpurogenum is an alternative colorants producer and the production of these biocompounds can be improved employing statistical tool. PMID:25242965

  16. Multivariate analysis in thoracic research.

    PubMed

    Mengual-Macenlle, Noemí; Marcos, Pedro J; Golpe, Rafael; González-Rivas, Diego

    2015-03-01

    Multivariate analysis is based in observation and analysis of more than one statistical outcome variable at a time. In design and analysis, the technique is used to perform trade studies across multiple dimensions while taking into account the effects of all variables on the responses of interest. The development of multivariate methods emerged to analyze large databases and increasingly complex data. Since the best way to represent the knowledge of reality is the modeling, we should use multivariate statistical methods. Multivariate methods are designed to simultaneously analyze data sets, i.e., the analysis of different variables for each person or object studied. Keep in mind at all times that all variables must be treated accurately reflect the reality of the problem addressed. There are different types of multivariate analysis and each one should be employed according to the type of variables to analyze: dependent, interdependence and structural methods. In conclusion, multivariate methods are ideal for the analysis of large data sets and to find the cause and effect relationships between variables; there is a wide range of analysis types that we can use.

  17. Frequency of color blindness in pre-employment screening in a tertiary health care center in Pakistan.

    PubMed

    Chhipa, Shaukat Ali; Hashmi, Farzeen K; Ali, Shehreen; Kamal, Mustafa; Ahmad, Khabir

    2017-01-01

    To describe the frequency of color vision deficiency among Pakistani adults presenting for pre-employment health screening in a tertiary care hospital. The cross-sectional study was carried out at the Aga Khan University Hospital, Karachi, and the data was collected for color vision deficiency, age, gender, and job applied for from pre-employment examination during 2013-2014. IBM SPSS 20 was used for statistical analysis. Three thousand four hundred and thirty seven persons underwent pre-employment screening during 2013 and 2014; 1837 (53.44%) were males and 1600 (46.65%) females. The mean age was 29.01 (±6.53) years. A total of 0.9% (32/3437) persons had color vision deficiency with male being 1.4% and female 0.4%. Color vision deficiency was observed in 0.9% of candidates screened for pre-employment health check up in a tertiary care hospital. The color vision deficiency was predominantly present in male individuals.

  18. Match statistics related to winning in the group stage of 2014 Brazil FIFA World Cup.

    PubMed

    Liu, Hongyou; Gomez, Miguel-Ángel; Lago-Peñas, Carlos; Sampaio, Jaime

    2015-01-01

    Identifying match statistics that strongly contribute to winning in football matches is a very important step towards a more predictive and prescriptive performance analysis. The current study aimed to determine relationships between 24 match statistics and the match outcome (win, loss and draw) in all games and close games of the group stage of FIFA World Cup (2014, Brazil) by employing the generalised linear model. The cumulative logistic regression was run in the model taking the value of each match statistic as independent variable to predict the logarithm of the odds of winning. Relationships were assessed as effects of a two-standard-deviation increase in the value of each variable on the change in the probability of a team winning a match. Non-clinical magnitude-based inferences were employed and were evaluated by using the smallest worthwhile change. Results showed that for all the games, nine match statistics had clearly positive effects on the probability of winning (Shot, Shot on Target, Shot from Counter Attack, Shot from Inside Area, Ball Possession, Short Pass, Average Pass Streak, Aerial Advantage and Tackle), four had clearly negative effects (Shot Blocked, Cross, Dribble and Red Card), other 12 statistics had either trivial or unclear effects. While for the close games, the effects of Aerial Advantage and Yellow Card turned to trivial and clearly negative, respectively. Information from the tactical modelling can provide a more thorough and objective match understanding to coaches and performance analysts for evaluating post-match performances and for scouting upcoming oppositions.

  19. Mechanical properties of silicate glasses exposed to a low-Earth orbit

    NASA Technical Reports Server (NTRS)

    Wiedlocher, David E.; Tucker, Dennis S.; Nichols, Ron; Kinser, Donald L.

    1992-01-01

    The effects of a 5.8 year exposure to low earth orbit environment upon the mechanical properties of commercial optical fused silica, low iron soda-lime-silica, Pyrex 7740, Vycor 7913, BK-7, and the glass ceramic Zerodur were examined. Mechanical testing employed the ASTM-F-394 piston on 3-ball method in a liquid nitrogen environment. Samples were exposed on the Long Duration Exposure Facility (LDEF) in two locations. Impacts were observed on all specimens except Vycor. Weibull analysis as well as a standard statistical evaluation were conducted. The Weibull analysis revealed no differences between control samples and the two exposed samples. We thus concluded that radiation components of the Earth orbital environment did not degrade the mechanical strength of the samples examined within the limits of experimental error. The upper bound of strength degradation for meteorite impacted samples based upon statistical analysis and observation was 50 percent.

  20. Statistical inference for Hardy-Weinberg proportions in the presence of missing genotype information.

    PubMed

    Graffelman, Jan; Sánchez, Milagros; Cook, Samantha; Moreno, Victor

    2013-01-01

    In genetic association studies, tests for Hardy-Weinberg proportions are often employed as a quality control checking procedure. Missing genotypes are typically discarded prior to testing. In this paper we show that inference for Hardy-Weinberg proportions can be biased when missing values are discarded. We propose to use multiple imputation of missing values in order to improve inference for Hardy-Weinberg proportions. For imputation we employ a multinomial logit model that uses information from allele intensities and/or neighbouring markers. Analysis of an empirical data set of single nucleotide polymorphisms possibly related to colon cancer reveals that missing genotypes are not missing completely at random. Deviation from Hardy-Weinberg proportions is mostly due to a lack of heterozygotes. Inbreeding coefficients estimated by multiple imputation of the missings are typically lowered with respect to inbreeding coefficients estimated by discarding the missings. Accounting for missings by multiple imputation qualitatively changed the results of 10 to 17% of the statistical tests performed. Estimates of inbreeding coefficients obtained by multiple imputation showed high correlation with estimates obtained by single imputation using an external reference panel. Our conclusion is that imputation of missing data leads to improved statistical inference for Hardy-Weinberg proportions.

  1. Using Computational Modeling to Assess the Impact of Clinical Decision Support on Cancer Screening within Community Health Centers

    PubMed Central

    Carney, Timothy Jay; Morgan, Geoffrey P.; Jones, Josette; McDaniel, Anna M.; Weaver, Michael; Weiner, Bryan; Haggstrom, David A.

    2014-01-01

    Our conceptual model demonstrates our goal to investigate the impact of clinical decision support (CDS) utilization on cancer screening improvement strategies in the community health care (CHC) setting. We employed a dual modeling technique using both statistical and computational modeling to evaluate impact. Our statistical model used the Spearman’s Rho test to evaluate the strength of relationship between our proximal outcome measures (CDS utilization) against our distal outcome measure (provider self-reported cancer screening improvement). Our computational model relied on network evolution theory and made use of a tool called Construct-TM to model the use of CDS measured by the rate of organizational learning. We employed the use of previously collected survey data from community health centers Cancer Health Disparities Collaborative (HDCC). Our intent is to demonstrate the added valued gained by using a computational modeling tool in conjunction with a statistical analysis when evaluating the impact a health information technology, in the form of CDS, on health care quality process outcomes such as facility-level screening improvement. Significant simulated disparities in organizational learning over time were observed between community health centers beginning the simulation with high and low clinical decision support capability. PMID:24953241

  2. Exploring the statistics of magnetic reconnection X-points in kinetic particle-in-cell turbulence

    NASA Astrophysics Data System (ADS)

    Haggerty, C. C.; Parashar, T. N.; Matthaeus, W. H.; Shay, M. A.; Yang, Y.; Wan, M.; Wu, P.; Servidio, S.

    2017-10-01

    Magnetic reconnection is a ubiquitous phenomenon in turbulent plasmas. It is an important part of the turbulent dynamics and heating of space and astrophysical plasmas. We examine the statistics of magnetic reconnection using a quantitative local analysis of the magnetic vector potential, previously used in magnetohydrodynamics simulations, and now employed to fully kinetic particle-in-cell (PIC) simulations. Different ways of reducing the particle noise for analysis purposes, including multiple smoothing techniques, are explored. We find that a Fourier filter applied at the Debye scale is an optimal choice for analyzing PIC data. Finally, we find a broader distribution of normalized reconnection rates compared to the MHD limit with rates as large as 0.5 but with an average of approximately 0.1.

  3. Do immigrants working illegally reduce the natives' legal employment? Evidence from Italy.

    PubMed

    Venturini, A

    1999-01-01

    This paper examines how immigrants working illegally in the shadow economy affect the legal employment of native and foreign workers in the official economy of Italy. The data set used was provided by the Central Statistical Office and includes information regarding the units of labor employed both in official production and in underground production; employment in the latter is subdivided into native workers and foreign workers. Estimates were then made as to how "legal employment" has reacted to changes in "illegal employment", with special reference to the effect of the foreign component of "illegal labor". The results of the cross sector-time series analysis of the demand for legal labor in the Italian economy from 1980 to 1995 showed that the increase of illegal units of labor produces a reduction in the use of legal labor, albeit a very limited one. An analysis by sectors shows that the competitive effects of illegal foreign workers is not homogeneous and is strongest in the agricultural sector while complementarity between the two categories of labor is evident in the nontradable services sector. When comparing the number of effects of illegal foreign and illegal native workers, illegal native workers are lower than the illegal foreign workers. Despite regularization in Italy and the lack of flexibility in the labor market, neither regular nor nonregular foreign workers have begun to openly displace native workers.

  4. Employment Condition, Economic Deprivation and Self-Evaluated Health in Europe: Evidence from EU-SILC 2009-2012.

    PubMed

    Bacci, Silvia; Pigini, Claudia; Seracini, Marco; Minelli, Liliana

    2017-02-03

    Background : The mixed empirical evidence about employment conditions (i.e., permanent vs. temporary job, full-time vs. part-time job) as well as unemployment has motivated the development of conceptual models with the aim of assessing the pathways leading to effects of employment status on health. Alongside physically and psychologically riskier working conditions, one channel stems in the possibly severe economic deprivation faced by temporary workers. We investigate whether economic deprivation is able to partly capture the effect of employment status on Self-evaluated Health Status (SHS). Methods : Our analysis is based on the European Union Statistics on Income and Living Conditions (EU-SILC) survey, for a balanced sample from 26 countries from 2009 to 2012. We estimate a correlated random-effects logit model for the SHS that accounts for the ordered nature of the dependent variable and the longitudinal structure of the data. Results and Discussion : Material deprivation and economic strain are able to partly account for the negative effects on SHS from precarious and part-time employment as well as from unemployment that, however, exhibits a significant independent negative association with SHS. Conclusions : Some of the indicators used to proxy economic deprivation are significant predictors of SHS and their correlation with the employment condition is such that it should not be neglected in empirical analysis, when available and further to the monetary income.

  5. Employment Condition, Economic Deprivation and Self-Evaluated Health in Europe: Evidence from EU-SILC 2009–2012

    PubMed Central

    Bacci, Silvia; Pigini, Claudia; Seracini, Marco; Minelli, Liliana

    2017-01-01

    Background: The mixed empirical evidence about employment conditions (i.e., permanent vs. temporary job, full-time vs. part-time job) as well as unemployment has motivated the development of conceptual models with the aim of assessing the pathways leading to effects of employment status on health. Alongside physically and psychologically riskier working conditions, one channel stems in the possibly severe economic deprivation faced by temporary workers. We investigate whether economic deprivation is able to partly capture the effect of employment status on Self-evaluated Health Status (SHS). Methods: Our analysis is based on the European Union Statistics on Income and Living Conditions (EU-SILC) survey, for a balanced sample from 26 countries from 2009 to 2012. We estimate a correlated random-effects logit model for the SHS that accounts for the ordered nature of the dependent variable and the longitudinal structure of the data. Results and Discussion: Material deprivation and economic strain are able to partly account for the negative effects on SHS from precarious and part-time employment as well as from unemployment that, however, exhibits a significant independent negative association with SHS. Conclusions: Some of the indicators used to proxy economic deprivation are significant predictors of SHS and their correlation with the employment condition is such that it should not be neglected in empirical analysis, when available and further to the monetary income. PMID:28165375

  6. Education and Employment Patterns of Bioscientists. A Statistical Report.

    ERIC Educational Resources Information Center

    National Academy of Sciences - National Research Council, Washington, DC.

    This report contains a compilation of manpower statistics describing the education and employment of bioscientists. The tables also include data from other major disciplines to allow for comparisons with other scientists and nonscientists. Bioscientists include those with degrees in anatomy, biochemistry, biophysics, genetics, microbiology,…

  7. OCCUPATIONS IN COLORADO. PART I, OUTLOOK BY INDUSTRIES.

    ERIC Educational Resources Information Center

    1966

    CURRENT AND PROJECTED EMPLOYMENT STATISTICS ARE GIVEN FOR THE STATE AND FOR THE DENVER STANDARD METROPOLITAN STATISTICAL AREA WHICH INCLUDES ADAMS, ARAPAHOE, BOULDER, DENVER, AND JEFFERSON COUNTIES. DATA WERE OBTAINED FROM THE COLORADO DEPARTMENT OF EMPLOYMENT, DENVER RESEARCH INSTITUTE, U.S. CENSUS, UNIVERSITY OF COLORADO, MOUNTAIN STATES…

  8. Spatial Autocorrelation Approaches to Testing Residuals from Least Squares Regression.

    PubMed

    Chen, Yanguang

    2016-01-01

    In geo-statistics, the Durbin-Watson test is frequently employed to detect the presence of residual serial correlation from least squares regression analyses. However, the Durbin-Watson statistic is only suitable for ordered time or spatial series. If the variables comprise cross-sectional data coming from spatial random sampling, the test will be ineffectual because the value of Durbin-Watson's statistic depends on the sequence of data points. This paper develops two new statistics for testing serial correlation of residuals from least squares regression based on spatial samples. By analogy with the new form of Moran's index, an autocorrelation coefficient is defined with a standardized residual vector and a normalized spatial weight matrix. Then by analogy with the Durbin-Watson statistic, two types of new serial correlation indices are constructed. As a case study, the two newly presented statistics are applied to a spatial sample of 29 China's regions. These results show that the new spatial autocorrelation models can be used to test the serial correlation of residuals from regression analysis. In practice, the new statistics can make up for the deficiencies of the Durbin-Watson test.

  9. Analysis of covariance as a remedy for demographic mismatch of research subject groups: some sobering simulations.

    PubMed

    Adams, K M; Brown, G G; Grant, I

    1985-08-01

    Analysis of Covariance (ANCOVA) is often used in neuropsychological studies to effect ex-post-facto adjustment of performance variables amongst groups of subjects mismatched on some relevant demographic variable. This paper reviews some of the statistical assumptions underlying this usage. In an attempt to illustrate the complexities of this statistical technique, three sham studies using actual patient data are presented. These staged simulations have varying relationships between group test performance differences and levels of covariate discrepancy. The results were robust and consistent in their nature, and were held to support the wisdom of previous cautions by statisticians concerning the employment of ANCOVA to justify comparisons between incomparable groups. ANCOVA should not be used in neuropsychological research to equate groups unequal on variables such as age and education or to exert statistical control whose objective is to eliminate consideration of the covariate as an explanation for results. Finally, the report advocates by example the use of simulation to further our understanding of neuropsychological variables.

  10. Grid indentation analysis of mechanical properties of composite electrodes in Li-ion batteries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vasconcelos, Luize Scalco de; Xu, Rong; Li, Jianlin

    We report that electrodes in commercial rechargeable batteries are microscopically heterogeneous materials. The constituent components, including active materials, polymeric binders, and porous conductive matrix, often have large variation in their mechanical properties, making the mechanical characterization of composite electrodes a challenging task. In a model system of LiNi 0.5Mn 0.3Co 0.2O 2 cathode, we employ the instrumented grid indentation to determine the elastic modulus and hardness of the constituent phases. The approach relies on a large array of nanoindentation experiments and statistical analysis of the resulting data provided that the maximum indentation depth is carefully chosen. The statistically extracted propertiesmore » of the active particles and the surrounding medium are in good agreement with the tests of targeted indentation at selected sites. Lastly, the combinatory technique of grid indentation and statistical deconvolution represents a fast and reliable route to quantify the mechanical properties of composite electrodes that feed the parametric input for the mechanics models.« less

  11. Grid indentation analysis of mechanical properties of composite electrodes in Li-ion batteries

    DOE PAGES

    Vasconcelos, Luize Scalco de; Xu, Rong; Li, Jianlin; ...

    2016-03-09

    We report that electrodes in commercial rechargeable batteries are microscopically heterogeneous materials. The constituent components, including active materials, polymeric binders, and porous conductive matrix, often have large variation in their mechanical properties, making the mechanical characterization of composite electrodes a challenging task. In a model system of LiNi 0.5Mn 0.3Co 0.2O 2 cathode, we employ the instrumented grid indentation to determine the elastic modulus and hardness of the constituent phases. The approach relies on a large array of nanoindentation experiments and statistical analysis of the resulting data provided that the maximum indentation depth is carefully chosen. The statistically extracted propertiesmore » of the active particles and the surrounding medium are in good agreement with the tests of targeted indentation at selected sites. Lastly, the combinatory technique of grid indentation and statistical deconvolution represents a fast and reliable route to quantify the mechanical properties of composite electrodes that feed the parametric input for the mechanics models.« less

  12. Polypropylene Production Optimization in Fluidized Bed Catalytic Reactor (FBCR): Statistical Modeling and Pilot Scale Experimental Validation

    PubMed Central

    Khan, Mohammad Jakir Hossain; Hussain, Mohd Azlan; Mujtaba, Iqbal Mohammed

    2014-01-01

    Propylene is one type of plastic that is widely used in our everyday life. This study focuses on the identification and justification of the optimum process parameters for polypropylene production in a novel pilot plant based fluidized bed reactor. This first-of-its-kind statistical modeling with experimental validation for the process parameters of polypropylene production was conducted by applying ANNOVA (Analysis of variance) method to Response Surface Methodology (RSM). Three important process variables i.e., reaction temperature, system pressure and hydrogen percentage were considered as the important input factors for the polypropylene production in the analysis performed. In order to examine the effect of process parameters and their interactions, the ANOVA method was utilized among a range of other statistical diagnostic tools such as the correlation between actual and predicted values, the residuals and predicted response, outlier t plot, 3D response surface and contour analysis plots. The statistical analysis showed that the proposed quadratic model had a good fit with the experimental results. At optimum conditions with temperature of 75°C, system pressure of 25 bar and hydrogen percentage of 2%, the highest polypropylene production obtained is 5.82% per pass. Hence it is concluded that the developed experimental design and proposed model can be successfully employed with over a 95% confidence level for optimum polypropylene production in a fluidized bed catalytic reactor (FBCR). PMID:28788576

  13. Collaborative Employee Wellness: Living Healthy With Diabetes.

    PubMed

    Hovatter, Joan McGarvev; Cooke, Catherine E; de Bittner, Magaly Rodriguez

    Innovative approaches to managing an employee population with a high prevalence of type 2 diabetes mellitus can mitigate costs for employers by improving employees' health. This article describes such an approach at McCormick & Company, Inc., where participants had statistically significant improvements in weight, average plasma glucose concentration (also called glycated hemoglobin or A1c) and cholesterol. A simulation analysis applying the findings of the study population to Maryland employees with a baseline A1c of greater than 6.0% showed that participation in the program could improve glycemic control in these patients, reducing the A1 c by 0.24% on average, with associated cost savings for the employer.

  14. Round-off errors in cutting plane algorithms based on the revised simplex procedure

    NASA Technical Reports Server (NTRS)

    Moore, J. E.

    1973-01-01

    This report statistically analyzes computational round-off errors associated with the cutting plane approach to solving linear integer programming problems. Cutting plane methods require that the inverse of a sequence of matrices be computed. The problem basically reduces to one of minimizing round-off errors in the sequence of inverses. Two procedures for minimizing this problem are presented, and their influence on error accumulation is statistically analyzed. One procedure employs a very small tolerance factor to round computed values to zero. The other procedure is a numerical analysis technique for reinverting or improving the approximate inverse of a matrix. The results indicated that round-off accumulation can be effectively minimized by employing a tolerance factor which reflects the number of significant digits carried for each calculation and by applying the reinversion procedure once to each computed inverse. If 18 significant digits plus an exponent are carried for each variable during computations, then a tolerance value of 0.1 x 10 to the minus 12th power is reasonable.

  15. Employment and Earnings. Volume 35, Number 3, March 1988.

    ERIC Educational Resources Information Center

    Employment and Earnings, 1988

    1988-01-01

    This document presents the following monthly statistical data for the population of United States: (1) employment status; (2) characteristics of the unemployed; (3) characteristics of the employed and their job categories; (4) seasonally adjusted employment and unemployment; (5) national employment; (6) employment in states and areas; (7) national…

  16. Spatio-temporal variability of droughts and terrestrial water storage over Lake Chad Basin using independent component analysis

    NASA Astrophysics Data System (ADS)

    Ndehedehe, Christopher E.; Agutu, Nathan O.; Okwuashi, Onuwa; Ferreira, Vagner G.

    2016-09-01

    Lake Chad has recently been perceived to be completely desiccated and almost extinct due to insufficient published ground observations. Given the high spatial variability of rainfall in the region, and the fact that extreme climatic conditions (for example, droughts) could be intensifying in the Lake Chad basin (LCB) due to human activities, a spatio-temporal approach to drought analysis becomes essential. This study employed independent component analysis (ICA), a fourth-order cumulant statistics, to decompose standardised precipitation index (SPI), standardised soil moisture index (SSI), and terrestrial water storage (TWS) derived from Gravity Recovery and Climate Experiment (GRACE) into spatial and temporal patterns over the LCB. In addition, this study uses satellite altimetry data to estimate variations in the Lake Chad water levels, and further employs relevant climate teleconnection indices (El-Niño Southern Oscillation-ENSO, Atlantic Multi-decadal Oscillation-AMO, and Atlantic Meridional Mode-AMM) to examine their links to the observed drought temporal patterns over the basin. From the spatio-temporal drought analysis, temporal evolutions of SPI at 12 month aggregation show relatively wet conditions in the last two decades (although with marked alterations) with the 2012-2014 period being the wettest. In addition to the improved rainfall conditions during this period, there was a statistically significant increase of 0.04 m/yr in altimetry water levels observed over Lake Chad between 2008 and 2014, which confirms a shift in the hydrological conditions of the basin. Observed trend in TWS changes during the 2002-2014 period shows a statistically insignificant increase of 3.0 mm/yr at the centre of the basin, coinciding with soil moisture deficit indicated by the temporal evolutions of SSI at all monthly accumulations during the 2002-2003 and 2009-2012 periods. Further, SPI at 3 and 6 month scales indicated fluctuating drought conditions at the extreme south of the basin, coinciding with a statistically insignificant decline in TWS of about 4.5 mm/yr at the southern catchment of the basin. Finally, correlation analyses indicate that ENSO, AMO, and AMM are associated with extreme rainfall conditions in the basin, with AMO showing the strongest association (statistically significant correlation of 0.55) with SPI 12 month aggregation. Therefore, this study provides a framework that will support drought monitoring in the LCB.

  17. Environmental Impact Analysis Process. Environmental Impact Statement for Realignment of Beale Air Force Base

    DTIC Science & Technology

    1990-04-01

    CONTACTED .......... 6-2 APPENDIX A.- Average Daily Air Emissions Inventory for Yuba County .............................. A-1 APPENDIX B...Civilian Wage and Salary Employment, Yuba City Metropolitan Statistical Area, 1987 (Yuba and Sutter Counties , CA) ................... 3-46 vii LIST OF...Beale AFB Students Enrolled in Yuba and Sutter County Public Schools, FY 1989-90, by Assistance Category and School Capacity ................... 3-50 4

  18. The Hired Farm Working Force of 1974. A Statistical Report. Agricultural Economic Report No. 297.

    ERIC Educational Resources Information Center

    Rowe, Gene A.

    Information is given on the number, characteristics, employment, and earnings of persons 14 years of age and over who performed hired farm wagework at any time during 1974. The brief analysis highlights some of the most pertinent changes and trends in the size and composition of the hired farm working force. Data were obtained through a survey…

  19. THE ECONOMIC ASPECTS OF URBANIZATION, ECONOMIC CONSIDERATIONS IN COMMUNITY ACTION. KANSAS STATE UNIVERSITY SHORT COURSE SERIES IN PLANNING AND DEVELOPMENT, 2.

    ERIC Educational Resources Information Center

    MCGRAW, EUGENE T.

    STATISTICAL DATA AND PROJECTIONS ON POPULATION, EMPLOYMENT, AND INCOME IN KANSAS, AS REPORTED IN 1966 BY THE KANSAS OFFICE OF ECONOMIC ANALYSIS, UNDERLINE THE FACT THAT KANSAS IS CHANGING FROM A LARGELY AGRICULTURAL ECONOMY TO A MANUFACTURING-CENTERED, URBAN-ORIENTED ECONOMY. HOWEVER, THE ANTICIPATED PATTERN OF ECONOMIC GROWTH AND DEVELOPMENT IS…

  20. Three Empirical Strategies for Teaching Statistics

    ERIC Educational Resources Information Center

    Marson, Stephen M.

    2007-01-01

    This paper employs a three-step process to analyze three empirically supported strategies for teaching statistics to BSW students. The strategies included: repetition, immediate feedback, and use of original data. First, each strategy is addressed through the literature. Second, the application of employing each of the strategies over the period…

  1. Electrospining of polyaniline/poly(lactic acid) ultrathin fibers: process and statistical modeling using a non-gaussian approach

    USDA-ARS?s Scientific Manuscript database

    Cover: The electrospinning technique was employed to obtain conducting nanofibers based on polyaniline and poly(lactic acid). A statistical model was employed to describe how the process factors (solution concentration, applied voltage, and flow rate) govern the fiber dimensions. Nanofibers down to ...

  2. Primary prevention of dental erosion by calcium and fluoride: a systematic review.

    PubMed

    Zini, A; Krivoroutski, Y; Vered, Y

    2014-02-01

    Overviews of the current literature only provide summaries of existing relevant preventive strategies for dental erosion. To perform a systematic review according to the quantitative meta-analysis method of the scientific literature on prevention of dental erosion. The focused question will address primary prevention of dental erosion by calcium and fluoride. Randomized clinical trials (RCTs) regarding dental erosion prevention. The search included five databases: Embase, Cochrane database of systematic reviews, PubMed (MEDLINE), FDA publication and Berman medical library of the Hebrew University. The search included data in the English language, with effect on preventing dental erosion always presented as mean enamel loss and measured by profilometer. Statistical meta-analysis was performed by StatsDirect program and PEPI statistical software. Fixed- and random-effect models were used to analyse the data. Heterogeneity tests were employed to validate the fixed-effect model assumption. A total of 475 articles on dental erosion prevention were located. A four-stage selection process was employed, and 10 RCT articles were found to be suitable for meta-analysis. The number of studies on prevention of dental erosion maintaining standards of evidence-based dentistry remains insufficient to reach any definite conclusions. The focused questions of this review cannot be addressed according to the existing literature. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  3. Open Source Tools for Seismicity Analysis

    NASA Astrophysics Data System (ADS)

    Powers, P.

    2010-12-01

    The spatio-temporal analysis of seismicity plays an important role in earthquake forecasting and is integral to research on earthquake interactions and triggering. For instance, the third version of the Uniform California Earthquake Rupture Forecast (UCERF), currently under development, will use Epidemic Type Aftershock Sequences (ETAS) as a model for earthquake triggering. UCERF will be a "living" model and therefore requires robust, tested, and well-documented ETAS algorithms to ensure transparency and reproducibility. Likewise, as earthquake aftershock sequences unfold, real-time access to high quality hypocenter data makes it possible to monitor the temporal variability of statistical properties such as the parameters of the Omori Law and the Gutenberg Richter b-value. Such statistical properties are valuable as they provide a measure of how much a particular sequence deviates from expected behavior and can be used when assigning probabilities of aftershock occurrence. To address these demands and provide public access to standard methods employed in statistical seismology, we present well-documented, open-source JavaScript and Java software libraries for the on- and off-line analysis of seismicity. The Javascript classes facilitate web-based asynchronous access to earthquake catalog data and provide a framework for in-browser display, analysis, and manipulation of catalog statistics; implementations of this framework will be made available on the USGS Earthquake Hazards website. The Java classes, in addition to providing tools for seismicity analysis, provide tools for modeling seismicity and generating synthetic catalogs. These tools are extensible and will be released as part of the open-source OpenSHA Commons library.

  4. Training in metabolomics research. II. Processing and statistical analysis of metabolomics data, metabolite identification, pathway analysis, applications of metabolomics and its future.

    PubMed

    Barnes, Stephen; Benton, H Paul; Casazza, Krista; Cooper, Sara J; Cui, Xiangqin; Du, Xiuxia; Engler, Jeffrey; Kabarowski, Janusz H; Li, Shuzhao; Pathmasiri, Wimal; Prasain, Jeevan K; Renfrow, Matthew B; Tiwari, Hemant K

    2016-08-01

    Metabolomics, a systems biology discipline representing analysis of known and unknown pathways of metabolism, has grown tremendously over the past 20 years. Because of its comprehensive nature, metabolomics requires careful consideration of the question(s) being asked, the scale needed to answer the question(s), collection and storage of the sample specimens, methods for extraction of the metabolites from biological matrices, the analytical method(s) to be employed and the quality control of the analyses, how collected data are correlated, the statistical methods to determine metabolites undergoing significant change, putative identification of metabolites and the use of stable isotopes to aid in verifying metabolite identity and establishing pathway connections and fluxes. This second part of a comprehensive description of the methods of metabolomics focuses on data analysis, emerging methods in metabolomics and the future of this discipline. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  5. Point-by-point compositional analysis for atom probe tomography.

    PubMed

    Stephenson, Leigh T; Ceguerra, Anna V; Li, Tong; Rojhirunsakool, Tanaporn; Nag, Soumya; Banerjee, Rajarshi; Cairney, Julie M; Ringer, Simon P

    2014-01-01

    This new alternate approach to data processing for analyses that traditionally employed grid-based counting methods is necessary because it removes a user-imposed coordinate system that not only limits an analysis but also may introduce errors. We have modified the widely used "binomial" analysis for APT data by replacing grid-based counting with coordinate-independent nearest neighbour identification, improving the measurements and the statistics obtained, allowing quantitative analysis of smaller datasets, and datasets from non-dilute solid solutions. It also allows better visualisation of compositional fluctuations in the data. Our modifications include:.•using spherical k-atom blocks identified by each detected atom's first k nearest neighbours.•3D data visualisation of block composition and nearest neighbour anisotropy.•using z-statistics to directly compare experimental and expected composition curves. Similar modifications may be made to other grid-based counting analyses (contingency table, Langer-Bar-on-Miller, sinusoidal model) and could be instrumental in developing novel data visualisation options.

  6. The effect of telehealth systems and satisfaction with health expenditure among patients with metabolic syndrome.

    PubMed

    Uei, Shu-Lin; Tsai, Chung-Hung; Kuo, Yu-Ming

    2016-04-29

    Telehealth cost analysis has become a crucial issue for governments in recent years. In this study, we examined cases of metabolic syndrome in Hualien County, Taiwan. This research adopted the framework proposed by Marchand to establish a study process. In addition, descriptive statistics, a t test, analysis of variance, and regression analysis were employed to analyze 100 questionnaires. The results of the t$ test revealed significant differences in medical health expenditure, number of clinical visits for medical treatment, average amount of time spent commuting to clinics, amount of time spent undergoing medical treatment, and average number of people accompanying patients to medical care facilities or assisting with other tasks in the past one month, indicating that offering telehealth care services can reduce health expenditure. The statistical analysis results revealed that customer satisfaction has a positive effect on reducing health expenditure. Therefore, this study proves that telehealth care systems can effectively reduce health expenditure and directly improve customer satisfaction with medical treatment.

  7. GWAR: robust analysis and meta-analysis of genome-wide association studies.

    PubMed

    Dimou, Niki L; Tsirigos, Konstantinos D; Elofsson, Arne; Bagos, Pantelis G

    2017-05-15

    In the context of genome-wide association studies (GWAS), there is a variety of statistical techniques in order to conduct the analysis, but, in most cases, the underlying genetic model is usually unknown. Under these circumstances, the classical Cochran-Armitage trend test (CATT) is suboptimal. Robust procedures that maximize the power and preserve the nominal type I error rate are preferable. Moreover, performing a meta-analysis using robust procedures is of great interest and has never been addressed in the past. The primary goal of this work is to implement several robust methods for analysis and meta-analysis in the statistical package Stata and subsequently to make the software available to the scientific community. The CATT under a recessive, additive and dominant model of inheritance as well as robust methods based on the Maximum Efficiency Robust Test statistic, the MAX statistic and the MIN2 were implemented in Stata. Concerning MAX and MIN2, we calculated their asymptotic null distributions relying on numerical integration resulting in a great gain in computational time without losing accuracy. All the aforementioned approaches were employed in a fixed or a random effects meta-analysis setting using summary data with weights equal to the reciprocal of the combined cases and controls. Overall, this is the first complete effort to implement procedures for analysis and meta-analysis in GWAS using Stata. A Stata program and a web-server are freely available for academic users at http://www.compgen.org/tools/GWAR. pbagos@compgen.org. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  8. Psychiatric versus physical disabilities: A comparison of barriers and facilitators to employment.

    PubMed

    Sevak, Purvi; Khan, Shamima

    2017-06-01

    Guided by the social model of disability (Nagi, 1965), this study aims to better identify barriers to and facilitators of employment for individuals with psychiatric disabilities and how these factors may differ for individuals with physical disabilities. Our analysis uses data from the Survey of Disability and Employment on 2,148 individuals with psychiatric disabilities, physical disabilities, or both who in 2014 applied for services from 1 of 3 state vocational rehabilitation (VR) agencies. We identify type of disability based on respondents' open-ended descriptions of their impairments. We use univariate statistics and multivariate regression estimates to compare employment history, and potential barriers to and facilitators of employment between individuals with psychiatric and physical disabilities. VR applicants with psychiatric disabilities have had longer periods of nonemployment than individuals with physical disabilities alone. They are significantly more likely than individuals with physical disabilities alone to report nonhealth reasons, such as getting fired and lacking skills, as barriers to employment. We found that a number of accommodations, including flexible schedules and modified work duties, are significantly associated with continued employment. VR counselors should be aware that although most applicants with psychiatric disabilities place a great deal of importance on being employed, they face additional barriers to employment. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  9. The impact of the 2007-2009 recession on workers' health coverage.

    PubMed

    Fronstin, Paul

    2011-04-01

    IMPACT OF THE RECESSION: The 2007-2009 recession has taken its toll on the percentage of the population with employment-based health coverage. While, since 2000, there has been a slow erosion in the percentage of individuals under age 65 with employment-based health coverage, 2009 was the first year in which the percentage fell below 60 percent, and marked the largest one-year decline in coverage. FEWER WORKERS WITH COVERAGE: The percentage of workers with coverage through their own job fell from 53.2 percent in 2008 to 52 percent in 2009, a 2.4 percent decline in the likelihood that a worker has coverage through his or her own job. The percentage of workers with coverage as a dependent fell from 17 percent in 2008 to 16.3 percent in 2009, a 4.5 percent drop in the likelihood that a worker has coverage as a dependent. These declines occurred as the unemployment rate increased from an average of 5.8 percent in 2008 to 9.3 percent in 2009 (and reached a high of 10.1 percent during 2009). FIRM SIZE/INDUSTRY: The decline in the percentage of workers with coverage from their own job affected workers in private-sector firms of all sizes. Among public-sector workers, the decline from 73.4 percent to 73 percent was not statistically significant. Workers in all private-sector industries experienced a statistically significant decline in coverage between 2008 and 2009. HOURS WORKED: Full-time workers experienced a decline in coverage that was statistically significant while part-time workers did not. Among full-time workers, those employed full year experienced a statistically significant decline in coverage from their own job. Those employed full time but for only part of the year did not experience a statistically significant change in coverage. Among part-time workers, those employed full year experienced a statistically significant increase in the likelihood of having coverage in their own name, as did part-time workers employed for only part of the year. ANNUAL EARNINGS: The decline in the percentage of workers with coverage through their own job was limited to workers with lower annual earnings. Statistically significant declines were not found among any group of workers with annual earnings of at least $40,000. Workers with a high school education or less experienced a statistically significant decline in the likelihood of having coverage. Neither workers with a college degree nor those with a graduate degree experienced a statistically significant decline in coverage through their own job. Workers of all races experienced statistically significant declines in coverage between 2008 and 2009. Both men and women experienced a statistically significant decline in the percentage with health coverage through their own job. IMPACT OF STRUCTURAL CHANGES TO THE WORK FORCE: The movement of workers from the manufacturing industry to the service sector continued between 2008 and 2009. The percentage of workers employed on a full-time basis decreased while the percentage working part time increased. While there was an overall decline in the percentage of full-time workers, that decline was limited to workers employed full year. The percentage of workers employed on a full-time, part-year basis increased between 2008 and 2009. The distribution of workers by annual earnings shifted from middle-income workers to lower-income workers between 2008 and 2009.

  10. Markov chains and semi-Markov models in time-to-event analysis.

    PubMed

    Abner, Erin L; Charnigo, Richard J; Kryscio, Richard J

    2013-10-25

    A variety of statistical methods are available to investigators for analysis of time-to-event data, often referred to as survival analysis. Kaplan-Meier estimation and Cox proportional hazards regression are commonly employed tools but are not appropriate for all studies, particularly in the presence of competing risks and when multiple or recurrent outcomes are of interest. Markov chain models can accommodate censored data, competing risks (informative censoring), multiple outcomes, recurrent outcomes, frailty, and non-constant survival probabilities. Markov chain models, though often overlooked by investigators in time-to-event analysis, have long been used in clinical studies and have widespread application in other fields.

  11. Markov chains and semi-Markov models in time-to-event analysis

    PubMed Central

    Abner, Erin L.; Charnigo, Richard J.; Kryscio, Richard J.

    2014-01-01

    A variety of statistical methods are available to investigators for analysis of time-to-event data, often referred to as survival analysis. Kaplan-Meier estimation and Cox proportional hazards regression are commonly employed tools but are not appropriate for all studies, particularly in the presence of competing risks and when multiple or recurrent outcomes are of interest. Markov chain models can accommodate censored data, competing risks (informative censoring), multiple outcomes, recurrent outcomes, frailty, and non-constant survival probabilities. Markov chain models, though often overlooked by investigators in time-to-event analysis, have long been used in clinical studies and have widespread application in other fields. PMID:24818062

  12. Statistical mechanics of broadcast channels using low-density parity-check codes.

    PubMed

    Nakamura, Kazutaka; Kabashima, Yoshiyuki; Morelos-Zaragoza, Robert; Saad, David

    2003-03-01

    We investigate the use of Gallager's low-density parity-check (LDPC) codes in a degraded broadcast channel, one of the fundamental models in network information theory. Combining linear codes is a standard technique in practical network communication schemes and is known to provide better performance than simple time sharing methods when algebraic codes are used. The statistical physics based analysis shows that the practical performance of the suggested method, achieved by employing the belief propagation algorithm, is superior to that of LDPC based time sharing codes while the best performance, when received transmissions are optimally decoded, is bounded by the time sharing limit.

  13. Aeroelastic Uncertainty Quantification Studies Using the S4T Wind Tunnel Model

    NASA Technical Reports Server (NTRS)

    Nikbay, Melike; Heeg, Jennifer

    2017-01-01

    This paper originates from the joint efforts of an aeroelastic study team in the Applied Vehicle Technology Panel from NATO Science and Technology Organization, with the Task Group number AVT-191, titled "Application of Sensitivity Analysis and Uncertainty Quantification to Military Vehicle Design." We present aeroelastic uncertainty quantification studies using the SemiSpan Supersonic Transport wind tunnel model at the NASA Langley Research Center. The aeroelastic study team decided treat both structural and aerodynamic input parameters as uncertain and represent them as samples drawn from statistical distributions, propagating them through aeroelastic analysis frameworks. Uncertainty quantification processes require many function evaluations to asses the impact of variations in numerous parameters on the vehicle characteristics, rapidly increasing the computational time requirement relative to that required to assess a system deterministically. The increased computational time is particularly prohibitive if high-fidelity analyses are employed. As a remedy, the Istanbul Technical University team employed an Euler solver in an aeroelastic analysis framework, and implemented reduced order modeling with Polynomial Chaos Expansion and Proper Orthogonal Decomposition to perform the uncertainty propagation. The NASA team chose to reduce the prohibitive computational time by employing linear solution processes. The NASA team also focused on determining input sample distributions.

  14. Flexible employment and nurses' intention to leave the profession: The role of support at work.

    PubMed

    Zeytinoglu, Isik U; Denton, Margaret; Plenderleith, Jennifer Millen

    2011-02-01

    The objectives of this paper are to examine (1) the association between flexible employment and nurses' intention to leave the profession, and (2) whether or not support at work mediates the association between flexible employment and nurses' intention to leave the profession. Flexible employment is analyzed objectively using non-permanent contract, part-time employment status, casual employment status, involuntary hours and on-call work, and subjectively using job insecurity. Support at work refers to organizational, supervisor and peer support. Data come from our survey of 1396 nurses employed in three teaching hospitals in Southern Ontario. Descriptive statistics are provided. Bivariate correlations, hierarchical regression analysis and mediation tests are conducted. Compared to those in full-time employment, nurses in part-time employment do not intend to leave the profession. None of the other objective flexible employment factors are associated with intention to leave the profession. Perceived job insecurity is associated with intention to leave the profession. Low support at work contributes to intention to leave the profession and mediates the association between job insecurity and intention to leave the profession. The study provides evidence to health sector managers and policy makers that part-time employment, perceived job security and support at work are important factors to consider in efforts to retain nurses in the profession. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  15. Regional frequency analysis of extreme rainfalls using partial L moments method

    NASA Astrophysics Data System (ADS)

    Zakaria, Zahrahtul Amani; Shabri, Ani

    2013-07-01

    An approach based on regional frequency analysis using L moments and LH moments are revisited in this study. Subsequently, an alternative regional frequency analysis using the partial L moments (PL moments) method is employed, and a new relationship for homogeneity analysis is developed. The results were then compared with those obtained using the method of L moments and LH moments of order two. The Selangor catchment, consisting of 37 sites and located on the west coast of Peninsular Malaysia, is chosen as a case study. PL moments for the generalized extreme value (GEV), generalized logistic (GLO), and generalized Pareto distributions were derived and used to develop the regional frequency analysis procedure. PL moment ratio diagram and Z test were employed in determining the best-fit distribution. Comparison between the three approaches showed that GLO and GEV distributions were identified as the suitable distributions for representing the statistical properties of extreme rainfall in Selangor. Monte Carlo simulation used for performance evaluation shows that the method of PL moments would outperform L and LH moments methods for estimation of large return period events.

  16. Summary Statistics of CPB-Qualified Public Radio Stations, Fiscal Year 1972.

    ERIC Educational Resources Information Center

    Lee, S. Young; Pedone, Ronald J.

    Statistics in the areas of finance, employment, and broadcast and production for CPB-qualified (Corporation for Public Broadcasting) public radio stations are given in this report. Tables in the area of finance are presented specifying total funds, income, direct operating costs, and capital expenditure. Employment is divided into all employment…

  17. A Study of Arizona Labor Market Demand Data for Vocational Education Planning.

    ERIC Educational Resources Information Center

    Gould, Albert W.; Manning, Doris E.

    A study examined the project methodology used by the Bureau of Labor Statistics and the related projections made by the state employment security agencies. Findings from a literature review indicated that the system has steadily improved since 1979. Projections made from the Occupational Employment Statistics Surveys were remarkably accurate.…

  18. Conference Report on Youth Unemployment: Its Measurements and Meaning.

    ERIC Educational Resources Information Center

    Employment and Training Administration (DOL), Washington, DC.

    Thirteen papers presented at a conference on employment statistics and youth are contained in this report. Reviewed are the problems of gathering, interpreting, and applying employment and unemployment data relating to youth. The titles of the papers are as follow: "Counting Youth: A Comparison of Youth Labor Force Statistics in the Current…

  19. Accuracy Evaluation of the Unified P-Value from Combining Correlated P-Values

    PubMed Central

    Alves, Gelio; Yu, Yi-Kuo

    2014-01-01

    Meta-analysis methods that combine -values into a single unified -value are frequently employed to improve confidence in hypothesis testing. An assumption made by most meta-analysis methods is that the -values to be combined are independent, which may not always be true. To investigate the accuracy of the unified -value from combining correlated -values, we have evaluated a family of statistical methods that combine: independent, weighted independent, correlated, and weighted correlated -values. Statistical accuracy evaluation by combining simulated correlated -values showed that correlation among -values can have a significant effect on the accuracy of the combined -value obtained. Among the statistical methods evaluated those that weight -values compute more accurate combined -values than those that do not. Also, statistical methods that utilize the correlation information have the best performance, producing significantly more accurate combined -values. In our study we have demonstrated that statistical methods that combine -values based on the assumption of independence can produce inaccurate -values when combining correlated -values, even when the -values are only weakly correlated. Therefore, to prevent from drawing false conclusions during hypothesis testing, our study advises caution be used when interpreting the -value obtained from combining -values of unknown correlation. However, when the correlation information is available, the weighting-capable statistical method, first introduced by Brown and recently modified by Hou, seems to perform the best amongst the methods investigated. PMID:24663491

  20. Spotlight-8 Image Analysis Software

    NASA Technical Reports Server (NTRS)

    Klimek, Robert; Wright, Ted

    2006-01-01

    Spotlight is a cross-platform GUI-based software package designed to perform image analysis on sequences of images generated by combustion and fluid physics experiments run in a microgravity environment. Spotlight can perform analysis on a single image in an interactive mode or perform analysis on a sequence of images in an automated fashion. Image processing operations can be employed to enhance the image before various statistics and measurement operations are performed. An arbitrarily large number of objects can be analyzed simultaneously with independent areas of interest. Spotlight saves results in a text file that can be imported into other programs for graphing or further analysis. Spotlight can be run on Microsoft Windows, Linux, and Apple OS X platforms.

  1. On intracluster Faraday rotation. II - Statistical analysis

    NASA Technical Reports Server (NTRS)

    Lawler, J. M.; Dennison, B.

    1982-01-01

    The comparison of a reliable sample of radio source Faraday rotation measurements seen through rich clusters of galaxies, with sources seen through the outer parts of clusters and therefore having little intracluster Faraday rotation, indicates that the distribution of rotation in the former population is broadened, but only at the 80% level of statistical confidence. Employing a physical model for the intracluster medium in which the square root of magnetic field strength/turbulent cell per gas core radius number ratio equals approximately 0.07 microgauss, a Monte Carlo simulation is able to reproduce the observed broadening. An upper-limit analysis figure of less than 0.20 microgauss for the field strength/turbulent cell ratio, combined with lower limits on field strength imposed by limitations on the Compton-scattered flux, shows that intracluster magnetic fields must be tangled on scales greater than about 20 kpc.

  2. Computational tools for multi-linked flexible structures

    NASA Technical Reports Server (NTRS)

    Lee, Gordon K. F.; Brubaker, Thomas A.; Shults, James R.

    1990-01-01

    A software module which designs and tests controllers and filters in Kalman Estimator form, based on a polynomial state-space model is discussed. The user-friendly program employs an interactive graphics approach to simplify the design process. A variety of input methods are provided to test the effectiveness of the estimator. Utilities are provided which address important issues in filter design such as graphical analysis, statistical analysis, and calculation time. The program also provides the user with the ability to save filter parameters, inputs, and outputs for future use.

  3. Reporting Practices and Use of Quantitative Methods in Canadian Journal Articles in Psychology.

    PubMed

    Counsell, Alyssa; Harlow, Lisa L

    2017-05-01

    With recent focus on the state of research in psychology, it is essential to assess the nature of the statistical methods and analyses used and reported by psychological researchers. To that end, we investigated the prevalence of different statistical procedures and the nature of statistical reporting practices in recent articles from the four major Canadian psychology journals. The majority of authors evaluated their research hypotheses through the use of analysis of variance (ANOVA), t -tests, and multiple regression. Multivariate approaches were less common. Null hypothesis significance testing remains a popular strategy, but the majority of authors reported a standardized or unstandardized effect size measure alongside their significance test results. Confidence intervals on effect sizes were infrequently employed. Many authors provided minimal details about their statistical analyses and less than a third of the articles presented on data complications such as missing data and violations of statistical assumptions. Strengths of and areas needing improvement for reporting quantitative results are highlighted. The paper concludes with recommendations for how researchers and reviewers can improve comprehension and transparency in statistical reporting.

  4. A Dynamic Intrusion Detection System Based on Multivariate Hotelling's T2 Statistics Approach for Network Environments

    PubMed Central

    Avalappampatty Sivasamy, Aneetha; Sundan, Bose

    2015-01-01

    The ever expanding communication requirements in today's world demand extensive and efficient network systems with equally efficient and reliable security features integrated for safe, confident, and secured communication and data transfer. Providing effective security protocols for any network environment, therefore, assumes paramount importance. Attempts are made continuously for designing more efficient and dynamic network intrusion detection models. In this work, an approach based on Hotelling's T2 method, a multivariate statistical analysis technique, has been employed for intrusion detection, especially in network environments. Components such as preprocessing, multivariate statistical analysis, and attack detection have been incorporated in developing the multivariate Hotelling's T2 statistical model and necessary profiles have been generated based on the T-square distance metrics. With a threshold range obtained using the central limit theorem, observed traffic profiles have been classified either as normal or attack types. Performance of the model, as evaluated through validation and testing using KDD Cup'99 dataset, has shown very high detection rates for all classes with low false alarm rates. Accuracy of the model presented in this work, in comparison with the existing models, has been found to be much better. PMID:26357668

  5. A Dynamic Intrusion Detection System Based on Multivariate Hotelling's T2 Statistics Approach for Network Environments.

    PubMed

    Sivasamy, Aneetha Avalappampatty; Sundan, Bose

    2015-01-01

    The ever expanding communication requirements in today's world demand extensive and efficient network systems with equally efficient and reliable security features integrated for safe, confident, and secured communication and data transfer. Providing effective security protocols for any network environment, therefore, assumes paramount importance. Attempts are made continuously for designing more efficient and dynamic network intrusion detection models. In this work, an approach based on Hotelling's T(2) method, a multivariate statistical analysis technique, has been employed for intrusion detection, especially in network environments. Components such as preprocessing, multivariate statistical analysis, and attack detection have been incorporated in developing the multivariate Hotelling's T(2) statistical model and necessary profiles have been generated based on the T-square distance metrics. With a threshold range obtained using the central limit theorem, observed traffic profiles have been classified either as normal or attack types. Performance of the model, as evaluated through validation and testing using KDD Cup'99 dataset, has shown very high detection rates for all classes with low false alarm rates. Accuracy of the model presented in this work, in comparison with the existing models, has been found to be much better.

  6. Prevention of the Posttraumatic Fibrotic Response in Joints

    DTIC Science & Technology

    2015-10-01

    used on a regular basis. Major Task 4: Evaluating the efficacy of inhibitory chIgG to reduce the consequences of traumatic joint injury. During...the second year of study, we successfully employed all assays needed to evaluate the utility of the inhibitory antibody to reduce the flexion...1. Major Task 5: Task 4. Data analysis and statistical evaluation of results. All data from the mechanical measurements, from the biochemical

  7. Satellite temperature monitoring and prediction system

    NASA Technical Reports Server (NTRS)

    Barnett, U. R.; Martsolf, J. D.; Crosby, F. L.

    1980-01-01

    The paper describes the Florida Satellite Freeze Forecast System (SFFS) in its current state. All data collection options have been demonstrated, and data collected over a three year period have been stored for future analysis. Presently, specific minimum temperature forecasts are issued routinely from November through March. The procedures for issuing these forecast are discussed. The automated data acquisition and processing system is described, and the physical and statistical models employed are examined.

  8. [Induced abortion and labor activity. Reflections for discussion].

    PubMed

    Orjuela-Ramírez, María E

    2012-06-01

    The induced abortion is a global phenomenon that according to various authors respond to socially constructed patterns of behavior and where they influence social realities of each country. This phenomenon requires the information necessary to understand the complex process leading to the decision of women to opt for abortion and able to understand the social, economic and health that can explain this requirement. For this purpose is presented for discussion, some considerations on voluntary abortion and labor activity of women who opt for this practice, with special mention of the situation in Spain. The arguments are supported by statistical analysis of the voluntary interruption of pregnancy (IVE) reported by the Ministry of Health and Social Policy, participation of women in the labor market in Spain obtained from the National Statistics Institute (INE), the research results on the association between employment status of women and voluntary termination of pregnancy and comprehensive review of scientific literature on the different perspectives of the approach of voluntary abortion. Analysis deserves special importance of women's work activity as a possible factor in the decision of women to terminate their pregnancies, a variable that has been identified in most of the investigations as a socioeconomic condition for women who choose for that alternative, considering that pregnancy interferes with the employment of women or, rather, prevents them from use.

  9. Ursgal, Universal Python Module Combining Common Bottom-Up Proteomics Tools for Large-Scale Analysis.

    PubMed

    Kremer, Lukas P M; Leufken, Johannes; Oyunchimeg, Purevdulam; Schulze, Stefan; Fufezan, Christian

    2016-03-04

    Proteomics data integration has become a broad field with a variety of programs offering innovative algorithms to analyze increasing amounts of data. Unfortunately, this software diversity leads to many problems as soon as the data is analyzed using more than one algorithm for the same task. Although it was shown that the combination of multiple peptide identification algorithms yields more robust results, it is only recently that unified approaches are emerging; however, workflows that, for example, aim to optimize search parameters or that employ cascaded style searches can only be made accessible if data analysis becomes not only unified but also and most importantly scriptable. Here we introduce Ursgal, a Python interface to many commonly used bottom-up proteomics tools and to additional auxiliary programs. Complex workflows can thus be composed using the Python scripting language using a few lines of code. Ursgal is easily extensible, and we have made several database search engines (X!Tandem, OMSSA, MS-GF+, Myrimatch, MS Amanda), statistical postprocessing algorithms (qvality, Percolator), and one algorithm that combines statistically postprocessed outputs from multiple search engines ("combined FDR") accessible as an interface in Python. Furthermore, we have implemented a new algorithm ("combined PEP") that combines multiple search engines employing elements of "combined FDR", PeptideShaker, and Bayes' theorem.

  10. Logistic regression for risk factor modelling in stuttering research.

    PubMed

    Reed, Phil; Wu, Yaqionq

    2013-06-01

    To outline the uses of logistic regression and other statistical methods for risk factor analysis in the context of research on stuttering. The principles underlying the application of a logistic regression are illustrated, and the types of questions to which such a technique has been applied in the stuttering field are outlined. The assumptions and limitations of the technique are discussed with respect to existing stuttering research, and with respect to formulating appropriate research strategies to accommodate these considerations. Finally, some alternatives to the approach are briefly discussed. The way the statistical procedures are employed are demonstrated with some hypothetical data. Research into several practical issues concerning stuttering could benefit if risk factor modelling were used. Important examples are early diagnosis, prognosis (whether a child will recover or persist) and assessment of treatment outcome. After reading this article you will: (a) Summarize the situations in which logistic regression can be applied to a range of issues about stuttering; (b) Follow the steps in performing a logistic regression analysis; (c) Describe the assumptions of the logistic regression technique and the precautions that need to be checked when it is employed; (d) Be able to summarize its advantages over other techniques like estimation of group differences and simple regression. Copyright © 2012 Elsevier Inc. All rights reserved.

  11. Computer-Assisted Instruction in Statistics. Technical Report.

    ERIC Educational Resources Information Center

    Cooley, William W.

    A paper given at a conference on statistical computation discussed teaching statistics with computers. It concluded that computer-assisted instruction is most appropriately employed in the numerical demonstration of statistical concepts, and for statistical laboratory instruction. The student thus learns simultaneously about the use of computers…

  12. Dermatological and respiratory problems in migrant construction workers of Udupi, Karnataka.

    PubMed

    Banerjee, Mayuri; Kamath, Ramachandra; Tiwari, Rajnarayan R; Nair, Narayana Pillai Sreekumaran

    2015-01-01

    India being a developing country has tremendous demand of physical infrastructure and construction work as a result there is a raising demand of construction workers. Workers in construction industry are mainly migratory and employed on contract or subcontract basis. These workers face temporary relationship between employer and employee, uncertainty in working hours, contracting and subcontracting system, lack of basic continuous employment, lack basic amenities, and inadequacy in welfare schemes. To estimate the prevalence of respiratory and dermatological symptoms among migratory construction workers. This cross-sectional study was conducted in Manipal, Karnataka, among 340 male migratory construction workers. A standard modified questionnaire was used as a tool by the interviewer and the physical examination of the workers was done by a physician. The statistical analysis was done using Statistical Package for the Social Sciences (SPSS) version 15.0. Eighty percent of the workers belong to the age group of 18-30 years. The mean age of the workers was 26 ± 8.2 years. Most (43.8%) of the workers are from West Bengal followed by those from Bihar and Jharkhand. The rates of prevalence of respiratory and dermatological symptoms were 33.2% and 36.2%, respectively. The migrant construction workers suffer from a high proportion of respiratory and dermatological problems.

  13. A critical evaluation of ecological indices for the comparative analysis of microbial communities based on molecular datasets.

    PubMed

    Lucas, Rico; Groeneveld, Jürgen; Harms, Hauke; Johst, Karin; Frank, Karin; Kleinsteuber, Sabine

    2017-01-01

    In times of global change and intensified resource exploitation, advanced knowledge of ecophysiological processes in natural and engineered systems driven by complex microbial communities is crucial for both safeguarding environmental processes and optimising rational control of biotechnological processes. To gain such knowledge, high-throughput molecular techniques are routinely employed to investigate microbial community composition and dynamics within a wide range of natural or engineered environments. However, for molecular dataset analyses no consensus about a generally applicable alpha diversity concept and no appropriate benchmarking of corresponding statistical indices exist yet. To overcome this, we listed criteria for the appropriateness of an index for such analyses and systematically scrutinised commonly employed ecological indices describing diversity, evenness and richness based on artificial and real molecular datasets. We identified appropriate indices warranting interstudy comparability and intuitive interpretability. The unified diversity concept based on 'effective numbers of types' provides the mathematical framework for describing community composition. Additionally, the Bray-Curtis dissimilarity as a beta-diversity index was found to reflect compositional changes. The employed statistical procedure is presented comprising commented R-scripts and example datasets for user-friendly trial application. © FEMS 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  14. Public health workforce employment in US public and private sectors.

    PubMed

    Kennedy, Virginia C

    2009-01-01

    The purpose of this study was to describe the number and distribution of 26 administrative, professional, and technical public health occupations across the array of US governmental and nongovernmental industries. This study used data from the Occupational Employment Statistics program of the US Bureau of Labor Statistics. For each occupation of interest, the investigator determined the number of persons employed in 2006 in five industries and industry groups: government, nonprofit agencies, education, healthcare, and all other industries. Industry-specific employment profiles varied from one occupation to another. However, about three-fourths of all those engaged in these occupations worked in the private healthcare industry. Relatively few worked in nonprofit or educational settings, and less than 10 percent were employed in government agencies. The industry-specific distribution of public health personnel, particularly the proportion employed in the public sector, merits close monitoring. This study also highlights the need for a better understanding of the work performed by public health occupations in nongovernmental work settings. Finally, the Occupational Employment Statistics program has the potential to serve as an ongoing, national data collection system for public health workforce information. If this potential was realized, future workforce enumerations would not require primary data collection but rather could be accomplished using secondary data.

  15. Austerity, precariousness, and the health status of Greek labour market participants: Retrospective cohort analysis of employed and unemployed persons in 2008-2009 and 2010-2011.

    PubMed

    Barlow, Pepita; Reeves, Aaron; McKee, Martin; Stuckler, David

    2015-11-01

    Greece implemented the deepest austerity package in Europe during the Great Recession (from 2008), including reductions in severance pay and redundancy notice periods. To evaluate whether these measures worsened labour market participants' health status, we compared changes in self-reported health using two cohorts of employed individuals in Greece from the European Union Statistics on Income and Living Conditions. During the initial recession (2008-2009) we found that self-reported health worsened both for those remaining in employment and those who lost jobs. Similarly, during the austerity programme (2010-2011) people who lost jobs experienced greater health declines. Importantly, individuals who remained employed in 2011 were also 25 per cent more likely to experience a health decline than in 2009. These harms appeared concentrated in people aged 45-54 who lost jobs. Our study moves beyond existing findings by demonstrating that austerity both exacerbates the negative health consequences of job loss and worsens the health of those still employed.

  16. Job sharing: a retention strategy for nurses.

    PubMed

    Kane, D

    1999-01-01

    Job sharing is a part-time employment alternative which offers advantages for employers interested in retaining experienced staff and nurses who are seeking a more equitable balance between work life and home life responsibilities. This quasi experimental, ex post facto research study was designed to determine if there are differences in job satisfaction, burnout, and desire to leave their position, in nurses who are employed in full-time, part-time, or job sharing positions. The sample (N = 269) was drawn from a large Canadian teaching and referral hospital. Three sample groups were developed, consisting of job sharing, full-time, and part-time nurses, respectively. Descriptive statistics were used to identify characteristics of the selected population. Analysis of variance was used to examine differences between the three employment groups on the outcome measures of job satisfaction, burnout, and desire to leave their position. The results of this study significantly support the belief that job sharing has a positive impact on job satisfaction and job retention. Implications for nursing administrators as well as individual nurses will be discussed.

  17. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  18. A large-scale perspective on stress-induced alterations in resting-state networks

    NASA Astrophysics Data System (ADS)

    Maron-Katz, Adi; Vaisvaser, Sharon; Lin, Tamar; Hendler, Talma; Shamir, Ron

    2016-02-01

    Stress is known to induce large-scale neural modulations. However, its neural effect once the stressor is removed and how it relates to subjective experience are not fully understood. Here we used a statistically sound data-driven approach to investigate alterations in large-scale resting-state functional connectivity (rsFC) induced by acute social stress. We compared rsfMRI profiles of 57 healthy male subjects before and after stress induction. Using a parcellation-based univariate statistical analysis, we identified a large-scale rsFC change, involving 490 parcel-pairs. Aiming to characterize this change, we employed statistical enrichment analysis, identifying anatomic structures that were significantly interconnected by these pairs. This analysis revealed strengthening of thalamo-cortical connectivity and weakening of cross-hemispheral parieto-temporal connectivity. These alterations were further found to be associated with change in subjective stress reports. Integrating report-based information on stress sustainment 20 minutes post induction, revealed a single significant rsFC change between the right amygdala and the precuneus, which inversely correlated with the level of subjective recovery. Our study demonstrates the value of enrichment analysis for exploring large-scale network reorganization patterns, and provides new insight on stress-induced neural modulations and their relation to subjective experience.

  19. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples, volume 1

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  20. Manpower Resources for Scientific Activities at Universities and Colleges, January 1976. Detailed Statistical Tables, Appendix B.

    ERIC Educational Resources Information Center

    Loycano, Robert J.

    The data presented in these tabulations are based on the 1976 National Science Foundation survey of scientific and engineering personnel employed at universities and colleges. The data are contained in 60 statistical tables organized under the following broad headings: trends; type of institution; field, employment status, control, educational…

  1. Opticians Employed in Health Services; United States--1969. Vital and Health Statistics, Series 14, No. 3.

    ERIC Educational Resources Information Center

    National Center for Health Statistics (DHEW/PHS), Hyattsville, MD.

    First in a series of statistical reports on personnel providing vision and eye care assistance, the report presents data collected by the Bureau of Census (geographic location, age, sex, education, type and place of employment, training, specialties, activities, and time spent at work) concerning opticians actively engaged in their profession…

  2. Maternal Factors Predicting Cognitive and Behavioral Characteristics of Children with Fetal Alcohol Spectrum Disorders

    PubMed Central

    May, Philip A.; Tabachnick, Barbara G.; Gossage, J. Phillip; Kalberg, Wendy O.; Marais, Anna-Susan; Robinson, Luther K.; Manning, Melanie A.; Blankenship, Jason; Buckley, David; Hoyme, H. Eugene; Adnams, Colleen M.

    2013-01-01

    Objective To provide an analysis of multiple predictors of cognitive and behavioral traits for children with fetal alcohol spectrum disorders (FASD). Method Multivariate correlation techniques were employed with maternal and child data from epidemiologic studies in a community in South Africa. Data on 561 first grade children with fetal alcohol syndrome (FAS), partial FAS (PFAS), and not FASD and their mothers were analyzed by grouping 19 maternal variables into categories (physical, demographic, childbearing, and drinking) and employed in structural equation models (SEM) to assess correlates of child intelligence (verbal and non-verbal) and behavior. Results A first SEM utilizing only seven maternal alcohol use variables to predict cognitive/behavioral traits was statistically significant (B = 3.10, p < .05), but explained only 17.3% of the variance. The second model incorporated multiple maternal variables and was statistically significant explaining 55.3% of the variance. Significantly correlated with low intelligence and problem behavior were demographic (B = 3.83, p < .05) (low maternal education, low socioeconomic status (SES), and rural residence) and maternal physical characteristics (B = 2.70, p < .05) (short stature, small head circumference, and low weight). Childbearing history and alcohol use composites were not statistically significant in the final complex model, and were overpowered by SES and maternal physical traits. Conclusions While other analytic techniques have amply demonstrated the negative effects of maternal drinking on intelligence and behavior, this highly-controlled analysis of multiple maternal influences reveals that maternal demographics and physical traits make a significant enabling or disabling contribution to child functioning in FASD. PMID:23751886

  3. Impact of Rating Scale Categories on Reliability and Fit Statistics of the Malay Spiritual Well-Being Scale using Rasch Analysis.

    PubMed

    Daher, Aqil Mohammad; Ahmad, Syed Hassan; Winn, Than; Selamat, Mohd Ikhsan

    2015-01-01

    Few studies have employed the item response theory in examining reliability. We conducted this study to examine the effect of Rating Scale Categories (RSCs) on the reliability and fit statistics of the Malay Spiritual Well-Being Scale, employing the Rasch model. The Malay Spiritual Well-Being Scale (SWBS) with the original six; three and four newly structured RSCs was distributed randomly among three different samples of 50 participants each. The mean age of respondents in the three samples ranged between 36 and 39 years old. The majority was female in all samples, and Islam was the most prevalent religion among the respondents. The predominating race was Malay, followed by Chinese and Indian. The original six RSCs indicated better targeting of 0.99 and smallest model error of 0.24. The Infit Mnsq (mean square) and Zstd (Z standard) of the six RSCs were "1.1"and "-0.1"respectively. The six RSCs achieved the highest person and item reliabilities of 0.86 and 0.85 respectively. These reliabilities yielded the highest person (2.46) and item (2.38) separation indices compared to other the RSCs. The person and item reliability and, to a lesser extent, the fit statistics, were better with the six RSCs compared to the four and three RSCs.

  4. Statistical Modelling of Temperature and Moisture Uptake of Biochars Exposed to Selected Relative Humidity of Air.

    PubMed

    Bastistella, Luciane; Rousset, Patrick; Aviz, Antonio; Caldeira-Pires, Armando; Humbert, Gilles; Nogueira, Manoel

    2018-02-09

    New experimental techniques, as well as modern variants on known methods, have recently been employed to investigate the fundamental reactions underlying the oxidation of biochar. The purpose of this paper was to experimentally and statistically study how the relative humidity of air, mass, and particle size of four biochars influenced the adsorption of water and the increase in temperature. A random factorial design was employed using the intuitive statistical software Xlstat. A simple linear regression model and an analysis of variance with a pairwise comparison were performed. The experimental study was carried out on the wood of Quercus pubescens , Cyclobalanopsis glauca , Trigonostemon huangmosun , and Bambusa vulgaris , and involved five relative humidity conditions (22, 43, 75, 84, and 90%), two mass samples (0.1 and 1 g), and two particle sizes (powder and piece). Two response variables including water adsorption and temperature increase were analyzed and discussed. The temperature did not increase linearly with the adsorption of water. Temperature was modeled by nine explanatory variables, while water adsorption was modeled by eight. Five variables, including factors and their interactions, were found to be common to the two models. Sample mass and relative humidity influenced the two qualitative variables, while particle size and biochar type only influenced the temperature.

  5. Post-operative diffusion weighted imaging as a predictor of posterior fossa syndrome permanence in paediatric medulloblastoma.

    PubMed

    Chua, Felicia H Z; Thien, Ady; Ng, Lee Ping; Seow, Wan Tew; Low, David C Y; Chang, Kenneth T E; Lian, Derrick W Q; Loh, Eva; Low, Sharon Y Y

    2017-03-01

    Posterior fossa syndrome (PFS) is a serious complication faced by neurosurgeons and their patients, especially in paediatric medulloblastoma patients. The uncertain aetiology of PFS, myriad of cited risk factors and therapeutic challenges make this phenomenon an elusive entity. The primary objective of this study was to identify associative factors related to the development of PFS in medulloblastoma patient post-tumour resection. This is a retrospective study based at a single institution. Patient data and all related information were collected from the hospital records, in accordance to a list of possible risk factors associated with PFS. These included pre-operative tumour volume, hydrocephalus, age, gender, extent of resection, metastasis, ventriculoperitoneal shunt insertion, post-operative meningitis and radiological changes in MRI. Additional variables included molecular and histological subtypes of each patient's medulloblastoma tumour. Statistical analysis was employed to determine evidence of each variable's significance in PFS permanence. A total of 19 patients with appropriately complete data was identified. Initial univariate analysis did not show any statistical significance. However, multivariate analysis for MRI-specific changes reported bilateral DWI restricted diffusion changes involving both right and left sides of the surgical cavity was of statistical significance for PFS permanence. The authors performed a clinical study that evaluated possible risk factors for permanent PFS in paediatric medulloblastoma patients. Analysis of collated results found that post-operative DWI restriction in bilateral regions within the surgical cavity demonstrated statistical significance as a predictor of PFS permanence-a novel finding in the current literature.

  6. Work Experiences of Patients Receiving Palliative Care at a Comprehensive Cancer Center: Exploratory Analysis.

    PubMed

    Glare, Paul A; Nikolova, Tanya; Alickaj, Alberta; Patil, Sujata; Blinder, Victoria

    2017-07-01

    Employment-related issues have been largely overlooked in cancer patients needing palliative care. These issues may become more relevant as cancer evolves into more of a chronic illness and palliative care is provided independent of stage or prognosis. To characterize the employment situations of working-age palliative care patients. Cross-sectional survey setting/subjects: Consecutive sample of 112 patients followed in palliative care outpatient clinics at a comprehensive cancer center. Thirty-seven-item self-report questionnaire covering demographics, clinical status, and work experiences since diagnosis. The commonest cancer diagnoses were breast, colorectal, gynecological, and lung. Eighty-one percent had active disease. Seventy-four percent were on treatment. Eighty percent recalled being employed at the time of diagnosis, with 65% working full time. At the time of the survey, 44% were employed and 26% were working full time. Most participants said work was important, made them feel normal, and helped them feel they were "beating the cancer". Factors associated with being employed included male gender, self-employed, and taking less than three months off work. Respondents with pain and/or other symptoms were significantly less likely to be working. On multivariate analysis, only pain (odds ratio [OR] 8.16, p < 0.001) and other physical symptoms (OR 5.90, p = 0.012) predicted work status; gender (OR 2.07), self-employed (OR 3.07), and current chemotherapy (OR 1.81) were included in the model, but were not statistically significant in this small sample. Work may be an important issue for some palliative care patients. Additional research is needed to facilitate ongoing employment for those who wish or need to continue working.

  7. Client predictors of employment outcomes in high-fidelity supported employment: a regression analysis.

    PubMed

    Campbell, Kikuko; Bond, Gary R; Drake, Robert E; McHugo, Gregory J; Xie, Haiyi

    2010-08-01

    Research on vocational rehabilitation for clients with severe mental illness over the past 2 decades has yielded inconsistent findings regarding client factors statistically related to employment. The present study aimed to elucidate the relationship between baseline client characteristics and competitive employment outcomes-job acquisition and total weeks worked during an 18-month follow-up-in Individual Placement and Support (IPS). Data from 4 recent randomized controlled trials of IPS were aggregated for within-group regression analyses. In the IPS sample (N = 307), work history was the only significant predictor for job acquisition, but receiving Supplemental Security Income-with or without Social Security Disability Insurance-was associated with fewer total weeks worked (2.0%-2.8% of the variance). In the comparison sample (N = 374), clients with a diagnosis of mood disorder or with less severe thought disorder symptoms were more likely to obtain competitive employment. The findings confirm that clients with severe mental illness interested in competitive work best benefit from high-fidelity supported employment regardless of their work history and sociodemographic and clinical background, and highlight the needs for changes in federal policies for disability income support and insurance regulations.

  8. Validation of Physics Standardized Test Items

    NASA Astrophysics Data System (ADS)

    Marshall, Jill

    2008-10-01

    The Texas Physics Assessment Team (TPAT) examined the Texas Assessment of Knowledge and Skills (TAKS) to determine whether it is a valid indicator of physics preparation for future course work and employment, and of the knowledge and skills needed to act as an informed citizen in a technological society. We categorized science items from the 2003 and 2004 10th and 11th grade TAKS by content area(s) covered, knowledge and skills required to select the correct answer, and overall quality. We also analyzed a 5000 student sample of item-level results from the 2004 11th grade exam using standard statistical methods employed by test developers (factor analysis and Item Response Theory). Triangulation of our results revealed strengths and weaknesses of the different methods of analysis. The TAKS was found to be only weakly indicative of physics preparation and we make recommendations for increasing the validity of standardized physics testing..

  9. Florida's Workforce 2005.

    ERIC Educational Resources Information Center

    Florida State Dept. of Labor and Employment Security, Tallahassee.

    This report analyzes projected changes in population, labor force, and employment by industry and occupation for Florida between 1995 and 2005. More than 50 charts and graphs provide statistics on the following: Florida's population, labor force 1975-2005; employment 1975-2005; industry employment 1995-2005; occupational employment (general);…

  10. Analysis of Statistical Methods Currently used in Toxicology Journals

    PubMed Central

    Na, Jihye; Yang, Hyeri

    2014-01-01

    Statistical methods are frequently used in toxicology, yet it is not clear whether the methods employed by the studies are used consistently and conducted based on sound statistical grounds. The purpose of this paper is to describe statistical methods used in top toxicology journals. More specifically, we sampled 30 papers published in 2014 from Toxicology and Applied Pharmacology, Archives of Toxicology, and Toxicological Science and described methodologies used to provide descriptive and inferential statistics. One hundred thirteen endpoints were observed in those 30 papers, and most studies had sample size less than 10, with the median and the mode being 6 and 3 & 6, respectively. Mean (105/113, 93%) was dominantly used to measure central tendency, and standard error of the mean (64/113, 57%) and standard deviation (39/113, 34%) were used to measure dispersion, while few studies provide justifications regarding why the methods being selected. Inferential statistics were frequently conducted (93/113, 82%), with one-way ANOVA being most popular (52/93, 56%), yet few studies conducted either normality or equal variance test. These results suggest that more consistent and appropriate use of statistical method is necessary which may enhance the role of toxicology in public health. PMID:25343012

  11. Analysis of Statistical Methods Currently used in Toxicology Journals.

    PubMed

    Na, Jihye; Yang, Hyeri; Bae, SeungJin; Lim, Kyung-Min

    2014-09-01

    Statistical methods are frequently used in toxicology, yet it is not clear whether the methods employed by the studies are used consistently and conducted based on sound statistical grounds. The purpose of this paper is to describe statistical methods used in top toxicology journals. More specifically, we sampled 30 papers published in 2014 from Toxicology and Applied Pharmacology, Archives of Toxicology, and Toxicological Science and described methodologies used to provide descriptive and inferential statistics. One hundred thirteen endpoints were observed in those 30 papers, and most studies had sample size less than 10, with the median and the mode being 6 and 3 & 6, respectively. Mean (105/113, 93%) was dominantly used to measure central tendency, and standard error of the mean (64/113, 57%) and standard deviation (39/113, 34%) were used to measure dispersion, while few studies provide justifications regarding why the methods being selected. Inferential statistics were frequently conducted (93/113, 82%), with one-way ANOVA being most popular (52/93, 56%), yet few studies conducted either normality or equal variance test. These results suggest that more consistent and appropriate use of statistical method is necessary which may enhance the role of toxicology in public health.

  12. Spatial Autocorrelation Approaches to Testing Residuals from Least Squares Regression

    PubMed Central

    Chen, Yanguang

    2016-01-01

    In geo-statistics, the Durbin-Watson test is frequently employed to detect the presence of residual serial correlation from least squares regression analyses. However, the Durbin-Watson statistic is only suitable for ordered time or spatial series. If the variables comprise cross-sectional data coming from spatial random sampling, the test will be ineffectual because the value of Durbin-Watson’s statistic depends on the sequence of data points. This paper develops two new statistics for testing serial correlation of residuals from least squares regression based on spatial samples. By analogy with the new form of Moran’s index, an autocorrelation coefficient is defined with a standardized residual vector and a normalized spatial weight matrix. Then by analogy with the Durbin-Watson statistic, two types of new serial correlation indices are constructed. As a case study, the two newly presented statistics are applied to a spatial sample of 29 China’s regions. These results show that the new spatial autocorrelation models can be used to test the serial correlation of residuals from regression analysis. In practice, the new statistics can make up for the deficiencies of the Durbin-Watson test. PMID:26800271

  13. Morphological texture assessment of oral bone as a screening tool for osteoporosis

    NASA Astrophysics Data System (ADS)

    Analoui, Mostafa; Eggertsson, Hafsteinn; Eckert, George

    2001-07-01

    Three classes of texture analysis approaches have been employed to assess the textural characteristic of oral bone. A set of linear structuring elements was used to compute granulometric features of trabecular bone. Multifractal analysis was also used to compute the fractal dimension of the corresponding tissues. In addition, some statistical features and histomorphometric parameters were computed. To assess the proposed approach we acquired digital intraoral radiographs of 47 subjects (14 males and 33 females). All radiographs were captured at 12 bits/pixel. Images were converted to binary form through a sliding locally adaptive thresholding approach. Each subject was scanned by DEXA for bone dosimetry. Subject were classified into one of the following three categories according World Health Organization (WHO) standard (1) healthy, (2) with osteopenia and (3) osteoporosis. In this study fractal dimension showed very low correlation with bone mineral density (BMD) measurements, which did not reach a level of statistical significance (p<0.5). However, entropy of pattern spectrum (EPS), along with statistical features and histomorphometric parameters, has shown correlation coefficients ranging from low to high, with statistical significance for both males and females. The results of this study indicate the utility of this approach for bone texture analysis. It is conjectured that designing a 2-D structuring element, specially tuned to trabecular bone texture, will increase the efficacy of the proposed method.

  14. Emergence of patterns in random processes

    NASA Astrophysics Data System (ADS)

    Newman, William I.; Turcotte, Donald L.; Malamud, Bruce D.

    2012-08-01

    Sixty years ago, it was observed that any independent and identically distributed (i.i.d.) random variable would produce a pattern of peak-to-peak sequences with, on average, three events per sequence. This outcome was employed to show that randomness could yield, as a null hypothesis for animal populations, an explanation for their apparent 3-year cycles. We show how we can explicitly obtain a universal distribution of the lengths of peak-to-peak sequences in time series and that this can be employed for long data sets as a test of their i.i.d. character. We illustrate the validity of our analysis utilizing the peak-to-peak statistics of a Gaussian white noise. We also consider the nearest-neighbor cluster statistics of point processes in time. If the time intervals are random, we show that cluster size statistics are identical to the peak-to-peak sequence statistics of time series. In order to study the influence of correlations in a time series, we determine the peak-to-peak sequence statistics for the Langevin equation of kinetic theory leading to Brownian motion. To test our methodology, we consider a variety of applications. Using a global catalog of earthquakes, we obtain the peak-to-peak statistics of earthquake magnitudes and the nearest neighbor interoccurrence time statistics. In both cases, we find good agreement with the i.i.d. theory. We also consider the interval statistics of the Old Faithful geyser in Yellowstone National Park. In this case, we find a significant deviation from the i.i.d. theory which we attribute to antipersistence. We consider the interval statistics using the AL index of geomagnetic substorms. We again find a significant deviation from i.i.d. behavior that we attribute to mild persistence. Finally, we examine the behavior of Standard and Poor's 500 stock index's daily returns from 1928-2011 and show that, while it is close to being i.i.d., there is, again, significant persistence. We expect that there will be many other applications of our methodology both to interoccurrence statistics and to time series.

  15. A comparison of the views of extension agents and farmers regarding extension education courses in Dezful, Iran

    NASA Astrophysics Data System (ADS)

    Nazarzadeh Zare, Mohsen; Dorrani, Kamal; Gholamali Lavasani, Masoud

    2012-11-01

    Background and purpose : This study examines the views of farmers and extension agents participating in extension education courses in Dezful, Iran, with regard to problems with these courses. It relies upon a descriptive methodology, using a survey as its instrument. Sample : The statistical population consisted of 5060 farmers and 50 extension agents; all extension agents were studied owing to their small population and a sample of 466 farmers was selected based on the stratified ratio sampling method. For the data analysis, statistical procedures including the t-test and factor analysis were used. Results : The results of factor analysis on the views of farmers indicated that these courses have problems such as inadequate use of instructional materials by extension agents, insufficient employment of knowledgeable and experienced extension agents, bad and inconvenient timing of courses for farmers, lack of logical connection between one curriculum and prior ones, negligence in considering the opinions of farmers in arranging the courses, and lack of information about the time of courses. The findings of factor analysis on the views of extension agents indicated that these courses suffer from problems such as use of consistent methods of instruction for teaching curricula, and lack of continuity between courses and their levels and content. Conclusions : Recommendations include: listening to the views of farmers when planning extension courses; providing audiovisual aids, pamphlets and CDs; arranging courses based on convenient timing for farmers; using incentives to encourage participation; and employing extension agents with knowledge of the latest agricultural issues.

  16. Thermoelastic Stress Analysis: The Mean Stress Effect in Metallic Alloys

    NASA Technical Reports Server (NTRS)

    Gyekenyesi, Andrew L.; Baaklini, George Y.

    1999-01-01

    The primary objective of this study involved the utilization of the thermoelastic stress analysis (TSA) method to demonstrate the mean stress dependence of the thermoelastic constant. Titanium and nickel base alloys, commonly employed in aerospace gas turbines, were the materials of interest. The repeatability of the results was studied through a statistical analysis of the data. Although the mean stress dependence was well established, the ability to confidently quantify it was diminished by the experimental variations. If calibration of the thermoelastic response to mean stress can be successfully implemented, it is feasible to use the relationship to determine a structure's residual stress state.

  17. SSME/side loads analysis for flight configuration, revision A. [structural analysis of space shuttle main engine under side load excitation

    NASA Technical Reports Server (NTRS)

    Holland, W.

    1974-01-01

    This document describes the dynamic loads analysis accomplished for the Space Shuttle Main Engine (SSME) considering the side load excitation associated with transient flow separation on the engine bell during ground ignition. The results contained herein pertain only to the flight configuration. A Monte Carlo procedure was employed to select the input variables describing the side load excitation and the loads were statistically combined. This revision includes an active thrust vector control system representation and updated orbiter thrust structure stiffness characteristics. No future revisions are planned but may be necessary as system definition and input parameters change.

  18. Statistical tools for transgene copy number estimation based on real-time PCR.

    PubMed

    Yuan, Joshua S; Burris, Jason; Stewart, Nathan R; Mentewab, Ayalew; Stewart, C Neal

    2007-11-01

    As compared with traditional transgene copy number detection technologies such as Southern blot analysis, real-time PCR provides a fast, inexpensive and high-throughput alternative. However, the real-time PCR based transgene copy number estimation tends to be ambiguous and subjective stemming from the lack of proper statistical analysis and data quality control to render a reliable estimation of copy number with a prediction value. Despite the recent progresses in statistical analysis of real-time PCR, few publications have integrated these advancements in real-time PCR based transgene copy number determination. Three experimental designs and four data quality control integrated statistical models are presented. For the first method, external calibration curves are established for the transgene based on serially-diluted templates. The Ct number from a control transgenic event and putative transgenic event are compared to derive the transgene copy number or zygosity estimation. Simple linear regression and two group T-test procedures were combined to model the data from this design. For the second experimental design, standard curves were generated for both an internal reference gene and the transgene, and the copy number of transgene was compared with that of internal reference gene. Multiple regression models and ANOVA models can be employed to analyze the data and perform quality control for this approach. In the third experimental design, transgene copy number is compared with reference gene without a standard curve, but rather, is based directly on fluorescence data. Two different multiple regression models were proposed to analyze the data based on two different approaches of amplification efficiency integration. Our results highlight the importance of proper statistical treatment and quality control integration in real-time PCR-based transgene copy number determination. These statistical methods allow the real-time PCR-based transgene copy number estimation to be more reliable and precise with a proper statistical estimation. Proper confidence intervals are necessary for unambiguous prediction of trangene copy number. The four different statistical methods are compared for their advantages and disadvantages. Moreover, the statistical methods can also be applied for other real-time PCR-based quantification assays including transfection efficiency analysis and pathogen quantification.

  19. Patterns of medicinal plant use: an examination of the Ecuadorian Shuar medicinal flora using contingency table and binomial analyses.

    PubMed

    Bennett, Bradley C; Husby, Chad E

    2008-03-28

    Botanical pharmacopoeias are non-random subsets of floras, with some taxonomic groups over- or under-represented. Moerman [Moerman, D.E., 1979. Symbols and selectivity: a statistical analysis of Native American medical ethnobotany, Journal of Ethnopharmacology 1, 111-119] introduced linear regression/residual analysis to examine these patterns. However, regression, the commonly-employed analysis, suffers from several statistical flaws. We use contingency table and binomial analyses to examine patterns of Shuar medicinal plant use (from Amazonian Ecuador). We first analyzed the Shuar data using Moerman's approach, modified to better meet requirements of linear regression analysis. Second, we assessed the exact randomization contingency table test for goodness of fit. Third, we developed a binomial model to test for non-random selection of plants in individual families. Modified regression models (which accommodated assumptions of linear regression) reduced R(2) to from 0.59 to 0.38, but did not eliminate all problems associated with regression analyses. Contingency table analyses revealed that the entire flora departs from the null model of equal proportions of medicinal plants in all families. In the binomial analysis, only 10 angiosperm families (of 115) differed significantly from the null model. These 10 families are largely responsible for patterns seen at higher taxonomic levels. Contingency table and binomial analyses offer an easy and statistically valid alternative to the regression approach.

  20. Analysis of Landsat-4 Thematic Mapper data for classification of forest stands in Baldwin County, Alabama

    NASA Technical Reports Server (NTRS)

    Hill, C. L.

    1984-01-01

    A computer-implemented classification has been derived from Landsat-4 Thematic Mapper data acquired over Baldwin County, Alabama on January 15, 1983. One set of spectral signatures was developed from the data by utilizing a 3x3 pixel sliding window approach. An analysis of the classification produced from this technique identified forested areas. Additional information regarding only the forested areas. Additional information regarding only the forested areas was extracted by employing a pixel-by-pixel signature development program which derived spectral statistics only for pixels within the forested land covers. The spectral statistics from both approaches were integrated and the data classified. This classification was evaluated by comparing the spectral classes produced from the data against corresponding ground verification polygons. This iterative data analysis technique resulted in an overall classification accuracy of 88.4 percent correct for slash pine, young pine, loblolly pine, natural pine, and mixed hardwood-pine. An accuracy assessment matrix has been produced for the classification.

  1. Use of Multivariate Linkage Analysis for Dissection of a Complex Cognitive Trait

    PubMed Central

    Marlow, Angela J.; Fisher, Simon E.; Francks, Clyde; MacPhie, I. Laurence; Cherny, Stacey S.; Richardson, Alex J.; Talcott, Joel B.; Stein, John F.; Monaco, Anthony P.; Cardon, Lon R.

    2003-01-01

    Replication of linkage results for complex traits has been exceedingly difficult, owing in part to the inability to measure the precise underlying phenotype, small sample sizes, genetic heterogeneity, and statistical methods employed in analysis. Often, in any particular study, multiple correlated traits have been collected, yet these have been analyzed independently or, at most, in bivariate analyses. Theoretical arguments suggest that full multivariate analysis of all available traits should offer more power to detect linkage; however, this has not yet been evaluated on a genomewide scale. Here, we conduct multivariate genomewide analyses of quantitative-trait loci that influence reading- and language-related measures in families affected with developmental dyslexia. The results of these analyses are substantially clearer than those of previous univariate analyses of the same data set, helping to resolve a number of key issues. These outcomes highlight the relevance of multivariate analysis for complex disorders for dissection of linkage results in correlated traits. The approach employed here may aid positional cloning of susceptibility genes in a wide spectrum of complex traits. PMID:12587094

  2. Urban environmental health applications of remote sensing, summary report

    NASA Technical Reports Server (NTRS)

    Rush, M.; Goldstein, J.; Hsi, B. P.; Olsen, C. B.

    1975-01-01

    Health and its association with the physical environment was studied based on the hypothesis that there is a relationship between the man-made physical environment and health status of a population. The statistical technique of regression analysis was employed to show the degree of association and aspects of physical environment which accounted for the greater variation in health status. Mortality, venereal disease, tuberculosis, hepatitis, meningitis, shigella/salmonella, hypertension and cardiac arrest/myocardial infarction were examined. The statistical techniques were used to measure association and variation, not necessarily cause and effect. Conclusions drawn show that the association still exists in the decade of the 1970's and that it can be successfully monitored with the methodology of remote sensing.

  3. Instrumental and statistical methods for the comparison of class evidence

    NASA Astrophysics Data System (ADS)

    Liszewski, Elisa Anne

    Trace evidence is a major field within forensic science. Association of trace evidence samples can be problematic due to sample heterogeneity and a lack of quantitative criteria for comparing spectra or chromatograms. The aim of this study is to evaluate different types of instrumentation for their ability to discriminate among samples of various types of trace evidence. Chemometric analysis, including techniques such as Agglomerative Hierarchical Clustering, Principal Components Analysis, and Discriminant Analysis, was employed to evaluate instrumental data. First, automotive clear coats were analyzed by using microspectrophotometry to collect UV absorption data. In total, 71 samples were analyzed with classification accuracy of 91.61%. An external validation was performed, resulting in a prediction accuracy of 81.11%. Next, fiber dyes were analyzed using UV-Visible microspectrophotometry. While several physical characteristics of cotton fiber can be identified and compared, fiber color is considered to be an excellent source of variation, and thus was examined in this study. Twelve dyes were employed, some being visually indistinguishable. Several different analyses and comparisons were done, including an inter-laboratory comparison and external validations. Lastly, common plastic samples and other polymers were analyzed using pyrolysis-gas chromatography/mass spectrometry, and their pyrolysis products were then analyzed using multivariate statistics. The classification accuracy varied dependent upon the number of classes chosen, but the plastics were grouped based on composition. The polymers were used as an external validation and misclassifications occurred with chlorinated samples all being placed into the category containing PVC.

  4. Metabolomic fingerprinting employing DART-TOFMS for authentication of tomatoes and peppers from organic and conventional farming.

    PubMed

    Novotná, H; Kmiecik, O; Gałązka, M; Krtková, V; Hurajová, A; Schulzová, V; Hallmann, E; Rembiałkowska, E; Hajšlová, J

    2012-01-01

    The rapidly growing demand for organic food requires the availability of analytical tools enabling their authentication. Recently, metabolomic fingerprinting/profiling has been demonstrated as a challenging option for a comprehensive characterisation of small molecules occurring in plants, since their pattern may reflect the impact of various external factors. In a two-year pilot study, concerned with the classification of organic versus conventional crops, ambient mass spectrometry consisting of a direct analysis in real time (DART) ion source and a time-of-flight mass spectrometer (TOFMS) was employed. This novel methodology was tested on 40 tomato and 24 pepper samples grown under specified conditions. To calculate statistical models, the obtained data (mass spectra) were processed by the principal component analysis (PCA) followed by linear discriminant analysis (LDA). The results from the positive ionisation mode enabled better differentiation between organic and conventional samples than the results from the negative mode. In this case, the recognition ability obtained by LDA was 97.5% for tomato and 100% for pepper samples and the prediction abilities were above 80% for both sample sets. The results suggest that the year of production had stronger influence on the metabolomic fingerprints compared with the type of farming (organic versus conventional). In any case, DART-TOFMS is a promising tool for rapid screening of samples. Establishing comprehensive (multi-sample) long-term databases may further help to improve the quality of statistical classification models.

  5. Effect of unemployment on cardiovascular risk factors and mental health.

    PubMed

    Zagożdżon, P; Parszuto, J; Wrotkowska, M; Dydjow-Bendek, D

    2014-09-01

    Following the economic changes in Poland, increasing health discrepancies were observed during a period of 20 years, which may be partly attributable to the consequences of unemployment. To assess the association between unemployment, major cardiovascular risk factors and mental health. A cross-sectional study in which data were collected between 2009 and 2010 during preventive health examinations by an occupational medicine service in Gdansk, Poland. Data on blood pressure, resting heart rate, information about smoking habits, body mass index and history of use of mental health services were collected during these assessments. Multiple logistic regression was used during data analysis to adjust for age, gender, education and length of employment. Study participants comprised 3052 unemployed and 2059 employed individuals. After adjustment for age, gender, education and number of previous employments, the odds ratio (OR) for hypertension in relation to unemployment was 1.02 [95% confidence interval (95% CI) 0.84-1.23]. There was a statistically significant negative association between being overweight and unemployment (OR = 0.81, 95% CI: 0.66-0.99). Smoking was positively associated with unemployment after adjustment for age and sex (OR = 1.45, 95% CI: 1.25-1.67). There was a positive relationship between mental ill-health and unemployment among study participants (OR = 2.05, 95% CI: 0.91-4.65), but this was not statistically significant. The patterns of major cardiovascular risk factors differed between unemployed and employed individuals in Poland. Our observations suggest employment status is a predictor of specific disease risk profiles; consequently, specific preventive measures are needed in unemployed individuals. © The Author 2014. Published by Oxford University Press on behalf of the Society of Occupational Medicine. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. A two-point diagnostic for the H II galaxy Hubble diagram

    NASA Astrophysics Data System (ADS)

    Leaf, Kyle; Melia, Fulvio

    2018-03-01

    A previous analysis of starburst-dominated H II galaxies and H II regions has demonstrated a statistically significant preference for the Friedmann-Robertson-Walker cosmology with zero active mass, known as the Rh = ct universe, over Λcold dark matter (ΛCDM) and its related dark-matter parametrizations. In this paper, we employ a two-point diagnostic with these data to present a complementary statistical comparison of Rh = ct with Planck ΛCDM. Our two-point diagnostic compares, in a pairwise fashion, the difference between the distance modulus measured at two redshifts with that predicted by each cosmology. Our results support the conclusion drawn by a previous comparative analysis demonstrating that Rh = ct is statistically preferred over Planck ΛCDM. But we also find that the reported errors in the H II measurements may not be purely Gaussian, perhaps due to a partial contamination by non-Gaussian systematic effects. The use of H II galaxies and H II regions as standard candles may be improved even further with a better handling of the systematics in these sources.

  7. Changes in Occupational Employment in the Food and Kindred Products Industry, 1977-1980. Technical Note No. 1.

    ERIC Educational Resources Information Center

    Lewis, Gary

    The extent to which occupational staffing patterns change over time was examined in a study focusing on the Food and Kindred Products industry--Standard Industrial Classification (SIC) 20. Data were taken from the 1977 and 1980 Occupational Employment Statistics program coordinated by the United States Department of Labor Statistics. Actual 1980…

  8. Minorities and Women in State and Local Governments. 1974. Volume V--Township Governments. Research Report No. 52-5.

    ERIC Educational Resources Information Center

    Reshad, Rosalind S.

    One of six volumes summarizing through narrative and statistical tables data collected by the Equal Employment Opportunity Commission in its 1974 survey, this fifth volume details nationwide statistics on the employment status of minorities and women working in township governments. Data from 299 actual units of government in fourteen states were…

  9. Minorities and Women in State and Local Governments. 1974. Volume IV--Municipal Governments. Research Report No. 52-4.

    ERIC Educational Resources Information Center

    Skinner, Alice W.

    One of six volumes summarizing through narrative and statistical tables data collected by the Equal Employment Opportunity Commission in its 1974 survey, this fourth volume details the employment status of minorities and women in municipal governments. Based on reports filed by 2,230 municipalities, statistics in this study are designed to…

  10. Task-based data-acquisition optimization for sparse image reconstruction systems

    NASA Astrophysics Data System (ADS)

    Chen, Yujia; Lou, Yang; Kupinski, Matthew A.; Anastasio, Mark A.

    2017-03-01

    Conventional wisdom dictates that imaging hardware should be optimized by use of an ideal observer (IO) that exploits full statistical knowledge of the class of objects to be imaged, without consideration of the reconstruction method to be employed. However, accurate and tractable models of the complete object statistics are often difficult to determine in practice. Moreover, in imaging systems that employ compressive sensing concepts, imaging hardware and (sparse) image reconstruction are innately coupled technologies. We have previously proposed a sparsity-driven ideal observer (SDIO) that can be employed to optimize hardware by use of a stochastic object model that describes object sparsity. The SDIO and sparse reconstruction method can therefore be "matched" in the sense that they both utilize the same statistical information regarding the class of objects to be imaged. To efficiently compute SDIO performance, the posterior distribution is estimated by use of computational tools developed recently for variational Bayesian inference. Subsequently, the SDIO test statistic can be computed semi-analytically. The advantages of employing the SDIO instead of a Hotelling observer are systematically demonstrated in case studies in which magnetic resonance imaging (MRI) data acquisition schemes are optimized for signal detection tasks.

  11. Prevention of the Posttraumatic Fibrotic Response in Joints

    DTIC Science & Technology

    2015-10-01

    are currently used on a regular basis. Major Task 4: Evaluating the efficacy of inhibitory chIgG to reduce the consequences of traumatic joint...injury. During the second year of study, we successfully employed all assays needed to evaluate the utility of the inhibitory antibody to reduce the...32nd week 1. Major Task 5: Task 4. Data analysis and statistical evaluation of results. All data from the mechanical measurements, from the

  12. Prevention of the Posttraumatic Fibrotic Response in Joints

    DTIC Science & Technology

    2015-10-01

    surgical procedures and subsequent collection of tissues have been developed and are currently used on a regular basis. Major Task 4: Evaluating the...needed to evaluate the utility of the inhibitory antibody to reduce the flexion contracture of injured knee joints. The employed techniques include...second surgery to remove a pin, and it did not change by the end of the 32nd week 1. Major Task 5: Task 4. Data analysis and statistical evaluation

  13. Thermal Decomposition of 1,5-Dinitrobiuret (DNB): Direct Dynamics Trajectory Simulations and Statistical Modeling

    DTIC Science & Technology

    2011-05-03

    18 . NUMBER OF PAGES 19a. NAME OF RESPONSIBLE PERSON Dr. Tommy W. Hawkins a. REPORT Unclassified b. ABSTRACT Unclassified c. THIS PAGE...branching using Rice-Ramsperger-Kassel-Marcus (RRKM) theory, 18 and finally to the analysis of inter-conversions of primary decomposition products...theory, 18 was employed to examine the properties of the reactant, intermediate complex and transition states as a function of the total internal energy

  14. Statistical Measurement and Analysis of Claimant and Demographic Variables Affecting Processing and Adjudication Duration in The United States Army Physical Disability Evaluation System.

    DTIC Science & Technology

    1997-02-06

    Adjudication Duration 2 2. INTRODUCTION This retrospective study analyzes relationships of variables to adjudication and processing duration in the Army...Package for Social Scientists (SPSS), Standard Version 6.1, June 1994, to determine relationships among the dependent and independent variables... consanguinity between variables. Content and criterion validity is employed to determine the measure of scientific validity. Reliability is also

  15. Source Evaluation and Trace Metal Contamination in Benthic Sediments from Equatorial Ecosystems Using Multivariate Statistical Techniques

    PubMed Central

    Benson, Nsikak U.; Asuquo, Francis E.; Williams, Akan B.; Essien, Joseph P.; Ekong, Cyril I.; Akpabio, Otobong; Olajire, Abaas A.

    2016-01-01

    Trace metals (Cd, Cr, Cu, Ni and Pb) concentrations in benthic sediments were analyzed through multi-step fractionation scheme to assess the levels and sources of contamination in estuarine, riverine and freshwater ecosystems in Niger Delta (Nigeria). The degree of contamination was assessed using the individual contamination factors (ICF) and global contamination factor (GCF). Multivariate statistical approaches including principal component analysis (PCA), cluster analysis and correlation test were employed to evaluate the interrelationships and associated sources of contamination. The spatial distribution of metal concentrations followed the pattern Pb>Cu>Cr>Cd>Ni. Ecological risk index by ICF showed significant potential mobility and bioavailability for Cu, Cu and Ni. The ICF contamination trend in the benthic sediments at all studied sites was Cu>Cr>Ni>Cd>Pb. The principal component and agglomerative clustering analyses indicate that trace metals contamination in the ecosystems was influenced by multiple pollution sources. PMID:27257934

  16. Precipitation forecast using artificial neural networks. An application to the Guadalupe Valley, Baja California, Mexico

    NASA Astrophysics Data System (ADS)

    Herrera-Oliva, C. S.

    2013-05-01

    In this work we design and implement a method for the determination of precipitation forecast through the application of an elementary neuronal network (perceptron) to the statistical analysis of the precipitation reported in catalogues. The method is limited mainly by the catalogue length (and, in a smaller degree, by its accuracy). The method performance is measured using grading functions that evaluate a tradeoff between positive and negative aspects of performance. The method is applied to the Guadalupe Valley, Baja California, Mexico. Using consecutive intervals of dt=0.1 year, employing the data of several climatological stations situated in and surrounding this important wine industries zone. We evaluated the performance of different models of ANN, whose variables of entrance are the heights of precipitation. The results obtained were satisfactory, except for exceptional values of rain. Key words: precipitation forecast, artificial neural networks, statistical analysis

  17. Literature review of some selected types of results and statistical analyses of total-ozone data. [for the ozonosphere

    NASA Technical Reports Server (NTRS)

    Myers, R. H.

    1976-01-01

    The depletion of ozone in the stratosphere is examined, and causes for the depletion are cited. Ground station and satellite measurements of ozone, which are taken on a worldwide basis, are discussed. Instruments used in ozone measurement are discussed, such as the Dobson spectrophotometer, which is credited with providing the longest and most extensive series of observations for ground based observation of stratospheric ozone. Other ground based instruments used to measure ozone are also discussed. The statistical differences of ground based measurements of ozone from these different instruments are compared to each other, and to satellite measurements. Mathematical methods (i.e., trend analysis or linear regression analysis) of analyzing the variability of ozone concentration with respect to time and lattitude are described. Various time series models which can be employed in accounting for ozone concentration variability are examined.

  18. Multiscale volatility duration characteristics on financial multi-continuum percolation dynamics

    NASA Astrophysics Data System (ADS)

    Wang, Min; Wang, Jun

    A random stock price model based on the multi-continuum percolation system is developed to investigate the nonlinear dynamics of stock price volatility duration, in an attempt to explain various statistical facts found in financial data, and have a deeper understanding of mechanisms in the financial market. The continuum percolation system is usually referred to be a random coverage process or a Boolean model, it is a member of a class of statistical physics systems. In this paper, the multi-continuum percolation (with different values of radius) is employed to model and reproduce the dispersal of information among the investors. To testify the rationality of the proposed model, the nonlinear analyses of return volatility duration series are preformed by multifractal detrending moving average analysis and Zipf analysis. The comparison empirical results indicate the similar nonlinear behaviors for the proposed model and the actual Chinese stock market.

  19. Probing the statistical properties of CMB B-mode polarization through Minkowski functionals

    NASA Astrophysics Data System (ADS)

    Santos, Larissa; Wang, Kai; Zhao, Wen

    2016-07-01

    The detection of the magnetic type B-mode polarization is the main goal of future cosmic microwave background (CMB) experiments. In the standard model, the B-mode map is a strong non-gaussian field due to the CMB lensing component. Besides the two-point correlation function, the other statistics are also very important to dig the information of the polarization map. In this paper, we employ the Minkowski functionals to study the morphological properties of the lensed B-mode maps. We find that the deviations from Gaussianity are very significant for both full and partial-sky surveys. As an application of the analysis, we investigate the morphological imprints of the foreground residuals in the B-mode map. We find that even for very tiny foreground residuals, the effects on the map can be detected by the Minkowski functional analysis. Therefore, it provides a complementary way to investigate the foreground contaminations in the CMB studies.

  20. Response of SiC{sub f}/Si{sub 3}N{sub 4} composites under static and cyclic loading -- An experimental and statistical analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahfuz, H.; Maniruzzaman, M.; Vaidya, U.

    1997-04-01

    Monotonic tensile and fatigue response of continuous silicon carbide fiber reinforced silicon nitride (SiC{sub f}/Si{sub 3}N{sub 4}) composites has been investigated. The monotonic tensile tests have been performed at room and elevated temperatures. Fatigue tests have been conducted at room temperature (RT), at a stress ratio, R = 0.1 and a frequency of 5 Hz. It is observed during the monotonic tests that the composites retain only 30% of its room temperature strength at 1,600 C suggesting a substantial chemical degradation of the matrix at that temperature. The softening of the matrix at elevated temperature also causes reduction in tensilemore » modulus, and the total reduction in modulus is around 45%. Fatigue data have been generated at three load levels and the fatigue strength of the composite has been found to be considerably high; about 75% of its ultimate room temperature strength. Extensive statistical analysis has been performed to understand the degree of scatter in the fatigue as well as in the static test data. Weibull shape factors and characteristic values have been determined for each set of tests and their relationship with the response of the composites has been discussed. A statistical fatigue life prediction method developed from the Weibull distribution is also presented. Maximum Likelihood Estimator with censoring techniques and data pooling schemes has been employed to determine the distribution parameters for the statistical analysis. These parameters have been used to generate the S-N diagram with desired level of reliability. Details of the statistical analysis and the discussion of the static and fatigue behavior of the composites are presented in this paper.« less

  1. Mixture distributions of wind speed in the UAE

    NASA Astrophysics Data System (ADS)

    Shin, J.; Ouarda, T.; Lee, T. S.

    2013-12-01

    Wind speed probability distribution is commonly used to estimate potential wind energy. The 2-parameter Weibull distribution has been most widely used to characterize the distribution of wind speed. However, it is unable to properly model wind speed regimes when wind speed distribution presents bimodal and kurtotic shapes. Several studies have concluded that the Weibull distribution should not be used for frequency analysis of wind speed without investigation of wind speed distribution. Due to these mixture distributional characteristics of wind speed data, the application of mixture distributions should be further investigated in the frequency analysis of wind speed. A number of studies have investigated the potential wind energy in different parts of the Arabian Peninsula. Mixture distributional characteristics of wind speed were detected from some of these studies. Nevertheless, mixture distributions have not been employed for wind speed modeling in the Arabian Peninsula. In order to improve our understanding of wind energy potential in Arabian Peninsula, mixture distributions should be tested for the frequency analysis of wind speed. The aim of the current study is to assess the suitability of mixture distributions for the frequency analysis of wind speed in the UAE. Hourly mean wind speed data at 10-m height from 7 stations were used in the current study. The Weibull and Kappa distributions were employed as representatives of the conventional non-mixture distributions. 10 mixture distributions are used and constructed by mixing four probability distributions such as Normal, Gamma, Weibull and Extreme value type-one (EV-1) distributions. Three parameter estimation methods such as Expectation Maximization algorithm, Least Squares method and Meta-Heuristic Maximum Likelihood (MHML) method were employed to estimate the parameters of the mixture distributions. In order to compare the goodness-of-fit of tested distributions and parameter estimation methods for sample wind data, the adjusted coefficient of determination, Bayesian Information Criterion (BIC) and Chi-squared statistics were computed. Results indicate that MHML presents the best performance of parameter estimation for the used mixture distributions. In most of the employed 7 stations, mixture distributions give the best fit. When the wind speed regime shows mixture distributional characteristics, most of these regimes present the kurtotic statistical characteristic. Particularly, applications of mixture distributions for these stations show a significant improvement in explaining the whole wind speed regime. In addition, the Weibull-Weibull mixture distribution presents the best fit for the wind speed data in the UAE.

  2. Facial anthropometric differences among gender, ethnicity, and age groups.

    PubMed

    Zhuang, Ziqing; Landsittel, Douglas; Benson, Stacey; Roberge, Raymond; Shaffer, Ronald

    2010-06-01

    The impact of race/ethnicity upon facial anthropometric data in the US workforce, on the development of personal protective equipment, has not been investigated to any significant degree. The proliferation of minority populations in the US workforce has increased the need to investigate differences in facial dimensions among these workers. The objective of this study was to determine the face shape and size differences among race and age groups from the National Institute for Occupational Safety and Health survey of 3997 US civilian workers. Survey participants were divided into two gender groups, four racial/ethnic groups, and three age groups. Measurements of height, weight, neck circumference, and 18 facial dimensions were collected using traditional anthropometric techniques. A multivariate analysis of the data was performed using Principal Component Analysis. An exploratory analysis to determine the effect of different demographic factors had on anthropometric features was assessed via a linear model. The 21 anthropometric measurements, body mass index, and the first and second principal component scores were dependent variables, while gender, ethnicity, age, occupation, weight, and height served as independent variables. Gender significantly contributes to size for 19 of 24 dependent variables. African-Americans have statistically shorter, wider, and shallower noses than Caucasians. Hispanic workers have 14 facial features that are significantly larger than Caucasians, while their nose protrusion, height, and head length are significantly shorter. The other ethnic group was composed primarily of Asian subjects and has statistically different dimensions from Caucasians for 16 anthropometric values. Nineteen anthropometric values for subjects at least 45 years of age are statistically different from those measured for subjects between 18 and 29 years of age. Workers employed in manufacturing, fire fighting, healthcare, law enforcement, and other occupational groups have facial features that differ significantly than those in construction. Statistically significant differences in facial anthropometric dimensions (P < 0.05) were noted between males and females, all racial/ethnic groups, and the subjects who were at least 45 years old when compared to workers between 18 and 29 years of age. These findings could be important to the design and manufacture of respirators, as well as employers responsible for supplying respiratory protective equipment to their employees.

  3. Comparative study of submerged and surface culture acetification process for orange vinegar.

    PubMed

    Cejudo-Bastante, Cristina; Durán-Guerrero, Enrique; García-Barroso, Carmelo; Castro-Mejías, Remedios

    2018-02-01

    The two main acetification methodologies generally employed in the production of vinegar (surface and submerged cultures) were studied and compared for the production of orange vinegar. Polyphenols (UPLC/DAD) and volatiles compounds (SBSE-GC/MS) were considered as the main variables in the comparative study. Sensory characteristics of the obtained vinegars were also evaluated. Seventeen polyphenols and 24 volatile compounds were determined in the samples during both acetification processes. For phenolic compounds, analysis of variance showed significant higher concentrations when surface culture acetification was employed. However, for the majority of volatile compounds higher contents were observed for submerged culture acetification process, and it was also reflected in the sensory analysis, presenting higher scores for the different descriptors. Multivariate statistical analysis such as principal component analysis demonstrated the possibility of discriminating the samples regarding the type of acetification process. Polyphenols such as apigenin derivative or ferulic acid and volatile compounds such as 4-vinylguaiacol, decanoic acid, nootkatone, trans-geraniol, β-citronellol or α-terpineol, among others, were those compounds that contributed more to the discrimination of the samples. The acetification process employed in the production of orange vinegar has been demonstrated to be very significant for the final characteristics of the vinegar obtained. So it must be carefully controlled to obtain high quality products. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  4. Self-Regulated Learning Strategies in Relation with Statistics Anxiety

    ERIC Educational Resources Information Center

    Kesici, Sahin; Baloglu, Mustafa; Deniz, M. Engin

    2011-01-01

    Dealing with students' attitudinal problems related to statistics is an important aspect of statistics instruction. Employing the appropriate learning strategies may have a relationship with anxiety during the process of statistics learning. Thus, the present study investigated multivariate relationships between self-regulated learning strategies…

  5. Pepitome: evaluating improved spectral library search for identification complementarity and quality assessment

    PubMed Central

    Dasari, Surendra; Chambers, Matthew C.; Martinez, Misti A.; Carpenter, Kristin L.; Ham, Amy-Joan L.; Vega-Montoto, Lorenzo J.; Tabb, David L.

    2012-01-01

    Spectral libraries have emerged as a viable alternative to protein sequence databases for peptide identification. These libraries contain previously detected peptide sequences and their corresponding tandem mass spectra (MS/MS). Search engines can then identify peptides by comparing experimental MS/MS scans to those in the library. Many of these algorithms employ the dot product score for measuring the quality of a spectrum-spectrum match (SSM). This scoring system does not offer a clear statistical interpretation and ignores fragment ion m/z discrepancies in the scoring. We developed a new spectral library search engine, Pepitome, which employs statistical systems for scoring SSMs. Pepitome outperformed the leading library search tool, SpectraST, when analyzing data sets acquired on three different mass spectrometry platforms. We characterized the reliability of spectral library searches by confirming shotgun proteomics identifications through RNA-Seq data. Applying spectral library and database searches on the same sample revealed their complementary nature. Pepitome identifications enabled the automation of quality analysis and quality control (QA/QC) for shotgun proteomics data acquisition pipelines. PMID:22217208

  6. Enterprise size and return to work after stroke.

    PubMed

    Hannerz, Harald; Ferm, Linnea; Poulsen, Otto M; Pedersen, Betina Holbæk; Andersen, Lars L

    2012-12-01

    It has been hypothesised that return to work rates among sick-listed workers increases with enterprise size. The aim of the present study was to estimate the effect of enterprise size on the odds of returning to work among previously employed stroke patients in Denmark, 2000-2006. We used a prospective design with a 2 year follow-up period. The study population consisted of 13,178 stroke patients divided into four enterprise sizes categories, according to the place of their employment prior to the stroke: micro (1-9 employees), small (10-49 employees), medium (50-249 employees) and large (>250 employees). The analysis was based on nationwide data on enterprise size from Statistics Denmark merged with data from the Danish occupational hospitalisation register. We found a statistically significant association (p = 0.034); each increase in enterprise size category was followed by an increase in the estimated odds of returning to work. The chances of returning to work after stroke increases as the size of enterprise increases. Preventive efforts and research aimed at finding ways of mitigating the effect are warranted.

  7. Characterization of ABS specimens produced via the 3D printing technology for drone structural components

    NASA Astrophysics Data System (ADS)

    Ferro, Carlo Giovanni; Brischetto, Salvatore; Torre, Roberto; Maggiore, Paolo

    2016-07-01

    The Fused Deposition Modelling (FDM) technology is widely used in rapid prototyping. 3D printers for home desktop applications are usually employed to make non-structural objects. When the mechanical stresses are not excessive, this technology can also be successfully employed to produce structural objects, not only in prototyping stage but also in the realization of series pieces. The innovative idea of the present work is the application of this technology, implemented in a desktop 3D printer, to the realization of components for aeronautical use, especially for unmanned aerial systems. For this purpose, the paper is devoted to the statistical study of the performance of a desktop 3D printer to understand how the process performs and which are the boundary limits of acceptance. Mechanical and geometrical properties of ABS (Acrylonitrile Butadiene Styrene) specimens, such as tensile strength and stiffness, have been evaluated. ASTM638 type specimens have been used. A capability analysis has been applied for both mechanical and dimensional performances. Statistically stable limits have been determined using experimentally collected data.

  8. Detection of crossover time scales in multifractal detrended fluctuation analysis

    NASA Astrophysics Data System (ADS)

    Ge, Erjia; Leung, Yee

    2013-04-01

    Fractal is employed in this paper as a scale-based method for the identification of the scaling behavior of time series. Many spatial and temporal processes exhibiting complex multi(mono)-scaling behaviors are fractals. One of the important concepts in fractals is crossover time scale(s) that separates distinct regimes having different fractal scaling behaviors. A common method is multifractal detrended fluctuation analysis (MF-DFA). The detection of crossover time scale(s) is, however, relatively subjective since it has been made without rigorous statistical procedures and has generally been determined by eye balling or subjective observation. Crossover time scales such determined may be spurious and problematic. It may not reflect the genuine underlying scaling behavior of a time series. The purpose of this paper is to propose a statistical procedure to model complex fractal scaling behaviors and reliably identify the crossover time scales under MF-DFA. The scaling-identification regression model, grounded on a solid statistical foundation, is first proposed to describe multi-scaling behaviors of fractals. Through the regression analysis and statistical inference, we can (1) identify the crossover time scales that cannot be detected by eye-balling observation, (2) determine the number and locations of the genuine crossover time scales, (3) give confidence intervals for the crossover time scales, and (4) establish the statistically significant regression model depicting the underlying scaling behavior of a time series. To substantive our argument, the regression model is applied to analyze the multi-scaling behaviors of avian-influenza outbreaks, water consumption, daily mean temperature, and rainfall of Hong Kong. Through the proposed model, we can have a deeper understanding of fractals in general and a statistical approach to identify multi-scaling behavior under MF-DFA in particular.

  9. Lower back pain and absenteeism among professional public transport drivers.

    PubMed

    Kresal, Friderika; Roblek, Vasja; Jerman, Andrej; Meško, Maja

    2015-01-01

    Drivers in public transport are subjected to lower back pain. The reason for the pain is associated with the characteristics of the physical position imposed on the worker while performing the job. Lower back pain is the main cause of absenteeism among drivers. The present study includes 145 public transport drivers employed as professional drivers for an average of 14.14 years. Analysis of the data obtained in the study includes the basic descriptive statistics, χ(2) test and multiple regression analysis. Analysis of the incidence of lower back pain showed that the majority of our sample population suffered from pain in the lower back. We found that there are no statistically significant differences between the groups formed by the length of service as a professional driver and incidence of lower back pain; we were also interested in whether or not the risk factors of lower back pain affects the absenteeism of city bus drivers. Analysis of the data has shown that the risk factors of pain in the lower part of the spine do affect the absenteeism of city bus drivers.

  10. Assessing Statistically Significant Heavy-Metal Concentrations in Abandoned Mine Areas via Hot Spot Analysis of Portable XRF Data

    PubMed Central

    Kim, Sung-Min; Choi, Yosoon

    2017-01-01

    To develop appropriate measures to prevent soil contamination in abandoned mining areas, an understanding of the spatial variation of the potentially toxic trace elements (PTEs) in the soil is necessary. For the purpose of effective soil sampling, this study uses hot spot analysis, which calculates a z-score based on the Getis-Ord Gi* statistic to identify a statistically significant hot spot sample. To constitute a statistically significant hot spot, a feature with a high value should also be surrounded by other features with high values. Using relatively cost- and time-effective portable X-ray fluorescence (PXRF) analysis, sufficient input data are acquired from the Busan abandoned mine and used for hot spot analysis. To calibrate the PXRF data, which have a relatively low accuracy, the PXRF analysis data are transformed using the inductively coupled plasma atomic emission spectrometry (ICP-AES) data. The transformed PXRF data of the Busan abandoned mine are classified into four groups according to their normalized content and z-scores: high content with a high z-score (HH), high content with a low z-score (HL), low content with a high z-score (LH), and low content with a low z-score (LL). The HL and LH cases may be due to measurement errors. Additional or complementary surveys are required for the areas surrounding these suspect samples or for significant hot spot areas. The soil sampling is conducted according to a four-phase procedure in which the hot spot analysis and proposed group classification method are employed to support the development of a sampling plan for the following phase. Overall, 30, 50, 80, and 100 samples are investigated and analyzed in phases 1–4, respectively. The method implemented in this case study may be utilized in the field for the assessment of statistically significant soil contamination and the identification of areas for which an additional survey is required. PMID:28629168

  11. Assessing Statistically Significant Heavy-Metal Concentrations in Abandoned Mine Areas via Hot Spot Analysis of Portable XRF Data.

    PubMed

    Kim, Sung-Min; Choi, Yosoon

    2017-06-18

    To develop appropriate measures to prevent soil contamination in abandoned mining areas, an understanding of the spatial variation of the potentially toxic trace elements (PTEs) in the soil is necessary. For the purpose of effective soil sampling, this study uses hot spot analysis, which calculates a z -score based on the Getis-Ord Gi* statistic to identify a statistically significant hot spot sample. To constitute a statistically significant hot spot, a feature with a high value should also be surrounded by other features with high values. Using relatively cost- and time-effective portable X-ray fluorescence (PXRF) analysis, sufficient input data are acquired from the Busan abandoned mine and used for hot spot analysis. To calibrate the PXRF data, which have a relatively low accuracy, the PXRF analysis data are transformed using the inductively coupled plasma atomic emission spectrometry (ICP-AES) data. The transformed PXRF data of the Busan abandoned mine are classified into four groups according to their normalized content and z -scores: high content with a high z -score (HH), high content with a low z -score (HL), low content with a high z -score (LH), and low content with a low z -score (LL). The HL and LH cases may be due to measurement errors. Additional or complementary surveys are required for the areas surrounding these suspect samples or for significant hot spot areas. The soil sampling is conducted according to a four-phase procedure in which the hot spot analysis and proposed group classification method are employed to support the development of a sampling plan for the following phase. Overall, 30, 50, 80, and 100 samples are investigated and analyzed in phases 1-4, respectively. The method implemented in this case study may be utilized in the field for the assessment of statistically significant soil contamination and the identification of areas for which an additional survey is required.

  12. Cancer mortality in cohorts of workers in the European rubber manufacturing industry first employed since 1975.

    PubMed

    Boniol, M; Koechlin, A; Świątkowska, B; Sorahan, T; Wellmann, J; Taeger, D; Jakobsson, K; Pira, E; Boffetta, P; La Vecchia, C; Pizot, C; Boyle, P

    2016-05-01

    Increased cancer risk has been reported among workers in the rubber manufacturing industry employed before the 1960s. It is unclear whether risk remains increased among workers hired subsequently. The present study focused on risk of cancer mortality for rubber workers first employed since 1975 in 64 factories. Anonymized data from cohorts of rubber workers employed for at least 1 year from Germany, Italy, Poland, Sweden, and the UK were pooled. Standardized mortality ratios (SMRs), based on country-specific death rates, were reported for bladder and lung cancer (primary outcomes of interest), for other selected cancer sites, and for cancer sites with a minimum of 10 deaths in men or women. Analyses stratified by type of industry, period, and duration of employment were carried out. A total of 38 457 individuals (29 768 men; 8689 women) contributed to 949 370 person-years. No increased risk of bladder cancer was observed [SMR = 0.80, 95% confidence interval (CI) 0.46; 1.38]. The risk of lung cancer death was reduced (SMR = 0.81, 95% CI 0.70; 0.94). No statistically significant increased risk was observed for any other cause of death. A reduced risk was evident for total cancer mortality (SMR = 0.81, 95% CI 0.76; 0.87). Risks were lower for workers in the tyre industry compared with workers in the general rubber goods sector. Analysis by employment duration showed a negative trend with SMRs decreasing with increasing duration of employment. In an analysis of secondary end points, when stratified by type of industry and period of first employment, excess risks of myeloma and gastric cancer were observed each due, essentially, to results from one centre. No consistent increased risk of cancer death was observed among rubber workers first employed since 1975, no overall analysis of the pooled cohort produced significantly increased risk. Continued surveillance of the present cohorts is required to confirm the absence of long-term risk. © The Author 2016. Published by Oxford University Press on behalf of the European Society for Medical Oncology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  13. Fully Bayesian tests of neutrality using genealogical summary statistics.

    PubMed

    Drummond, Alexei J; Suchard, Marc A

    2008-10-31

    Many data summary statistics have been developed to detect departures from neutral expectations of evolutionary models. However questions about the neutrality of the evolution of genetic loci within natural populations remain difficult to assess. One critical cause of this difficulty is that most methods for testing neutrality make simplifying assumptions simultaneously about the mutational model and the population size model. Consequentially, rejecting the null hypothesis of neutrality under these methods could result from violations of either or both assumptions, making interpretation troublesome. Here we harness posterior predictive simulation to exploit summary statistics of both the data and model parameters to test the goodness-of-fit of standard models of evolution. We apply the method to test the selective neutrality of molecular evolution in non-recombining gene genealogies and we demonstrate the utility of our method on four real data sets, identifying significant departures of neutrality in human influenza A virus, even after controlling for variation in population size. Importantly, by employing a full model-based Bayesian analysis, our method separates the effects of demography from the effects of selection. The method also allows multiple summary statistics to be used in concert, thus potentially increasing sensitivity. Furthermore, our method remains useful in situations where analytical expectations and variances of summary statistics are not available. This aspect has great potential for the analysis of temporally spaced data, an expanding area previously ignored for limited availability of theory and methods.

  14. Limb-darkening and the structure of the Jovian atmosphere

    NASA Technical Reports Server (NTRS)

    Newman, W. I.; Sagan, C.

    1978-01-01

    By observing the transit of various cloud features across the Jovian disk, limb-darkening curves were constructed for three regions in the 4.6 to 5.1 mu cm band. Several models currently employed in describing the radiative or dynamical properties of planetary atmospheres are here examined to understand their implications for limb-darkening. The statistical problem of fitting these models to the observed data is reviewed and methods for applying multiple regression analysis are discussed. Analysis of variance techniques are introduced to test the viability of a given physical process as a cause of the observed limb-darkening.

  15. Statistical downscaling of general-circulation-model- simulated average monthly air temperature to the beginning of flowering of the dandelion (Taraxacum officinale) in Slovenia

    NASA Astrophysics Data System (ADS)

    Bergant, Klemen; Kajfež-Bogataj, Lučka; Črepinšek, Zalika

    2002-02-01

    Phenological observations are a valuable source of information for investigating the relationship between climate variation and plant development. Potential climate change in the future will shift the occurrence of phenological phases. Information about future climate conditions is needed in order to estimate this shift. General circulation models (GCM) provide the best information about future climate change. They are able to simulate reliably the most important mean features on a large scale, but they fail on a regional scale because of their low spatial resolution. A common approach to bridging the scale gap is statistical downscaling, which was used to relate the beginning of flowering of Taraxacum officinale in Slovenia with the monthly mean near-surface air temperature for January, February and March in Central Europe. Statistical models were developed and tested with NCAR/NCEP Reanalysis predictor data and EARS predictand data for the period 1960-1999. Prior to developing statistical models, empirical orthogonal function (EOF) analysis was employed on the predictor data. Multiple linear regression was used to relate the beginning of flowering with expansion coefficients of the first three EOF for the Janauary, Febrauary and March air temperatures, and a strong correlation was found between them. Developed statistical models were employed on the results of two GCM (HadCM3 and ECHAM4/OPYC3) to estimate the potential shifts in the beginning of flowering for the periods 1990-2019 and 2020-2049 in comparison with the period 1960-1989. The HadCM3 model predicts, on average, 4 days earlier occurrence and ECHAM4/OPYC3 5 days earlier occurrence of flowering in the period 1990-2019. The analogous results for the period 2020-2049 are a 10- and 11-day earlier occurrence.

  16. Post-migration employment changes and health: A dyadic spousal analysis.

    PubMed

    Ro, Annie; Goldberg, Rachel E

    2017-10-01

    Prospective studies have found unemployment and job loss to be associated with negative psychological and physical health outcomes. For immigrants, the health implications of employment change cannot be considered apart from pre-migration experiences. While immigrants demonstrate relative success in securing employment in the United States, their work is often not commensurate with their education or expertise. Previous research has linked downward employment with adverse health outcomes among immigrants, but with gender differences. We extended this literature by considering a wider range of employment states and accounting for the interdependence of husbands' and wives' employment trajectories. We examined the relationships between personal and spousal post-migration employment changes and self-rated health and depressive symptoms using dyadic data from the 2003 New Immigrant Survey (NIS) (n = 5682 individuals/2841 spousal pairs). We used the Actor Partner Interdependence Model (APIM) to model cross-partner effects and account for spousal interdependence. In general, men's downward employment trajectories were associated with poorer health for themselves. Women's employment trajectories had fewer statistically significant associations with their own or their husbands' health, underscoring the generally more peripheral nature of women's work in the household. However, women's current unemployment in particular was associated with poorer health outcomes for themselves and their husbands, suggesting that unmet need for women's work can produce health risks within immigrant households. Our findings suggest that employment change should be considered a household event that can impact the wellbeing of linked individuals within. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. A Multi-Class, Interdisciplinary Project Using Elementary Statistics

    ERIC Educational Resources Information Center

    Reese, Margaret

    2012-01-01

    This article describes a multi-class project that employs statistical computing and writing in a statistics class. Three courses, General Ecology, Meteorology, and Introductory Statistics, cooperated on a project for the EPA's Student Design Competition. The continuing investigation has also spawned several undergraduate research projects in…

  18. Regional analysis of annual maximum rainfall using TL-moments method

    NASA Astrophysics Data System (ADS)

    Shabri, Ani Bin; Daud, Zalina Mohd; Ariff, Noratiqah Mohd

    2011-06-01

    Information related to distributions of rainfall amounts are of great importance for designs of water-related structures. One of the concerns of hydrologists and engineers is the probability distribution for modeling of regional data. In this study, a novel approach to regional frequency analysis using L-moments is revisited. Subsequently, an alternative regional frequency analysis using the TL-moments method is employed. The results from both methods were then compared. The analysis was based on daily annual maximum rainfall data from 40 stations in Selangor Malaysia. TL-moments for the generalized extreme value (GEV) and generalized logistic (GLO) distributions were derived and used to develop the regional frequency analysis procedure. TL-moment ratio diagram and Z-test were employed in determining the best-fit distribution. Comparison between the two approaches showed that the L-moments and TL-moments produced equivalent results. GLO and GEV distributions were identified as the most suitable distributions for representing the statistical properties of extreme rainfall in Selangor. Monte Carlo simulation was used for performance evaluation, and it showed that the method of TL-moments was more efficient for lower quantile estimation compared with the L-moments.

  19. Body mass index and employment-based health insurance.

    PubMed

    Fong, Ronald L; Franks, Peter

    2008-05-09

    Obese workers incur greater health care costs than normal weight workers. Possibly viewed by employers as an increased financial risk, they may be at a disadvantage in procuring employment that provides health insurance. This study aims to evaluate the association between body mass index [BMI, weight in kilograms divided by the square of height in meters] of employees and their likelihood of holding jobs that include employment-based health insurance [EBHI]. We used the 2004 Household Components of the nationally representative Medical Expenditure Panel Survey. We utilized logistic regression models with provision of EBHI as the dependent variable in this descriptive analysis. The key independent variable was BMI, with adjustments for the domains of demographics, social-economic status, workplace/job characteristics, and health behavior/status. BMI was classified as normal weight (18.5-24.9), overweight (25.0-29.9), or obese (> or = 30.0). There were 11,833 eligible respondents in the analysis. Among employed adults, obese workers [adjusted probability (AP) = 0.62, (0.60, 0.65)] (P = 0.005) were more likely to be employed in jobs with EBHI than their normal weight counterparts [AP = 0.57, (0.55, 0.60)]. Overweight workers were also more likely to hold jobs with EBHI than normal weight workers, but the difference did not reach statistical significance [AP = 0.61 (0.58, 0.63)] (P = 0.052). There were no interaction effects between BMI and gender or age. In this nationally representative sample, we detected an association between workers' increasing BMI and their likelihood of being employed in positions that include EBHI. These findings suggest that obese workers are more likely to have EBHI than other workers.

  20. Load balancing for massively-parallel soft-real-time systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hailperin, M.

    1988-09-01

    Global load balancing, if practical, would allow the effective use of massively-parallel ensemble architectures for large soft-real-problems. The challenge is to replace quick global communications, which is impractical in a massively-parallel system, with statistical techniques. In this vein, the author proposes a novel approach to decentralized load balancing based on statistical time-series analysis. Each site estimates the system-wide average load using information about past loads of individual sites and attempts to equal that average. This estimation process is practical because the soft-real-time systems of interest naturally exhibit loads that are periodic, in a statistical sense akin to seasonality in econometrics.more » It is shown how this load-characterization technique can be the foundation for a load-balancing system in an architecture employing cut-through routing and an efficient multicast protocol.« less

  1. A Management Information System Model for Program Management. Ph.D. Thesis - Oklahoma State Univ.; [Computerized Systems Analysis

    NASA Technical Reports Server (NTRS)

    Shipman, D. L.

    1972-01-01

    The development of a model to simulate the information system of a program management type of organization is reported. The model statistically determines the following parameters: type of messages, destinations, delivery durations, type processing, processing durations, communication channels, outgoing messages, and priorites. The total management information system of the program management organization is considered, including formal and informal information flows and both facilities and equipment. The model is written in General Purpose System Simulation 2 computer programming language for use on the Univac 1108, Executive 8 computer. The model is simulated on a daily basis and collects queue and resource utilization statistics for each decision point. The statistics are then used by management to evaluate proposed resource allocations, to evaluate proposed changes to the system, and to identify potential problem areas. The model employs both empirical and theoretical distributions which are adjusted to simulate the information flow being studied.

  2. Propagation of a Free Flame in a Turbulent Gas Stream

    NASA Technical Reports Server (NTRS)

    Mickelsen, William R; Ernstein, Norman E

    1956-01-01

    Effective flame speeds of free turbulent flames were measured by photographic, ionization-gap, and photomultiplier-tube methods, and were found to have a statistical distribution attributed to the nature of the turbulent field. The effective turbulent flame speeds for the free flame were less than those previously measured for flames stabilized on nozzle burners, Bunsen burners, and bluff bodies. The statistical spread of the effective turbulent flame speeds was markedly wider in the lean and rich fuel-air-ratio regions, which might be attributed to the greater sensitivity of laminar flame speed to flame temperature in those regions. Values calculated from the turbulent free-flame-speed analysis proposed by Tucker apparently form upper limits for the statistical spread of free-flame-speed data. Hot-wire anemometer measurements of the longitudinal velocity fluctuation intensity and longitudinal correlation coefficient were made and were employed in the comparison of data and in the theoretical calculation of turbulent flame speed.

  3. From random microstructures to representative volume elements

    NASA Astrophysics Data System (ADS)

    Zeman, J.; Šejnoha, M.

    2007-06-01

    A unified treatment of random microstructures proposed in this contribution opens the way to efficient solutions of large-scale real world problems. The paper introduces a notion of statistically equivalent periodic unit cell (SEPUC) that replaces in a computational step the actual complex geometries on an arbitrary scale. A SEPUC is constructed such that its morphology conforms with images of real microstructures. Here, the appreciated two-point probability function and the lineal path function are employed to classify, from the statistical point of view, the geometrical arrangement of various material systems. Examples of statistically equivalent unit cells constructed for a unidirectional fibre tow, a plain weave textile composite and an irregular-coursed masonry wall are given. A specific result promoting the applicability of the SEPUC as a tool for the derivation of homogenized effective properties that are subsequently used in an independent macroscopic analysis is also presented.

  4. [Employment opportunities and education needs of physicians with specialty training in Hygiene and Preventive Medicine.].

    PubMed

    Fara, Gaetano M; Nardi, Giuseppe; Signorelli, Carlo; Fanti, Mila

    2005-01-01

    This survey was carried out under the sponsorship of the Italian Society of Hygiene (SItI), to evaluate the current professional position of physicians who completed their post-graduate professional training in Hygiene and Preventive Medicine in the years 2000 through 2003. An ad-hoc questionnaire was administered to 689 such specialists across Italy with a response rate of 40%. The results show that specialists in Hygiene and Preventive Medicine are generally satisfied with their professional choice though most specialists were found to have only temporary employment. Post-specialty training courses of major interest to specialists in Hygiene and Preventive medicine are those regarding occupational health, statistical analysis and epidemiology, and quality of health care.

  5. Interventional study to improve diabetic guidelines adherence using mobile health (m-Health) technology in Lahore, Pakistan.

    PubMed

    Hashmi, Noreen Rahat; Khan, Shazad Ali

    2018-05-31

    To check if mobile health (m-Health) short message service (SMS) can improve the knowledge and practice of the American Diabetic Association preventive care guidelines (ADA guidelines) recommendations among physicians. Quasi-experimental pre-post study design with a control group. The participants of the study were 62 medical officers/medical postgraduate trainees from two hospitals in Lahore, Pakistan. Pretested questionnaire was used to collect baseline information about physicians' knowledge and adherence according to the ADA guidelines. All the respondents attended 1-day workshop about the guidelines. The intervention group received regular reminders by SMS about the ADA guidelines for the next 5 months. Postintervention knowledge and practice scores of 13 variables were checked again using the same questionnaire. Statistical analysis included χ 2 and McNemar's tests for categorical variables and t-test for continuous variables. Pearson's correlation analysis was done to check correlation between knowledge and practice scores in the intervention group. P values of <0.05 were considered statistically significant. The total number of participating physicians was 62. Fifty-three (85.5%) respondents completed the study. Composite scores within the intervention group showed statistically significant improvement in knowledge (p<0.001) and practice (p<0.001) postintervention. The overall composite scores preintervention and postintervention also showed statistically significant difference of improvement in knowledge (p=0.002) and practice (p=0.001) between non-intervention and intervention groups. Adherence to individual 13 ADA preventive care guidelines level was noted to be suboptimal at baseline. Statistically significant improvement in the intervention group was seen in the following individual variables: review of symptoms of hypoglycaemia and hyperglycaemia, eye examination, neurological examination, lipid examination, referral to ophthalmologist, and counselling about non-smoking. m-Health technology can be a useful educational tool to help with improving knowledge and practice of diabetic guidelines. Future multicentre trials will help to scale this intervention for wider use in resource-limited countries. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  6. Statistical Optimization of 1,3-Propanediol (1,3-PD) Production from Crude Glycerol by Considering Four Objectives: 1,3-PD Concentration, Yield, Selectivity, and Productivity.

    PubMed

    Supaporn, Pansuwan; Yeom, Sung Ho

    2018-04-30

    This study investigated the biological conversion of crude glycerol generated from a commercial biodiesel production plant as a by-product to 1,3-propanediol (1,3-PD). Statistical analysis was employed to derive a statistical model for the individual and interactive effects of glycerol, (NH 4 ) 2 SO 4 , trace elements, pH, and cultivation time on the four objectives: 1,3-PD concentration, yield, selectivity, and productivity. Optimum conditions for each objective with its maximum value were predicted by statistical optimization, and experiments under the optimum conditions verified the predictions. In addition, by systematic analysis of the values of four objectives, optimum conditions for 1,3-PD concentration (49.8 g/L initial glycerol, 4.0 g/L of (NH 4 ) 2 SO 4 , 2.0 mL/L of trace element, pH 7.5, and 11.2 h of cultivation time) were determined to be the global optimum culture conditions for 1,3-PD production. Under these conditions, we could achieve high 1,3-PD yield (47.4%), 1,3-PD selectivity (88.8%), and 1,3-PD productivity (2.1/g/L/h) as well as high 1,3-PD concentration (23.6 g/L).

  7. Factors Associated with Low Birth Weight of Children Among Employed Mothers in Pakistan.

    PubMed

    Jafree, Sara Rizvi; Zakar, Rubeena; Zakar, Muhammad Zakria

    2015-09-01

    Evidence shows that Pakistan has an increasing rate of children with low birth weight (LBW). Employed mothers in paid work (EMPW) in the country have predominantly been disadvantaged in terms of access to education and low-income employment; with negative consequences on maternal and child health. The objective of this study was to determine socio-demographic characteristics of EMPW and identify the association between maternal employment and child birth weight in Pakistan. Secondary data from the Pakistan Demographic Health Survey (PDHS) conducted for the year 2006-2007 was used. PDHS is a nationally representative household survey. Relevant data needed from the PDHS data file were coded and filtered. The sample size of EMPW with at least one child born in the last 5 years was 2,515. Data was analyzed by using SPSS. Descriptive and inferential statistics were used to see the association between EMPW characteristics and LBW. Findings confirm that the majority of EMPW in Pakistan are illiterate, poor, employed in unskilled work, and belonging to rural regions. Multivariate regression analysis revealed statistical association between EMPW and LBW among mothers who did not receive prenatal care from unskilled healthcare provider (AOR 1.92; 95% CI 1.12-3.30), had lack of access to information such as radio (AOR 1.88; 95% CI 1.28-2.77), during pregnancy did not receive calcium (AOR 1.19; 95% CI 1.05-1.34), and iron (AOR 1.33; 95% CI 1.05-1.69), had experienced headaches during pregnancy (AOR 1.41; 95% CI 1.12-1.76), and were not paid in cash for their work (AOR 1.41; 95% CI 1.04-1.90). EMPW in Pakistan, especially in low-income jobs and rural regions, need urgent support for healthcare awareness, free supplementation of micronutrients and frequent consultation with trained practitioner during the prenatal period. Long-term mobilization of social structure and governance is needed to encourage maternal health awareness, hospital deliveries, and formal sector employment for EMPW.

  8. Community periodontal treatment needs in South Korea.

    PubMed

    Lee, M-Y; Chang, S-J; Kim, C-B; Chung, W-G; Choi, E-M; Kim, N-H

    2015-11-01

    This study aimed to assess the relationship between socio-economic factors and community periodontal treatment needs in Korea. Data were obtained from the year 2009 Korean National Health and Nutrition Examination Survey. Our analysis included 7510 survey participants over the age of 19 years. To assess the relationship between socio-economic factors and the need for periodontal scaling, we performed multivariate logistic regression analyses for data with a complex sampling structure. PASW statistics 19.0 (SPSS Inc., Chicago, IL, USA) was used to perform the statistical analyses, and the results were expressed as odds ratios (OR) with corresponding 95% confidence intervals (CIs). A very high percentage of Korean adults required periodontal scaling (71.5%). After adjusting for sex, age, and socio-economic factors, the need for periodontal scaling was associated with low levels of education (OR: 1.41, 95% CI: 1.03-1.93), low incomes (OR: 1.27, 95% CI: 1.01-1.60), employment as a service and sales worker (OR: 1.39, 95% CI: 1.10-1.77), and employment as a manual worker (OR: 1.31, 95% CI: 1.02-1.69). In South Korea, the need for periodontal scaling was associated with socio-economic factors, such as low levels of education, low incomes, employment as a service and sales worker and employment as a manual worker. Consequently, clinical and community dental hygienists should consider adults with these risk factors as belonging to high-priority subgroups to whom they should respond first. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  9. 29 CFR 1614.601 - EEO group statistics.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 4 2010-07-01 2010-07-01 false EEO group statistics. 1614.601 Section 1614.601 Labor... EMPLOYMENT OPPORTUNITY Matters of General Applicability § 1614.601 EEO group statistics. (a) Each agency... provided by an employee is inaccurate, the agency shall advise the employee about the solely statistical...

  10. Mortality in the British rubber industry 1946-85.

    PubMed Central

    Sorahan, T; Parkes, H G; Veys, C A; Waterhouse, J A; Straughan, J K; Nutt, A

    1989-01-01

    The mortality experienced by a cohort of 36,691 rubber workers during 1946-85 has been investigated. These workers were all male operatives first employed in any one of the 13 participating factories in 1946-60; all had worked continuously in the industry for a minimum period of one year. Compared with the general population, statistically significant excesses relating to cancer mortality were found for cancer of the pharynx (E = 20.2, O = 30, SMR = 149), oesophagus (E = 87.6, O = 107, SMR = 122), stomach (E = 316.5, O = 359, SMR = 113), lung (E = 1219.2, O = 1592, SMR = 131), and all neoplasms (E = 2965.6, O = 3344, SMR = 113). Statistically significant deficits were found for cancer of the prostate. (E = 128.2, O = 91, SMR = 71), testis (E = 11.0, O = 4, SMR = 36), and Hodgkin's disease (E = 26.9, O = 16, SMR = 59). Involvement of occupational exposures was assessed by the method of regression models and life tables (RMLT). This method was used to compare the duration of employment in the industry, the duration in "dust exposed" jobs, and the duration in "fume and/or solvent exposed" jobs of those dying from causes of interest with those of all matching survivors. Positive associations (approaching formal levels of statistical significance) were found only for cancers of the stomach and the lung. The results of the RMLT analysis are independent of those from the SMR analysis, and the study continues to provide limited evidence of a causal association between the risks of stomach cancer and dust exposures, and the risks of lung cancer and fume or solvent exposures in the rubber industry during the period under study. PMID:2920137

  11. Prevalence of peritonitis and mortality in patients treated with continuous ambulatory peritoneal dialysis (CAPD) in Africa: a protocol for a systematic review and meta-analysis.

    PubMed

    Moloi, Mothusi Walter; Kajawo, Shepherd; Noubiap, Jean Jacques; Mbah, Ikechukwu O; Ekrikpo, Udeme; Kengne, Andre Pascal; Bello, Aminu K; Okpechi, Ikechi G

    2018-05-24

    Continuous ambulatory peritoneal dialysis (CAPD) is the ideal modality for renal replacement therapy in most African settings given that it is relatively cheaper than haemodialysis (HD) and does not require in-centre care. CAPD is, however, not readily utilised as it is often complicated by peritonitis leading to high rates of technique failure. The objective of this study is to assess the prevalence of CAPD-related peritonitis and all-cause mortality in patients treated with CAPD in Africa. We will search PubMed, EMBASE, SCOPUS, Africa Journal Online and Google Scholar for studies conducted in Africa from 1 January 1980 to 30 June 2017 with no language restrictions. Eligible studies will include cross-sectional, prospective observational and cohort studies of patients treated with CAPD. Two authors will independently screen, select studies, extract data and conduct risk of bias assessment. Data consistently reported across studies will be pooled using random-effects meta-analysis. Heterogeneity will be evaluated using Cochrane's Q statistic and quantified using I 2 statistics. Graphical and formal statistical tests will be used to assess for publication bias. Ethical approval will not be needed for this study as data used will be extracted from already published studies. Results of this review will be published in a peer-reviewed journal and presented at conferences. The Preferred Reporting Items for Systematic reviews and Meta-Analyses for Protocols 2015 (PRISMA-P 2015) framework guided the development of this protocol. CRD42017072966. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  12. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 3: Structure and listing of programs

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  13. Detailed Spectral Analysis of the 260 ks XMM-Newton Data of 1E 1207.4-5209 and Significance of a 2.1 keV Absorption Feature

    NASA Astrophysics Data System (ADS)

    Mori, Kaya; Chonko, James C.; Hailey, Charles J.

    2005-10-01

    We have reanalyzed the 260 ks XMM-Newton observation of 1E 1207.4-5209. There are several significant improvements over previous work. First, a much broader range of physically plausible spectral models was used. Second, we have used a more rigorous statistical analysis. The standard F-distribution was not employed, but rather the exact finite statistics F-distribution was determined by Monte Carlo simulations. This approach was motivated by the recent work of Protassov and coworkers and Freeman and coworkers. They demonstrated that the standard F-distribution is not even asymptotically correct when applied to assess the significance of additional absorption features in a spectrum. With our improved analysis we do not find a third and fourth spectral feature in 1E 1207.4-5209 but only the two broad absorption features previously reported. Two additional statistical tests, one line model dependent and the other line model independent, confirmed our modified F-test analysis. For all physically plausible continuum models in which the weak residuals are strong enough to fit, the residuals occur at the instrument Au M edge. As a sanity check we confirmed that the residuals are consistent in strength and position with the instrument Au M residuals observed in 3C 273.

  14. Analysis strategies for longitudinal attachment loss data.

    PubMed

    Beck, J D; Elter, J R

    2000-02-01

    The purpose of this invited review is to describe and discuss methods currently in use to quantify the progression of attachment loss in epidemiological studies of periodontal disease, and to make recommendations for specific analytic methods based upon the particular design of the study and structure of the data. The review concentrates on the definition of incident attachment loss (ALOSS) and its component parts; measurement issues including thresholds and regression to the mean; methods of accounting for longitudinal change, including changes in means, changes in proportions of affected sites, incidence density, the effect of tooth loss and reversals, and repeated events; statistical models of longitudinal change, including the incorporation of the time element, use of linear, logistic or Poisson regression or survival analysis, and statistical tests; site vs person level of analysis, including statistical adjustment for correlated data; the strengths and limitations of ALOSS data. Examples from the Piedmont 65+ Dental Study are used to illustrate specific concepts. We conclude that incidence density is the preferred methodology to use for periodontal studies with more than one period of follow-up and that the use of studies not employing methods for dealing with complex samples, correlated data, and repeated measures does not take advantage of our current understanding of the site- and person-level variables important in periodontal disease and may generate biased results.

  15. Data Flow Analysis and Visualization for Spatiotemporal Statistical Data without Trajectory Information.

    PubMed

    Kim, Seokyeon; Jeong, Seongmin; Woo, Insoo; Jang, Yun; Maciejewski, Ross; Ebert, David S

    2018-03-01

    Geographic visualization research has focused on a variety of techniques to represent and explore spatiotemporal data. The goal of those techniques is to enable users to explore events and interactions over space and time in order to facilitate the discovery of patterns, anomalies and relationships within the data. However, it is difficult to extract and visualize data flow patterns over time for non-directional statistical data without trajectory information. In this work, we develop a novel flow analysis technique to extract, represent, and analyze flow maps of non-directional spatiotemporal data unaccompanied by trajectory information. We estimate a continuous distribution of these events over space and time, and extract flow fields for spatial and temporal changes utilizing a gravity model. Then, we visualize the spatiotemporal patterns in the data by employing flow visualization techniques. The user is presented with temporal trends of geo-referenced discrete events on a map. As such, overall spatiotemporal data flow patterns help users analyze geo-referenced temporal events, such as disease outbreaks, crime patterns, etc. To validate our model, we discard the trajectory information in an origin-destination dataset and apply our technique to the data and compare the derived trajectories and the original. Finally, we present spatiotemporal trend analysis for statistical datasets including twitter data, maritime search and rescue events, and syndromic surveillance.

  16. Statistical mixture design and multivariate analysis of inkjet printed a-WO3/TiO2/WOX electrochromic films.

    PubMed

    Wojcik, Pawel Jerzy; Pereira, Luís; Martins, Rodrigo; Fortunato, Elvira

    2014-01-13

    An efficient mathematical strategy in the field of solution processed electrochromic (EC) films is outlined as a combination of an experimental work, modeling, and information extraction from massive computational data via statistical software. Design of Experiment (DOE) was used for statistical multivariate analysis and prediction of mixtures through a multiple regression model, as well as the optimization of a five-component sol-gel precursor subjected to complex constraints. This approach significantly reduces the number of experiments to be realized, from 162 in the full factorial (L=3) and 72 in the extreme vertices (D=2) approach down to only 30 runs, while still maintaining a high accuracy of the analysis. By carrying out a finite number of experiments, the empirical modeling in this study shows reasonably good prediction ability in terms of the overall EC performance. An optimized ink formulation was employed in a prototype of a passive EC matrix fabricated in order to test and trial this optically active material system together with a solid-state electrolyte for the prospective application in EC displays. Coupling of DOE with chromogenic material formulation shows the potential to maximize the capabilities of these systems and ensures increased productivity in many potential solution-processed electrochemical applications.

  17. Application of a data-mining method based on Bayesian networks to lesion-deficit analysis

    NASA Technical Reports Server (NTRS)

    Herskovits, Edward H.; Gerring, Joan P.

    2003-01-01

    Although lesion-deficit analysis (LDA) has provided extensive information about structure-function associations in the human brain, LDA has suffered from the difficulties inherent to the analysis of spatial data, i.e., there are many more variables than subjects, and data may be difficult to model using standard distributions, such as the normal distribution. We herein describe a Bayesian method for LDA; this method is based on data-mining techniques that employ Bayesian networks to represent structure-function associations. These methods are computationally tractable, and can represent complex, nonlinear structure-function associations. When applied to the evaluation of data obtained from a study of the psychiatric sequelae of traumatic brain injury in children, this method generates a Bayesian network that demonstrates complex, nonlinear associations among lesions in the left caudate, right globus pallidus, right side of the corpus callosum, right caudate, and left thalamus, and subsequent development of attention-deficit hyperactivity disorder, confirming and extending our previous statistical analysis of these data. Furthermore, analysis of simulated data indicates that methods based on Bayesian networks may be more sensitive and specific for detecting associations among categorical variables than methods based on chi-square and Fisher exact statistics.

  18. The Raman spectrum character of skin tumor induced by UVB

    NASA Astrophysics Data System (ADS)

    Wu, Shulian; Hu, Liangjun; Wang, Yunxia; Li, Yongzeng

    2016-03-01

    In our study, the skin canceration processes induced by UVB were analyzed from the perspective of tissue spectrum. A home-made Raman spectral system with a millimeter order excitation laser spot size combined with a multivariate statistical analysis for monitoring the skin changed irradiated by UVB was studied and the discrimination were evaluated. Raman scattering signals of the SCC and normal skin were acquired. Spectral differences in Raman spectra were revealed. Linear discriminant analysis (LDA) based on principal component analysis (PCA) were employed to generate diagnostic algorithms for the classification of skin SCC and normal. The results indicated that Raman spectroscopy combined with PCA-LDA demonstrated good potential for improving the diagnosis of skin cancers.

  19. Neural net diagnostics for VLSI test

    NASA Technical Reports Server (NTRS)

    Lin, T.; Tseng, H.; Wu, A.; Dogan, N.; Meador, J.

    1990-01-01

    This paper discusses the application of neural network pattern analysis algorithms to the IC fault diagnosis problem. A fault diagnostic is a decision rule combining what is known about an ideal circuit test response with information about how it is distorted by fabrication variations and measurement noise. The rule is used to detect fault existence in fabricated circuits using real test equipment. Traditional statistical techniques may be used to achieve this goal, but they can employ unrealistic a priori assumptions about measurement data. Our approach to this problem employs an adaptive pattern analysis technique based on feedforward neural networks. During training, a feedforward network automatically captures unknown sample distributions. This is important because distributions arising from the nonlinear effects of process variation can be more complex than is typically assumed. A feedforward network is also able to extract measurement features which contribute significantly to making a correct decision. Traditional feature extraction techniques employ matrix manipulations which can be particularly costly for large measurement vectors. In this paper we discuss a software system which we are developing that uses this approach. We also provide a simple example illustrating the use of the technique for fault detection in an operational amplifier.

  20. Visual and Statistical Analysis of Digital Elevation Models Generated Using Idw Interpolator with Varying Powers

    NASA Astrophysics Data System (ADS)

    Asal, F. F.

    2012-07-01

    Digital elevation data obtained from different Engineering Surveying techniques is utilized in generating Digital Elevation Model (DEM), which is employed in many Engineering and Environmental applications. This data is usually in discrete point format making it necessary to utilize an interpolation approach for the creation of DEM. Quality assessment of the DEM is a vital issue controlling its use in different applications; however this assessment relies heavily on statistical methods with neglecting the visual methods. The research applies visual analysis investigation on DEMs generated using IDW interpolator of varying powers in order to examine their potential in the assessment of the effects of the variation of the IDW power on the quality of the DEMs. Real elevation data has been collected from field using total station instrument in a corrugated terrain. DEMs have been generated from the data at a unified cell size using IDW interpolator with power values ranging from one to ten. Visual analysis has been undertaken using 2D and 3D views of the DEM; in addition, statistical analysis has been performed for assessment of the validity of the visual techniques in doing such analysis. Visual analysis has shown that smoothing of the DEM decreases with the increase in the power value till the power of four; however, increasing the power more than four does not leave noticeable changes on 2D and 3D views of the DEM. The statistical analysis has supported these results where the value of the Standard Deviation (SD) of the DEM has increased with increasing the power. More specifically, changing the power from one to two has produced 36% of the total increase (the increase in SD due to changing the power from one to ten) in SD and changing to the powers of three and four has given 60% and 75% respectively. This refers to decrease in DEM smoothing with the increase in the power of the IDW. The study also has shown that applying visual methods supported by statistical analysis has proven good potential in the DEM quality assessment.

  1. Distribution of ULF energy (f is less than 80 mHz) in the inner magnetosphere - A statistical analysis of AMPTE CCE magnetic field data

    NASA Technical Reports Server (NTRS)

    Takahashi, Kazue; Anderson, Brian J.

    1992-01-01

    Magnetic field measurements made with the AMPTE CCE spacecraft are used to investigate the distribution of ULF energy in the inner magnetosphere. The data base is employed to examine the spatial distribution of ULF energy. The spatial distribution of wave power and spectral structures are used to identify several pulsation types, including multiharmonic toroidal oscillations; equatorial compressional Pc 3 oscillations; second harmonic poloidal oscillations; and nightside compressional oscillations. The frequencies of the toroidal oscillations are applied to determine the statistical radial profile of the plasma mass density and Alfven velocity. A clear signature of the plasma pause in the profiles of these average parameters is found.

  2. An Overview of Interrater Agreement on Likert Scales for Researchers and Practitioners

    PubMed Central

    O'Neill, Thomas A.

    2017-01-01

    Applications of interrater agreement (IRA) statistics for Likert scales are plentiful in research and practice. IRA may be implicated in job analysis, performance appraisal, panel interviews, and any other approach to gathering systematic observations. Any rating system involving subject-matter experts can also benefit from IRA as a measure of consensus. Further, IRA is fundamental to aggregation in multilevel research, which is becoming increasingly common in order to address nesting. Although, several technical descriptions of a few specific IRA statistics exist, this paper aims to provide a tractable orientation to common IRA indices to support application. The introductory overview is written with the intent of facilitating contrasts among IRA statistics by critically reviewing equations, interpretations, strengths, and weaknesses. Statistics considered include rwg, rwg*, r′wg, rwg(p), average deviation (AD), awg, standard deviation (Swg), and the coefficient of variation (CVwg). Equations support quick calculation and contrasting of different agreement indices. The article also includes a “quick reference” table and three figures in order to help readers identify how IRA statistics differ and how interpretations of IRA will depend strongly on the statistic employed. A brief consideration of recommended practices involving statistical and practical cutoff standards is presented, and conclusions are offered in light of the current literature. PMID:28553257

  3. Employment preferences of public sector nurses in Malawi: results from a discrete choice experiment.

    PubMed

    Mangham, Lindsay J; Hanson, Kara

    2008-12-01

    To understand the employment preferences of Malawian public sector registered nurses, and to ascertain whether salary increases significantly affect how nurses regard their employment. A discrete choice experiment was used to assess the significance of six job attributes on nurses' preferences over pairs of job descriptions: net monthly pay, provision of government housing, opportunities to upgrade their qualifications, typical workload, availability of resources and place of work. A multivariate model was used to estimate the extent to which nurses were willing to trade between their monetary benefits, non-monetary benefits, and working conditions, and to determine the relative importance of the job attributes. Most nurses were willing to trade among attributes, and very few appeared to have preferences that were dominated by a single job attribute. All attributes had a statistically significant influence on nurses' preferences, and further analysis showed the rate at which they were willing to forego pay increases for other improvements in their employment conditions. Opportunities to upgrade professional qualifications, government housing and the increases in net monthly pay had the greatest impact on nurses' employment choices. Salary enhancement can improve the motivation and retention of nurses, as well as improvements of employment conditions, which support existing efforts to address the health worker shortage.

  4. Potential Effects of a Scenario Earthquake on the Economy of Southern California: Small Business Exposure and Sensitivity Analysis to a Magnitude 7.8 Earthquake

    USGS Publications Warehouse

    Sherrouse, Benson C.; Hester, David J.; Wein, Anne M.

    2008-01-01

    The Multi-Hazards Demonstration Project (MHDP) is a collaboration between the U.S. Geological Survey (USGS) and various partners from the public and private sectors and academia, meant to improve Southern California's resiliency to natural hazards (Jones and others, 2007). In support of the MHDP objectives, the ShakeOut Scenario was developed. It describes a magnitude 7.8 (M7.8) earthquake along the southernmost 300 kilometers (200 miles) of the San Andreas Fault, identified by geoscientists as a plausible event that will cause moderate to strong shaking over much of the eight-county (Imperial, Kern, Los Angeles, Orange, Riverside, San Bernardino, San Diego, and Ventura) Southern California region. This report contains an exposure and sensitivity analysis of small businesses in terms of labor and employment statistics. Exposure is measured as the absolute counts of labor market variables anticipated to experience each level of Instrumental Intensity (a proxy measure of damage). Sensitivity is the percentage of the exposure of each business establishment size category to each Instrumental Intensity level. The analysis concerns the direct effect of the earthquake on small businesses. The analysis is inspired by the Bureau of Labor Statistics (BLS) report that analyzed the labor market losses (exposure) of a M6.9 earthquake on the Hayward fault by overlaying geocoded labor market data on Instrumental Intensity values. The method used here is influenced by the ZIP-code-level data provided by the California Employment Development Department (CA EDD), which requires the assignment of Instrumental Intensities to ZIP codes. The ZIP-code-level labor market data includes the number of business establishments, employees, and quarterly payroll categorized by business establishment size.

  5. A hybrid fault diagnosis approach based on mixed-domain state features for rotating machinery.

    PubMed

    Xue, Xiaoming; Zhou, Jianzhong

    2017-01-01

    To make further improvement in the diagnosis accuracy and efficiency, a mixed-domain state features data based hybrid fault diagnosis approach, which systematically blends both the statistical analysis approach and the artificial intelligence technology, is proposed in this work for rolling element bearings. For simplifying the fault diagnosis problems, the execution of the proposed method is divided into three steps, i.e., fault preliminary detection, fault type recognition and fault degree identification. In the first step, a preliminary judgment about the health status of the equipment can be evaluated by the statistical analysis method based on the permutation entropy theory. If fault exists, the following two processes based on the artificial intelligence approach are performed to further recognize the fault type and then identify the fault degree. For the two subsequent steps, mixed-domain state features containing time-domain, frequency-domain and multi-scale features are extracted to represent the fault peculiarity under different working conditions. As a powerful time-frequency analysis method, the fast EEMD method was employed to obtain multi-scale features. Furthermore, due to the information redundancy and the submergence of original feature space, a novel manifold learning method (modified LGPCA) is introduced to realize the low-dimensional representations for high-dimensional feature space. Finally, two cases with 12 working conditions respectively have been employed to evaluate the performance of the proposed method, where vibration signals were measured from an experimental bench of rolling element bearing. The analysis results showed the effectiveness and the superiority of the proposed method of which the diagnosis thought is more suitable for practical application. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  6. Service quality and corporate social responsibility, influence on post-purchase intentions of sheltered employment institutions.

    PubMed

    Chen, Chao-Chien; Lin, Shih-Yen; Cheng, Chia-Hsin; Tsai, Chia-Ching

    2012-01-01

    The main purpose of this study is to investigate the impact of service quality and corporate social responsibility (CSR) on customer satisfaction, and customer satisfaction toward post-purchase intentions from sheltered employment institutions. Work experience plays an important role in career development for those people with intellectual disabilities. When they are not yet capable of obtaining a job in the open market, they must receive job training and daily care in sheltered employment institutions. If the sheltered employment institutions cannot operate properly, they will greatly affect intellectual disabilities. From the study of "Children Are Us Bakeries and Restaurants" sheltered employment institutions are one kind of food service business that has been found to request and improve service quality and execution of CSR. These are two main factors which can enhance brand value and create a good reputation for sheltered employment institutions. The questionnaire results indicate that perceived service quality has a positive relationship with customer satisfaction and the reliability dimension is the most important factor for customers to assess service quality. Meanwhile, correlation analysis shows that customer satisfaction regarding service quality influences post-purchase intentions, indicating that friendly and helpful employees can please customers and enhance their satisfaction level and also induce positive post-purchase intentions of consumers. Regarding the CSR of "Children Are Us Bakeries and Restaurants" sheltered employment institutions, the analysis reveals a statistical significance: the greater customer satisfaction of CSR, the higher the post-purchase intention. In addition, in the work, paired-sample t test analysis reveals there is a significant difference (p<.05) in service quality and CSR in terms of "perceived" and "expected" responses. In summary, since those with intellectual disabilities usually are enthusiastic at work and do their best to provide good service and execute CSR well, the value of sheltered employment institutions establishments should be recognized by all should receive continued support and there should be a willingness to hire these intellectually disabled citizens. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. An Integrated Analysis of the Physiological Effects of Space Flight: Executive Summary

    NASA Technical Reports Server (NTRS)

    Leonard, J. I.

    1985-01-01

    A large array of models were applied in a unified manner to solve problems in space flight physiology. Mathematical simulation was used as an alternative way of looking at physiological systems and maximizing the yield from previous space flight experiments. A medical data analysis system was created which consist of an automated data base, a computerized biostatistical and data analysis system, and a set of simulation models of physiological systems. Five basic models were employed: (1) a pulsatile cardiovascular model; (2) a respiratory model; (3) a thermoregulatory model; (4) a circulatory, fluid, and electrolyte balance model; and (5) an erythropoiesis regulatory model. Algorithms were provided to perform routine statistical tests, multivariate analysis, nonlinear regression analysis, and autocorrelation analysis. Special purpose programs were prepared for rank correlation, factor analysis, and the integration of the metabolic balance data.

  8. Size and shape measurement in contemporary cephalometrics.

    PubMed

    McIntyre, Grant T; Mossey, Peter A

    2003-06-01

    The traditional method of analysing cephalograms--conventional cephalometric analysis (CCA)--involves the calculation of linear distance measurements, angular measurements, area measurements, and ratios. Because shape information cannot be determined from these 'size-based' measurements, an increasing number of studies employ geometric morphometric tools in the cephalometric analysis of craniofacial morphology. Most of the discussions surrounding the appropriateness of CCA, Procrustes superimposition, Euclidean distance matrix analysis (EDMA), thin-plate spline analysis (TPS), finite element morphometry (FEM), elliptical Fourier functions (EFF), and medial axis analysis (MAA) have centred upon mathematical and statistical arguments. Surprisingly, little information is available to assist the orthodontist in the clinical relevance of each technique. This article evaluates the advantages and limitations of the above methods currently used to analyse the craniofacial morphology on cephalograms and investigates their clinical relevance and possible applications.

  9. Working to eat: Vulnerability, food insecurity, and obesity among migrant and seasonal farmworker families.

    PubMed

    Borre, Kristen; Ertle, Luke; Graff, Mariaelisa

    2010-04-01

    Food insecurity and obesity have potential health consequences for migrant and seasonal farm workers (MSFW). Thirty-six Latino MSFW working in eastern North Carolina whose children attended Migrant Head Start completed interviews, focus groups and home visits. Content analysis, nutrient analysis, and non-parametric statistical analysis produced results. MSFW (63.8%) families were food insecure; of those, 34.7% experienced hunger. 32% of pre-school children were food insecure. Food secure families spent more money on food. Obesity was prevalent in adults and children but the relationship to food insecurity remains unclear. Strategies to reduce risk of foods insecurity were employed by MSFW, but employer and community assistance is needed to reduce their risk. Food insecurity is rooted in the cultural lifestyle of farmwork, poverty, and dependency. MSFW obesity and food insecurity require further study to determine the relationship with migration and working conditions. Networking and social support are important for MSFW families to improve food security. Policies and community/workplace interventions could reduce risk of food insecurity and improve the health of workers. (c) 2010 Wiley-Liss, Inc.

  10. [The cases of accident in the ceramic tile industry in relation to the age and job seniority of the workers].

    PubMed

    Candela, S; Duca, P; Bedogni, L

    1993-01-01

    A study was made of 3,368 workers in 36 ceramic plants in the Scandiano area (Reggio Emilia, Italy) during the year 1990; 403 had an accident during the observation period. The incidence and severity of the accident were correlated to age, job (low, intermediate, high accident risk), duration of employment as at 1.1.1990 (> 24, 12-24, > 24, 12-24, < 12 months, engaged during 1990). Logistic regression analysis, survival analysis and RIDIT analysis were performed using GLIM and EGRET statistical packages. Risk: ORs for intermediate and high-risk jobs vs low-risk jobs were 1.3 (1.0-1.7 95% confidence limits) and 1.7 (1.3-2.3), respectively, adjusted for duration of employment; ORs for the 12-24, < 12 months, engaged during 1990 categories vs the > 24 months category were 1.5 (1.1-2.0), 1.7 (1.3-2.2), 2.1 (1.6-2.8) respectively. Severity: the mean RIDIT of the > 44-year-old vs < 30-year-old workers was 0.6 (0.53-0.67).

  11. Physics Manpower, 1973, Education and Employment Studies.

    ERIC Educational Resources Information Center

    American Inst. of Physics, New York, NY.

    Discussed in this document are the changes within the physics profession, their causes and effect. Detailed statistical data are supplied concerning physics enrollments, the institutions where physics is taught, the faculty in physics departments, and the nonacademic employment of physicists. Other topics include employment, education, minority…

  12. [Employment status and perceived health in Italy: data from the European Union Statistics on Income and Living Conditions (EU-SILC) longitudinal study].

    PubMed

    Bacci, Silvia; Seracini, Marco; Chiavarini, Manuela; Bartolucci, Francesco; Minelli, Liliana

    2017-01-01

    The aim of this study was to investigate the relationship between employment status (permanent employment, fixed-term employment, unemployment, other) and perceived health status in a sample of the Italian population. Data was obtained from the European Union Statistics on Income and Living Condition (EU-SILC) study during the period 2009 - 2012. The sample consists of 4,848 individuals, each with a complete record of observations during four years for a total of 19,392 observations. The causal relationship between perceived/self-reported health status and employment status was tested using a global logit model (STATA). Our results confirm a significant association between employment status and perceived health, as well as between perceived health status and economic status. Unemployment that was dependent on an actual lack of work opportunities and not from individual disability was found to be the most significant determinant of perceived health status; a higher educational level produces a better perceived health status.

  13. DATA ON YOUTH, 1967, A STATISTICAL DOCUMENT.

    ERIC Educational Resources Information Center

    SCHEIDER, GEORGE

    THE DATA IN THIS REPORT ARE STATISTICS ON YOUTH THROUGHOUT THE UNITED STATES AND IN NEW YORK STATE. INCLUDED ARE DATA ON POPULATION, SCHOOL STATISTICS, EMPLOYMENT, FAMILY INCOME, JUVENILE DELINQUENCY AND YOUTH CRIME (INCLUDING NEW YORK CITY FIGURES), AND TRAFFIC ACCIDENTS. THE STATISTICS ARE PRESENTED IN THE TEXT AND IN TABLES AND CHARTS. (NH)

  14. The Preparedness of Preservice Secondary Mathematics Teachers to Teach Statistics: A Cross-Institutional Mixed Methods Study

    ERIC Educational Resources Information Center

    Lovett, Jennifer Nickell

    2016-01-01

    The purpose of this study is to provide researchers, mathematics educators, and statistics educators information about the current state of preservice secondary mathematics teachers' preparedness to teach statistics. To do so, this study employed an explanatory mixed methods design to quantitatively examine the statistical knowledge and statistics…

  15. Employment program for patients with severe mental illness in Malaysia: a 3-month outcome.

    PubMed

    Wan Kasim, Syarifah Hafizah; Midin, Marhani; Abu Bakar, Abdul Kadir; Sidi, Hatta; Nik Jaafar, Nik Ruzyanei; Das, Srijit

    2014-01-01

    This study aimed to examine the rate and predictive factors of successful employment at 3 months upon enrolment into an employment program among patients with severe mental illness (SMI). A cross-sectional study using universal sampling technique was conducted on patients with SMI who completed a 3-month period of being employed at Hospital Permai, Malaysia. A total of 147 patients were approached and 126 were finally included in the statistical analyses. Successful employment was defined as the ability to work 40 or more hours per month. Factors significantly associated with successful employment from bivariate analyses were entered into a multiple logistic regression analysis to identify predictors of successful employment. The rate of successful employment at 3 months was 68.3% (n=81). Significant factors associated with successful employment from bivariate analyses were having past history of working, good family support, less number of psychiatric admissions, good compliance to medicine, good interest in work, living in hostel, being motivated to work, satisfied with the job or salary, getting a preferred job, being in competitive or supported employment and having higher than median scores of PANNS on the positive, negative and general psychopathology. Significant predictors of employment, from a logistic regression model were having good past history of working (p<0.021; OR 6.12; [95% CI 2.1-11.9]) and getting a preferred job (p<0.032; [OR 4.021; 95% CI 1.83-12.1]). Results showed a high employment rate among patients with SMI. Good past history of working and getting a preferred job were significant predictors of successful employment. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Meta-analyses on intra-aortic balloon pump in cardiogenic shock complicating acute myocardial infarction may provide biased results.

    PubMed

    Acconcia, M C; Caretta, Q; Romeo, F; Borzi, M; Perrone, M A; Sergi, D; Chiarotti, F; Calabrese, C M; Sili Scavalli, A; Gaudio, C

    2018-04-01

    Intra-aortic balloon pump (IABP) is the device most commonly investigated in patients with cardiogenic shock (CS) complicating acute myocardial infarction (AMI). Recently meta-analyses on this topic showed opposite results: some complied with the actual guideline recommendations, while others did not, due to the presence of bias. We investigated the reasons for the discrepancy among meta-analyses and strategies employed to avoid the potential source of bias. Scientific databases were searched for meta-analyses of IABP support in AMI complicated by CS. The presence of clinical diversity, methodological diversity and statistical heterogeneity were analyzed. When we found clinical or methodological diversity, we reanalyzed the data by comparing the patients selected for homogeneous groups. When the fixed effect model was employed despite the presence of statistical heterogeneity, the meta-analysis was repeated adopting the random effect model, with the same estimator used in the original meta-analysis. Twelve meta-analysis were selected. Six meta-analyses of randomized controlled trials (RCTs) were inconclusive because underpowered to detect the IABP effect. Five included RCTs and observational studies (Obs) and one only Obs. Some meta-analyses on RCTs and Obs had biased results due to presence of clinical and/or methodological diversity. The reanalysis of data reallocated for homogeneous groups was no more in contrast with guidelines recommendations. Meta-analyses performed without controlling for clinical and/or methodological diversity, represent a confounding message against a good clinical practice. The reanalysis of data demonstrates the validity of the current guidelines recommendations in addressing clinical decision making in providing IABP support in AMI complicated by CS.

  17. Online Variational Bayesian Filtering-Based Mobile Target Tracking in Wireless Sensor Networks

    PubMed Central

    Zhou, Bingpeng; Chen, Qingchun; Li, Tiffany Jing; Xiao, Pei

    2014-01-01

    The received signal strength (RSS)-based online tracking for a mobile node in wireless sensor networks (WSNs) is investigated in this paper. Firstly, a multi-layer dynamic Bayesian network (MDBN) is introduced to characterize the target mobility with either directional or undirected movement. In particular, it is proposed to employ the Wishart distribution to approximate the time-varying RSS measurement precision's randomness due to the target movement. It is shown that the proposed MDBN offers a more general analysis model via incorporating the underlying statistical information of both the target movement and observations, which can be utilized to improve the online tracking capability by exploiting the Bayesian statistics. Secondly, based on the MDBN model, a mean-field variational Bayesian filtering (VBF) algorithm is developed to realize the online tracking of a mobile target in the presence of nonlinear observations and time-varying RSS precision, wherein the traditional Bayesian filtering scheme cannot be directly employed. Thirdly, a joint optimization between the real-time velocity and its prior expectation is proposed to enable online velocity tracking in the proposed online tacking scheme. Finally, the associated Bayesian Cramer–Rao Lower Bound (BCRLB) analysis and numerical simulations are conducted. Our analysis unveils that, by exploiting the potential state information via the general MDBN model, the proposed VBF algorithm provides a promising solution to the online tracking of a mobile node in WSNs. In addition, it is shown that the final tracking accuracy linearly scales with its expectation when the RSS measurement precision is time-varying. PMID:25393784

  18. Measurement of the relationship between perceived and computed color differences

    NASA Astrophysics Data System (ADS)

    García, Pedro A.; Huertas, Rafael; Melgosa, Manuel; Cui, Guihua

    2007-07-01

    Using simulated data sets, we have analyzed some mathematical properties of different statistical measurements that have been employed in previous literature to test the performance of different color-difference formulas. Specifically, the properties of the combined index PF/3 (performance factor obtained as average of three terms), widely employed in current literature, have been considered. A new index named standardized residual sum of squares (STRESS), employed in multidimensional scaling techniques, is recommended. The main difference between PF/3 and STRESS is that the latter is simpler and allows inferences on the statistical significance of two color-difference formulas with respect to a given set of visual data.

  19. Equal Employment + Equal Pay = Multiple Problems for Colleges and Universities

    ERIC Educational Resources Information Center

    Steinbach, Sheldon Elliot; Reback, Joyce E.

    1974-01-01

    Issues involved in government regulation of university employment practices are discussed: confidentiality of records, pregnancy as a disability, alleged discrimination in benefits, tests and other employment criteria, seniority and layoff, reverse discrimination, use of statistics for determination of discrimination, and the Equal Pay Act. (JT)

  20. Characterizing Giant Exoplanets through Multiwavelength Transit Observations: HD 189733b

    NASA Astrophysics Data System (ADS)

    Kar, Aman; Cole, Jackson Lane; Gardner, Cristilyn N.; Garver, Bethany Ray; Jarka, Kyla L.; McGough, Aylin Marie; PeQueen, David Jeffrey; Rivera, Daniel Ivan; Kasper, David; Jang-Condell, Hannah; Kobulnicky, Henry; Dale, Daniel

    2018-01-01

    Observing the transits of exoplanets in multiple wavelengths enables the characterization of their atmospheres. We used the Wyoming Infrared Observatory to obtain high precision photometry on HD 189733b, one of the most studied exoplanets. We employed the photometry package AIJ and Bayesian statistics in our analysis. Preliminary results suggest a wavelength dependence in the size of the exoplanet, indicative of scattering in the atmosphere. This work is supported by the National Science Foundation under REU grant AST 1560461.

  1. A design methodology for nonlinear systems containing parameter uncertainty

    NASA Technical Reports Server (NTRS)

    Young, G. E.; Auslander, D. M.

    1983-01-01

    In the present design methodology for nonlinear systems containing parameter uncertainty, a generalized sensitivity analysis is incorporated which employs parameter space sampling and statistical inference. For the case of a system with j adjustable and k nonadjustable parameters, this methodology (which includes an adaptive random search strategy) is used to determine the combination of j adjustable parameter values which maximize the probability of those performance indices which simultaneously satisfy design criteria in spite of the uncertainty due to k nonadjustable parameters.

  2. Authoritarianism as a Driver of U.S. Foreign Policy: The Cases of Myanmar, Vietnam, and North Korea

    DTIC Science & Technology

    2016-12-01

    environments. E. RESEARCH DESIGN The methodology of this paper employs statistical analysis and comparative case studies. With proxy scales that...U.S. FOREIGN POLICY: THE CASES OF MYANMAR, VIETNAM, AND NORTH KOREA by Rang Lee December 2016 Thesis Co-Advisor: Tristan Mabry Thesis Co-Advisor...COVERED Master’s thesis 4. TITLE AND SUBTITLE AUTHORITARIANISM AS A DRIVER OF U.S. FOREIGN POLICY: THE CASES OF MYANMAR, VIETNAM, AND NORTH KOREA 5

  3. High Accuracy Passive Magnetic Field-Based Localization for Feedback Control Using Principal Component Analysis.

    PubMed

    Foong, Shaohui; Sun, Zhenglong

    2016-08-12

    In this paper, a novel magnetic field-based sensing system employing statistically optimized concurrent multiple sensor outputs for precise field-position association and localization is presented. This method capitalizes on the independence between simultaneous spatial field measurements at multiple locations to induce unique correspondences between field and position. This single-source-multi-sensor configuration is able to achieve accurate and precise localization and tracking of translational motion without contact over large travel distances for feedback control. Principal component analysis (PCA) is used as a pseudo-linear filter to optimally reduce the dimensions of the multi-sensor output space for computationally efficient field-position mapping with artificial neural networks (ANNs). Numerical simulations are employed to investigate the effects of geometric parameters and Gaussian noise corruption on PCA assisted ANN mapping performance. Using a 9-sensor network, the sensing accuracy and closed-loop tracking performance of the proposed optimal field-based sensing system is experimentally evaluated on a linear actuator with a significantly more expensive optical encoder as a comparison.

  4. On statistical analysis of factors affecting anthocyanin extraction from Ixora siamensis

    NASA Astrophysics Data System (ADS)

    Mat Nor, N. A.; Arof, A. K.

    2016-10-01

    This study focused on designing an experimental model in order to evaluate the influence of operative extraction parameters employed for anthocyanin extraction from Ixora siamensis on CIE color measurements (a*, b* and color saturation). Extractions were conducted at temperatures of 30, 55 and 80°C, soaking time of 60, 120 and 180 min using acidified methanol solvent with different trifluoroacetic acid (TFA) contents of 0.5, 1.75 and 3% (v/v). The statistical evaluation was performed by running analysis of variance (ANOVA) and regression calculation to investigate the significance of the generated model. Results show that the generated regression models adequately explain the data variation and significantly represented the actual relationship between the independent variables and the responses. Analysis of variance (ANOVA) showed high coefficient determination values (R2) of 0.9687 for a*, 0.9621 for b* and 0.9758 for color saturation, thus ensuring a satisfactory fit of the developed models with the experimental data. Interaction between TFA content and extraction temperature exhibited to the highest significant influence on CIE color parameter.

  5. Computational Analysis for Rocket-Based Combined-Cycle Systems During Rocket-Only Operation

    NASA Technical Reports Server (NTRS)

    Steffen, C. J., Jr.; Smith, T. D.; Yungster, S.; Keller, D. J.

    2000-01-01

    A series of Reynolds-averaged Navier-Stokes calculations were employed to study the performance of rocket-based combined-cycle systems operating in an all-rocket mode. This parametric series of calculations were executed within a statistical framework, commonly known as design of experiments. The parametric design space included four geometric and two flowfield variables set at three levels each, for a total of 729 possible combinations. A D-optimal design strategy was selected. It required that only 36 separate computational fluid dynamics (CFD) solutions be performed to develop a full response surface model, which quantified the linear, bilinear, and curvilinear effects of the six experimental variables. The axisymmetric, Reynolds-averaged Navier-Stokes simulations were executed with the NPARC v3.0 code. The response used in the statistical analysis was created from Isp efficiency data integrated from the 36 CFD simulations. The influence of turbulence modeling was analyzed by using both one- and two-equation models. Careful attention was also given to quantify the influence of mesh dependence, iterative convergence, and artificial viscosity upon the resulting statistical model. Thirteen statistically significant effects were observed to have an influence on rocket-based combined-cycle nozzle performance. It was apparent that the free-expansion process, directly downstream of the rocket nozzle, can influence the Isp efficiency. Numerical schlieren images and particle traces have been used to further understand the physical phenomena behind several of the statistically significant results.

  6. Employers' Use and Views of the VET System 2017. Australian Vocational Education and Training Statistics

    ERIC Educational Resources Information Center

    National Centre for Vocational Education Research (NCVER), 2017

    2017-01-01

    This publication presents information on employers' use and views of the vocational education and training (VET) system. The findings relate to the various ways in which Australian employers use the VET system and unaccredited training to meet their skill needs, and their satisfaction with these methods of training. Australian employers can engage…

  7. Finite Element Analysis of Reverberation Chambers

    NASA Technical Reports Server (NTRS)

    Bunting, Charles F.; Nguyen, Duc T.

    2000-01-01

    The primary motivating factor behind the initiation of this work was to provide a deterministic means of establishing the validity of the statistical methods that are recommended for the determination of fields that interact in -an avionics system. The application of finite element analysis to reverberation chambers is the initial step required to establish a reasonable course of inquiry in this particularly data-intensive study. The use of computational electromagnetics provides a high degree of control of the "experimental" parameters that can be utilized in a simulation of reverberating structures. As the work evolved there were four primary focus areas they are: 1. The eigenvalue problem for the source free problem. 2. The development of a complex efficient eigensolver. 3. The application of a source for the TE and TM fields for statistical characterization. 4. The examination of shielding effectiveness in a reverberating environment. One early purpose of this work was to establish the utility of finite element techniques in the development of an extended low frequency statistical model for reverberation phenomena. By employing finite element techniques, structures of arbitrary complexity can be analyzed due to the use of triangular shape functions in the spatial discretization. The effects of both frequency stirring and mechanical stirring are presented. It is suggested that for the low frequency operation the typical tuner size is inadequate to provide a sufficiently random field and that frequency stirring should be used. The results of the finite element analysis of the reverberation chamber illustrate io-W the potential utility of a 2D representation for enhancing the basic statistical characteristics of the chamber when operating in a low frequency regime. The basic field statistics are verified for frequency stirring over a wide range of frequencies. Mechanical stirring is shown to provide an effective frequency deviation.

  8. A statistical simulation model for field testing of non-target organisms in environmental risk assessment of genetically modified plants.

    PubMed

    Goedhart, Paul W; van der Voet, Hilko; Baldacchino, Ferdinando; Arpaia, Salvatore

    2014-04-01

    Genetic modification of plants may result in unintended effects causing potentially adverse effects on the environment. A comparative safety assessment is therefore required by authorities, such as the European Food Safety Authority, in which the genetically modified plant is compared with its conventional counterpart. Part of the environmental risk assessment is a comparative field experiment in which the effect on non-target organisms is compared. Statistical analysis of such trials come in two flavors: difference testing and equivalence testing. It is important to know the statistical properties of these, for example, the power to detect environmental change of a given magnitude, before the start of an experiment. Such prospective power analysis can best be studied by means of a statistical simulation model. This paper describes a general framework for simulating data typically encountered in environmental risk assessment of genetically modified plants. The simulation model, available as Supplementary Material, can be used to generate count data having different statistical distributions possibly with excess-zeros. In addition the model employs completely randomized or randomized block experiments, can be used to simulate single or multiple trials across environments, enables genotype by environment interaction by adding random variety effects, and finally includes repeated measures in time following a constant, linear or quadratic pattern in time possibly with some form of autocorrelation. The model also allows to add a set of reference varieties to the GM plants and its comparator to assess the natural variation which can then be used to set limits of concern for equivalence testing. The different count distributions are described in some detail and some examples of how to use the simulation model to study various aspects, including a prospective power analysis, are provided.

  9. A statistical simulation model for field testing of non-target organisms in environmental risk assessment of genetically modified plants

    PubMed Central

    Goedhart, Paul W; van der Voet, Hilko; Baldacchino, Ferdinando; Arpaia, Salvatore

    2014-01-01

    Genetic modification of plants may result in unintended effects causing potentially adverse effects on the environment. A comparative safety assessment is therefore required by authorities, such as the European Food Safety Authority, in which the genetically modified plant is compared with its conventional counterpart. Part of the environmental risk assessment is a comparative field experiment in which the effect on non-target organisms is compared. Statistical analysis of such trials come in two flavors: difference testing and equivalence testing. It is important to know the statistical properties of these, for example, the power to detect environmental change of a given magnitude, before the start of an experiment. Such prospective power analysis can best be studied by means of a statistical simulation model. This paper describes a general framework for simulating data typically encountered in environmental risk assessment of genetically modified plants. The simulation model, available as Supplementary Material, can be used to generate count data having different statistical distributions possibly with excess-zeros. In addition the model employs completely randomized or randomized block experiments, can be used to simulate single or multiple trials across environments, enables genotype by environment interaction by adding random variety effects, and finally includes repeated measures in time following a constant, linear or quadratic pattern in time possibly with some form of autocorrelation. The model also allows to add a set of reference varieties to the GM plants and its comparator to assess the natural variation which can then be used to set limits of concern for equivalence testing. The different count distributions are described in some detail and some examples of how to use the simulation model to study various aspects, including a prospective power analysis, are provided. PMID:24834325

  10. Employment status and heart disease risk factors in middle-aged women: the Rancho Bernardo Study.

    PubMed Central

    Kritz-Silverstein, D; Wingard, D L; Barrett-Connor, E

    1992-01-01

    BACKGROUND. In recent years, an increasing number of women have been entering the labor force. It is known that in men, employment is related to heart disease risk, but there are few studies examining this association among women. METHODS. The relation between employment status and heart disease risk factors including lipid and lipoprotein levels, systolic and diastolic blood pressure, fasting and postchallenge plasma glucose and insulin levels, was examined in 242 women aged 40 to 59 years, who were participants in the Rancho Bernardo Heart and Chronic Disease Survey. At the time of a follow-up clinic visit between 1984 and 1987, 46.7% were employed, primarily in managerial positions. RESULTS. Employed women smoked fewer cigarettes, drank less alcohol, and exercised more than unemployed women, but these differences were not statistically significant. After adjustment for covariates, employed women had significantly lower total cholesterol and fasting plasma glucose levels than unemployed women. Differences on other biological variables, although not statistically significant, also favored the employed women. CONCLUSIONS. Results of this study suggest that middle-aged women employed in managerial positions are healthier than unemployed women. PMID:1739150

  11. Relationship of Wound, Ostomy, and Continence Certified Nurses and Healthcare-Acquired Conditions in Acute Care Hospitals.

    PubMed

    Boyle, Diane K; Bergquist-Beringer, Sandra; Cramer, Emily

    The purpose of this study was to describe the (a) number and types of employed WOC certified nurses in acute care hospitals, (b) rates of hospital-acquired pressure injury (HAPI) and catheter-associated urinary tract infection (CAUTI), and (c) effectiveness of WOC certified nurses with respect to lowering HAPI and CAUTI occurrences. Retrospective analysis of data from National Database of Nursing Quality Indicators. The sample comprised 928 National Database of Nursing Quality Indicators (NDNQI) hospitals that participated in the 2012 NDNQI RN Survey (source of specialty certification data) and collected HAPI, CAUTI, and nurse staffing data during the years 2012 to 2013. We analyzed years 2012 to 2013 data from the NDNQI. Descriptive statistics summarized the number and types of employed WOC certified nurses, the rate of HAPI and CAUTI, and HAPI risk assessment and prevention intervention rates. Chi-square analyses were used to compare the characteristics of hospitals that do and do not employ WOC certified nurses. Analysis-of-covariance models were used to test the association between WOC certified nurses and HAPI and CAUTI occurrences. Just more than one-third of the study hospitals (36.6%) employed WOC certified nurses. Certified continence care nurses (CCCNs) were employed in fewest number. Hospitals employing wound care specialty certified nurses (CWOCN, CWCN, and CWON) had lower HAPI rates and better pressure injury risk assessment and prevention practices. Stage 3 and 4 HAPI occurrences among hospitals employing CWOCNs, CWCNs, and CWONs (0.27%) were nearly half the rate of hospitals not employing these nurses (0.51%). There were no significant relationships between nurses with specialty certification in continence care (CWOCN, CCCN) or ostomy care (CWOCN, COCN) and CAUTI rates. CWOCNs, CWCNs, and CWONs are an important factor in achieving better HAPI outcomes in acute care settings. The role of CWOCNs, CCCNs, and COCNs in CAUTI prevention warrants further investigation.

  12. Relationship of Wound, Ostomy, and Continence Certified Nurses and Healthcare-Acquired Conditions in Acute Care Hospitals

    PubMed Central

    Bergquist-Beringer, Sandra; Cramer, Emily

    2017-01-01

    PURPOSE: The purpose of this study was to describe the (a) number and types of employed WOC certified nurses in acute care hospitals, (b) rates of hospital-acquired pressure injury (HAPI) and catheter-associated urinary tract infection (CAUTI), and (c) effectiveness of WOC certified nurses with respect to lowering HAPI and CAUTI occurrences. DESIGN: Retrospective analysis of data from National Database of Nursing Quality Indicators. SUBJECTS AND SETTINGS: The sample comprised 928 National Database of Nursing Quality Indicators (NDNQI) hospitals that participated in the 2012 NDNQI RN Survey (source of specialty certification data) and collected HAPI, CAUTI, and nurse staffing data during the years 2012 to 2013. METHODS: We analyzed years 2012 to 2013 data from the NDNQI. Descriptive statistics summarized the number and types of employed WOC certified nurses, the rate of HAPI and CAUTI, and HAPI risk assessment and prevention intervention rates. Chi-square analyses were used to compare the characteristics of hospitals that do and do not employ WOC certified nurses. Analysis-of-covariance models were used to test the association between WOC certified nurses and HAPI and CAUTI occurrences. RESULTS: Just more than one-third of the study hospitals (36.6%) employed WOC certified nurses. Certified continence care nurses (CCCNs) were employed in fewest number. Hospitals employing wound care specialty certified nurses (CWOCN, CWCN, and CWON) had lower HAPI rates and better pressure injury risk assessment and prevention practices. Stage 3 and 4 HAPI occurrences among hospitals employing CWOCNs, CWCNs, and CWONs (0.27%) were nearly half the rate of hospitals not employing these nurses (0.51%). There were no significant relationships between nurses with specialty certification in continence care (CWOCN, CCCN) or ostomy care (CWOCN, COCN) and CAUTI rates. CONCLUSIONS: CWOCNs, CWCNs, and CWONs are an important factor in achieving better HAPI outcomes in acute care settings. The role of CWOCNs, CCCNs, and COCNs in CAUTI prevention warrants further investigation. PMID:28328645

  13. Certification Can Count: The Case of Aircraft Mechanics. Issues in Labor Statistics. Summary 02-03.

    ERIC Educational Resources Information Center

    Bureau of Labor Statistics, Washington, DC.

    This document is a summary of aerospace industry technician statistics gathered by the Occupational Employment Statistics Survey for the year 2000 by the Department of Labor, Bureau of Labor Statistics. The data includes the following: (1) a comparison of wages earned by Federal Aviation Administration (FAA) certified and non-FAA certified…

  14. 49 CFR 40.111 - When and how must a laboratory disclose statistical summaries and other information it maintains?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... statistical summaries and other information it maintains? 40.111 Section 40.111 Transportation Office of the... Testing Laboratories § 40.111 When and how must a laboratory disclose statistical summaries and other information it maintains? (a) As a laboratory, you must transmit an aggregate statistical summary, by employer...

  15. The Functional Relationship between Maternal Employment, Self-Concept; and Family Orientation.

    ERIC Educational Resources Information Center

    Goodwin, Paul; Newman, Isadore

    This study investigated the relationships between maternal employment during three periods in the child's life, the child's self-concept, and family orientation. Variables statistically controlled were intactness of the family, father's employment status, the child's sex, the child's race, and the family's socioeconomic status. It was hypothesized…

  16. Women and Nontraditional Work.

    ERIC Educational Resources Information Center

    Mort, Heidi; Reisman, Janet

    This fact sheet summarizes labor market statistics on nontraditional jobs for women and public policy, barriers, and strategies regarding such employment. Among the data presented are the following: nontraditional jobs for women are jobs in which 75 percent or more of those employed are men; 9 percent of all working women are employed in…

  17. A Virtual Study of Grid Resolution on Experiments of a Highly-Resolved Turbulent Plume

    NASA Astrophysics Data System (ADS)

    Maisto, Pietro M. F.; Marshall, Andre W.; Gollner, Michael J.; Fire Protection Engineering Department Collaboration

    2017-11-01

    An accurate representation of sub-grid scale turbulent mixing is critical for modeling fire plumes and smoke transport. In this study, PLIF and PIV diagnostics are used with the saltwater modeling technique to provide highly-resolved instantaneous field measurements in unconfined turbulent plumes useful for statistical analysis, physical insight, and model validation. The effect of resolution was investigated employing a virtual interrogation window (of varying size) applied to the high-resolution field measurements. Motivated by LES low-pass filtering concepts, the high-resolution experimental data in this study can be analyzed within the interrogation windows (i.e. statistics at the sub-grid scale) and on interrogation windows (i.e. statistics at the resolved scale). A dimensionless resolution threshold (L/D*) criterion was determined to achieve converged statistics on the filtered measurements. Such a criterion was then used to establish the relative importance between large and small-scale turbulence phenomena while investigating specific scales for the turbulent flow. First order data sets start to collapse at a resolution of 0.3D*, while for second and higher order statistical moments the interrogation window size drops down to 0.2D*.

  18. A new statistical methodology predicting chip failure probability considering electromigration

    NASA Astrophysics Data System (ADS)

    Sun, Ted

    In this research thesis, we present a new approach to analyze chip reliability subject to electromigration (EM) whose fundamental causes and EM phenomenon happened in different materials are presented in this thesis. This new approach utilizes the statistical nature of EM failure in order to assess overall EM risk. It includes within-die temperature variations from the chip's temperature map extracted by an Electronic Design Automation (EDA) tool to estimate the failure probability of a design. Both the power estimation and thermal analysis are performed in the EDA flow. We first used the traditional EM approach to analyze the design with a single temperature across the entire chip that involves 6 metal and 5 via layers. Next, we used the same traditional approach but with a realistic temperature map. The traditional EM analysis approach and that coupled with a temperature map and the comparison between the results of considering and not considering temperature map are presented in in this research. A comparison between these two results confirms that using a temperature map yields a less pessimistic estimation of the chip's EM risk. Finally, we employed the statistical methodology we developed considering a temperature map and different use-condition voltages and frequencies to estimate the overall failure probability of the chip. The statistical model established considers the scaling work with the usage of traditional Black equation and four major conditions. The statistical result comparisons are within our expectations. The results of this statistical analysis confirm that the chip level failure probability is higher i) at higher use-condition frequencies for all use-condition voltages, and ii) when a single temperature instead of a temperature map across the chip is considered. In this thesis, I start with an overall review on current design types, common flows, and necessary verifications and reliability checking steps used in this IC design industry. Furthermore, the important concepts about "Scripting Automation" which is used in all the integration of using diversified EDA tools in this research work are also described in detail with several examples and my completed coding works are also put in the appendix for your reference. Hopefully, this construction of my thesis will give readers a thorough understanding about my research work from the automation of EDA tools to the statistical data generation, from the nature of EM to the statistical model construction, and the comparisons among the traditional EM analysis and the statistical EM analysis approaches.

  19. Disability, employment and work performance among people with ICD-10 anxiety disorders.

    PubMed

    Waghorn, Geoff; Chant, David; White, Paul; Whiteford, Harvey

    2005-01-01

    To ascertain at a population level, patterns of disability, labour force participation, employment and work performance among people with ICD-10 anxiety disorders in comparison to people without disability or long-term health conditions. A secondary analysis was conducted of a probability sample of 42 664 individuals collected in an Australian Bureau of Statistics (ABS) national survey in 1998. Trained lay interviewers using ICD-10 computer-assisted interviews identified household residents with anxiety disorders. Anxiety disorders were associated with: reduced labour force participation, degraded employment trajectories and impaired work performance compared to people without disabilities or long-term health conditions. People with anxiety disorders may need more effective treatments and assistance with completing education and training, joining and rejoining the workforce, developing career pathways, remaining in the workforce and sustaining work performance. A whole-of-government approach appears needed to reduce the burden of disease and increase community labour resources. Implications for clinicians, vocational professionals and policy makers are discussed.

  20. Vocational training and employability: Evaluation evidence from Romania.

    PubMed

    Popescu, Madalina Ecaterina; Roman, Monica

    2018-04-01

    This study evaluates the direct effects of vocational training, which is a popular active labour market policy in a European developing country such as Romania. Since the available official statistical microdata were insufficient to conduct reliable impact evaluations, the main findings were obtained through a counterfactual impact evaluation using newly produced micro survey data. Moreover, the research provides a heterogeneity analysis of groups of trainees, in order to identify the categories for which the programme performs best. The main results reveal that the training measure has a positive, but modest impact upon employability in Romania: participation increases employment chances properly controlled by 15%. It is most successful for women and for people living in urban areas. Measures for increasing the impacts of the vocational training programme in Romania are identified in terms of better targeting and profiling the trainees and closer adjustment of the programme to the specific needs of the labour market. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Distribution of water quality parameters in Dhemaji district, Assam (India).

    PubMed

    Buragohain, Mridul; Bhuyan, Bhabajit; Sarma, H P

    2010-07-01

    The primary objective of this study is to present a statistically significant water quality database of Dhemaji district, Assam (India) with special reference to pH, fluoride, nitrate, arsenic, iron, sodium and potassium. 25 water samples collected from different locations of five development blocks in Dhemaji district have been studied separately. The implications presented are based on statistical analyses of the raw data. Normal distribution statistics and reliability analysis (correlation and covariance matrix) have been employed to find out the distribution pattern, localisation of data, and other related information. Statistical observations show that all the parameters under investigation exhibit non uniform distribution with a long asymmetric tail either on the right or left side of the median. The width of the third quartile was consistently found to be more than the second quartile for each parameter. Differences among mean, mode and median, significant skewness and kurtosis value indicate that the distribution of various water quality parameters in the study area is widely off normal. Thus, the intrinsic water quality is not encouraging due to unsymmetrical distribution of various water quality parameters in the study area.

  2. Are conventional statistical techniques exhaustive for defining metal background concentrations in harbour sediments? A case study: The Coastal Area of Bari (Southeast Italy).

    PubMed

    Mali, Matilda; Dell'Anna, Maria Michela; Mastrorilli, Piero; Damiani, Leonardo; Ungaro, Nicola; Belviso, Claudia; Fiore, Saverio

    2015-11-01

    Sediment contamination by metals poses significant risks to coastal ecosystems and is considered to be problematic for dredging operations. The determination of the background values of metal and metalloid distribution based on site-specific variability is fundamental in assessing pollution levels in harbour sediments. The novelty of the present work consists of addressing the scope and limitation of analysing port sediments through the use of conventional statistical techniques (such as: linear regression analysis, construction of cumulative frequency curves and the iterative 2σ technique), that are commonly employed for assessing Regional Geochemical Background (RGB) values in coastal sediments. This study ascertained that although the tout court use of such techniques in determining the RGB values in harbour sediments seems appropriate (the chemical-physical parameters of port sediments fit well with statistical equations), it should nevertheless be avoided because it may be misleading and can mask key aspects of the study area that can only be revealed by further investigations, such as mineralogical and multivariate statistical analyses. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. How Minorities Continue to Be Excluded from Equal Employment Opportunities: Research on Labor Market and Institutional Barriers.

    DTIC Science & Technology

    1987-04-01

    of jobs, four types of exclusionary barriers are investigated: "segregated networks" at the candidate stage, "information bias" and " statistical ...constitutional law, and socio-economic theory (for example, Glazer, 1975; Maguire, 1980). Disagreements have been particularly strong about the preferen...will present statistics on current labor market processes that can be used to assess the continuing need for strong policies of equal employment

  4. Is Functional Independence Associated With Improved Long-Term Survival After Lung Transplantation?

    PubMed

    Osho, Asishana; Mulvihill, Michael; Lamba, Nayan; Hirji, Sameer; Yerokun, Babatunde; Bishawi, Muath; Spencer, Philip; Panda, Nikhil; Villavicencio, Mauricio; Hartwig, Matthew

    2018-07-01

    Existing research demonstrates superior short-term outcomes (length of stay, 1-year survival) after lung transplantation in patients with preoperative functional independence. The aim of this study was to determine whether advantages remain significant in the long-term. The United Network for Organ Sharing database was queried for adult, first-time, isolated lung transplantation records from January 2005 to December 2015. Stratification was performed based on Karnofsky Performance Status Score (3 groups) and on employment at the time of transplantation (2 groups). Kaplan-Meier and Cox analyses were performed to determine the association between these factors and survival in the long-term. Of 16,497 patients meeting criteria, 1,581 (9.6%) were almost completely independent at the time of transplant vs 5,662 (34.3%) who were disabled (completely reliant on others for activities of daily living). Cox models adjusting for recipient, donor, and transplant factors demonstrated a statistically significant association between disability at the time of transplant and long-term death (hazard ratio, 1.26; 95% confidence interval, 1.14 to 1.40; p < 0.001). There were 15,931 patients with available data on paid employment at the time of transplantation. Multivariable analysis demonstrated a statistically significant association between employment at the time of transplantation and death (hazard ratio, 0.86; 95% confidence interval, 0.75 to 0.91; p < 0.001). Preoperative functional independence and maintenance of employment are associated with superior long-term outcomes in lung recipients. The results highlight potential benefits of pretransplant functional rehabilitation for patients on the waiting list for lungs. Copyright © 2018 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  5. Employment loss during economic crisis and suicidal thoughts in Belgium: a survey in general practice

    PubMed Central

    Vanderoost, Filip; van der Wielen, Susan; van Nunen, Karolien; Van Hal, Guido

    2013-01-01

    Background The economic crisis of 2009 led to a wave of corporate reorganisations and bankruptcies, with many dismissals of employees. GPs were confronted with subsequent health consequences. Aim To assess the possible relationship between losing one’s job and having suicidal thoughts. Design and setting A survey of patients aged 18–49 years recruited from GP practices in Belgium in Deurne (Flemish region) and La Louvière (Walloon region) from September to December 2010. Method Anonymous self-administered questionnaire. Results Of all eligible patients (n = 1818), 831 were offered the questionnaire and 377 completed it (45.4%). More than one in five had been confronted with employment loss in the past year (the responder or someone close losing their job). Almost one in ten had lost their job themselves in the past year. More than one in four had experienced suicidal thoughts and 11.7% had seriously considered ending their life in the past year. In the logistic regression analysis, the following characteristics showed a statistically significant relationship with having suicidal thoughts: being single (odds ratio [OR] = 4.8, 95% confidence interval [CI] = 1.7 to 13.8), not having satisfying social contacts (OR = 5.1, 95% CI = 1.6 to 16.2), having depressive complaints (OR = 18.4, 95% CI = 5.8 to 58.4), and having lost one’s employment in the past year (OR = 8.8, 95% CI = 2.0 to 39.3). Conclusion This study points to a statistically significant relationship between losing one’s employment in the past year and having suicidal thoughts. It emphasises the important role of the GP in the continuous and reinforced assessment of suicidal risk in times of recession. PMID:24152484

  6. 75 FR 41579 - Submitting Airline Data via the Internet

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-16

    ... Airline Information, RTS-42, Bureau of Transportation Statistics, Research and Innovative Technology... Statistics (BTS), must be submitted electronically (e- filing). The new e-filing system is designed to be... November 30, 2010. P-10 Employment Statistics by Labor Category--due February 20, 2011. A Certification...

  7. 75 FR 3926 - Submission for OMB Emergency Review: Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-25

    ... DEPARTMENT OF LABOR Bureau of Labor Statistics Submission for OMB Emergency Review: Comment.... Agency: Bureau of Labor Statistics. Type of Review: New collection. Title of Collection: Quarterly Census... appropriation tasks the Bureau of Labor Statistics (BLS) Quarterly Census of Employment and Wages (QCEW) program...

  8. Finding the Root Causes of Statistical Inconsistency in Community Earth System Model Output

    NASA Astrophysics Data System (ADS)

    Milroy, D.; Hammerling, D.; Baker, A. H.

    2017-12-01

    Baker et al (2015) developed the Community Earth System Model Ensemble Consistency Test (CESM-ECT) to provide a metric for software quality assurance by determining statistical consistency between an ensemble of CESM outputs and new test runs. The test has proved useful for detecting statistical difference caused by compiler bugs and errors in physical modules. However, detection is only the necessary first step in finding the causes of statistical difference. The CESM is a vastly complex model comprised of millions of lines of code which is developed and maintained by a large community of software engineers and scientists. Any root cause analysis is correspondingly challenging. We propose a new capability for CESM-ECT: identifying the sections of code that cause statistical distinguishability. The first step is to discover CESM variables that cause CESM-ECT to classify new runs as statistically distinct, which we achieve via Randomized Logistic Regression. Next we use a tool developed to identify CESM components that define or compute the variables found in the first step. Finally, we employ the application Kernel GENerator (KGEN) created in Kim et al (2016) to detect fine-grained floating point differences. We demonstrate an example of the procedure and advance a plan to automate this process in our future work.

  9. Bonneville Power Administration Communication Alarm Processor expert system:

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goeltz, R.; Purucker, S.; Tonn, B.

    This report describes the Communications Alarm Processor (CAP), a prototype expert system developed for the Bonneville Power Administration by Oak Ridge National Laboratory. The system is designed to receive and diagnose alarms from Bonneville's Microwave Communications System (MCS). The prototype encompasses one of seven branches of the communications network and a subset of alarm systems and alarm types from each system. The expert system employs a backward chaining approach to diagnosing alarms. Alarms are fed into the expert system directly from the communication system via RS232 ports and sophisticated alarm filtering and mailbox software. Alarm diagnoses are presented to operatorsmore » for their review and concurrence before the diagnoses are archived. Statistical software is incorporated to allow analysis of archived data for report generation and maintenance studies. The delivered system resides on a Digital Equipment Corporation VAX 3200 workstation and utilizes Nexpert Object and SAS for the expert system and statistical analysis, respectively. 11 refs., 23 figs., 7 tabs.« less

  10. Simulation of parametric model towards the fixed covariate of right censored lung cancer data

    NASA Astrophysics Data System (ADS)

    Afiqah Muhamad Jamil, Siti; Asrul Affendi Abdullah, M.; Kek, Sie Long; Ridwan Olaniran, Oyebayo; Enera Amran, Syahila

    2017-09-01

    In this study, simulation procedure was applied to measure the fixed covariate of right censored data by using parametric survival model. The scale and shape parameter were modified to differentiate the analysis of parametric regression survival model. Statistically, the biases, mean biases and the coverage probability were used in this analysis. Consequently, different sample sizes were employed to distinguish the impact of parametric regression model towards right censored data with 50, 100, 150 and 200 number of sample. R-statistical software was utilised to develop the coding simulation with right censored data. Besides, the final model of right censored simulation was compared with the right censored lung cancer data in Malaysia. It was found that different values of shape and scale parameter with different sample size, help to improve the simulation strategy for right censored data and Weibull regression survival model is suitable fit towards the simulation of survival of lung cancer patients data in Malaysia.

  11. Effect of environment and genotype on commercial maize hybrids using LC/MS-based metabolomics.

    PubMed

    Baniasadi, Hamid; Vlahakis, Chris; Hazebroek, Jan; Zhong, Cathy; Asiago, Vincent

    2014-02-12

    We recently applied gas chromatography coupled to time-of-flight mass spectrometry (GC/TOF-MS) and multivariate statistical analysis to measure biological variation of many metabolites due to environment and genotype in forage and grain samples collected from 50 genetically diverse nongenetically modified (non-GM) DuPont Pioneer commercial maize hybrids grown at six North American locations. In the present study, the metabolome coverage was extended using a core subset of these grain and forage samples employing ultra high pressure liquid chromatography (uHPLC) mass spectrometry (LC/MS). A total of 286 and 857 metabolites were detected in grain and forage samples, respectively, using LC/MS. Multivariate statistical analysis was utilized to compare and correlate the metabolite profiles. Environment had a greater effect on the metabolome than genetic background. The results of this study support and extend previously published insights into the environmental and genetic associated perturbations to the metabolome that are not associated with transgenic modification.

  12. Evidence for social learning in wild lemurs (Lemur catta).

    PubMed

    Kendal, Rachel L; Custance, Deborah M; Kendal, Jeremy R; Vale, Gillian; Stoinski, Tara S; Rakotomalala, Nirina Lalaina; Rasamimanana, Hantanirina

    2010-08-01

    Interest in social learning has been fueled by claims of culture in wild animals. These remain controversial because alternative explanations to social learning, such as asocial learning or ecological differences, remain difficult to refute. Compared with laboratory-based research, the study of social learning in natural contexts is in its infancy. Here, for the first time, we apply two new statistical methods, option-bias analysis and network-based diffusion analysis, to data from the wild, complemented by standard inferential statistics. Contrary to common thought regarding the cognitive abilities of prosimian primates, our evidence is consistent with social learning within subgroups in the ring-tailed lemur (Lemur catta), supporting the theory of directed social learning (Coussi-Korbel & Fragaszy, 1995). We also caution that, as the toolbox for capturing social learning in natural contexts grows, care is required in ensuring that the methods employed are appropriate-in particular, regarding social dynamics among study subjects. Supplemental materials for this article may be downloaded from http://lb.psychonomic-journals.org/content/supplemental.

  13. Validation tools for image segmentation

    NASA Astrophysics Data System (ADS)

    Padfield, Dirk; Ross, James

    2009-02-01

    A large variety of image analysis tasks require the segmentation of various regions in an image. For example, segmentation is required to generate accurate models of brain pathology that are important components of modern diagnosis and therapy. While the manual delineation of such structures gives accurate information, the automatic segmentation of regions such as the brain and tumors from such images greatly enhances the speed and repeatability of quantifying such structures. The ubiquitous need for such algorithms has lead to a wide range of image segmentation algorithms with various assumptions, parameters, and robustness. The evaluation of such algorithms is an important step in determining their effectiveness. Therefore, rather than developing new segmentation algorithms, we here describe validation methods for segmentation algorithms. Using similarity metrics comparing the automatic to manual segmentations, we demonstrate methods for optimizing the parameter settings for individual cases and across a collection of datasets using the Design of Experiment framework. We then employ statistical analysis methods to compare the effectiveness of various algorithms. We investigate several region-growing algorithms from the Insight Toolkit and compare their accuracy to that of a separate statistical segmentation algorithm. The segmentation algorithms are used with their optimized parameters to automatically segment the brain and tumor regions in MRI images of 10 patients. The validation tools indicate that none of the ITK algorithms studied are able to outperform with statistical significance the statistical segmentation algorithm although they perform reasonably well considering their simplicity.

  14. The nexus between geopolitical uncertainty and crude oil markets: An entropy-based wavelet analysis

    NASA Astrophysics Data System (ADS)

    Uddin, Gazi Salah; Bekiros, Stelios; Ahmed, Ali

    2018-04-01

    The global financial crisis and the subsequent geopolitical turbulence in energy markets have brought increased attention to the proper statistical modeling especially of the crude oil markets. In particular, we utilize a time-frequency decomposition approach based on wavelet analysis to explore the inherent dynamics and the casual interrelationships between various types of geopolitical, economic and financial uncertainty indices and oil markets. Via the introduction of a mixed discrete-continuous multiresolution analysis, we employ the entropic criterion for the selection of the optimal decomposition level of a MODWT as well as the continuous-time coherency and phase measures for the detection of business cycle (a)synchronization. Overall, a strong heterogeneity in the revealed interrelationships is detected over time and across scales.

  15. Looking for a Job While Employed. Issues in Labor Statistics. Summary 97-14.

    ERIC Educational Resources Information Center

    Bureau of Labor Statistics, Washington, DC.

    In February 1995, a supplement to the Current Population Survey examined the job search rate among a sample of 108,876 employed persons (except unpaid family workers) who had worked for their employer for at least 3 months were asked if they had looked for others employment since December 1994. Of those surveyed, 6,044 (5.6%) had actively searched…

  16. Workplace Lactation Support in Milwaukee County 5 Years After the Affordable Care Act.

    PubMed

    Lennon, Tyler; Willis, Earnestine

    2017-02-01

    Workplace lactation support has become increasingly important because returning to work is associated with discontinuing breastfeeding and women in the workforce are increasing. Research aim: This study examined workplace lactation support among Milwaukee County businesses 5 years after implementation of the Affordable Care Act's Break Time for Nursing Mothers provision. A cross-sectional survey of Milwaukee County businesses was conducted in the summer of 2015 that inquired about workplace policies, lactation spaces, and other lactation resources offered. Business supports were stratified based on employer sizes: large (> 500 employees), medium (50-499 employees), and small (20-49 employees). A lactation amenity score was calculated for each business based on lactation resources available. Three hundred surveys were distributed and 71 businesses voluntarily completed the survey. Small employers were excluded from statistical analysis due to fewer responses ( n = 8). Overall, 87.3% ( n = 55) of respondents reported providing access to a multiuser space for lactation and 65.1% ( n = 41) reported providing a designated lactation space. Large employers ( n = 30) were more likely than medium employers ( n = 33) to provide a designated lactation space for breastfeeding or expressing (86.7% vs. 45.5%, p < .001). Large employers' mean amenity score was significantly higher than that of medium employers (3.37 vs. 2.57, p = .014), and they were also more likely to offer additional supports including access to a lactation consultant, classes, and materials (46.7% vs. 12.1%, p < .01). Large employers provide more lactation support than medium employers in Milwaukee County. All employers, regardless of size, need to increase additional lactation support for women in the workplace.

  17. US productivity slowdown: a case of statistical myopia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Darby, M.R.

    1984-06-01

    The author argues that the productivity panic is based upon statistical myopia, and that a careful analysis within the perspective of the entire 20th century discloses no substantial variation in what is described as growth in total factor productivity or technical progress. He finds no substantial variations in trend growth rates of private labor productivity since 1900 if reasonable adjustments are made for the effects of demographic trends on the average quality of labor. Even if one were to ignore the effects of demographic shifts, the measured growth rates of productivity, total private hours, and private employment have essentially themore » same values in 1956-79 as for 1900-29. Some of the primary data base for the paper appears in the appendix. 39 references, 3 figures, 9 tables.« less

  18. Transitions between superstatistical regimes: Validity, breakdown and applications

    NASA Astrophysics Data System (ADS)

    Jizba, Petr; Korbel, Jan; Lavička, Hynek; Prokš, Martin; Svoboda, Václav; Beck, Christian

    2018-03-01

    Superstatistics is a widely employed tool of non-equilibrium statistical physics which plays an important rôle in analysis of hierarchical complex dynamical systems. Yet, its "canonical" formulation in terms of a single nuisance parameter is often too restrictive when applied to complex empirical data. Here we show that a multi-scale generalization of the superstatistics paradigm is more versatile, allowing to address such pertinent issues as transmutation of statistics or inter-scale stochastic behavior. To put some flesh on the bare bones, we provide a numerical evidence for a transition between two superstatistics regimes, by analyzing high-frequency (minute-tick) data for share-price returns of seven selected companies. Salient issues, such as breakdown of superstatistics in fractional diffusion processes or connection with Brownian subordination are also briefly discussed.

  19. Intention to leave the profession: antecedents and role in nurse turnover.

    PubMed

    Parry, Julianne

    2008-10-01

    This paper is a report of a study to examine the relationship between intention to change profession and intention to change employer among newly graduated nurses. Few studies of the worldwide nursing workforce shortage consider the contribution of changing professions to the shortage. Organizational behaviour research has identified that professional commitment and organizational commitment have an important role in organizational turnover and that professional commitment and intention to change professions may have a greater role in organizational turnover than is presently understood. A model of the relationships between affective professional commitment job satisfaction, organizational commitment, intention to change professions and organizational turnover intention was developed through review of the organizational behaviour literature and tested using path analysis. The sample was drawn from all nurses in Queensland, Australia, entering the workforce for the first time in 2005. The model was tested with a final sample size of 131 nurses in the initial period of exposure to the workplace. Affective professional commitment and organizational commitment were statistically significantly related to intention to change professions. Job satisfaction, organizational commitment and intention to change professions were statistically significantly related to intention to change employer. Turnover research in nursing should include intention to change professions as well as intention to change employer. Policies and practices that enhance the development of affective professional commitment prior to exposure to the workplace and support affective professional commitment, job satisfaction and organizational commitment in the workplace are needed to help reduce nurse turnover.

  20. Working 9-5: Causal Relationships Between Singers' "Day Jobs" and Their Performance Work, With Implications for Vocal Health.

    PubMed

    Bartlett, Irene; Wilson, Pat H

    2017-03-01

    It is acknowledged generally that professional contemporary commercial music (CCM) singers engage in supplementary employment ("the day job") to achieve and maintain a reliable living wage. In this paper, consideration is given to the impact of such nonperformance employment on CCM's sustainable vocal health. Collected data from a survey of 102 professional contemporary gig singers were analysed using descriptive statistical procedures from the Statistical Package for the Social Sciences. Although these data provided descriptions of the personal characteristics of individuals in the sample, the inclusion of open format questions encouraged participants to report details of their "lived" experience. Additionally, a meta-analysis of a range of associated literature was undertaken. Sixty-five participants (N = 102) reported that in addition to their heavy performance voice use, they were employed in "other" work (the "day job") where their speaking voice loads were high. In responding to open-ended questions, many proffered written comments that were unprompted. The collected data from this element of the research study are reported here. We propose that at least some causal factors of singers' reported voice problems may lie in the misuse or overuse of their everyday speaking voice (as demanded by their "day job") rather than a misuse of their singing voice. These findings have practical application to all whose concern is care for the vocal or emotional health and performance longevity of professional singers. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  1. A Critique of Divorce Statistics and Their Interpretation.

    ERIC Educational Resources Information Center

    Crosby, John F.

    1980-01-01

    Increasingly, appeals to the divorce statistic are employed to substantiate claims that the family is in a state of breakdown and marriage is passe. This article contains a consideration of reasons why the divorce statistics are invalid and/or unreliable as indicators of the present state of marriage and family. (Author)

  2. Using Statistical Process Control to Make Data-Based Clinical Decisions.

    ERIC Educational Resources Information Center

    Pfadt, Al; Wheeler, Donald J.

    1995-01-01

    Statistical process control (SPC), which employs simple statistical tools and problem-solving techniques such as histograms, control charts, flow charts, and Pareto charts to implement continual product improvement procedures, can be incorporated into human service organizations. Examples illustrate use of SPC procedures to analyze behavioral data…

  3. Handbook of Labor Statistics. Bulletin 2175.

    ERIC Educational Resources Information Center

    Springsteen, Rosalind, Comp.; Epstein, Rosalie, Comp.

    This publication makes available in one volume the major series produced by the Bureau of Labor Statistics. Technical notes preceding each major section contain information on data changes and explain the services. Forty-four tables derived from the Current Population Survey (CPS) provide statistics on labor force and employment status,…

  4. 76 FR 34385 - Program Integrity: Gainful Employment-Debt Measures

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-13

    ... postsecondary education at a public institution. National Center for Education Statistics, 2004/2009 Beginning... reliable earnings information, including use of State data, survey data, or Bureau of Labor Statistics (BLS...

  5. Pesticides and public health: an analysis of the regulatory approach to assessing the carcinogenicity of glyphosate in the European Union.

    PubMed

    Clausing, Peter; Robinson, Claire; Burtscher-Schaden, Helmut

    2018-03-13

    The present paper scrutinises the European authorities' assessment of the carcinogenic hazard posed by glyphosate based on Regulation (EC) 1272/2008. We use the authorities' own criteria as a benchmark to analyse their weight of evidence (WoE) approach. Therefore, our analysis goes beyond the comparison of the assessments made by the European Food Safety Authority and the International Agency for Research on Cancer published by others. We show that not classifying glyphosate as a carcinogen by the European authorities, including the European Chemicals Agency, appears to be not consistent with, and in some instances, a direct violation of the applicable guidance and guideline documents. In particular, we criticise an arbitrary attenuation by the authorities of the power of statistical analyses; their disregard of existing dose-response relationships; their unjustified claim that the doses used in the mouse carcinogenicity studies were too high and their contention that the carcinogenic effects were not reproducible by focusing on quantitative and neglecting qualitative reproducibility. Further aspects incorrectly used were historical control data, multisite responses and progression of lesions to malignancy. Contrary to the authorities' evaluations, proper application of statistical methods and WoE criteria inevitably leads to the conclusion that glyphosate is 'probably carcinogenic' (corresponding to category 1B in the European Union). © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  6. Effect of grinding with diamond-disc and -bur on the mechanical behavior of a Y-TZP ceramic.

    PubMed

    Pereira, G K R; Amaral, M; Simoneti, R; Rocha, G C; Cesar, P F; Valandro, L F

    2014-09-01

    This study compared the effects of grinding on the surface micromorphology, phase transformation (t→m), biaxial flexural strength and structural reliability (Weibull analysis) of a Y-TZP (Lava) ceramic using diamond-discs and -burs. 170 discs (15×1.2mm) were produced and divided into 5 groups: without treatment (Ctrl, as-sintered), and ground with 4 different systems: extra-fine (25µm, Xfine) and coarse diamond-bur (181µm, Coarse), 600-grit (25µm, D600) and 120-grit diamond-disc (160µm, D120). Grinding with burs was performed using a contra-angle handpiece (T2-Revo R170, Sirona), while for discs (Allied) a Polishing Machine (Ecomet, Buehler) was employed, both under water-cooling. Micromorphological analysis showed distinct patterns generated by grinding with discs and burs, independent of grit size. There was no statistical difference for characteristic strength values (MPa) between smaller grit sizes (D600 - 1050.08 and Xfine - 1171.33), although they presented higher values compared to Ctrl (917.58). For bigger grit sizes, a significant difference was observed (Coarse - 1136.32>D120 - 727.47). Weibull Modules were statistically similar between the tested groups. Within the limits of this study, from a micromorphological point-of-view, the treatments performed did not generate similar effects, so from a methodological point-of-view, diamond-discs should not be employed to simulate clinical abrasion performed with diamond-burs on Y-TZP ceramics. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Cancer mortality in the British rubber industry: 1946-80.

    PubMed Central

    Sorahan, T; Parkes, H G; Veys, C A; Waterhouse, J A

    1986-01-01

    The mortality experienced by a cohort of 36445 rubber workers during 1946-80 has been investigated. These workers were all male operatives first employed in any one of the 13 participating factories in 1946-60; all had worked continuously in the industry for a minimum period of one year. Compared with the general population, statistically significant excesses relating to cancer mortality were found for cancer of the stomach (E = 245.9, O = 282, SMR = 115), primary cancer of the liver (E = 12.8, O = 22, SMR = 172), cancer of the lung (E = 892.7, O = 1191, SMR = 133), and all neoplasms (E = 2165.2, O = 2487, SMR = 115). Statistically significant deficits were found for cancer of the prostate (E = 79.7, O = 59, SMR = 74) and cancer of the testis (E = 10.3, O = 4, SMR = 39). The method of regression models in life tables (RMLT) was used to compare the duration of employment in the industry, the duration in "dust exposed" jobs, and the duration in "fume and/or solvent exposed" jobs of those dying from causes of interest with those of all matching survivors. Significant positive associations were found only for cancer of the stomach and cancer of the lung. The results of the RMLT analysis are independent of those from the SMR analysis, and the study has provided further evidence of a causal association between the risks of lung and stomach cancer and certain occupational exposures in the rubber industry. PMID:3718880

  8. The impacts of renewable energy policies on renewable energy sources for electricity generating capacity

    NASA Astrophysics Data System (ADS)

    Koo, Bryan Bonsuk

    Electricity generation from non-hydro renewable sources has increased rapidly in the last decade. For example, Renewable Energy Sources for Electricity (RES-E) generating capacity in the U.S. almost doubled for the last three year from 2009 to 2012. Multiple papers point out that RES-E policies implemented by state governments play a crucial role in increasing RES-E generation or capacity. This study examines the effects of state RES-E policies on state RES-E generating capacity, using a fixed effects model. The research employs panel data from the 50 states and the District of Columbia, for the period 1990 to 2011, and uses a two-stage approach to control endogeneity embedded in the policies adopted by state governments, and a Prais-Winsten estimator to fix any autocorrelation in the panel data. The analysis finds that Renewable Portfolio Standards (RPS) and Net-metering are significantly and positively associated with RES-E generating capacity, but neither Public Benefit Funds nor the Mandatory Green Power Option has a statistically significant relation to RES-E generating capacity. Results of the two-stage model are quite different from models which do not employ predicted policy variables. Analysis using non-predicted variables finds that RPS and Net-metering policy are statistically insignificant and negatively associated with RES-E generating capacity. On the other hand, Green Energy Purchasing policy is insignificant in the two-stage model, but significant in the model without predicted values.

  9. Body Image of Women Submitted to Breast Cancer Treatment

    PubMed

    Guedes, Thais Sousa Rodrigues; Dantas de Oliveira, Nayara Priscila; Holanda, Ayrton Martins; Reis, Mariane Albuquerque; Silva, Clécia Patrocínio da; Rocha e Silva, Bárbara Layse; Cancela, Marianna de Camargo; de Souza, Dyego Leandro Bezerra

    2018-06-25

    Background: The study of body image includes the perception of women regarding the physical appearance of their own body. The objective of the present study was to verify the prevalence of body image dissatisfaction and its associated factors in women submitted to breast cancer treatment. Methods: A cross-sectional study carried out with 103 female residents of the municipality of Natal (Northeast Brazil), diagnosed with breast cancer who had undergone cancer treatment for at least 12 months prior to the study, and remained under clinical monitoring. The variable body image was measured through the validated Body Image Scale (BIS). Socioeconomic variables and clinical history were also collected through an individual interview with each participant. The Pearson’s chi-squared test (Fisher’s Exact) was utilized for bivariate analysis, calculating the prevalence ratio with 95% confidence interval. Poisson regression with robust variance was utilized for multivariate analysis. The statistical significance considered was 0.05. Results: The prevalence of body image dissatisfaction was 74.8% CI (65%-82%). Statistically significant associations were observed between body image and multi-professional follow-up (p=0.009) and return to employment after treatment (p=0.022). Conclusion: It was concluded that women who reported employment after cancer treatment presented more alterations in self-perception concerning their appearance. Patients who did not receive multi-professional follow-up reported negative body image, evidencing the need for strategies that increase and improve healthcare, aiming to meet the demands of this population. Creative Commons Attribution License

  10. Overworked? An Observation of the Relationship between Student Employment and Academic Performance

    ERIC Educational Resources Information Center

    Logan, Jennifer; Hughes, Traci; Logan, Brian

    2016-01-01

    Current observations from the National Center for Education Statistics demonstrate the dramatic increase in college student employment over the past few decades. Not only are more students employed than in previous decades, students are working more hours. This could lead to declines in academic performance as hours worked increase, resulting in…

  11. Employment and Unemployment in 1976. Special Labor Force Report 199.

    ERIC Educational Resources Information Center

    Bednarzik, Robert W.; St. Marie, Stephen M.

    Changes in employment and unemployment in 1976, presented through the use of statistical data in tabular and chart forms, is the focus of this report. Protection for the unemployed, labor force trends, and persons of Spanish origin are also discussed under separate minor headings. Under the section on employment, the following subsections are…

  12. Sex Discrimination in Employment. Research Report No. 171.

    ERIC Educational Resources Information Center

    Morris, J. David; Wood, Linda B.

    This report examines the status of women and the laws that have been enacted to protect women from discrimination in employment. Written in lay language, it examines employment and occupational statistics for women in the United States and in Kentucky. Following an introduction in Chapter 1, the report presents four chapters surveying the problem,…

  13. The 1988-89 Job Outlook in Brief.

    ERIC Educational Resources Information Center

    White, Martha C.

    1988-01-01

    This article summarizes the employment outlook in 225 occupations as projected by the Bureau of Labor Statistics. It provides thumbnail sketches of employment data for each of the occupations in the 1988-89 "Occupational Outlook Handbook," on which it is based. Each entry presents the occupation's title, 1986 employment numbers, the percent change…

  14. Employers and Child Care: What Roles Do They Play?

    ERIC Educational Resources Information Center

    Hayghe, Howard V.

    1988-01-01

    The Bureau of Labor Statistics conducted a nationwide survey of approximately 10,000 businesses and government agencies in 1987. Results show that about 2 percent of employers sponsored day-care centers and 3 percent provide financial assistance toward expenses. However, employers are doing other things to aid employees with growing children. (JOW)

  15. 20 CFR 656.40 - Determination of prevailing wage for labor certification purposes.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Occupational Employment Statistics Survey shall be used to determine the arithmetic mean, unless the employer provides an acceptable survey under paragraph (g) of this section. (3) If the employer provides a survey... education and research entities. In computing the prevailing wage for a job opportunity in an occupational...

  16. 20 CFR 656.40 - Determination of prevailing wage for labor certification purposes.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Occupational Employment Statistics Survey shall be used to determine the arithmetic mean, unless the employer provides an acceptable survey under paragraph (g) of this section. (3) If the employer provides a survey... education and research entities. In computing the prevailing wage for a job opportunity in an occupational...

  17. 20 CFR 656.40 - Determination of prevailing wage for labor certification purposes.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Occupational Employment Statistics Survey shall be used to determine the arithmetic mean, unless the employer provides an acceptable survey under paragraph (g) of this section. (3) If the employer provides a survey... education and research entities. In computing the prevailing wage for a job opportunity in an occupational...

  18. 20 CFR 656.40 - Determination of prevailing wage for labor certification purposes.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Occupational Employment Statistics Survey shall be used to determine the arithmetic mean, unless the employer provides an acceptable survey under paragraph (g) of this section. (3) If the employer provides a survey... education and research entities. In computing the prevailing wage for a job opportunity in an occupational...

  19. 20 CFR 656.40 - Determination of prevailing wage for labor certification purposes.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Occupational Employment Statistics Survey shall be used to determine the arithmetic mean, unless the employer provides an acceptable survey under paragraph (g) of this section. (3) If the employer provides a survey... education and research entities. In computing the prevailing wage for a job opportunity in an occupational...

  20. Quality of life in breast cancer patients--a quantile regression analysis.

    PubMed

    Pourhoseingholi, Mohamad Amin; Safaee, Azadeh; Moghimi-Dehkordi, Bijan; Zeighami, Bahram; Faghihzadeh, Soghrat; Tabatabaee, Hamid Reza; Pourhoseingholi, Asma

    2008-01-01

    Quality of life study has an important role in health care especially in chronic diseases, in clinical judgment and in medical resources supplying. Statistical tools like linear regression are widely used to assess the predictors of quality of life. But when the response is not normal the results are misleading. The aim of this study is to determine the predictors of quality of life in breast cancer patients, using quantile regression model and compare to linear regression. A cross-sectional study conducted on 119 breast cancer patients that admitted and treated in chemotherapy ward of Namazi hospital in Shiraz. We used QLQ-C30 questionnaire to assessment quality of life in these patients. A quantile regression was employed to assess the assocciated factors and the results were compared to linear regression. All analysis carried out using SAS. The mean score for the global health status for breast cancer patients was 64.92+/-11.42. Linear regression showed that only grade of tumor, occupational status, menopausal status, financial difficulties and dyspnea were statistically significant. In spite of linear regression, financial difficulties were not significant in quantile regression analysis and dyspnea was only significant for first quartile. Also emotion functioning and duration of disease statistically predicted the QOL score in the third quartile. The results have demonstrated that using quantile regression leads to better interpretation and richer inference about predictors of the breast cancer patient quality of life.

  1. iTTVis: Interactive Visualization of Table Tennis Data.

    PubMed

    Wu, Yingcai; Lan, Ji; Shu, Xinhuan; Ji, Chenyang; Zhao, Kejian; Wang, Jiachen; Zhang, Hui

    2018-01-01

    The rapid development of information technology paved the way for the recording of fine-grained data, such as stroke techniques and stroke placements, during a table tennis match. This data recording creates opportunities to analyze and evaluate matches from new perspectives. Nevertheless, the increasingly complex data poses a significant challenge to make sense of and gain insights into. Analysts usually employ tedious and cumbersome methods which are limited to watching videos and reading statistical tables. However, existing sports visualization methods cannot be applied to visualizing table tennis competitions due to different competition rules and particular data attributes. In this work, we collaborate with data analysts to understand and characterize the sophisticated domain problem of analysis of table tennis data. We propose iTTVis, a novel interactive table tennis visualization system, which to our knowledge, is the first visual analysis system for analyzing and exploring table tennis data. iTTVis provides a holistic visualization of an entire match from three main perspectives, namely, time-oriented, statistical, and tactical analyses. The proposed system with several well-coordinated views not only supports correlation identification through statistics and pattern detection of tactics with a score timeline but also allows cross analysis to gain insights. Data analysts have obtained several new insights by using iTTVis. The effectiveness and usability of the proposed system are demonstrated with four case studies.

  2. Quantitative comparison of tympanic membrane displacements using two optical methods to recover the optical phase

    NASA Astrophysics Data System (ADS)

    Santiago-Lona, Cynthia V.; Hernández-Montes, María del Socorro; Mendoza-Santoyo, Fernando; Esquivel-Tejeda, Jesús

    2018-02-01

    The study and quantification of the tympanic membrane (TM) displacements add important information to advance the knowledge about the hearing process. A comparative statistical analysis between two commonly used demodulation methods employed to recover the optical phase in digital holographic interferometry, namely the fast Fourier transform and phase-shifting interferometry, is presented as applied to study thin tissues such as the TM. The resulting experimental TM surface displacement data are used to contrast both methods through the analysis of variance and F tests. Data are gathered when the TMs are excited with continuous sound stimuli at levels 86, 89 and 93 dB SPL for the frequencies of 800, 1300 and 2500 Hz under the same experimental conditions. The statistical analysis shows repeatability in z-direction displacements with a standard deviation of 0.086, 0.098 and 0.080 μm using the Fourier method, and 0.080, 0.104 and 0.055 μm with the phase-shifting method at a 95% confidence level for all frequencies. The precision and accuracy are evaluated by means of the coefficient of variation; the results with the Fourier method are 0.06143, 0.06125, 0.06154 and 0.06154, 0.06118, 0.06111 with phase-shifting. The relative error between both methods is 7.143, 6.250 and 30.769%. On comparing the measured displacements, the results indicate that there is no statistically significant difference between both methods for frequencies at 800 and 1300 Hz; however, errors and other statistics increase at 2500 Hz.

  3. Statistical link between external climate forcings and modes of ocean variability

    NASA Astrophysics Data System (ADS)

    Malik, Abdul; Brönnimann, Stefan; Perona, Paolo

    2017-07-01

    In this study we investigate statistical link between external climate forcings and modes of ocean variability on inter-annual (3-year) to centennial (100-year) timescales using de-trended semi-partial-cross-correlation analysis technique. To investigate this link we employ observations (AD 1854-1999), climate proxies (AD 1600-1999), and coupled Atmosphere-Ocean-Chemistry Climate Model simulations with SOCOL-MPIOM (AD 1600-1999). We find robust statistical evidence that Atlantic multi-decadal oscillation (AMO) has intrinsic positive correlation with solar activity in all datasets employed. The strength of the relationship between AMO and solar activity is modulated by volcanic eruptions and complex interaction among modes of ocean variability. The observational dataset reveals that El Niño southern oscillation (ENSO) has statistically significant negative intrinsic correlation with solar activity on decadal to multi-decadal timescales (16-27-year) whereas there is no evidence of a link on a typical ENSO timescale (2-7-year). In the observational dataset, the volcanic eruptions do not have a link with AMO on a typical AMO timescale (55-80-year) however the long-term datasets (proxies and SOCOL-MPIOM output) show that volcanic eruptions have intrinsic negative correlation with AMO on inter-annual to multi-decadal timescales. The Pacific decadal oscillation has no link with solar activity, however, it has positive intrinsic correlation with volcanic eruptions on multi-decadal timescales (47-54-year) in reconstruction and decadal to multi-decadal timescales (16-32-year) in climate model simulations. We also find evidence of a link between volcanic eruptions and ENSO, however, the sign of relationship is not consistent between observations/proxies and climate model simulations.

  4. The US healthcare workforce and the labor market effect on healthcare spending and health outcomes.

    PubMed

    Pellegrini, Lawrence C; Rodriguez-Monguio, Rosa; Qian, Jing

    2014-06-01

    The healthcare sector was one of the few sectors of the US economy that created new positions in spite of the recent economic downturn. Economic contractions are associated with worsening morbidity and mortality, declining private health insurance coverage, and budgetary pressure on public health programs. This study examines the causes of healthcare employment growth and workforce composition in the US and evaluates the labor market's impact on healthcare spending and health outcomes. Data are collected for 50 states and the District of Columbia from 1999-2009. Labor market and healthcare workforce data are obtained from the Bureau of Labor Statistics. Mortality and health status data are collected from the Centers for Disease Control and Prevention's Vital Statistics program and Behavioral Risk Factor Surveillance System. Healthcare spending data are derived from the Centers for Medicare and Medicaid Services. Dynamic panel data regression models, with instrumental variables, are used to examine the effect of the labor market on healthcare spending, morbidity, and mortality. Regression analysis is also performed to model the effects of healthcare spending on the healthcare workforce composition. All statistical tests are based on a two-sided [Formula: see text] significance of [Formula: see text] .05. Analyses are performed with STATA and SAS. The labor force participation rate shows a more robust effect on healthcare spending, morbidity, and mortality than the unemployment rate. Study results also show that declining labor force participation negatively impacts overall health status ([Formula: see text] .01), and mortality for males ([Formula: see text] .05) and females ([Formula: see text] .001), aged 16-64. Further, the Medicaid and Medicare spending share increases as labor force participation declines ([Formula: see text] .001); whereas, the private healthcare spending share decreases ([Formula: see text] .001). Public and private healthcare spending also has a differing effect on healthcare occupational employment per 100,000 people. Private healthcare spending positively impacts primary care physician employment ([Formula: see text] .001); whereas, Medicare spending drives up employment of physician assistants, registered nurses, and personal care attendants ([Formula: see text] .001). Medicaid and Medicare spending has a negative effect on surgeon employment ([Formula: see text] .05); the effect of private healthcare spending is positive but not statistically significant. Labor force participation, as opposed to unemployment, is a better proxy for measuring the effect of the economic environment on healthcare spending and health outcomes. Further, during economic contractions, Medicaid and Medicare's share of overall healthcare spending increases with meaningful effects on the configuration of state healthcare workforces and subsequently, provision of care for populations at-risk for worsening morbidity and mortality.

  5. Longitudinal statistics on work activity and use of employment supports for new Social Security Disability Insurance beneficiaries.

    PubMed

    Liu, Su; Stapleton, David C

    2011-01-01

    We present longitudinal employment and work-incentive statistics for individuals who began receiving Social Security Disability Insurance (DI) benefits from 1996 through 2006. For the longest-observed cohort, 28 percent returned to work, 6.5 percent had their benefits suspended for work in at least 1 month, and 3.7 percent had their benefits terminated for work. The corresponding percentages are much higher for those who were younger than age 40 when they entered the DI program. Most first suspensions occurred within 5 years after entry. Cross-state variation in outcomes is high, and, to the extent observed, statistics for more recent cohorts are lower.

  6. Diffuse optical spectroscopy monitoring of oxygen state and hemoglobin concentration during SKBR-3 tumor model growth

    NASA Astrophysics Data System (ADS)

    Orlova, A. G.; Kirillin, M. Yu; Volovetsky, A. B.; Shilyagina, N. Yu; Sergeeva, E. A.; Golubiatnikov, G. Yu; Turchin, I. V.

    2017-01-01

    Tumor oxygenation and hemoglobin content are the key indicators of the tumor status which can be efficiently employed for prognosis of tumor development and choice of treatment strategy. We report on monitoring of these parameters in SKBR-3 (human breast adenocarcinoma) tumors established as subcutaneous tumor xenografts in athymic nude mice by diffuse optical spectroscopy (DOS). A simple continuous wave fiber probe DOS system is employed. Optical properties extraction approach is based on diffusion approximation. Statistically significant difference between measured values of normal tissue and tumor are demonstrated. Hemoglobin content in tumor increases from 7.0  ±  4.2 μM to 30.1  ±  16.1 μM with tumor growth from 150  ±  80 mm3 to 1300  ±  650 mm3 which is determined by gradual increase of deoxyhemoglobin content while measured oxyhemoglobin content does not demonstrate any statistically significant variations. Oxygenation in tumor falls quickly from 52.8  ±  24.7% to 20.2  ±  4.8% preceding acceleration of tumor growth. Statistical analysis indicated dependence of oxy-, deoxy- and total hemoglobin on tumor volume (p  <  0.01). DOS measurements of oxygen saturation are in agreement with independent measurements of oxygen partial pressure by polarography (Pearson’s correlation coefficient equals 0.8).

  7. Regulatory considerations in the design of comparative observational studies using propensity scores.

    PubMed

    Yue, Lilly Q

    2012-01-01

    In the evaluation of medical products, including drugs, biological products, and medical devices, comparative observational studies could play an important role when properly conducted randomized, well-controlled clinical trials are infeasible due to ethical or practical reasons. However, various biases could be introduced at every stage and into every aspect of the observational study, and consequently the interpretation of the resulting statistical inference would be of concern. While there do exist statistical techniques for addressing some of the challenging issues, often based on propensity score methodology, these statistical tools probably have not been as widely employed in prospectively designing observational studies as they should be. There are also times when they are implemented in an unscientific manner, such as performing propensity score model selection for a dataset involving outcome data in the same dataset, so that the integrity of observational study design and the interpretability of outcome analysis results could be compromised. In this paper, regulatory considerations on prospective study design using propensity scores are shared and illustrated with hypothetical examples.

  8. A procedure for combining acoustically induced and mechanically induced loads (first passage failure design criterion)

    NASA Technical Reports Server (NTRS)

    Crowe, D. R.; Henricks, W.

    1983-01-01

    The combined load statistics are developed by taking the acoustically induced load to be a random population, assumed to be stationary. Each element of this ensemble of acoustically induced loads is assumed to have the same power spectral density (PSD), obtained previously from a random response analysis employing the given acoustic field in the STS cargo bay as a stationary random excitation. The mechanically induced load is treated as either (1) a known deterministic transient, or (2) a nonstationary random variable of known first and second statistical moments which vary with time. A method is then shown for determining the probability that the combined load would, at any time, have a value equal to or less than a certain level. Having obtained a statistical representation of how the acoustic and mechanical loads are expected to combine, an analytical approximation for defining design levels for these loads is presented using the First Passage failure criterion.

  9. Analysis of the procedures used to evaluate suicide crime scenes in Brazil: a statistical approach to interpret reports.

    PubMed

    Bruni, Aline Thaís; Velho, Jesus Antonio; Ferreira, Arthur Serra Lopes; Tasso, Maria Júlia; Ferrari, Raíssa Santos; Yoshida, Ricardo Luís; Dias, Marcos Salvador; Leite, Vitor Barbanti Pereira

    2014-08-01

    This study uses statistical techniques to evaluate reports on suicide scenes; it utilizes 80 reports from different locations in Brazil, randomly collected from both federal and state jurisdictions. We aimed to assess a heterogeneous group of cases in order to obtain an overall perspective of the problem. We evaluated variables regarding the characteristics of the crime scene, such as the detected traces (blood, instruments and clothes) that were found and we addressed the methodology employed by the experts. A qualitative approach using basic statistics revealed a wide distribution as to how the issue was addressed in the documents. We examined a quantitative approach involving an empirical equation and we used multivariate procedures to validate the quantitative methodology proposed for this empirical equation. The methodology successfully identified the main differences in the information presented in the reports, showing that there is no standardized method of analyzing evidences. Copyright © 2014 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  10. Time-Frequency Cross Mutual Information Analysis of the Brain Functional Networks Underlying Multiclass Motor Imagery.

    PubMed

    Gong, Anmin; Liu, Jianping; Chen, Si; Fu, Yunfa

    2018-01-01

    To study the physiologic mechanism of the brain during different motor imagery (MI) tasks, the authors employed a method of brain-network modeling based on time-frequency cross mutual information obtained from 4-class (left hand, right hand, feet, and tongue) MI tasks recorded as brain-computer interface (BCI) electroencephalography data. The authors explored the brain network revealed by these MI tasks using statistical analysis and the analysis of topologic characteristics, and observed significant differences in the reaction level, reaction time, and activated target during 4-class MI tasks. There was a great difference in the reaction level between the execution and resting states during different tasks: the reaction level of the left-hand MI task was the greatest, followed by that of the right-hand, feet, and tongue MI tasks. The reaction time required to perform the tasks also differed: during the left-hand and right-hand MI tasks, the brain networks of subjects reacted promptly and strongly, but there was a delay during the feet and tongue MI task. Statistical analysis and the analysis of network topology revealed the target regions of the brain network during different MI processes. In conclusion, our findings suggest a new way to explain the neural mechanism behind MI.

  11. Job Patterns for Minorities and Women in Private Industry: Equal Employment Opportunity Report, 1969. Volume 1: The Nation, States, Industries. Volume 2: Metropolitan Areas.

    ERIC Educational Resources Information Center

    Equal Employment Opportunity Commission, Washington, DC.

    The Equal Employment Opportunity Report for 1969 documents the results of job discrimination, based on more than 150,000 reports submitted by 44,000 employers covering more than 28 million workers. These reports provide statistics of employment by sex, race, and national origin in nine standard occupational categories: officials and managers,…

  12. Detection of a gravitropism phenotype in glutamate receptor-like 3.3 mutants of Arabidopsis thaliana using machine vision and computation.

    PubMed

    Miller, Nathan D; Durham Brooks, Tessa L; Assadi, Amir H; Spalding, Edgar P

    2010-10-01

    Gene disruption frequently produces no phenotype in the model plant Arabidopsis thaliana, complicating studies of gene function. Functional redundancy between gene family members is one common explanation but inadequate detection methods could also be responsible. Here, newly developed methods for automated capture and processing of time series of images, followed by computational analysis employing modified linear discriminant analysis (LDA) and wavelet-based differentiation, were employed in a study of mutants lacking the Glutamate Receptor-Like 3.3 gene. Root gravitropism was selected as the process to study with high spatiotemporal resolution because the ligand-gated Ca(2+)-permeable channel encoded by GLR3.3 may contribute to the ion fluxes associated with gravity signal transduction in roots. Time series of root tip angles were collected from wild type and two different glr3.3 mutants across a grid of seed-size and seedling-age conditions previously found to be important to gravitropism. Statistical tests of average responses detected no significant difference between populations, but LDA separated both mutant alleles from the wild type. After projecting the data onto LDA solution vectors, glr3.3 mutants displayed greater population variance than the wild type in all four conditions. In three conditions the projection means also differed significantly between mutant and wild type. Wavelet analysis of the raw response curves showed that the LDA-detected phenotypes related to an early deceleration and subsequent slower-bending phase in glr3.3 mutants. These statistically significant, heritable, computation-based phenotypes generated insight into functions of GLR3.3 in gravitropism. The methods could be generally applicable to the study of phenotypes and therefore gene function.

  13. Detection of a Gravitropism Phenotype in glutamate receptor-like 3.3 Mutants of Arabidopsis thaliana Using Machine Vision and Computation

    PubMed Central

    Miller, Nathan D.; Durham Brooks, Tessa L.; Assadi, Amir H.; Spalding, Edgar P.

    2010-01-01

    Gene disruption frequently produces no phenotype in the model plant Arabidopsis thaliana, complicating studies of gene function. Functional redundancy between gene family members is one common explanation but inadequate detection methods could also be responsible. Here, newly developed methods for automated capture and processing of time series of images, followed by computational analysis employing modified linear discriminant analysis (LDA) and wavelet-based differentiation, were employed in a study of mutants lacking the Glutamate Receptor-Like 3.3 gene. Root gravitropism was selected as the process to study with high spatiotemporal resolution because the ligand-gated Ca2+-permeable channel encoded by GLR3.3 may contribute to the ion fluxes associated with gravity signal transduction in roots. Time series of root tip angles were collected from wild type and two different glr3.3 mutants across a grid of seed-size and seedling-age conditions previously found to be important to gravitropism. Statistical tests of average responses detected no significant difference between populations, but LDA separated both mutant alleles from the wild type. After projecting the data onto LDA solution vectors, glr3.3 mutants displayed greater population variance than the wild type in all four conditions. In three conditions the projection means also differed significantly between mutant and wild type. Wavelet analysis of the raw response curves showed that the LDA-detected phenotypes related to an early deceleration and subsequent slower-bending phase in glr3.3 mutants. These statistically significant, heritable, computation-based phenotypes generated insight into functions of GLR3.3 in gravitropism. The methods could be generally applicable to the study of phenotypes and therefore gene function. PMID:20647506

  14. A systematic review of the quality of statistical methods employed for analysing quality of life data in cancer randomised controlled trials.

    PubMed

    Hamel, Jean-Francois; Saulnier, Patrick; Pe, Madeline; Zikos, Efstathios; Musoro, Jammbe; Coens, Corneel; Bottomley, Andrew

    2017-09-01

    Over the last decades, Health-related Quality of Life (HRQoL) end-points have become an important outcome of the randomised controlled trials (RCTs). HRQoL methodology in RCTs has improved following international consensus recommendations. However, no international recommendations exist concerning the statistical analysis of such data. The aim of our study was to identify and characterise the quality of the statistical methods commonly used for analysing HRQoL data in cancer RCTs. Building on our recently published systematic review, we analysed a total of 33 published RCTs studying the HRQoL methods reported in RCTs since 1991. We focussed on the ability of the methods to deal with the three major problems commonly encountered when analysing HRQoL data: their multidimensional and longitudinal structure and the commonly high rate of missing data. All studies reported HRQoL being assessed repeatedly over time for a period ranging from 2 to 36 months. Missing data were common, with compliance rates ranging from 45% to 90%. From the 33 studies considered, 12 different statistical methods were identified. Twenty-nine studies analysed each of the questionnaire sub-dimensions without type I error adjustment. Thirteen studies repeated the HRQoL analysis at each assessment time again without type I error adjustment. Only 8 studies used methods suitable for repeated measurements. Our findings show a lack of consistency in statistical methods for analysing HRQoL data. Problems related to multiple comparisons were rarely considered leading to a high risk of false positive results. It is therefore critical that international recommendations for improving such statistical practices are developed. Copyright © 2017. Published by Elsevier Ltd.

  15. Detection and Evaluation of Spatio-Temporal Spike Patterns in Massively Parallel Spike Train Data with SPADE.

    PubMed

    Quaglio, Pietro; Yegenoglu, Alper; Torre, Emiliano; Endres, Dominik M; Grün, Sonja

    2017-01-01

    Repeated, precise sequences of spikes are largely considered a signature of activation of cell assemblies. These repeated sequences are commonly known under the name of spatio-temporal patterns (STPs). STPs are hypothesized to play a role in the communication of information in the computational process operated by the cerebral cortex. A variety of statistical methods for the detection of STPs have been developed and applied to electrophysiological recordings, but such methods scale poorly with the current size of available parallel spike train recordings (more than 100 neurons). In this work, we introduce a novel method capable of overcoming the computational and statistical limits of existing analysis techniques in detecting repeating STPs within massively parallel spike trains (MPST). We employ advanced data mining techniques to efficiently extract repeating sequences of spikes from the data. Then, we introduce and compare two alternative approaches to distinguish statistically significant patterns from chance sequences. The first approach uses a measure known as conceptual stability, of which we investigate a computationally cheap approximation for applications to such large data sets. The second approach is based on the evaluation of pattern statistical significance. In particular, we provide an extension to STPs of a method we recently introduced for the evaluation of statistical significance of synchronous spike patterns. The performance of the two approaches is evaluated in terms of computational load and statistical power on a variety of artificial data sets that replicate specific features of experimental data. Both methods provide an effective and robust procedure for detection of STPs in MPST data. The method based on significance evaluation shows the best overall performance, although at a higher computational cost. We name the novel procedure the spatio-temporal Spike PAttern Detection and Evaluation (SPADE) analysis.

  16. Detection and Evaluation of Spatio-Temporal Spike Patterns in Massively Parallel Spike Train Data with SPADE

    PubMed Central

    Quaglio, Pietro; Yegenoglu, Alper; Torre, Emiliano; Endres, Dominik M.; Grün, Sonja

    2017-01-01

    Repeated, precise sequences of spikes are largely considered a signature of activation of cell assemblies. These repeated sequences are commonly known under the name of spatio-temporal patterns (STPs). STPs are hypothesized to play a role in the communication of information in the computational process operated by the cerebral cortex. A variety of statistical methods for the detection of STPs have been developed and applied to electrophysiological recordings, but such methods scale poorly with the current size of available parallel spike train recordings (more than 100 neurons). In this work, we introduce a novel method capable of overcoming the computational and statistical limits of existing analysis techniques in detecting repeating STPs within massively parallel spike trains (MPST). We employ advanced data mining techniques to efficiently extract repeating sequences of spikes from the data. Then, we introduce and compare two alternative approaches to distinguish statistically significant patterns from chance sequences. The first approach uses a measure known as conceptual stability, of which we investigate a computationally cheap approximation for applications to such large data sets. The second approach is based on the evaluation of pattern statistical significance. In particular, we provide an extension to STPs of a method we recently introduced for the evaluation of statistical significance of synchronous spike patterns. The performance of the two approaches is evaluated in terms of computational load and statistical power on a variety of artificial data sets that replicate specific features of experimental data. Both methods provide an effective and robust procedure for detection of STPs in MPST data. The method based on significance evaluation shows the best overall performance, although at a higher computational cost. We name the novel procedure the spatio-temporal Spike PAttern Detection and Evaluation (SPADE) analysis. PMID:28596729

  17. Machine Learning Methods for Attack Detection in the Smart Grid.

    PubMed

    Ozay, Mete; Esnaola, Inaki; Yarman Vural, Fatos Tunay; Kulkarni, Sanjeev R; Poor, H Vincent

    2016-08-01

    Attack detection problems in the smart grid are posed as statistical learning problems for different attack scenarios in which the measurements are observed in batch or online settings. In this approach, machine learning algorithms are used to classify measurements as being either secure or attacked. An attack detection framework is provided to exploit any available prior knowledge about the system and surmount constraints arising from the sparse structure of the problem in the proposed approach. Well-known batch and online learning algorithms (supervised and semisupervised) are employed with decision- and feature-level fusion to model the attack detection problem. The relationships between statistical and geometric properties of attack vectors employed in the attack scenarios and learning algorithms are analyzed to detect unobservable attacks using statistical learning methods. The proposed algorithms are examined on various IEEE test systems. Experimental analyses show that machine learning algorithms can detect attacks with performances higher than attack detection algorithms that employ state vector estimation methods in the proposed attack detection framework.

  18. A Review of Statistical Disclosure Control Techniques Employed by Web-Based Data Query Systems.

    PubMed

    Matthews, Gregory J; Harel, Ofer; Aseltine, Robert H

    We systematically reviewed the statistical disclosure control techniques employed for releasing aggregate data in Web-based data query systems listed in the National Association for Public Health Statistics and Information Systems (NAPHSIS). Each Web-based data query system was examined to see whether (1) it employed any type of cell suppression, (2) it used secondary cell suppression, and (3) suppressed cell counts could be calculated. No more than 30 minutes was spent on each system. Of the 35 systems reviewed, no suppression was observed in more than half (n = 18); observed counts below the threshold were observed in 2 sites; and suppressed values were recoverable in 9 sites. Six sites effectively suppressed small counts. This inquiry has revealed substantial weaknesses in the protective measures used in data query systems containing sensitive public health data. Many systems utilized no disclosure control whatsoever, and the vast majority of those that did deployed it inconsistently or inadequately.

  19. NoFear Act Data Report

    EPA Pesticide Factsheets

    Pursuant to the No Fear Act, a federal agency must post on its public Web site summary statistical data pertaining to complaints of employment discrimination filed by employees, former employees and applicants for employment under 29 CFR part 1614

  20. Industry is Largest Employer of Scientists

    ERIC Educational Resources Information Center

    Chemical and Engineering News, 1977

    1977-01-01

    Cites statistics of a National Science Foundation report on scientists and engineers in 1974. Reports that chemists are better educated, older, have a better chance of being employed, and do more work for industry, than other scientific personnel. (MLH)

  1. Dynamic analysis environment for nuclear forensic analyses

    NASA Astrophysics Data System (ADS)

    Stork, C. L.; Ummel, C. C.; Stuart, D. S.; Bodily, S.; Goldblum, B. L.

    2017-01-01

    A Dynamic Analysis Environment (DAE) software package is introduced to facilitate group inclusion/exclusion method testing, evaluation and comparison for pre-detonation nuclear forensics applications. Employing DAE, the multivariate signatures of a questioned material can be compared to the signatures for different, known groups, enabling the linking of the questioned material to its potential process, location, or fabrication facility. Advantages of using DAE for group inclusion/exclusion include built-in query tools for retrieving data of interest from a database, the recording and documentation of all analysis steps, a clear visualization of the analysis steps intelligible to a non-expert, and the ability to integrate analysis tools developed in different programming languages. Two group inclusion/exclusion methods are implemented in DAE: principal component analysis, a parametric feature extraction method, and k nearest neighbors, a nonparametric pattern recognition method. Spent Fuel Isotopic Composition (SFCOMPO), an open source international database of isotopic compositions for spent nuclear fuels (SNF) from 14 reactors, is used to construct PCA and KNN models for known reactor groups, and 20 simulated SNF samples are utilized in evaluating the performance of these group inclusion/exclusion models. For all 20 simulated samples, PCA in conjunction with the Q statistic correctly excludes a large percentage of reactor groups and correctly includes the true reactor of origination. Employing KNN, 14 of the 20 simulated samples are classified to their true reactor of origination.

  2. Statistics and Title VII Proof: Prima Facie Case and Rebuttal.

    ERIC Educational Resources Information Center

    Whitten, David

    1978-01-01

    The method and means by which statistics can raise a prima facie case of Title VII violation are analyzed. A standard is identified that can be applied to determine whether a statistical disparity is sufficient to shift the burden to the employer to rebut a prima facie case of discrimination. (LBH)

  3. Origin-based polyphenolic fingerprinting of Theobroma cacao in unfermented and fermented beans.

    PubMed

    D'Souza, Roy N; Grimbs, Sergio; Behrends, Britta; Bernaert, Herwig; Ullrich, Matthias S; Kuhnert, Nikolai

    2017-09-01

    A comprehensive analysis of cocoa polyphenols from unfermented and fermented cocoa beans from a wide range of geographic origins was carried out to catalogue systematic differences based on their origin as well as fermentation status. This study identifies previously unknown compounds with the goal to ascertain, which of these are responsible for the largest differences between bean types. UHPLC coupled with ultra-high resolution time-of-flight mass spectrometry was employed to identify and relatively quantify various oligomeric proanthocyanidins and their glycosides amongst several other unreported compounds. A series of biomarkers allowing a clear distinction between unfermented and fermented cocoa beans and for beans of different origins were identified. The large sample set employed allowed comparison of statistically significant variations of key cocoa constituents. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Perceptions of employment relations and permanence in the organization: mediating effects of affective commitment in relations of psychological contract and intention to quit.

    PubMed

    Alcover, Carlos-María; Martínez-Iñigo, David; Chambel, Maria José

    2012-06-01

    Working conditions in call/contact centers influence employees' perceptions of their relations with the organization and their attitudes to work. Such perceptions can be analyzed through the psychological contract. The association between the relational/transactional orientation of the psychological contract and intention to quit the organization was examined, as well as the mediating role of affective commitment in employment relations. Data were collected from 973 employees in a cross-sectional survey. Analysis confirmed that there was a statistically significant relation between the orientation of the psychological contract and intention to quit, which was positive for transactionally oriented and negative for relationally oriented contracts. A mediating role for affective commitment was also confirmed, and a full mediating effect was reported for both orientations.

  5. A novel bi-level meta-analysis approach: applied to biological pathway analysis.

    PubMed

    Nguyen, Tin; Tagett, Rebecca; Donato, Michele; Mitrea, Cristina; Draghici, Sorin

    2016-02-01

    The accumulation of high-throughput data in public repositories creates a pressing need for integrative analysis of multiple datasets from independent experiments. However, study heterogeneity, study bias, outliers and the lack of power of available methods present real challenge in integrating genomic data. One practical drawback of many P-value-based meta-analysis methods, including Fisher's, Stouffer's, minP and maxP, is that they are sensitive to outliers. Another drawback is that, because they perform just one statistical test for each individual experiment, they may not fully exploit the potentially large number of samples within each study. We propose a novel bi-level meta-analysis approach that employs the additive method and the Central Limit Theorem within each individual experiment and also across multiple experiments. We prove that the bi-level framework is robust against bias, less sensitive to outliers than other methods, and more sensitive to small changes in signal. For comparative analysis, we demonstrate that the intra-experiment analysis has more power than the equivalent statistical test performed on a single large experiment. For pathway analysis, we compare the proposed framework versus classical meta-analysis approaches (Fisher's, Stouffer's and the additive method) as well as against a dedicated pathway meta-analysis package (MetaPath), using 1252 samples from 21 datasets related to three human diseases, acute myeloid leukemia (9 datasets), type II diabetes (5 datasets) and Alzheimer's disease (7 datasets). Our framework outperforms its competitors to correctly identify pathways relevant to the phenotypes. The framework is sufficiently general to be applied to any type of statistical meta-analysis. The R scripts are available on demand from the authors. sorin@wayne.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  6. Study of Employment of Women in the Federal Government 1971.

    ERIC Educational Resources Information Center

    Civil Service Commission, Washington, DC. Manpower Statistics Div.

    This study presents statistical information gained from a survey of women employed full-time in Federal civilian collar employment as of October 31, 1971 in the Washington, D.C., metropolitan area, the 50 states and the territories of the U.S., and foreign countries. Excluded from the survey are members and employees of the Congress, employees of…

  7. Youth Employment in the Hospitality Sector.

    ERIC Educational Resources Information Center

    Schiller, Bradley R.

    A study used data from the National Longitudinal Surveys of Youth to analyze the long-term effects of hospitality industry employment on youth. The subsample extracted for the study included all youth who were aged 16-24 in 1980 and employed in the civilian sector for pay at any time in the year. Statistics indicated the hospitality sector was…

  8. Labor Trends: Overview of the United States, New York City, and Long Island. Revised Edition.

    ERIC Educational Resources Information Center

    Goldstein, Cheryl

    This document summarizes employment statistics and trends, with a geographic emphasis on areas where Queensborough Community College (New York) students and graduates seek employment. Data are presented on the following: (1) current and projected United States labor force; (2) occupational outlook; (3) employment status of civilian labor force 25…

  9. Association between being employed in a smoke-free workplace and living in a smoke-free home: evidence from 15 low and middle income countries.

    PubMed

    Nazar, Gaurang P; Lee, John Tayu; Glantz, Stanton A; Arora, Monika; Pearce, Neil; Millett, Christopher

    2014-02-01

    To assess whether being employed in a smoke-free workplace is associated with living in a smoke-free home in 15 low and middle income countries (LMICs). Country-specific individual level analyses of cross-sectional Global Adult Tobacco Survey data (2008-2011) from 15 LMICs was conducted using multiple logistic regression. The dependent variable was living in a smoke-free home; the independent variable was being employed in a smoke-free workplace. Analyses were adjusted for age, gender, residence, region, education, occupation, current smoking, current smokeless tobacco use and number of household members. Individual country results were combined in a random effects meta-analysis. In each country, the percentage of participants employed in a smoke-free workplace who reported living in a smoke-free home was higher than those employed in a workplace not smoke-free. The adjusted odds ratios (AORs) of living in a smoke-free home among participants employed in a smoke-free workplace (vs. those employed where smoking occurred) were statistically significant in 13 of the 15 countries, ranging from 1.12 [95% CI 0.79-1.58] in Uruguay to 2.29 [1.37-3.83] in China. The pooled AOR was 1.61 [1.46-1.79]. In LMICs, employment in a smoke-free workplace is associated with living in a smoke-free home. Accelerated implementation of comprehensive smoke-free policies is likely to result in substantial population health benefits in these settings. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.

  10. Association between being employed in a smoke-free workplace and living in a smoke-free home: Evidence from 15 low and middle income countries☆

    PubMed Central

    Nazar, Gaurang P.; Lee, John Tayu; Glantz, Stanton A.; Arora, Monika; Pearce, Neil; Millett, Christopher

    2014-01-01

    Objective To assess whether being employed in a smoke-free workplace is associated with living in a smoke-free home in 15 low and middle income countries (LMICs). Methods Country-specific individual level analyses of cross-sectional Global Adult Tobacco Survey data (2008–2011) from 15 LMICs was conducted using multiple logistic regression. The dependent variable was living in a smoke-free home; the independent variable was being employed in a smoke-free workplace. Analyses were adjusted for age, gender, residence, region, education, occupation, current smoking, current smokeless tobacco use and number of household members. Individual country results were combined in a random effects meta-analysis. Results In each country, the percentage of participants employed in a smoke-free workplace who reported living in a smoke-free home was higher than those employed in a workplace not smoke-free. The adjusted odds ratios (AORs) of living in a smoke-free home among participants employed in a smoke-free workplace (vs. those employed where smoking occurred) were statistically significant in 13 of the 15 countries, ranging from 1.12 [95% CI 0.79–1.58] in Uruguay to 2.29 [1.37–3.83] in China. The pooled AOR was 1.61 [1.46–1.79]. Conclusion In LMICs, employment in a smoke-free workplace is associated with living in a smoke-free home. Accelerated implementation of comprehensive smoke-free policies is likely to result in substantial population health benefits in these settings. PMID:24287123

  11. Efficacy of Acupuncture in Reducing Preoperative Anxiety: A Meta-Analysis

    PubMed Central

    Bae, Hyojeong; Bae, Hyunsu; Min, Byung-Il; Cho, Seunghun

    2014-01-01

    Background. Acupuncture has been shown to reduce preoperative anxiety in several previous randomized controlled trials (RCTs). In order to assess the preoperative anxiolytic efficacy of acupuncture therapy, this study conducted a meta-analysis of an array of appropriate studies. Methods. Four electronic databases (MEDLINE, EMBASE, CENTRAL, and CINAHL) were searched up to February 2014. In the meta-analysis data were included from RCT studies in which groups receiving preoperative acupuncture treatment were compared with control groups receiving a placebo for anxiety. Results. Fourteen publications (N = 1,034) were included. Six publications, using the State-Trait Anxiety Inventory-State (STAI-S), reported that acupuncture interventions led to greater reductions in preoperative anxiety relative to sham acupuncture (mean difference = 5.63, P < .00001, 95% CI [4.14, 7.11]). Further eight publications, employing visual analogue scales (VAS), also indicated significant differences in preoperative anxiety amelioration between acupuncture and sham acupuncture (mean difference = 19.23, P < .00001, 95% CI [16.34, 22.12]). Conclusions. Acupuncture therapy aiming at reducing preoperative anxiety has a statistically significant effect relative to placebo or nontreatment conditions. Well-designed and rigorous studies that employ large sample sizes are necessary to corroborate this finding. PMID:25254059

  12. Statistics-based email communication security behavior recognition

    NASA Astrophysics Data System (ADS)

    Yi, Junkai; Su, Yueyang; Zhao, Xianghui

    2017-08-01

    With the development of information technology, e-mail has become a popular communication medium. It has great significant to determine the relationship between the two sides of the communication. Firstly, this paper analysed and processed the content and attachment of e-mail using the skill of steganalysis and malware analysis. And it also conducts the following feature extracting and behaviour model establishing which based on Naive Bayesian theory. Then a behaviour analysis method was employed to calculate and evaluate the communication security. Finally, some experiments about the accuracy of the behavioural relationship of communication identifying has been carried out. The result shows that this method has a great effects and correctness as eighty-four percent.

  13. The unauthorized Mexican immigrant population and welfare in Los Angeles County: a comparative statistical analysis.

    PubMed

    Marcelli, E A; Heer, D M

    1998-01-01

    "Using a unique 1994 Los Angeles County Household Survey of foreign-born Mexicans and the March 1994 and 1995 Current Population Surveys, we estimate the number of unauthorized Mexican immigrants (UMIs) residing in Los Angeles County, and compare their use of seven welfare programs with that of other non-U.S. citizens and U.S. citizens. Non-U.S. citizens were found to be no more likely than U.S. citizens to have used welfare, and UMIs were 11% (14%) less likely than other non-citizens (U.S.-born citizens).... We demonstrate how results differ depending on the unit of analysis employed, and on which programs constitute ¿welfare'." excerpt

  14. The composite sequential clustering technique for analysis of multispectral scanner data

    NASA Technical Reports Server (NTRS)

    Su, M. Y.

    1972-01-01

    The clustering technique consists of two parts: (1) a sequential statistical clustering which is essentially a sequential variance analysis, and (2) a generalized K-means clustering. In this composite clustering technique, the output of (1) is a set of initial clusters which are input to (2) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by traditional supervised maximum likelihood classification techniques. The mathematical algorithms for the composite sequential clustering program and a detailed computer program description with job setup are given.

  15. Monitoring of pistachio (Pistacia Vera) ripening by high field nuclear magnetic resonance spectroscopy.

    PubMed

    Sciubba, Fabio; Avanzato, Damiano; Vaccaro, Angela; Capuani, Giorgio; Spagnoli, Mariangela; Di Cocco, Maria Enrica; Tzareva, Irina Nikolova; Delfini, Maurizio

    2017-04-01

    The metabolic profiling of pistachio (Pistacia vera) aqueous extracts from two different cultivars, namely 'Bianca' and 'Gloria', was monitored over the months from May to September employing high field NMR spectroscopy. A large number of water-soluble metabolites were assigned by means of 1D and 2D NMR experiments. The change in the metabolic profiles monitored over time allowed the pistachio development to be investigated. Specific temporal trends of amino acids, sugars, organic acids and other metabolites were observed and analysed by multivariate Partial Least Squares (PLS) analysis. Statistical analysis showed that while in the period from May to September there were few differences between the two cultivars, the ripening rate was different.

  16. Non-parametric correlative uncertainty quantification and sensitivity analysis: Application to a Langmuir bimolecular adsorption model

    NASA Astrophysics Data System (ADS)

    Feng, Jinchao; Lansford, Joshua; Mironenko, Alexander; Pourkargar, Davood Babaei; Vlachos, Dionisios G.; Katsoulakis, Markos A.

    2018-03-01

    We propose non-parametric methods for both local and global sensitivity analysis of chemical reaction models with correlated parameter dependencies. The developed mathematical and statistical tools are applied to a benchmark Langmuir competitive adsorption model on a close packed platinum surface, whose parameters, estimated from quantum-scale computations, are correlated and are limited in size (small data). The proposed mathematical methodology employs gradient-based methods to compute sensitivity indices. We observe that ranking influential parameters depends critically on whether or not correlations between parameters are taken into account. The impact of uncertainty in the correlation and the necessity of the proposed non-parametric perspective are demonstrated.

  17. Effect of components of a workplace lactation program on breastfeeding duration among employees of a public-sector employer.

    PubMed

    Balkam, Jane A Johnston; Cadwell, Karin; Fein, Sara B

    2011-07-01

    The purpose of this study was to evaluate the impact of the individual services offered via a workplace lactation program of one large public-sector employer on the duration of any breastfeeding and exclusive breastfeeding. Exclusive breastfeeding was defined as exclusive feeding of human milk for the milk feeding. A cross-sectional mailed survey approach was used. The sample (n = 128) consisted of women who had used at least one component of the lactation program in the past 3 years and who were still employed at the same organization when data were collected. Descriptive statistics included frequency distributions and contingency table analysis. Chi-square analysis was used for comparison of groups, and both analysis of variance (ANOVA) and univariate analysis of variance from a general linear model were used for comparison of means. The survey respondents were primarily older, white, married, well-educated, high-income women. More of the women who received each lactation program service were exclusively breastfeeding at 6 months of infant age in all categories of services, with significant differences in the categories of telephone support and return to work consultation. After adjusting for race and work status, logistic regression analysis showed the number of services received was positively related to exclusive breastfeeding at 6 months and participation in a return to work consultation was positively related to any breastfeeding at 6 months. The study demonstrated that the workplace lactation program had a positive impact on duration of breastfeeding for the women who participated. Participation in the telephone support and return to work consultation services, and the total number of services used were related to longer duration of exclusive and/or any breastfeeding.

  18. Application of linear regression analysis in accuracy assessment of rolling force calculations

    NASA Astrophysics Data System (ADS)

    Poliak, E. I.; Shim, M. K.; Kim, G. S.; Choo, W. Y.

    1998-10-01

    Efficient operation of the computational models employed in process control systems require periodical assessment of the accuracy of their predictions. Linear regression is proposed as a tool which allows separate systematic and random prediction errors from those related to measurements. A quantitative characteristic of the model predictive ability is introduced in addition to standard statistical tests for model adequacy. Rolling force calculations are considered as an example for the application. However, the outlined approach can be used to assess the performance of any computational model.

  19. An unsupervised classification technique for multispectral remote sensing data.

    NASA Technical Reports Server (NTRS)

    Su, M. Y.; Cummings, R. E.

    1973-01-01

    Description of a two-part clustering technique consisting of (a) a sequential statistical clustering, which is essentially a sequential variance analysis, and (b) a generalized K-means clustering. In this composite clustering technique, the output of (a) is a set of initial clusters which are input to (b) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by traditional supervised maximum-likelihood classification techniques.

  20. Unsupervised classification of earth resources data.

    NASA Technical Reports Server (NTRS)

    Su, M. Y.; Jayroe, R. R., Jr.; Cummings, R. E.

    1972-01-01

    A new clustering technique is presented. It consists of two parts: (a) a sequential statistical clustering which is essentially a sequential variance analysis and (b) a generalized K-means clustering. In this composite clustering technique, the output of (a) is a set of initial clusters which are input to (b) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by existing supervised maximum liklihood classification technique.

  1. Meta-analysis in applied ecology.

    PubMed

    Stewart, Gavin

    2010-02-23

    This overview examines research synthesis in applied ecology and conservation. Vote counting and pooling unweighted averages are widespread despite the superiority of syntheses based on weighted combination of effects. Such analyses allow exploration of methodological uncertainty in addition to consistency of effects across species, space and time, but exploring heterogeneity remains controversial. Meta-analyses are required to generalize in ecology, and to inform evidence-based decision-making, but the more sophisticated statistical techniques and registers of research used in other disciplines must be employed in ecology to fully realize their benefits.

  2. The effect of different types of employment on quality of life.

    PubMed

    Kober, R; Eggleton, I R C

    2005-10-01

    Despite research that has investigated whether the financial benefits of open employment exceed the costs, there has been scant research as to the effect sheltered and open employment have upon the quality of life of participants. The importance of this research is threefold: it investigates outcomes explicitly in terms of quality of life; the sample size is comparatively large; and it uses an established and validated questionnaire. One hundred and seventeen people with intellectual disability (ID) who were employed in either open or sheltered employment by disability employment agencies were interviewed. Quality of life was assessed using the Quality of Life Questionnaire. After making an initial assessment to see whether the outcomes achieved depended on type of employment, quality of life scores were analyzed controlling for participants' level of functional work ability (assessed via the Functional Assessment Inventory). The results showed that participants placed in open employment reported statistically significant higher quality of life scores. When the sample was split based upon participants' functional work ability, the type of employment had no effect on the reported quality of life for participants with a low functional work ability. However, for those participants with a high functional work ability, those in open employment reported statistically significantly higher quality of life. The results of this study support the placement of people with ID with high functional work ability into open employment. However, a degree of caution needs to be taken in interpreting the results presented given the disparity in income levels between the two types of employment.

  3. Analysis of microarray leukemia data using an efficient MapReduce-based K-nearest-neighbor classifier.

    PubMed

    Kumar, Mukesh; Rath, Nitish Kumar; Rath, Santanu Kumar

    2016-04-01

    Microarray-based gene expression profiling has emerged as an efficient technique for classification, prognosis, diagnosis, and treatment of cancer. Frequent changes in the behavior of this disease generates an enormous volume of data. Microarray data satisfies both the veracity and velocity properties of big data, as it keeps changing with time. Therefore, the analysis of microarray datasets in a small amount of time is essential. They often contain a large amount of expression, but only a fraction of it comprises genes that are significantly expressed. The precise identification of genes of interest that are responsible for causing cancer are imperative in microarray data analysis. Most existing schemes employ a two-phase process such as feature selection/extraction followed by classification. In this paper, various statistical methods (tests) based on MapReduce are proposed for selecting relevant features. After feature selection, a MapReduce-based K-nearest neighbor (mrKNN) classifier is also employed to classify microarray data. These algorithms are successfully implemented in a Hadoop framework. A comparative analysis is done on these MapReduce-based models using microarray datasets of various dimensions. From the obtained results, it is observed that these models consume much less execution time than conventional models in processing big data. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Economic situation and occupational accidents in Poland: 2002-2014 panel data regional study.

    PubMed

    Łyszczarz, Błażej; Nojszewska, Ewelina

    2018-01-07

    Occupational accidents constitute a substantial health and economic burden for societies around the world and a variety of factors determine the frequency of accidents at work. The aim of this paper is to investigate the relationship between the economic situation and the rate of occupational accidents in Poland. The analysis comprised data for 66 Polish sub-regions taken from the Central Statistical Office's Local Data Bank. The regression analysis with panel data for period 2002-2014 was applied to identify the relationships involved. Four measures of accidents were used: the rates of total occupational accidents, accidents among men and women separately as well as days of incapacity to work due to accidents at work per employee. Four alternative measures assessed the economic situation: gross domestic product (GDP) per capita, average remuneration, the unemployment rate and number of dwelling permits. The confounding variables included were: employment in hazardous conditions and the size of enterprises. The results of the regression estimates show that the number of occupational accidents in Poland exhibits procyclical behavior, which means that more accidents are observed during the times of economic expansion. Stronger relationships were observed in the equations explaining men's accident rates as well as total rates. A weaker and not always statistically significant impact of economic situation was identified for women's accident rates and days of incapacity to work. The results have important implications for occupational health and safety actions. In the periods of higher work intensity employers should focus on appropriate training and supervision of inexperienced workers as well as on ensuring enough time for already experienced employees to recuperate. In terms of public health actions, policy makers should focus on scrutinizing working conditions, educating employers and counteracting possible discrimination of injured employees. Int J Occup Med Environ Health 2018;31(2):151-164. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.

  5. Application of time series analysis in modelling and forecasting emergency department visits in a medical centre in Southern Taiwan.

    PubMed

    Juang, Wang-Chuan; Huang, Sin-Jhih; Huang, Fong-Dee; Cheng, Pei-Wen; Wann, Shue-Ren

    2017-12-01

    Emergency department (ED) overcrowding is acknowledged as an increasingly important issue worldwide. Hospital managers are increasingly paying attention to ED crowding in order to provide higher quality medical services to patients. One of the crucial elements for a good management strategy is demand forecasting. Our study sought to construct an adequate model and to forecast monthly ED visits. We retrospectively gathered monthly ED visits from January 2009 to December 2016 to carry out a time series autoregressive integrated moving average (ARIMA) analysis. Initial development of the model was based on past ED visits from 2009 to 2016. A best-fit model was further employed to forecast the monthly data of ED visits for the next year (2016). Finally, we evaluated the predicted accuracy of the identified model with the mean absolute percentage error (MAPE). The software packages SAS/ETS V.9.4 and Office Excel 2016 were used for all statistical analyses. A series of statistical tests showed that six models, including ARIMA (0, 0, 1), ARIMA (1, 0, 0), ARIMA (1, 0, 1), ARIMA (2, 0, 1), ARIMA (3, 0, 1) and ARIMA (5, 0, 1), were candidate models. The model that gave the minimum Akaike information criterion and Schwartz Bayesian criterion and followed the assumptions of residual independence was selected as the adequate model. Finally, a suitable ARIMA (0, 0, 1) structure, yielding a MAPE of 8.91%, was identified and obtained as Visit t =7111.161+(a t +0.37462 a t -1). The ARIMA (0, 0, 1) model can be considered adequate for predicting future ED visits, and its forecast results can be used to aid decision-making processes. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  6. Employability and career experiences of international graduates of MSc Public Health: a mixed methods study.

    PubMed

    Buunaaisie, C; Manyara, A M; Annett, H; Bird, E L; Bray, I; Ige, J; Jones, M; Orme, J; Pilkington, P; Evans, D

    2018-05-08

    This article aims to describe the public health career experiences of international graduates of a Master of Science in Public Health (MSc PH) programme and to contribute to developing the evidence base on international public health workforce capacity development. A sequential mixed methods study was conducted between January 2017 and April 2017. Ninety-seven international graduates of one UK university's MSc PH programme were invited to take part in an online survey followed by semistructured interviews, for respondents who consented to be interviewed. We computed the descriptive statistics of the quantitative data obtained, and qualitative data were thematically analysed. The response rate was 48.5%. Most respondents (63%) were employed by various agencies within 1 year after graduation. Others (15%) were at different stages of doctor of philosophy studies. Respondents reported enhanced roles after graduation in areas such as public health policy analysis (74%); planning, implementation and evaluation of public health interventions (74%); leadership roles (72%); and research (70%). The common perceived skills that were relevant to the respondents' present jobs were critical analysis (87%), multidisciplinary thinking (86%), demonstrating public health leadership skills (84%) and research (77%). Almost all respondents (90%) were confident in conducting research. Respondents recommended the provision of longer public health placement opportunities, elective courses on project management and advanced statistics, and 'internationalisation' of the programme's curriculum. The study has revealed the relevance of higher education in public health in developing the career prospects and skills of graduates. International graduates of this MSc PH programme were satisfied with the relevance and impact of the skills they acquired during their studies. The outcomes of this study can be used for curriculum reformation. Employers' perspectives of the capabilities of these graduates, however, need further consideration. Copyright © 2018 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  7. Exploratory Analysis of Survey Data for Understanding Adoption of Novel Aerospace Systems

    NASA Astrophysics Data System (ADS)

    Reddy, Lauren M.

    In order to meet the increasing demand for manned and unmanned flight, the air transportation system must constantly evolve. As new technologies or operational procedures are conceived, we must determine their effect on humans in the system. In this research, we introduce a strategy to assess how individuals or organizations would respond to a novel aerospace system. We employ the most appropriate and sophisticated exploratory analysis techniques on the survey data to generate insight and identify significant variables. We employ three different methods for eliciting views from individuals or organizations who are affected by a system: an opinion survey, a stated preference survey, and structured interviews. We conduct an opinion survey of both the general public and stakeholders in the unmanned aircraft industry to assess their knowledge, attitude, and practices regarding unmanned aircraft. We complete a statistical analysis of the multiple-choice questions using multinomial logit and multivariate probit models and conduct qualitative analysis on free-text questions. We next present a stated preference survey of the general public on the use of an unmanned aircraft package delivery service. We complete a statistical analysis of the questions using multinomial logit, ordered probit, linear regression, and negative binomial models. Finally, we discuss structured interviews conducted on stakeholders from ANSPs and airlines operating in the North Atlantic. We describe how these groups may choose to adopt a new technology (space-based ADS-B) or operational procedure (in-trail procedures). We discuss similarities and differences between the stakeholders groups, the benefits and costs of in-trail procedures and space-based ADS-B as reported by the stakeholders, and interdependencies between the groups interviewed. To demonstrate the value of the data we generated, we explore how the findings from the surveys can be used to better characterize uncertainty in the cost-benefit analysis of aerospace systems. We demonstrate how the findings from the opinion and stated preference surveys can be infused into the cost-benefit analysis of an unmanned aircraft delivery system. We also demonstrate how to apply the findings from the interviews to characterize uncertainty in the estimation of the benefits of space-based ADS-B.

  8. Interpreting Conditions in the Job Market for College Graduates.

    ERIC Educational Resources Information Center

    Alsalam, Nabeel

    1993-01-01

    Indicates that occupational and employment statistics would be more beneficial if users had a better understanding of how occupations are changing and how employers are redefining jobs to use the education and skills of their employees. (JOW)

  9. North American transportation : statistics on Canadian, Mexican, and United States transportation

    DOT National Transportation Integrated Search

    1994-05-01

    North American Transportation: Statistics on Canadian, Mexican, and United States transportation contains extensive data on the size and scope, use, employment, fuel consumption, and economic role of each country's transportation system. It was publi...

  10. Selected papers in the hydrologic sciences, 1986

    USGS Publications Warehouse

    Subitzky, Seymour

    1987-01-01

    Water-quality data from long-term (24 years), fixed- station monitoring at the Cape Fear River at Lock 1 near Kelly, N.C., and various measures of basin development are correlated. Subbasin population, number of acres of cropland in the subbasin, number of people employed in manufacturing, and tons of fertilizer applied in the basin are considered as measures of basinwide development activity. Linear correlations show statistically significant posi- tive relations between both population and manufacturing activity and most of the dissolved constituents considered. Negative correlations were found between the acres of harvested cropland and most of the water-quality measures. The amount of fertilizer sold in the subbasin was not statistically related to the water-quality measures considered in this report. The statistical analysis was limited to several commonly used measures of water quality including specific conductance, pH, dissolved solids, several major dissolved ions, and a few nutrients. The major dissolved ions included in the analysis were calcium, sodium, potassium, magnesium, chloride, sulfate, silica, bicarbonate, and fluoride. The nutrients included were dissolved nitrite plus nitrate nitrogen, dissolved ammonia nitrogen, total nitrogen, dissolved phosphates, and total phosphorus. For the chemicals evaluated, manufacturing and population sources are more closely associated with water quality in the Cape Fear River at Lock 1 than are agricultural variables.

  11. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 1: Methodology and applications

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  12. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  13. Statistical analysis of kinetic energy entrainment in a model wind turbine array boundary layer

    NASA Astrophysics Data System (ADS)

    Cal, Raul Bayoan; Hamilton, Nicholas; Kang, Hyung-Suk; Meneveau, Charles

    2012-11-01

    For large wind farms, kinetic energy must be entrained from the flow above the wind turbines to replenish wakes and enable power extraction in the array. Various statistical features of turbulence causing vertical entrainment of mean-flow kinetic energy are studied using hot-wire velocimetry data taken in a model wind farm in a scaled wind tunnel experiment. Conditional statistics and spectral decompositions are employed to characterize the most relevant turbulent flow structures and determine their length-scales. Sweep and ejection events are shown to be the largest contributors to the vertical kinetic energy flux, although their relative contribution depends upon the location in the wake. Sweeps are shown to be dominant in the region above the wind turbine array. A spectral analysis of the data shows that large scales of the flow, about the size of the rotor diameter in length or larger, dominate the vertical entrainment. The flow is more incoherent below the array, causing decreased vertical fluxes there. The results show that improving the rate of vertical kinetic energy entrainment into wind turbine arrays is a standing challenge and would require modifying the large-scale structures of the flow. This work was funded in part by the National Science Foundation (CBET-0730922, CBET-1133800 and CBET-0953053).

  14. Thermal heterogeneity within aqueous materials quantified by 1H NMR spectroscopy: Multiparametric validation in silico and in vitro

    NASA Astrophysics Data System (ADS)

    Lutz, Norbert W.; Bernard, Monique

    2018-02-01

    We recently suggested a new paradigm for statistical analysis of thermal heterogeneity in (semi-)aqueous materials by 1H NMR spectroscopy, using water as a temperature probe. Here, we present a comprehensive in silico and in vitro validation that demonstrates the ability of this new technique to provide accurate quantitative parameters characterizing the statistical distribution of temperature values in a volume of (semi-)aqueous matter. First, line shape parameters of numerically simulated water 1H NMR spectra are systematically varied to study a range of mathematically well-defined temperature distributions. Then, corresponding models based on measured 1H NMR spectra of agarose gel are analyzed. In addition, dedicated samples based on hydrogels or biological tissue are designed to produce temperature gradients changing over time, and dynamic NMR spectroscopy is employed to analyze the resulting temperature profiles at sub-second temporal resolution. Accuracy and consistency of the previously introduced statistical descriptors of temperature heterogeneity are determined: weighted median and mean temperature, standard deviation, temperature range, temperature mode(s), kurtosis, skewness, entropy, and relative areas under temperature curves. Potential and limitations of this method for quantitative analysis of thermal heterogeneity in (semi-)aqueous materials are discussed in view of prospective applications in materials science as well as biology and medicine.

  15. Predictors of surgeons' efficiency in the operating rooms.

    PubMed

    Nakata, Yoshinori; Watanabe, Yuichi; Narimatsu, Hiroto; Yoshimura, Tatsuya; Otake, Hiroshi; Sawa, Tomohiro

    2017-02-01

    The sustainability of the Japanese healthcare system is questionable because of a huge fiscal debt. One of the solutions is to improve the efficiency of healthcare. The purpose of this study is to determine what factors are predictive of surgeons' efficiency scores. The authors collected data from all the surgical procedures performed at Teikyo University Hospital from April 1 through September 30 in 2013-2015. Output-oriented Charnes-Cooper-Rhodes model of data envelopment analysis was employed to calculate each surgeon's efficiency score. Seven independent variables that may predict their efficiency scores were selected: experience, medical school, surgical volume, gender, academic rank, surgical specialty, and the surgical fee schedule. Multiple regression analysis using random-effects Tobit model was used for our panel data. The data from total 8722 surgical cases were obtained in 18-month study period. The authors analyzed 134 surgeons. The only statistically significant coefficients were surgical specialty and surgical fee schedule (p = 0.000 and p = 0.016, respectively). Experience had some positive association with efficiency scores but did not reach statistical significance (p = 0.062). The other coefficients were not statistically significant. These results demonstrated that the surgical reimbursement system, not surgeons' personal characteristics, is a significant predictor of surgeons' efficiency.

  16. Identifying Pleiotropic Genes in Genome-Wide Association Studies for Multivariate Phenotypes with Mixed Measurement Scales

    PubMed Central

    Williams, L. Keoki; Buu, Anne

    2017-01-01

    We propose a multivariate genome-wide association test for mixed continuous, binary, and ordinal phenotypes. A latent response model is used to estimate the correlation between phenotypes with different measurement scales so that the empirical distribution of the Fisher’s combination statistic under the null hypothesis is estimated efficiently. The simulation study shows that our proposed correlation estimation methods have high levels of accuracy. More importantly, our approach conservatively estimates the variance of the test statistic so that the type I error rate is controlled. The simulation also shows that the proposed test maintains the power at the level very close to that of the ideal analysis based on known latent phenotypes while controlling the type I error. In contrast, conventional approaches–dichotomizing all observed phenotypes or treating them as continuous variables–could either reduce the power or employ a linear regression model unfit for the data. Furthermore, the statistical analysis on the database of the Study of Addiction: Genetics and Environment (SAGE) demonstrates that conducting a multivariate test on multiple phenotypes can increase the power of identifying markers that may not be, otherwise, chosen using marginal tests. The proposed method also offers a new approach to analyzing the Fagerström Test for Nicotine Dependence as multivariate phenotypes in genome-wide association studies. PMID:28081206

  17. Study of the structure changes caused by volcanic activity in Mexico applying the lineament analysis to the Aster (Terra) satellite data.

    NASA Astrophysics Data System (ADS)

    Arellano-Baeza, A. A.; Garcia, R. V.; Trejo-Soto, M.; Molina-Sauceda, E.

    Mexico is one of the most volcanically active regions in North America Volcanic activity in central Mexico is associated with the subduction of the Cocos and Rivera plates beneath the North American plate Periods of enhanced microseismic activity associated with the volcanic activity of the Colima and Popocapetl volcanoes are compared to some periods of low microseismic activity We detected changes in the number and orientation of lineaments associated with the microseismic activity due to lineament analysis of a temporal sequence of high resolution satellite images of both volcanoes 15 m resolution multispectral images provided by the ASTER VNIR instrument were used The Lineament Extraction and Stripes Statistic Analysis LESSA software package was employed for the lineament extraction

  18. Public opinion about smoking and smoke free legislation in a district of North India.

    PubMed

    Goel, S; Singh, R J; D, Sharma; A, Singh

    2014-01-01

    Context: A growing number of cities, districts, counties and states across the globe are going smoke-free. While an Indian national law namely Cigarettes and Other Tobacco Products Act (COTPA) exists since 2003 and aims at protecting all the people in our country; people still smoke in public places. Aim: This study assessed knowledge and perceptions about smoking, SHS and their support for Smoke-free laws among people residing in Mohali district, Punjab. Materials and Methods: This cross-sectional study was conducted in Mohali district of Punjab, India. A sample size of 1600 people was obtained. Probability Proportional to Size technique was used for selecting the number of individuals to be interviewed from each block and also from urban and rural population. Statistical Analysis Used: We estimated proportions and tested for significant differences by residence, smoking status, literacy level and employment level by means of the chi-square statistics. Statistical software SPSS for Windows version 20 was used for analysing data . Results: The overall prevalence of current smoking among study participants was 25%. Around 96% were aware of the fact that smoking is harmful to health, 45% viewed second-hand smoke to be equally harmful as active smoking, 84.2% knew that smoking is prohibited in public places and 88.3% wanted the government to take strict actions to control the menace of public smoking. Multivariate logistic regression analysis showed that people aged 20 years and above, unemployed, urban, literate and non-smokers had significantly better perception towards harms of smoking. The knowledge about smoke free provisions of COTPA was significantly better among males, employed individuals, urban residents, and literate people. Conclusions: There was high knowledge about deleterious multi-dimensional effects of smoking among residents and a high support for implementation of COTPA. Efforts should be taken to make Mohali a "smoke-free district".

  19. Comparative analysis of profitability of honey production using traditional and box hives.

    PubMed

    Al-Ghamdi, Ahmed A; Adgaba, Nuru; Herab, Ahmed H; Ansari, Mohammad J

    2017-07-01

    Information on the profitability and productivity of box hives is important to encourage beekeepers to adopt the technology. However, comparative analysis of profitability and productivity of box and traditional hives is not adequately available. The study was carried out on 182 beekeepers using cross sectional survey and employing a random sampling technique. The data were analyzed using descriptive statistics, analysis of variance (ANOVA), the Cobb-Douglas (CD) production function and partial budgeting. The CD production function revealed that supplementary bee feeds, labor and medication were statistically significant for both box and traditional hives. Generally, labor for bee management, supplementary feeding, and medication led to productivity differences of approximately 42.83%, 7.52%, and 5.34%, respectively, between box and traditional hives. The study indicated that productivity of box hives were 72% higher than traditional hives. The average net incomes of beekeepers using box and traditional hives were 33,699.7 SR/annum and 16,461.4 SR/annum respectively. The incremental net benefit of box hives over traditional hives was nearly double. Our study results clearly showed the importance of adoption of box hives for better productivity of the beekeeping subsector.

  20. Prognostic value of inflammation-based scores in patients with osteosarcoma

    PubMed Central

    Liu, Bangjian; Huang, Yujing; Sun, Yuanjue; Zhang, Jianjun; Yao, Yang; Shen, Zan; Xiang, Dongxi; He, Aina

    2016-01-01

    Systemic inflammation responses have been associated with cancer development and progression. C-reactive protein (CRP), Glasgow prognostic score (GPS), neutrophil-lymphocyte ratio (NLR), platelet-lymphocyte ratio (PLR), lymphocyte-monocyte ratio (LMR), and neutrophil-platelet score (NPS) have been shown to be independent risk factors in various types of malignant tumors. This retrospective analysis of 162 osteosarcoma cases was performed to estimate their predictive value of survival in osteosarcoma. All statistical analyses were performed by SPSS statistical software. Receiver operating characteristic (ROC) analysis was generated to set optimal thresholds; area under the curve (AUC) was used to show the discriminatory abilities of inflammation-based scores; Kaplan-Meier analysis was performed to plot the survival curve; cox regression models were employed to determine the independent prognostic factors. The optimal cut-off points of NLR, PLR, and LMR were 2.57, 123.5 and 4.73, respectively. GPS and NLR had a markedly larger AUC than CRP, PLR and LMR. High levels of CRP, GPS, NLR, PLR, and low level of LMR were significantly associated with adverse prognosis (P < 0.05). Multivariate Cox regression analyses revealed that GPS, NLR, and occurrence of metastasis were top risk factors associated with death of osteosarcoma patients. PMID:28008988

  1. Age-related differences in working hours among male and female GPs: an SMS-based time use study.

    PubMed

    van Hassel, Daniël; van der Velden, Lud; de Bakker, Dinny; Batenburg, Ronald

    2017-12-19

    In several countries, the number of hours worked by general practitioners (GPs) has decreased, raising concern about current and impending workforce shortages. This shorter working week has been ascribed both to the feminisation of the workforce and to a younger generation of GPs who prefer more flexible working arrangements. There is, however, limited insight into how the impact of these determinants interact. We investigated the relative importance of differences in GPs' working hours in relation to gender, age, and employment position. An analysis was performed on real-time monitoring data collected by sending SMS text messages to 1051 Dutch GPs, who participated during a 1-week time use study. We used descriptive statistics, independent sample t-tests, and one-way ANOVA analysis to compare the working time of different GP groups. A path analysis was conducted to examine the difference in working time by gender, age, employment position, and their combinations. Female GPs worked significantly fewer hours than their male peers. GPs in their 50s worked the highest number of hours, followed by GPs age 60 and older. GPs younger than 40 worked the lowest number of hours. This relationship between working hours and age was not significantly different for women and men. As shown by path analysis, female GPs consistently worked fewer hours than their male counterparts, regardless of their age and employment position. The relationship between age and working hours was largely influenced by gender and employment position. The variation in working hours among GPs can be explained by the combination of gender, age, and employment position. Gender appears to be the most important predictor as the largest part of the variation in working hours is explained by a direct effect of this variable. It has previously been reported that the difference in working hours between male and female GPs had decreased over time. However, our findings suggest that gender remains a critical factor for variation in time use and for policy instruments such as health workforce planning.

  2. A Classroom Note on the Binomial and Poisson Distributions: Biomedical Examples for Use in Teaching Introductory Statistics

    ERIC Educational Resources Information Center

    Holland, Bart K.

    2006-01-01

    A generally-educated individual should have some insight into how decisions are made in the very wide range of fields that employ statistical and probabilistic reasoning. Also, students of introductory probability and statistics are often best motivated by specific applications rather than by theory and mathematical development, because most…

  3. Huge Increase in Day-Care Workers: A Result of Multiple Societal Changes.

    ERIC Educational Resources Information Center

    Bureau of Labor Statistics (DOL), Washington, DC.

    Using Bureau of Labor Statistics estimates of employment in day-care establishments, this study analyzes changes in day care over the past 20 years. Growth in day-care employment has been much stronger than that of other industries. Since 1972, employment has increased by nearly 250 per cent. Causes of growth includes changing trends in enrollment…

  4. Annual Survey of Public Employment & Payroll Summary Report: 2013. Economy-Wide Statistics Briefs: Public Sector

    ERIC Educational Resources Information Center

    Willhide, Robert Jesse

    2014-01-01

    This report is part of a series of reports that provides information on the structure, function, finances, taxation, employment, and pension systems of the United States' approximately 90,000 state and local governments. This report presents data on state and local government employment and payroll based on information collected by the 2013 Annual…

  5. Voices from the Field: Developing Employability Skills for Archaeological Students Using a Project Based Learning Approach

    ERIC Educational Resources Information Center

    Wood, Gaynor

    2016-01-01

    Graduate employment statistics are receiving considerable attention in UK universities. This paper looks at how a wide range of employability attributes can be developed with students, through the innovative use of the Project Based Learning (PjBL) approach. The case study discussed here involves a group of archaeology students from the University…

  6. Gender, professional and non-professional work, and the changing pattern of employment-related inequality in poor self-rated health, 1995-2006 in South Korea.

    PubMed

    Kim, Il Ho; Khang, Young Ho; Cho, Sung Il; Chun, Heeran; Muntaner, Carles

    2011-01-01

    We examined gender differential changes in employment-related health inequalities according to occupational position (professional/nonprofessional) in South Korea during the last decade. Data were taken from four rounds of Social Statistical Surveys of South Korea (1995, 1999, 2003, and 2006) from the Korean National Statistics Office. The total study population was 55435 male and 33 913 female employees aged 25-64. Employment arrangements were divided into permanent, fixed-term, and daily employment. After stratification according to occupational position (professional/nonprofessional) and gender, different patterns in employment - related health inequalities were observed. In the professional group, the gaps in absolute and relative employment inequalities for poor self-rated health were more likely to widen following Korea's 1997 economic downturn. In the nonprofessional group, during the study period, graded patterns of employment-related health inequalities were continuously observed in both genders. Absolute health inequalities by employment status, however, decreased among men but increased among women. In addition, a remarkable increase in relative health inequalities was found among female temporary and daily employees (p = 0.009, < 0.001, respectively), but only among male daily employees (p = 0.001). Relative employment-related health inequalities had clearly widened for female daily workers between 2003 and 2006 (p = 0.047). The 1997 Korean economic downturn, in particular, seemingly stimulated a widening gap in employment health inequalities. Our study revealed that whereas absolute health inequalities in relation to employment status increased in the professional group, relative employment-related health inequalities increased in the nonprofessional group, especially among women. In view of the high concentration of female nonstandard employees, further monitoring of inequality should consider gender specific patterns according to employee's occupational and employment status.

  7. Testing for Mutagens Using Fruit Flies.

    ERIC Educational Resources Information Center

    Liebl, Eric C.

    1998-01-01

    Describes a laboratory employed in undergraduate teaching that uses fruit flies to test student-selected compounds for their ability to cause mutations. Requires no prior experience with fruit flies, incorporates a student design component, and employs both rigorous controls and statistical analyses. (DDR)

  8. Monitoring of bone regeneration process by means of texture analysis

    NASA Astrophysics Data System (ADS)

    Kokkinou, E.; Boniatis, I.; Costaridou, L.; Saridis, A.; Panagiotopoulos, E.; Panayiotakis, G.

    2009-09-01

    An image analysis method is proposed for the monitoring of the regeneration of the tibial bone. For this purpose, 130 digitized radiographs of 13 patients, who had undergone tibial lengthening by the Ilizarov method, were studied. For each patient, 10 radiographs, taken at an equal number of postoperative successive time moments, were available. Employing available software, 3 Regions Of Interest (ROIs), corresponding to the: (a) upper, (b) central, and (c) lower aspect of the gap, where bone regeneration was expected to occur, were determined on each radiograph. Employing custom developed algorithms: (i) a number of textural features were generated from each of the ROIs, and (ii) a texture-feature based regression model was designed for the quantitative monitoring of the bone regeneration process. Statistically significant differences (p < 0.05) were derived for the initial and the final textural features values, generated from the first and the last postoperatively obtained radiographs, respectively. A quadratic polynomial regression equation fitted data adequately (r2 = 0.9, p < 0.001). The suggested method may contribute to the monitoring of the tibial bone regeneration process.

  9. Using complex networks for text classification: Discriminating informative and imaginative documents

    NASA Astrophysics Data System (ADS)

    de Arruda, Henrique F.; Costa, Luciano da F.; Amancio, Diego R.

    2016-01-01

    Statistical methods have been widely employed in recent years to grasp many language properties. The application of such techniques have allowed an improvement of several linguistic applications, such as machine translation and document classification. In the latter, many approaches have emphasised the semantical content of texts, as is the case of bag-of-word language models. These approaches have certainly yielded reasonable performance. However, some potential features such as the structural organization of texts have been used only in a few studies. In this context, we probe how features derived from textual structure analysis can be effectively employed in a classification task. More specifically, we performed a supervised classification aiming at discriminating informative from imaginative documents. Using a networked model that describes the local topological/dynamical properties of function words, we achieved an accuracy rate of up to 95%, which is much higher than similar networked approaches. A systematic analysis of feature relevance revealed that symmetry and accessibility measurements are among the most prominent network measurements. Our results suggest that these measurements could be used in related language applications, as they play a complementary role in characterising texts.

  10. Photoacoustic Analysis of the Penetration Kinetics of Cordia verbenacea DC in Human Skin

    NASA Astrophysics Data System (ADS)

    Carvalho, S. S.; Barja, P. R.

    2012-11-01

    Phonophoresis consists of the utilization of ultrasound radiation associated to pharmacological agents in order to enhance transdermal penetration of applied drugs. It is a widely employed resource in physiotherapy practice, normally associated with anti-inflammatory drugs, such as Acheflan. This drug was developed in Brazil from the essential oil of Cordia verbenacea DC, a native plant of the Brazilian southern coast. In previous studies, the photoacoustic (PA) technique proved effective in the study of the penetration kinetics of topically applied products and in the evaluation of drug delivery after phonophoresis application. The present work aimed to evaluate the penetration kinetics of Acheflan in human skin, employing in vivo PA measurements after massage application or phonophoresis application. Ten volunteers (aged between 18 and 30 years) took part in the study. Time evolution of the PA signal was fitted to a Boltzmann curve, S-shaped. After statistical analysis, PA measurements have shown drug penetration for both application forms, but drug delivery was more evident after phonophoresis application, with a characteristic penetration time of less than 15 min for the stratum corneum.

  11. Analysis of Feature Intervisibility and Cumulative Visibility Using GIS, Bayesian and Spatial Statistics: A Study from the Mandara Mountains, Northern Cameroon

    PubMed Central

    Wright, David K.; MacEachern, Scott; Lee, Jaeyong

    2014-01-01

    The locations of diy-geδ-bay (DGB) sites in the Mandara Mountains, northern Cameroon are hypothesized to occur as a function of their ability to see and be seen from points on the surrounding landscape. A series of geostatistical, two-way and Bayesian logistic regression analyses were performed to test two hypotheses related to the intervisibility of the sites to one another and their visual prominence on the landscape. We determine that the intervisibility of the sites to one another is highly statistically significant when compared to 10 stratified-random permutations of DGB sites. Bayesian logistic regression additionally demonstrates that the visibility of the sites to points on the surrounding landscape is statistically significant. The location of sites appears to have also been selected on the basis of lower slope than random permutations of sites. Using statistical measures, many of which are not commonly employed in archaeological research, to evaluate aspects of visibility on the landscape, we conclude that the placement of DGB sites improved their conspicuousness for enhanced ritual, social cooperation and/or competition purposes. PMID:25383883

  12. A fully Bayesian before-after analysis of permeable friction course (PFC) pavement wet weather safety.

    PubMed

    Buddhavarapu, Prasad; Smit, Andre F; Prozzi, Jorge A

    2015-07-01

    Permeable friction course (PFC), a porous hot-mix asphalt, is typically applied to improve wet weather safety on high-speed roadways in Texas. In order to warrant expensive PFC construction, a statistical evaluation of its safety benefits is essential. Generally, the literature on the effectiveness of porous mixes in reducing wet-weather crashes is limited and often inconclusive. In this study, the safety effectiveness of PFC was evaluated using a fully Bayesian before-after safety analysis. First, two groups of road segments overlaid with PFC and non-PFC material were identified across Texas; the non-PFC or reference road segments selected were similar to their PFC counterparts in terms of site specific features. Second, a negative binomial data generating process was assumed to model the underlying distribution of crash counts of PFC and reference road segments to perform Bayesian inference on the safety effectiveness. A data-augmentation based computationally efficient algorithm was employed for a fully Bayesian estimation. The statistical analysis shows that PFC is not effective in reducing wet weather crashes. It should be noted that the findings of this study are in agreement with the existing literature, although these studies were not based on a fully Bayesian statistical analysis. Our study suggests that the safety effectiveness of PFC road surfaces, or any other safety infrastructure, largely relies on its interrelationship with the road user. The results suggest that the safety infrastructure must be properly used to reap the benefits of the substantial investments. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Assessment of water quality parameters using multivariate analysis for Klang River basin, Malaysia.

    PubMed

    Mohamed, Ibrahim; Othman, Faridah; Ibrahim, Adriana I N; Alaa-Eldin, M E; Yunus, Rossita M

    2015-01-01

    This case study uses several univariate and multivariate statistical techniques to evaluate and interpret a water quality data set obtained from the Klang River basin located within the state of Selangor and the Federal Territory of Kuala Lumpur, Malaysia. The river drains an area of 1,288 km(2), from the steep mountain rainforests of the main Central Range along Peninsular Malaysia to the river mouth in Port Klang, into the Straits of Malacca. Water quality was monitored at 20 stations, nine of which are situated along the main river and 11 along six tributaries. Data was collected from 1997 to 2007 for seven parameters used to evaluate the status of the water quality, namely dissolved oxygen, biochemical oxygen demand, chemical oxygen demand, suspended solids, ammoniacal nitrogen, pH, and temperature. The data were first investigated using descriptive statistical tools, followed by two practical multivariate analyses that reduced the data dimensions for better interpretation. The analyses employed were factor analysis and principal component analysis, which explain 60 and 81.6% of the total variation in the data, respectively. We found that the resulting latent variables from the factor analysis are interpretable and beneficial for describing the water quality in the Klang River. This study presents the usefulness of several statistical methods in evaluating and interpreting water quality data for the purpose of monitoring the effectiveness of water resource management. The results should provide more straightforward data interpretation as well as valuable insight for managers to conceive optimum action plans for controlling pollution in river water.

  14. Simulation-based estimation of mean and standard deviation for meta-analysis via Approximate Bayesian Computation (ABC).

    PubMed

    Kwon, Deukwoo; Reis, Isildinha M

    2015-08-12

    When conducting a meta-analysis of a continuous outcome, estimated means and standard deviations from the selected studies are required in order to obtain an overall estimate of the mean effect and its confidence interval. If these quantities are not directly reported in the publications, they must be estimated from other reported summary statistics, such as the median, the minimum, the maximum, and quartiles. We propose a simulation-based estimation approach using the Approximate Bayesian Computation (ABC) technique for estimating mean and standard deviation based on various sets of summary statistics found in published studies. We conduct a simulation study to compare the proposed ABC method with the existing methods of Hozo et al. (2005), Bland (2015), and Wan et al. (2014). In the estimation of the standard deviation, our ABC method performs better than the other methods when data are generated from skewed or heavy-tailed distributions. The corresponding average relative error (ARE) approaches zero as sample size increases. In data generated from the normal distribution, our ABC performs well. However, the Wan et al. method is best for estimating standard deviation under normal distribution. In the estimation of the mean, our ABC method is best regardless of assumed distribution. ABC is a flexible method for estimating the study-specific mean and standard deviation for meta-analysis, especially with underlying skewed or heavy-tailed distributions. The ABC method can be applied using other reported summary statistics such as the posterior mean and 95 % credible interval when Bayesian analysis has been employed.

  15. Pedagogical monitoring as a tool to reduce dropout in distance learning in family health.

    PubMed

    de Castro E Lima Baesse, Deborah; Grisolia, Alexandra Monteiro; de Oliveira, Ana Emilia Figueiredo

    2016-08-22

    This paper presents the results of a study of the Monsys monitoring system, an educational support tool designed to prevent and control the dropout rate in a distance learning course in family health. Developed by UNA-SUS/UFMA, Monsys was created to enable data mining in the virtual learning environment known as Moodle. This is an exploratory study using documentary and bibliographic research and analysis of the Monsys database. Two classes (2010 and 2011) were selected as research subjects, one with Monsys intervention and the other without. The samples were matched (using a ration of 1:1) by gender, age, marital status, graduation year, previous graduation status, location and profession. Statistical analysis was performed using the chi-square test and a multivariate logistic regression model with a 5 % significance level. The findings show that the dropout rate in the class in which Monsys was not employed (2010) was 43.2 %. However, the dropout rate in the class of 2011, in which the tool was employed as a pedagogical team aid, was 30.6 %. After statistical adjustment, the Monsys monitoring system remained in correlation with the course completion variable (adjusted OR = 1.74, IC95% = 1.17-2.59; p = 0.005), suggesting that the use of the Monsys tool, isolated to the adjusted variables, can enhance the likelihood that students will complete the course. Using the chi-square test, a profile analysis of students revealed a higher completion rate among women (67.7 %) than men (52.2 %). Analysis of age demonstrated that students between 40 and 49 years dropped out the least (32.1 %) and, with regard to professional training, nurses have the lowest dropout rates (36.3 %). The use of Monsys significantly reduced the dropout, with results showing greater association between the variables denoting presence of the monitoring system and female gender.

  16. A framework for incorporating DTI Atlas Builder registration into Tract-Based Spatial Statistics and a simulated comparison to standard TBSS.

    PubMed

    Leming, Matthew; Steiner, Rachel; Styner, Martin

    2016-02-27

    Tract-based spatial statistics (TBSS) 6 is a software pipeline widely employed in comparative analysis of the white matter integrity from diffusion tensor imaging (DTI) datasets. In this study, we seek to evaluate the relationship between different methods of atlas registration for use with TBSS and different measurements of DTI (fractional anisotropy, FA, axial diffusivity, AD, radial diffusivity, RD, and medial diffusivity, MD). To do so, we have developed a novel tool that builds on existing diffusion atlas building software, integrating it into an adapted version of TBSS called DAB-TBSS (DTI Atlas Builder-Tract-Based Spatial Statistics) by using the advanced registration offered in DTI Atlas Builder 7 . To compare the effectiveness of these two versions of TBSS, we also propose a framework for simulating population differences for diffusion tensor imaging data, providing a more substantive means of empirically comparing DTI group analysis programs such as TBSS. In this study, we used 33 diffusion tensor imaging datasets and simulated group-wise changes in this data by increasing, in three different simulations, the principal eigenvalue (directly altering AD), the second and third eigenvalues (RD), and all three eigenvalues (MD) in the genu, the right uncinate fasciculus, and the left IFO. Additionally, we assessed the benefits of comparing the tensors directly using a functional analysis of diffusion tensor tract statistics (FADTTS 10 ). Our results indicate comparable levels of FA-based detection between DAB-TBSS and TBSS, with standard TBSS registration reporting a higher rate of false positives in other measurements of DTI. Within the simulated changes investigated here, this study suggests that the use of DTI Atlas Builder's registration enhances TBSS group-based studies.

  17. Detecting most influencing courses on students grades using block PCA

    NASA Astrophysics Data System (ADS)

    Othman, Osama H.; Gebril, Rami Salah

    2014-12-01

    One of the modern solutions adopted in dealing with the problem of large number of variables in statistical analyses is the Block Principal Component Analysis (Block PCA). This modified technique can be used to reduce the vertical dimension (variables) of the data matrix Xn×p by selecting a smaller number of variables, (say m) containing most of the statistical information. These selected variables can then be employed in further investigations and analyses. Block PCA is an adapted multistage technique of the original PCA. It involves the application of Cluster Analysis (CA) and variable selection throughout sub principal components scores (PC's). The application of Block PCA in this paper is a modified version of the original work of Liu et al (2002). The main objective was to apply PCA on each group of variables, (established using cluster analysis), instead of involving the whole large pack of variables which was proved to be unreliable. In this work, the Block PCA is used to reduce the size of a huge data matrix ((n = 41) × (p = 251)) consisting of Grade Point Average (GPA) of the students in 251 courses (variables) in the faculty of science in Benghazi University. In other words, we are constructing a smaller analytical data matrix of the GPA's of the students with less variables containing most variation (statistical information) in the original database. By applying the Block PCA, (12) courses were found to `absorb' most of the variation or influence from the original data matrix, and hence worth to be keep for future statistical exploring and analytical studies. In addition, the course Independent Study (Math.) was found to be the most influencing course on students GPA among the 12 selected courses.

  18. On statistical inference in time series analysis of the evolution of road safety.

    PubMed

    Commandeur, Jacques J F; Bijleveld, Frits D; Bergel-Hayat, Ruth; Antoniou, Constantinos; Yannis, George; Papadimitriou, Eleonora

    2013-11-01

    Data collected for building a road safety observatory usually include observations made sequentially through time. Examples of such data, called time series data, include annual (or monthly) number of road traffic accidents, traffic fatalities or vehicle kilometers driven in a country, as well as the corresponding values of safety performance indicators (e.g., data on speeding, seat belt use, alcohol use, etc.). Some commonly used statistical techniques imply assumptions that are often violated by the special properties of time series data, namely serial dependency among disturbances associated with the observations. The first objective of this paper is to demonstrate the impact of such violations to the applicability of standard methods of statistical inference, which leads to an under or overestimation of the standard error and consequently may produce erroneous inferences. Moreover, having established the adverse consequences of ignoring serial dependency issues, the paper aims to describe rigorous statistical techniques used to overcome them. In particular, appropriate time series analysis techniques of varying complexity are employed to describe the development over time, relating the accident-occurrences to explanatory factors such as exposure measures or safety performance indicators, and forecasting the development into the near future. Traditional regression models (whether they are linear, generalized linear or nonlinear) are shown not to naturally capture the inherent dependencies in time series data. Dedicated time series analysis techniques, such as the ARMA-type and DRAG approaches are discussed next, followed by structural time series models, which are a subclass of state space methods. The paper concludes with general recommendations and practice guidelines for the use of time series models in road safety research. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Determinants of helmet use behaviour among employed motorcycle riders in Yazd, Iran based on theory of planned behaviour.

    PubMed

    Ali, Mehri; Saeed, Mazloomy Mahmoodabad Seyed; Ali, Morowatisharifabad Mohammad; Haidar, Nadrian

    2011-09-01

    This paper reports on predictors of helmet use behaviour, using variables based on the theory of planned behaviour model among the employed motorcycle riders in Yazd-Iran, in an attempt to identify influential factors that may be addressed through intervention efforts. In 2007, a cluster random sample of 130 employed motorcycle riders in the city of Yazd in central Iran, participated in the study. Appropriate instruments were designed to measure the variables of interest (attitude, subjective norms, perceived behaviour control, intention along with helmet use behaviour). Reliability and validity of the instruments were examined and approved. The statistical analysis of the data included descriptive statistics, bivariate correlations, and multiple regression. Based on the results, 56 out of all the respondents (43.1%) had history of accident by motorcycle. Of these motorcycle riders only 10.7% were wearing their helmet at the time of their accident. Intention and perceived behavioural control showed a significant relationship with helmet use behaviour and perceived behaviour control was the strongest predictor of helmet use intention, followed by subjective norms, and attitude. It was found that that helmet use rate among motorcycle riders was very low. The findings of present study provide a preliminary support for the TPB model as an effective framework for examining helmet use in motorcycle riders. Understanding motorcycle rider's thoughts, feelings and beliefs about helmet use behaviour can assist intervention specialists to develop and implement effective programs in order to promote helmet use among motorcycle riders. Copyright © 2010 Elsevier Ltd. All rights reserved.

  20. TV Viewing and BMI by Race/Ethnicity and Socio-Economic Status

    PubMed Central

    Shuval, Kerem; Gabriel, Kelley Pettee; Leonard, Tammy

    2013-01-01

    Objective To assess the association between TV viewing and obesity by race/ethnicity and socio-economic status. Design Cross-sectional analysis of 5,087 respondents to the Health Information National Trends Survey (HINTS), a nationally representative sample of US adults. Multivariate regression models were computed to assess the association between quartiles of TV viewing and BMI, stratified by race/ethnicity, educational attainment, employment and health insurance status. Results Findings indicate that increased TV viewing was associated with higher odds for being overweight/obese in the entire sample, while adjusting for physical activity and other confounders. After stratification by race/ethnicity, increased odds for overweight/obesity in the 3rd and 4th quartiles of TV viewing (e.g., 3rd quartile- cumulative OR = 1.43, 95%CI 1.07–1.92) was observed in non-Hispanic whites, with statistical significance. In non-Hispanic blacks and Hispanics, the odds were similar to whites, but did not reach statistical significance. Significant relations between greater TV viewing and increased BMI were observed in college graduates and non-graduates, those with health insurance and the employed. Conclusions This study extends previous research by examining potential inconsistencies in this association between various racial/ethnic groups and some socio-economic variables, which primarily were not found. PMID:23691070

  1. Factors that influence the tribocharging of pulverulent materials in compressed-air devices

    NASA Astrophysics Data System (ADS)

    Das, S.; Medles, K.; Mihalcioiu, A.; Beleca, R.; Dragan, C.; Dascalescu, L.

    2008-12-01

    Tribocharging of pulverulent materials in compressed-air devices is a typical multi-factorial process. This paper aims at demonstrating the interest of using the design of experiments methodology in association with virtual instrumentation for quantifying the effects of various process varaibles and of their interactions, as a prerequisite for the development of new tribocharging devices for industrial applications. The study is focused on the tribocharging of PVC powders in compressed-air devices similar to those employed in electrostatic painting. A classical 2 full-factorial design (3 factors at two levels) was employed for conducting the experiments. The response function was the charge/mass ratio of the material collected in a modified Faraday cage, at the exit of the tribocharging device. The charge/mass ratio was found to increase with the injection pressure and the vortex pressure in the tribocharging device, and to decrease with the increasing of the feed rate. In the present study an in-house design of experiments software was employed for statistical analysis of experimental data and validation of the experimental model.

  2. Employer reasons for failing to report eligible workers’ compensation claims in the BLS survey of occupational injuries and illnesses

    PubMed Central

    Wuellner, Sara E.; Bonauto, David K.

    2016-01-01

    Background Little research has been done to identify reasons employers fail to report some injuries and illnesses in the Bureau of Labor Statistics Survey of Occupational Injuries and Illnesses (SOII). Methods We interviewed the 2012 Washington SOII respondents from establishments that had failed to report one or more eligible workers’ compensation claims in the SOII about their reasons for not reporting specific claims. Qualitative content analysis methods were used to identify themes and patterns in the responses. Results Non‐compliance with OSHA recordkeeping or SOII reporting instructions and data entry errors led to unreported claims. Some employers refused to include claims because they did not consider the injury to be work‐related, despite workers’ compensation eligibility. Participant responses brought the SOII eligibility of some claims into question. Conclusion Systematic and non‐systematic errors lead to SOII underreporting. Insufficient recordkeeping systems and limited knowledge of reporting requirements are barriers to accurate workplace injury records. Am. J. Ind. Med. 59:343–356, 2016. © 2016 The Authors. American Journal of Industrial Medicine Published by Wiley Periodicals, Inc. PMID:26970051

  3. Use of statistical and pharmacokinetic-pharmacodynamic modeling and simulation to improve decision-making: A section summary report of the trends and innovations in clinical trial statistics conference.

    PubMed

    Kimko, Holly; Berry, Seth; O'Kelly, Michael; Mehrotra, Nitin; Hutmacher, Matthew; Sethuraman, Venkat

    2017-01-01

    The application of modeling and simulation (M&S) methods to improve decision-making was discussed during the Trends & Innovations in Clinical Trial Statistics Conference held in Durham, North Carolina, USA on May 1-4, 2016. Uses of both pharmacometric and statistical M&S were presented during the conference, highlighting the diversity of the methods employed by pharmacometricians and statisticians to address a broad range of quantitative issues in drug development. Five presentations are summarized herein, which cover the development strategy of employing M&S to drive decision-making; European initiatives on best practice in M&S; case studies of pharmacokinetic/pharmacodynamics modeling in regulatory decisions; estimation of exposure-response relationships in the presence of confounding; and the utility of estimating the probability of a correct decision for dose selection when prior information is limited. While M&S has been widely used during the last few decades, it is expected to play an essential role as more quantitative assessments are employed in the decision-making process. By integrating M&S as a tool to compile the totality of evidence collected throughout the drug development program, more informed decisions will be made.

  4. What are the experiences of people with dementia in employment?

    PubMed

    Chaplin, Ruth; Davidson, Ian

    2016-03-01

    Statistics show that an increase in the statutory retirement age in the UK will mean that many more people will develop a dementia while still in employment. A review of the literature confirmed that there are no existing studies in the UK which examine this issue in any detail. The aim of this study was to investigate the experiences of people who develop a dementia while still in employment and to understand how they make sense of these experiences; therefore a qualitative explorative inquiry based on an Interpretive Phenomenological Analysis methodology was used. Interviews with five people who had developed a dementia while still in employment were carried out, with ages ranging from 58 to 74 years. Interview transcripts were analysed and four super-ordinate themes were identified: the realization that something is wrong; managing the situation in the workplace; trying to make sense of change; and coming to terms with retirement or unemployment. The results showed that people who develop a dementia while still in employment do not always receive the 'reasonable adjustments' in the workplace to which they are entitled under the Equality Act (2010). Some of the participants felt that they were poorly treated by their workplace and described some distressing experiences. The study highlights the need for more effective specialized advice and support regarding employment issues and more research into the numbers of people in the UK that are affected by this issue. © The Author(s) 2014.

  5. Employment status five years after a randomised controlled trial comparing multidisciplinary and brief intervention in employees on sick leave due to low back pain.

    PubMed

    Pedersen, Pernille; Nielsen, Claus Vinther; Jensen, Ole Kudsk; Jensen, Chris; Labriola, Merete

    2018-05-01

    To evaluate differences in employment status, during a five-year follow-up period in patients on sick leave due to low back pain who had participated in a trial comparing a brief and a multidisciplinary intervention. From 2004 to 2008, 535 patients were referred to the Spine Centre at the Regional Hospital in Silkeborg if they had been on sick leave for 3-16 weeks due to low back pain. All patients underwent a clinical examination by a rehabilitation physician and a physiotherapist, and were randomised to either the brief intervention or the multidisciplinary intervention. The outcome was employment status from randomisation to five years of follow-up and was measured by the mean number of weeks in four different groups of employment status (sequence analysis) and a fraction of the number of weeks working (work participation score) that were accumulated over the years. A total of 231 patients were randomised to the brief intervention and 233 patients to the multidisciplinary intervention. No statistically significant differences in the mean weeks spent within the different employment statuses were found between the two intervention groups. After five years of follow-up, participants in the multidisciplinary intervention had a 19% higher risk of not having a work participation score above 75% compared to participants in the brief intervention. After five years of follow-up no differences in employment status were found between participants in the brief and the multidisciplinary intervention.

  6. SQC: secure quality control for meta-analysis of genome-wide association studies.

    PubMed

    Huang, Zhicong; Lin, Huang; Fellay, Jacques; Kutalik, Zoltán; Hubaux, Jean-Pierre

    2017-08-01

    Due to the limited power of small-scale genome-wide association studies (GWAS), researchers tend to collaborate and establish a larger consortium in order to perform large-scale GWAS. Genome-wide association meta-analysis (GWAMA) is a statistical tool that aims to synthesize results from multiple independent studies to increase the statistical power and reduce false-positive findings of GWAS. However, it has been demonstrated that the aggregate data of individual studies are subject to inference attacks, hence privacy concerns arise when researchers share study data in GWAMA. In this article, we propose a secure quality control (SQC) protocol, which enables checking the quality of data in a privacy-preserving way without revealing sensitive information to a potential adversary. SQC employs state-of-the-art cryptographic and statistical techniques for privacy protection. We implement the solution in a meta-analysis pipeline with real data to demonstrate the efficiency and scalability on commodity machines. The distributed execution of SQC on a cluster of 128 cores for one million genetic variants takes less than one hour, which is a modest cost considering the 10-month time span usually observed for the completion of the QC procedure that includes timing of logistics. SQC is implemented in Java and is publicly available at https://github.com/acs6610987/secureqc. jean-pierre.hubaux@epfl.ch. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  7. The Impact of Arts Activity on Nursing Staff Well-Being: An Intervention in the Workplace

    PubMed Central

    Karpavičiūtė, Simona; Macijauskienė, Jūratė

    2016-01-01

    Over 59 million workers are employed in the healthcare sector globally, with a daily risk of being exposed to a complex variety of health and safety hazards. The purpose of this study was to investigate the impact of arts activity on the well-being of nursing staff. During October–December 2014, 115 nursing staff working in a hospital, took part in this study, which lasted for 10 weeks. The intervention group (n = 56) took part in silk painting activities once a week. Data was collected using socio-demographic questions, the Warwick-Edinburgh Mental Well-Being Scale, Short Form—36 Health Survey questionnaire, Reeder stress scale, and Multidimensional fatigue inventory (before and after art activities in both groups). Statistical data analysis included descriptive statistics (frequency, percentage, mean, standard deviation), non-parametric statistics analysis (Man Whitney U Test; Wilcoxon signed—ranks test), Fisher’s exact test and reliability analysis (Cronbach’s Alpha). The level of significance was set at p ≤ 0.05. In the intervention group, there was a tendency for participation in arts activity having a positive impact on their general health and mental well-being, reducing stress and fatigue, awaking creativity and increasing a sense of community at work. The control group did not show any improvements. Of the intervention group 93% reported enjoyment, with 75% aspiring to continue arts activity in the future. This research suggests that arts activity, as a workplace intervention, can be used to promote nursing staff well-being at work. PMID:27104550

  8. Current Status of Endovascular Treatment for Vasospasm following Subarachnoid Hemorrhage: Analysis of JR-NET2

    PubMed Central

    HAYASHI, Kentaro; HIRAO, Tomohito; SAKAI, Nobuyuki; NAGATA, Izumi

    2014-01-01

    Endovascular treatments are employed for cerebral vasospasm following subarachnoid hemorrhage, which is not responded to the medical treatments. However, the effect or complication of the treatments is not known well. Here, we analyzed the data of Japanese Registry of Neuroendovascular Therapy 2 (JRNET2) and revealed current status of the endovascular treatment for the cerebral vasospasm. JR-NET2 is conducted from January 1, 2007 to December 31, 2009. Information on the clinical status, imaging studies, treatment methods, the results of treatment, and status 30 days later were recorded. Totally 645 treatments for 480 patients (mean age, 59.4 years; 72.7% woman) were included. Factors related to the neurological improvement and treatment related complications were statistically analyzed. Treatments for ruptured cerebral aneurysm were direct surgery for 366 cases and endovascular treatment for 253 cases. The timing of the endovascular treatment for the cerebral vasospasm was within 3 hours in 209 cases, 3–6 hours in 158 cases, and more than 6 hours in 158 cases. Intra-arterial vasodilator was employed for the 495 cases and percutaneous transluminal angioplasty for 140 cases. Neurological improvement was observed in 372 cases and radiological improvement was seen in 623 cases. The treatment related complication occurred in 20 cases (3.1%), including 6 cases of intracranial hemorrhage, 5 cases of cerebral ischemia, a case of puncture site trouble, and 8 cases of others. Statistical analysis showed early treatment was related to the neurological improvement. Current status of endovascular treatment for cerebral vasospasm was revealed. Endovascular treatment was effective for vasospasm especially was performed early. PMID:24257541

  9. Current status of endovascular treatment for vasospasm following subarachnoid hemorrhage: analysis of JR-NET2.

    PubMed

    Hayashi, Kentaro; Hirao, Tomohito; Sakai, Nobuyuki; Nagata, Izumi

    2014-01-01

    Endovascular treatments are employed for cerebral vasospasm following subarachnoid hemorrhage, which is not responded to the medical treatments. However, the effect or complication of the treatments is not known well. Here, we analyzed the data of Japanese Registry of Neuroendovascular Therapy 2 (JR-NET2) and revealed current status of the endovascular treatment for the cerebral vasospasm. JR-NET2 is conducted from January 1, 2007 to December 31, 2009. Information on the clinical status, imaging studies, treatment methods, the results of treatment, and status 30 days later were recorded. Totally 645 treatments for 480 patients (mean age, 59.4 years; 72.7% woman) were included. Factors related to the neurological improvement and treatment related complications were statistically analyzed. Treatments for ruptured cerebral aneurysm were direct surgery for 366 cases and endovascular treatment for 253 cases. The timing of the endovascular treatment for the cerebral vasospasm was within 3 hours in 209 cases, 3-6 hours in 158 cases, and more than 6 hours in 158 cases. Intra-arterial vasodilator was employed for the 495 cases and percutaneous transluminal angioplasty for 140 cases. Neurological improvement was observed in 372 cases and radiological improvement was seen in 623 cases. The treatment related complication occurred in 20 cases (3.1%), including 6 cases of intracranial hemorrhage, 5 cases of cerebral ischemia, a case of puncture site trouble, and 8 cases of others. Statistical analysis showed early treatment was related to the neurological improvement. Current status of endovascular treatment for cerebral vasospasm was revealed. Endovascular treatment was effective for vasospasm especially was performed early.

  10. Current Status of Endovascular Treatment for Vasospasm following Subarachnoid Hemorrhage: Analysis of JR-NET2.

    PubMed

    Hayashi, Kentaro; Hirao, Tomohito; Sakai, Nobuyuki; Nagata, Izumi

    2014-01-01

    Endovascular treatments are employed for cerebral vasospasm following subarachnoid hemorrhage, which is not responded to the medical treatments. However, the effect or complication of the treatments is not known well. Here, we analyzed the data of Japanese Registry of Neuroendovascular Therapy 2 (JR-NET2) and revealed current status of the endovascular treatment for the cerebral vasospasm. JR-NET2 is conducted from January 1, 2007 to December 31, 2009. Information on the clinical status, imaging studies, treatment methods, the results of treatment, and status 30 days later were recorded. Totally 645 treatments for 480 patients (mean age, 59.4 years; 72.7% woman) were included. Factors related to the neurological improvement and treatment related complications were statistically analyzed. Treatments for ruptured cerebral aneurysm were direct surgery for 366 cases and endovascular treatment for 253 cases. The timing of the endovascular treatment for the cerebral vasospasm was within 3 hours in 209 cases, 3–6 hours in 158 cases, and more than 6 hours in 158 cases. Intra-arterial vasodilator was employed for the 495 cases and percutaneous transluminal angioplasty for 140 cases. Neurological improvement was observed in 372 cases and radiological improvement was seen in 623 cases. The treatment related complication occurred in 20 cases (3.1%), including 6 cases of intracranial hemorrhage, 5 cases of cerebral ischemia, a case of puncture site trouble, and 8 cases of others. Statistical analysis showed early treatment was related to the neurological improvement. Current status of endovascular treatment for cerebral vasospasm was revealed. Endovascular treatment was effective for vasospasm especially was performed early.

  11. Optimization of Multilocus Sequence Analysis for Identification of Species in the Genus Vibrio

    PubMed Central

    Gabriel, Michael W.; Matsui, George Y.; Friedman, Robert

    2014-01-01

    Multilocus sequence analysis (MLSA) is an important method for identification of taxa that are not well differentiated by 16S rRNA gene sequences alone. In this procedure, concatenated sequences of selected genes are constructed and then analyzed. The effects that the number and the order of genes used in MLSA have on reconstruction of phylogenetic relationships were examined. The recA, rpoA, gapA, 16S rRNA gene, gyrB, and ftsZ sequences from 56 species of the genus Vibrio were used to construct molecular phylogenies, and these were evaluated individually and using various gene combinations. Phylogenies from two-gene sequences employing recA and rpoA in both possible gene orders were different. The addition of the gapA gene sequence, producing all six possible concatenated sequences, reduced the differences in phylogenies to degrees of statistical (bootstrap) support for some nodes. The overall statistical support for the phylogenetic tree, assayed on the basis of a reliability score (calculated from the number of nodes having bootstrap values of ≥80 divided by the total number of nodes) increased with increasing numbers of genes used, up to a maximum of four. No further improvement was observed from addition of the fifth gene sequence (ftsZ), and addition of the sixth gene (gyrB) resulted in lower proportions of strongly supported nodes. Reductions in the numbers of strongly supported nodes were also observed when maximum parsimony was employed for tree construction. Use of a small number of gene sequences in MLSA resulted in accurate identification of Vibrio species. PMID:24951781

  12. Investment, managerial capacity, and bias in public health preparedness.

    PubMed

    Langabeer, James R; DelliFraine, Jami L; Tyson, Sandra; Emert, Jamie M; Herbold, John

    2009-01-01

    Nearly $7 billion has been invested through national cooperative funding since 2002 to strengthen state and local response capacity. Yet, very little outcome evidence exists to analyze funding effectiveness. The objective of this research is to analyze the relationship between investment (funding) and capacity (readiness) for public health preparedness (PHP). The aim of the authors is to use a management framework to evaluate capacity, and to explore the "immediacy bias" impact on investment stability. This study employs a longitudinal study design, incorporating survey research of the entire population of 68 health departments in the state of Texas. The authors assessed the investment-capacity relationship through several statistical methods. The authors created a structural measure of managerial capacity through principal components analysis, factorizing 10 independent variables and augment this with a perceived readiness level reported from PHP managers. The authors then employ analysis of variance, correlation analyses, and other descriptive statistics. There has been a 539 percent coefficient of variation in funding at the local level between the years 2004 and 2008, and a 63 percent reduction in total resources since the peak of funding, using paired sample data. Results suggest that investment is positively associated with readiness and managerial capacity in local health departments. The authors also find that investment was related to greater community collaboration, higher adoption of Incident Command System (ICS) structure, and more frequent operational drills and exercises. Greater investment is associated with higher levels of capacity and readiness. The authors conclude from this that investment should be stabilized and continued, and not be influenced by historical cognitive biases.

  13. Lewis x is highly expressed in normal tissues: a comparative immunohistochemical study and literature revision.

    PubMed

    Croce, María V; Isla-Larrain, Marina; Rabassa, Martín E; Demichelis, Sandra; Colussi, Andrea G; Crespo, Marina; Lacunza, Ezequiel; Segal-Eiras, Amada

    2007-01-01

    An immunohistochemical analysis was employed to determine the expression of carbohydrate antigens associated to mucins in normal epithelia. Tissue samples were obtained as biopsies from normal breast (18), colon (35) and oral cavity mucosa (8). The following carbohydrate epitopes were studied: sialyl-Lewis x, Lewis x, Lewis y, Tn hapten, sialyl-Tn and Thomsen-Friedenreich antigen. Mucins were also studied employing antibodies against MUC1, MUC2, MUC4, MUC5AC, MUC6 and also normal colonic glycolipid. Statistical analysis was performed and Kendall correlations were obtained. Lewis x showed an apical pattern mainly at plasma membrane, although cytoplasmic staining was also found in most samples. TF, Tn and sTn haptens were detected in few specimens, while sLewis x was found in oral mucosa and breast tissue. Also, normal breast expressed MUC1 at a high percentage, whereas MUC4 was observed in a small number of samples. Colon specimens mainly expressed MUC2 and MUC1, while most oral mucosa samples expressed MUC4 and MUC1. A positive correlation between MUC1VNTR and TF epitope (r=0.396) was found in breast samples, while in colon specimens MUC2 and colonic glycolipid versus Lewis x were statistically significantly correlated (r=0.28 and r=0.29, respectively). As a conclusion, a defined carbohydrate epitope expression is not exclusive of normal tissue or a determined localization, and it is possible to assume that different glycoproteins and glycolipids may be carriers of carbohydrate antigens depending on the tissue localization considered.

  14. 20 CFR 668.340 - What are INA grantee allowable activities?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...; (7) Career counseling; (8) Provision of employment statistics information and local, regional, and... and specialized testing and assessment; (2) Development of an individual employment plan; (3) Group counseling; (4) Individual counseling and career planning; (5) Case Management for seeking training services...

  15. Zu Problemen statistischer Methoden in der Sprachwissenschaft (Problems of Statistical Methods in Linguistics)

    ERIC Educational Resources Information Center

    Zorn, Klaus

    1973-01-01

    Discussion of statistical apparatus employed in L. Doncheva-Mareva's article on the wide-spread usage of the present and future tense forms with future meaning in German letters, Deutsch als Fremdsprache, n1 1971. (RS)

  16. Methodology to assess clinical liver safety data.

    PubMed

    Merz, Michael; Lee, Kwan R; Kullak-Ublick, Gerd A; Brueckner, Andreas; Watkins, Paul B

    2014-11-01

    Analysis of liver safety data has to be multivariate by nature and needs to take into account time dependency of observations. Current standard tools for liver safety assessment such as summary tables, individual data listings, and narratives address these requirements to a limited extent only. Using graphics in the context of a systematic workflow including predefined graph templates is a valuable addition to standard instruments, helping to ensure completeness of evaluation, and supporting both hypothesis generation and testing. Employing graphical workflows interactively allows analysis in a team-based setting and facilitates identification of the most suitable graphics for publishing and regulatory reporting. Another important tool is statistical outlier detection, accounting for the fact that for assessment of Drug-Induced Liver Injury, identification and thorough evaluation of extreme values has much more relevance than measures of central tendency in the data. Taken together, systematical graphical data exploration and statistical outlier detection may have the potential to significantly improve assessment and interpretation of clinical liver safety data. A workshop was convened to discuss best practices for the assessment of drug-induced liver injury (DILI) in clinical trials.

  17. Genetic programming based models in plant tissue culture: An addendum to traditional statistical approach.

    PubMed

    Mridula, Meenu R; Nair, Ashalatha S; Kumar, K Satheesh

    2018-02-01

    In this paper, we compared the efficacy of observation based modeling approach using a genetic algorithm with the regular statistical analysis as an alternative methodology in plant research. Preliminary experimental data on in vitro rooting was taken for this study with an aim to understand the effect of charcoal and naphthalene acetic acid (NAA) on successful rooting and also to optimize the two variables for maximum result. Observation-based modelling, as well as traditional approach, could identify NAA as a critical factor in rooting of the plantlets under the experimental conditions employed. Symbolic regression analysis using the software deployed here optimised the treatments studied and was successful in identifying the complex non-linear interaction among the variables, with minimalistic preliminary data. The presence of charcoal in the culture medium has a significant impact on root generation by reducing basal callus mass formation. Such an approach is advantageous for establishing in vitro culture protocols as these models will have significant potential for saving time and expenditure in plant tissue culture laboratories, and it further reduces the need for specialised background.

  18. SCARE: A post-processor program to MSC/NASTRAN for the reliability analysis of structural ceramic components

    NASA Technical Reports Server (NTRS)

    Gyekenyesi, J. P.

    1985-01-01

    A computer program was developed for calculating the statistical fast fracture reliability and failure probability of ceramic components. The program includes the two-parameter Weibull material fracture strength distribution model, using the principle of independent action for polyaxial stress states and Batdorf's shear-sensitive as well as shear-insensitive crack theories, all for volume distributed flaws in macroscopically isotropic solids. Both penny-shaped cracks and Griffith cracks are included in the Batdorf shear-sensitive crack response calculations, using Griffith's maximum tensile stress or critical coplanar strain energy release rate criteria to predict mixed mode fracture. Weibull material parameters can also be calculated from modulus of rupture bar tests, using the least squares method with known specimen geometry and fracture data. The reliability prediction analysis uses MSC/NASTRAN stress, temperature and volume output, obtained from the use of three-dimensional, quadratic, isoparametric, or axisymmetric finite elements. The statistical fast fracture theories employed, along with selected input and output formats and options, are summarized. An example problem to demonstrate various features of the program is included.

  19. Updated Intensity - Duration - Frequency Curves Under Different Future Climate Scenarios

    NASA Astrophysics Data System (ADS)

    Ragno, E.; AghaKouchak, A.

    2016-12-01

    Current infrastructure design procedures rely on the use of Intensity - Duration - Frequency (IDF) curves retrieved under the assumption of temporal stationarity, meaning that occurrences of extreme events are expected to be time invariant. However, numerous studies have observed more severe extreme events over time. Hence, the stationarity assumption for extreme analysis may not be appropriate in a warming climate. This issue raises concerns regarding the safety and resilience of the existing and future infrastructures. Here we employ historical and projected (RCP 8.5) CMIP5 runs to investigate IDF curves of 14 urban areas across the United States. We first statistically assess changes in precipitation extremes using an energy-based test for equal distributions. Then, through a Bayesian inference approach for stationary and non-stationary extreme value analysis, we provide updated IDF curves based on climatic model projections. This presentation summarizes the projected changes in statistics of extremes. We show that, based on CMIP5 simulations, extreme precipitation events in some urban areas can be 20% more severe in the future, even when projected annual mean precipitation is expected to remain similar to the ground-based climatology.

  20. Analysis of Acoustic Emission Parameters from Corrosion of AST Bottom Plate in Field Testing

    NASA Astrophysics Data System (ADS)

    Jomdecha, C.; Jirarungsatian, C.; Suwansin, W.

    Field testing of aboveground storage tank (AST) to monitor corrosion of the bottom plate is presented in this chapter. AE testing data of the ten AST with different sizes, materials, and products were employed to monitor the bottom plate condition. AE sensors of 30 and 150 kHz were used to monitor the corrosion activity of up to 24 channels including guard sensors. Acoustic emission (AE) parameters were analyzed to explore the AE parameter patterns of occurring corrosion compared to the laboratory results. Amplitude, count, duration, and energy were main parameters of analysis. Pattern recognition technique with statistical was implemented to eliminate the electrical and environmental noises. The results showed the specific AE patterns of corrosion activities related to the empirical results. In addition, plane algorithm was utilized to locate the significant AE events from corrosion. Both results of parameter patterns and AE event locations can be used to interpret and locate the corrosion activities. Finally, basic statistical grading technique was used to evaluate the bottom plate condition of the AST.

Top