[Statistical analysis of German radiologic periodicals: developmental trends in the last 10 years].
Golder, W
1999-09-01
To identify which statistical tests are applied in German radiological publications, to what extent their use has changed during the last decade, and which factors might be responsible for this development. The major articles published in "ROFO" and "DER RADIOLOGE" during 1988, 1993 and 1998 were reviewed for statistical content. The contributions were classified by principal focus and radiological subspecialty. The methods used were assigned to descriptive, basal and advanced statistics. Sample size, significance level and power were established. The use of experts' assistance was monitored. Finally, we calculated the so-called cumulative accessibility of the publications. 525 contributions were found to be eligible. In 1988, 87% used descriptive statistics only, 12.5% basal, and 0.5% advanced statistics. The corresponding figures in 1993 and 1998 are 62 and 49%, 32 and 41%, and 6 and 10%, respectively. Statistical techniques were most likely to be used in research on musculoskeletal imaging and articles dedicated to MRI. Six basic categories of statistical methods account for the complete statistical analysis appearing in 90% of the articles. ROC analysis is the single most common advanced technique. Authors make increasingly use of statistical experts' opinion and programs. During the last decade, the use of statistical methods in German radiological journals has fundamentally improved, both quantitatively and qualitatively. Presently, advanced techniques account for 20% of the pertinent statistical tests. This development seems to be promoted by the increasing availability of statistical analysis software.
Online Statistical Modeling (Regression Analysis) for Independent Responses
NASA Astrophysics Data System (ADS)
Made Tirta, I.; Anggraeni, Dian; Pandutama, Martinus
2017-06-01
Regression analysis (statistical analmodelling) are among statistical methods which are frequently needed in analyzing quantitative data, especially to model relationship between response and explanatory variables. Nowadays, statistical models have been developed into various directions to model various type and complex relationship of data. Rich varieties of advanced and recent statistical modelling are mostly available on open source software (one of them is R). However, these advanced statistical modelling, are not very friendly to novice R users, since they are based on programming script or command line interface. Our research aims to developed web interface (based on R and shiny), so that most recent and advanced statistical modelling are readily available, accessible and applicable on web. We have previously made interface in the form of e-tutorial for several modern and advanced statistical modelling on R especially for independent responses (including linear models/LM, generalized linier models/GLM, generalized additive model/GAM and generalized additive model for location scale and shape/GAMLSS). In this research we unified them in the form of data analysis, including model using Computer Intensive Statistics (Bootstrap and Markov Chain Monte Carlo/ MCMC). All are readily accessible on our online Virtual Statistics Laboratory. The web (interface) make the statistical modeling becomes easier to apply and easier to compare them in order to find the most appropriate model for the data.
Statistical methods in personality assessment research.
Schinka, J A; LaLone, L; Broeckel, J A
1997-06-01
Emerging models of personality structure and advances in the measurement of personality and psychopathology suggest that research in personality and personality assessment has entered a stage of advanced development, in this article we examine whether researchers in these areas have taken advantage of new and evolving statistical procedures. We conducted a review of articles published in the Journal of Personality, Assessment during the past 5 years. Of the 449 articles that included some form of data analysis, 12.7% used only descriptive statistics, most employed only univariate statistics, and fewer than 10% used multivariate methods of data analysis. We discuss the cost of using limited statistical methods, the possible reasons for the apparent reluctance to employ advanced statistical procedures, and potential solutions to this technical shortcoming.
NAUSEA and the Principle of Supplementarity of Damping and Isolation in Noise Control.
1980-02-01
New approaches and uses of the statistical energy analysis (NAUSEA) have been considered and developed in recent months. The advances were made...possible in that the requirement, in the olde statistical energy analysis , that the dynamic systems be highly reverberant and the couplings between the...analytical consideration in terms of the statistical energy analysis (SEA). A brief discussion and simple examples that relate to these recent advances
Advances in Statistical Methods for Substance Abuse Prevention Research
MacKinnon, David P.; Lockwood, Chondra M.
2010-01-01
The paper describes advances in statistical methods for prevention research with a particular focus on substance abuse prevention. Standard analysis methods are extended to the typical research designs and characteristics of the data collected in prevention research. Prevention research often includes longitudinal measurement, clustering of data in units such as schools or clinics, missing data, and categorical as well as continuous outcome variables. Statistical methods to handle these features of prevention data are outlined. Developments in mediation, moderation, and implementation analysis allow for the extraction of more detailed information from a prevention study. Advancements in the interpretation of prevention research results include more widespread calculation of effect size and statistical power, the use of confidence intervals as well as hypothesis testing, detailed causal analysis of research findings, and meta-analysis. The increased availability of statistical software has contributed greatly to the use of new methods in prevention research. It is likely that the Internet will continue to stimulate the development and application of new methods. PMID:12940467
Statistical Tutorial | Center for Cancer Research
Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. ST is designed as a follow up to Statistical Analysis of Research Data (SARD) held in April 2018. The tutorial will apply the general principles of statistical analysis of research data including descriptive statistics, z- and t-tests of means and mean
Statistical Tutorial | Center for Cancer Research
Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. ST is designed as a follow up to Statistical Analysis of Research Data (SARD) held in April 2018. The tutorial will apply the general principles of statistical analysis of research data including descriptive statistics, z- and t-tests of means and mean differences, simple and multiple linear regression, ANOVA tests, and Chi-Squared distribution.
Statistical Analysis of Research Data | Center for Cancer Research
Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. The Statistical Analysis of Research Data (SARD) course will be held on April 5-6, 2018 from 9 a.m.-5 p.m. at the National Institutes of Health's Natcher Conference Center, Balcony C on the Bethesda Campus. SARD is designed to provide an overview on the general principles of statistical analysis of research data. The first day will feature univariate data analysis, including descriptive statistics, probability distributions, one- and two-sample inferential statistics.
Advanced Categorical Statistics: Issues and Applications in Communication Research.
ERIC Educational Resources Information Center
Denham, Bryan E.
2002-01-01
Discusses not only the procedures, assumptions, and applications of advanced categorical statistics, but also covers some common misapplications, from which a great deal can be learned. Addresses the use and limitations of cross-tabulation and chi-square analysis, as well as issues such as observation independence and artificial inflation of a…
Generalized Majority Logic Criterion to Analyze the Statistical Strength of S-Boxes
NASA Astrophysics Data System (ADS)
Hussain, Iqtadar; Shah, Tariq; Gondal, Muhammad Asif; Mahmood, Hasan
2012-05-01
The majority logic criterion is applicable in the evaluation process of substitution boxes used in the advanced encryption standard (AES). The performance of modified or advanced substitution boxes is predicted by processing the results of statistical analysis by the majority logic criteria. In this paper, we use the majority logic criteria to analyze some popular and prevailing substitution boxes used in encryption processes. In particular, the majority logic criterion is applied to AES, affine power affine (APA), Gray, Lui J, residue prime, S8 AES, Skipjack, and Xyi substitution boxes. The majority logic criterion is further extended into a generalized majority logic criterion which has a broader spectrum of analyzing the effectiveness of substitution boxes in image encryption applications. The integral components of the statistical analyses used for the generalized majority logic criterion are derived from results of entropy analysis, contrast analysis, correlation analysis, homogeneity analysis, energy analysis, and mean of absolute deviation (MAD) analysis.
Han, Seong Kyu; Lee, Dongyeop; Lee, Heetak; Kim, Donghyo; Son, Heehwa G; Yang, Jae-Seong; Lee, Seung-Jae V; Kim, Sanguk
2016-08-30
Online application for survival analysis (OASIS) has served as a popular and convenient platform for the statistical analysis of various survival data, particularly in the field of aging research. With the recent advances in the fields of aging research that deal with complex survival data, we noticed a need for updates to the current version of OASIS. Here, we report OASIS 2 (http://sbi.postech.ac.kr/oasis2), which provides extended statistical tools for survival data and an enhanced user interface. In particular, OASIS 2 enables the statistical comparison of maximal lifespans, which is potentially useful for determining key factors that limit the lifespan of a population. Furthermore, OASIS 2 provides statistical and graphical tools that compare values in different conditions and times. That feature is useful for comparing age-associated changes in physiological activities, which can be used as indicators of "healthspan." We believe that OASIS 2 will serve as a standard platform for survival analysis with advanced and user-friendly statistical tools for experimental biologists in the field of aging research.
Statistical Analysis of Research Data | Center for Cancer Research
Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. The Statistical Analysis of Research Data (SARD) course will be held on April 5-6, 2018 from 9 a.m.-5 p.m. at the National Institutes of Health's Natcher Conference Center, Balcony C on the Bethesda Campus. SARD is designed to provide an overview on the general
Li, Gaoming; Yi, Dali; Wu, Xiaojiao; Liu, Xiaoyu; Zhang, Yanqi; Liu, Ling; Yi, Dong
2015-01-01
Background Although a substantial number of studies focus on the teaching and application of medical statistics in China, few studies comprehensively evaluate the recognition of and demand for medical statistics. In addition, the results of these various studies differ and are insufficiently comprehensive and systematic. Objectives This investigation aimed to evaluate the general cognition of and demand for medical statistics by undergraduates, graduates, and medical staff in China. Methods We performed a comprehensive database search related to the cognition of and demand for medical statistics from January 2007 to July 2014 and conducted a meta-analysis of non-controlled studies with sub-group analysis for undergraduates, graduates, and medical staff. Results There are substantial differences with respect to the cognition of theory in medical statistics among undergraduates (73.5%), graduates (60.7%), and medical staff (39.6%). The demand for theory in medical statistics is high among graduates (94.6%), undergraduates (86.1%), and medical staff (88.3%). Regarding specific statistical methods, the cognition of basic statistical methods is higher than of advanced statistical methods. The demand for certain advanced statistical methods, including (but not limited to) multiple analysis of variance (ANOVA), multiple linear regression, and logistic regression, is higher than that for basic statistical methods. The use rates of the Statistical Package for the Social Sciences (SPSS) software and statistical analysis software (SAS) are only 55% and 15%, respectively. Conclusion The overall statistical competence of undergraduates, graduates, and medical staff is insufficient, and their ability to practically apply their statistical knowledge is limited, which constitutes an unsatisfactory state of affairs for medical statistics education. Because the demand for skills in this area is increasing, the need to reform medical statistics education in China has become urgent. PMID:26053876
Wu, Yazhou; Zhou, Liang; Li, Gaoming; Yi, Dali; Wu, Xiaojiao; Liu, Xiaoyu; Zhang, Yanqi; Liu, Ling; Yi, Dong
2015-01-01
Although a substantial number of studies focus on the teaching and application of medical statistics in China, few studies comprehensively evaluate the recognition of and demand for medical statistics. In addition, the results of these various studies differ and are insufficiently comprehensive and systematic. This investigation aimed to evaluate the general cognition of and demand for medical statistics by undergraduates, graduates, and medical staff in China. We performed a comprehensive database search related to the cognition of and demand for medical statistics from January 2007 to July 2014 and conducted a meta-analysis of non-controlled studies with sub-group analysis for undergraduates, graduates, and medical staff. There are substantial differences with respect to the cognition of theory in medical statistics among undergraduates (73.5%), graduates (60.7%), and medical staff (39.6%). The demand for theory in medical statistics is high among graduates (94.6%), undergraduates (86.1%), and medical staff (88.3%). Regarding specific statistical methods, the cognition of basic statistical methods is higher than of advanced statistical methods. The demand for certain advanced statistical methods, including (but not limited to) multiple analysis of variance (ANOVA), multiple linear regression, and logistic regression, is higher than that for basic statistical methods. The use rates of the Statistical Package for the Social Sciences (SPSS) software and statistical analysis software (SAS) are only 55% and 15%, respectively. The overall statistical competence of undergraduates, graduates, and medical staff is insufficient, and their ability to practically apply their statistical knowledge is limited, which constitutes an unsatisfactory state of affairs for medical statistics education. Because the demand for skills in this area is increasing, the need to reform medical statistics education in China has become urgent.
Research Education in Undergraduate Occupational Therapy Programs.
ERIC Educational Resources Information Center
Petersen, Paul; And Others
1992-01-01
Of 63 undergraduate occupational therapy programs surveyed, the 38 responses revealed some common areas covered: elementary descriptive statistics, validity, reliability, and measurement. Areas underrepresented include statistical analysis with or without computers, research design, and advanced statistics. (SK)
Atmospheric statistics for aerospace vehicle operations
NASA Technical Reports Server (NTRS)
Smith, O. E.; Batts, G. W.
1993-01-01
Statistical analysis of atmospheric variables was performed for the Shuttle Transportation System (STS) design trade studies and the establishment of launch commit criteria. Atmospheric constraint statistics have been developed for the NASP test flight, the Advanced Launch System, and the National Launch System. The concepts and analysis techniques discussed in the paper are applicable to the design and operations of any future aerospace vehicle.
ISSUES IN THE STATISTICAL ANALYSIS OF SMALL-AREA HEALTH DATA. (R825173)
The availability of geographically indexed health and population data, with advances in computing, geographical information systems and statistical methodology, have opened the way for serious exploration of small area health statistics based on routine data. Such analyses may be...
ERIC Educational Resources Information Center
Yeager, Joseph; Sommer, Linda
2007-01-01
Combining psycholinguistic technologies and systems analysis created advances in motivational profiling and numerous new behavioral engineering applications. These advances leapfrog many mainstream statistical research methods, producing superior research results via cause-effect language mechanisms. Entire industries explore motives ranging from…
Using Statistical Analysis Software to Advance Nitro Plasticizer Wettability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shear, Trevor Allan
Statistical analysis in science is an extremely powerful tool that is often underutilized. Additionally, it is frequently the case that data is misinterpreted or not used to its fullest extent. Utilizing the advanced software JMP®, many aspects of experimental design and data analysis can be evaluated and improved. This overview will detail the features of JMP® and how they were used to advance a project, resulting in time and cost savings, as well as the collection of scientifically sound data. The project analyzed in this report addresses the inability of a nitro plasticizer to coat a gold coated quartz crystalmore » sensor used in a quartz crystal microbalance. Through the use of the JMP® software, the wettability of the nitro plasticizer was increased by over 200% using an atmospheric plasma pen, ensuring good sample preparation and reliable results.« less
Advanced statistical energy analysis
NASA Astrophysics Data System (ADS)
Heron, K. H.
1994-09-01
A high-frequency theory (advanced statistical energy analysis (ASEA)) is developed which takes account of the mechanism of tunnelling and uses a ray theory approach to track the power flowing around a plate or a beam network and then uses statistical energy analysis (SEA) to take care of any residual power. ASEA divides the energy of each sub-system into energy that is freely available for transfer to other sub-systems and energy that is fixed within the sub-systems that are physically separate and can be interpreted as a series of mathematical models, the first of which is identical to standard SEA and subsequent higher order models are convergent on an accurate prediction. Using a structural assembly of six rods as an example, ASEA is shown to converge onto the exact results while SEA is shown to overpredict by up to 60 dB.
ERIC Educational Resources Information Center
Petocz, Peter; Sowey, Eric
2012-01-01
The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the…
Dong, Skye T; Costa, Daniel S J; Butow, Phyllis N; Lovell, Melanie R; Agar, Meera; Velikova, Galina; Teckle, Paulos; Tong, Allison; Tebbutt, Niall C; Clarke, Stephen J; van der Hoek, Kim; King, Madeleine T; Fayers, Peter M
2016-01-01
Symptom clusters in advanced cancer can influence patient outcomes. There is large heterogeneity in the methods used to identify symptom clusters. To investigate the consistency of symptom cluster composition in advanced cancer patients using different statistical methodologies for all patients across five primary cancer sites, and to examine which clusters predict functional status, a global assessment of health and global quality of life. Principal component analysis and exploratory factor analysis (with different rotation and factor selection methods) and hierarchical cluster analysis (with different linkage and similarity measures) were used on a data set of 1562 advanced cancer patients who completed the European Organization for the Research and Treatment of Cancer Quality of Life Questionnaire-Core 30. Four clusters consistently formed for many of the methods and cancer sites: tense-worry-irritable-depressed (emotional cluster), fatigue-pain, nausea-vomiting, and concentration-memory (cognitive cluster). The emotional cluster was a stronger predictor of overall quality of life than the other clusters. Fatigue-pain was a stronger predictor of overall health than the other clusters. The cognitive cluster and fatigue-pain predicted physical functioning, role functioning, and social functioning. The four identified symptom clusters were consistent across statistical methods and cancer types, although there were some noteworthy differences. Statistical derivation of symptom clusters is in need of greater methodological guidance. A psychosocial pathway in the management of symptom clusters may improve quality of life. Biological mechanisms underpinning symptom clusters need to be delineated by future research. A framework for evidence-based screening, assessment, treatment, and follow-up of symptom clusters in advanced cancer is essential. Copyright © 2016 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.
Statistical evaluation of vibration analysis techniques
NASA Technical Reports Server (NTRS)
Milner, G. Martin; Miller, Patrice S.
1987-01-01
An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.
Writing to Learn Statistics in an Advanced Placement Statistics Course
ERIC Educational Resources Information Center
Northrup, Christian Glenn
2012-01-01
This study investigated the use of writing in a statistics classroom to learn if writing provided a rich description of problem-solving processes of students as they solved problems. Through analysis of 329 written samples provided by students, it was determined that writing provided a rich description of problem-solving processes and enabled…
Response to Comments on "Evidence for mesothermy in dinosaurs".
Grady, John M; Enquist, Brian J; Dettweiler-Robinson, Eva; Wright, Natalie A; Smith, Felisa A
2015-05-29
D'Emic and Myhrvold raise a number of statistical and methodological issues with our recent analysis of dinosaur growth and energetics. However, their critiques and suggested improvements lack biological and statistical justification. Copyright © 2015, American Association for the Advancement of Science.
Advanced building energy management system demonstration for Department of Defense buildings.
O'Neill, Zheng; Bailey, Trevor; Dong, Bing; Shashanka, Madhusudana; Luo, Dong
2013-08-01
This paper presents an advanced building energy management system (aBEMS) that employs advanced methods of whole-building performance monitoring combined with statistical methods of learning and data analysis to enable identification of both gradual and discrete performance erosion and faults. This system assimilated data collected from multiple sources, including blueprints, reduced-order models (ROM) and measurements, and employed advanced statistical learning algorithms to identify patterns of anomalies. The results were presented graphically in a manner understandable to facilities managers. A demonstration of aBEMS was conducted in buildings at Naval Station Great Lakes. The facility building management systems were extended to incorporate the energy diagnostics and analysis algorithms, producing systematic identification of more efficient operation strategies. At Naval Station Great Lakes, greater than 20% savings were demonstrated for building energy consumption by improving facility manager decision support to diagnose energy faults and prioritize alternative, energy-efficient operation strategies. The paper concludes with recommendations for widespread aBEMS success. © 2013 New York Academy of Sciences.
ERIC Educational Resources Information Center
Osler, James Edward, II
2015-01-01
This monograph provides an epistemological rational for the Accumulative Manifold Validation Analysis [also referred by the acronym "AMOVA"] statistical methodology designed to test psychometric instruments. This form of inquiry is a form of mathematical optimization in the discipline of linear stochastic modelling. AMOVA is an in-depth…
Crawford, Charles G.; Wangsness, David J.
1993-01-01
The City of Indianapolis has constructed state-of-the-art advanced municipal wastewater-treatment systems to enlarge and upgrade the existing secondary-treatment processes at its Belmont and Southport treatment plants. These new advanced-wastewater-treatment plants became operational in 1983. A nonparametric statistical procedure--a modified form of the Wilcoxon-Mann-Whitney rank-sum test--was used to test for trends in time-series water-quality data from four sites on the White River and from the Belmont and Southport wastewater-treatment plants. Time-series data representative of pre-advanced- (1978-1980) and post-advanced- (1983--86) wastewater-treatment conditions were tested for trends, and the results indicate substantial changes in water quality of treated effluent and of the White River downstream from Indianapolis after implementation of advanced wastewater treatment. Water quality from 1981 through 1982 was highly variable due to plant construction. Therefore, this time period was excluded from the analysis. Water quality at sample sites located upstream from the wastewater-treatment plants was relatively constant during the period of study (1978-86). Analysis of data from the two plants and downstream from the plants indicates statistically significant decreasing trends in effluent concentrations of total ammonia, 5-day biochemical-oxygen demand, fecal-coliform bacteria, total phosphate, and total solids at all sites where sufficient data were available for testing. Because of in-plant nitrification, increases in nitrate concentration were statistically significant in the two plants and in the White River. The decrease in ammonia concentrations and 5-day biochemical-oxygen demand in the White River resulted in a statistically significant increasing trend in dissolved-oxygen concentration in the river because of reduced oxygen demand for nitrification and biochemical oxidation processes. Following implementation of advanced wastewater treatment, the number of river-quality samples that failed to meet the water-quality standards for ammonia and dissolved oxygen that apply to the White River decreased substantially.
Fu, Wenjiang J.; Stromberg, Arnold J.; Viele, Kert; Carroll, Raymond J.; Wu, Guoyao
2009-01-01
Over the past two decades, there have been revolutionary developments in life science technologies characterized by high throughput, high efficiency, and rapid computation. Nutritionists now have the advanced methodologies for the analysis of DNA, RNA, protein, low-molecular-weight metabolites, as well as access to bioinformatics databases. Statistics, which can be defined as the process of making scientific inferences from data that contain variability, has historically played an integral role in advancing nutritional sciences. Currently, in the era of systems biology, statistics has become an increasingly important tool to quantitatively analyze information about biological macromolecules. This article describes general terms used in statistical analysis of large, complex experimental data. These terms include experimental design, power analysis, sample size calculation, and experimental errors (type I and II errors) for nutritional studies at population, tissue, cellular, and molecular levels. In addition, we highlighted various sources of experimental variations in studies involving microarray gene expression, real-time polymerase chain reaction, proteomics, and other bioinformatics technologies. Moreover, we provided guidelines for nutritionists and other biomedical scientists to plan and conduct studies and to analyze the complex data. Appropriate statistical analyses are expected to make an important contribution to solving major nutrition-associated problems in humans and animals (including obesity, diabetes, cardiovascular disease, cancer, ageing, and intrauterine fetal retardation). PMID:20233650
78 FR 65426 - Technical Report: Evaluation of the Certified-Advanced Air Bags
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-31
... INFORMATION CONTACT: Nathan K. Greenwell, Mathematical Statistician, Evaluation Division, NVS-431, National Center for Statistics and Analysis, National Highway Traffic Safety Administration, Room W53-438, 1200... bags at all for children, using occupant detection sensors to suppress the air bags. Statistical...
Comfort and convenience analysis of advanced restraint systems
DOT National Transportation Integrated Search
1975-08-25
Five restraint systems were evaluated in terms of comfort and convenience by ten subjects. Statistical analysis of particular questions and system comparisons uncovered potential problems. The standard lap and shoulder belt system (1974 Chevrolet Imp...
Shrout, Patrick E; Rodgers, Joseph L
2018-01-04
Psychology advances knowledge by testing statistical hypotheses using empirical observations and data. The expectation is that most statistically significant findings can be replicated in new data and in new laboratories, but in practice many findings have replicated less often than expected, leading to claims of a replication crisis. We review recent methodological literature on questionable research practices, meta-analysis, and power analysis to explain the apparently high rates of failure to replicate. Psychologists can improve research practices to advance knowledge in ways that improve replicability. We recommend that researchers adopt open science conventions of preregi-stration and full disclosure and that replication efforts be based on multiple studies rather than on a single replication attempt. We call for more sophisticated power analyses, careful consideration of the various influences on effect sizes, and more complete disclosure of nonsignificant as well as statistically significant findings.
2012-09-30
recognition. Algorithm design and statistical analysis and feature analysis. Post -Doctoral Associate, Cornell University, Bioacoustics Research...short. The HPC-ADA was designed based on fielded systems [1-4, 6] that offer a variety of desirable attributes, specifically dynamic resource...The software package was designed to utilize parallel and distributed processing for running recognition and other advanced algorithms. DeLMA
Modified Bayesian Kriging for Noisy Response Problems for Reliability Analysis
2015-01-01
52242, USA nicholas-gaul@uiowa.edu Mary Kathryn Cowles Department of Statistics & Actuarial Science College of Liberal Arts and Sciences , The...Forrester, A. I. J., & Keane, A. J. (2009). Recent advances in surrogate-based optimization. Progress in Aerospace Sciences , 45(1–3), 50-79. doi...Wiley. [27] Sacks, J., Welch, W. J., Toby J. Mitchell, & Wynn, H. P. (1989). Design and analysis of computer experiments. Statistical Science , 4
The Statistical Consulting Center for Astronomy (SCCA)
NASA Technical Reports Server (NTRS)
Akritas, Michael
2001-01-01
The process by which raw astronomical data acquisition is transformed into scientifically meaningful results and interpretation typically involves many statistical steps. Traditional astronomy limits itself to a narrow range of old and familiar statistical methods: means and standard deviations; least-squares methods like chi(sup 2) minimization; and simple nonparametric procedures such as the Kolmogorov-Smirnov tests. These tools are often inadequate for the complex problems and datasets under investigations, and recent years have witnessed an increased usage of maximum-likelihood, survival analysis, multivariate analysis, wavelet and advanced time-series methods. The Statistical Consulting Center for Astronomy (SCCA) assisted astronomers with the use of sophisticated tools, and to match these tools with specific problems. The SCCA operated with two professors of statistics and a professor of astronomy working together. Questions were received by e-mail, and were discussed in detail with the questioner. Summaries of those questions and answers leading to new approaches were posted on the Web (www.state.psu.edu/ mga/SCCA). In addition to serving individual astronomers, the SCCA established a Web site for general use that provides hypertext links to selected on-line public-domain statistical software and services. The StatCodes site (www.astro.psu.edu/statcodes) provides over 200 links in the areas of: Bayesian statistics; censored and truncated data; correlation and regression, density estimation and smoothing, general statistics packages and information; image analysis; interactive Web tools; multivariate analysis; multivariate clustering and classification; nonparametric analysis; software written by astronomers; spatial statistics; statistical distributions; time series analysis; and visualization tools. StatCodes has received a remarkable high and constant hit rate of 250 hits/week (over 10,000/year) since its inception in mid-1997. It is of interest to scientists both within and outside of astronomy. The most popular sections are multivariate techniques, image analysis, and time series analysis. Hundreds of copies of the ASURV, SLOPES and CENS-TAU codes developed by SCCA scientists were also downloaded from the StatCodes site. In addition to formal SCCA duties, SCCA scientists continued a variety of related activities in astrostatistics, including refereeing of statistically oriented papers submitted to the Astrophysical Journal, talks in meetings including Feigelson's talk to science journalists entitled "The reemergence of astrostatistics" at the American Association for the Advancement of Science meeting, and published papers of astrostatistical content.
Advance Report of Final Mortality Statistics, 1985.
ERIC Educational Resources Information Center
Monthly Vital Statistics Report, 1987
1987-01-01
This document presents mortality statistics for 1985 for the entire United States. Data analysis and discussion of these factors is included: death and death rates; death rates by age, sex, and race; expectation of life at birth and at specified ages; causes of death; infant mortality; and maternal mortality. Highlights reported include: (1) the…
Investigation of Weibull statistics in fracture analysis of cast aluminum
NASA Technical Reports Server (NTRS)
Holland, Frederic A., Jr.; Zaretsky, Erwin V.
1989-01-01
The fracture strengths of two large batches of A357-T6 cast aluminum coupon specimens were compared by using two-parameter Weibull analysis. The minimum number of these specimens necessary to find the fracture strength of the material was determined. The applicability of three-parameter Weibull analysis was also investigated. A design methodology based on the combination of elementary stress analysis and Weibull statistical analysis is advanced and applied to the design of a spherical pressure vessel shell. The results from this design methodology are compared with results from the applicable ASME pressure vessel code.
Probabilistic structural analysis methods and applications
NASA Technical Reports Server (NTRS)
Cruse, T. A.; Wu, Y.-T.; Dias, B.; Rajagopal, K. R.
1988-01-01
An advanced algorithm for simulating the probabilistic distribution of structural responses due to statistical uncertainties in loads, geometry, material properties, and boundary conditions is reported. The method effectively combines an advanced algorithm for calculating probability levels for multivariate problems (fast probability integration) together with a general-purpose finite-element code for stress, vibration, and buckling analysis. Application is made to a space propulsion system turbine blade for which the geometry and material properties are treated as random variables.
Evaluating Cellular Polyfunctionality with a Novel Polyfunctionality Index
Larsen, Martin; Sauce, Delphine; Arnaud, Laurent; Fastenackels, Solène; Appay, Victor; Gorochov, Guy
2012-01-01
Functional evaluation of naturally occurring or vaccination-induced T cell responses in mice, men and monkeys has in recent years advanced from single-parameter (e.g. IFN-γ-secretion) to much more complex multidimensional measurements. Co-secretion of multiple functional molecules (such as cytokines and chemokines) at the single-cell level is now measurable due primarily to major advances in multiparametric flow cytometry. The very extensive and complex datasets generated by this technology raise the demand for proper analytical tools that enable the analysis of combinatorial functional properties of T cells, hence polyfunctionality. Presently, multidimensional functional measures are analysed either by evaluating all combinations of parameters individually or by summing frequencies of combinations that include the same number of simultaneous functions. Often these evaluations are visualized as pie charts. Whereas pie charts effectively represent and compare average polyfunctionality profiles of particular T cell subsets or patient groups, they do not document the degree or variation of polyfunctionality within a group nor does it allow more sophisticated statistical analysis. Here we propose a novel polyfunctionality index that numerically evaluates the degree and variation of polyfuntionality, and enable comparative and correlative parametric and non-parametric statistical tests. Moreover, it allows the usage of more advanced statistical approaches, such as cluster analysis. We believe that the polyfunctionality index will render polyfunctionality an appropriate end-point measure in future studies of T cell responsiveness. PMID:22860124
Statistical and Economic Techniques for Site-specific Nematode Management.
Liu, Zheng; Griffin, Terry; Kirkpatrick, Terrence L
2014-03-01
Recent advances in precision agriculture technologies and spatial statistics allow realistic, site-specific estimation of nematode damage to field crops and provide a platform for the site-specific delivery of nematicides within individual fields. This paper reviews the spatial statistical techniques that model correlations among neighboring observations and develop a spatial economic analysis to determine the potential of site-specific nematicide application. The spatial econometric methodology applied in the context of site-specific crop yield response contributes to closing the gap between data analysis and realistic site-specific nematicide recommendations and helps to provide a practical method of site-specifically controlling nematodes.
Clark, Robin A; Shoaib, Mohammed; Hewitt, Katherine N; Stanford, S Clare; Bate, Simon T
2012-08-01
InVivoStat is a free-to-use statistical software package for analysis of data generated from animal experiments. The package is designed specifically for researchers in the behavioural sciences, where exploiting the experimental design is crucial for reliable statistical analyses. This paper compares the analysis of three experiments conducted using InVivoStat with other widely used statistical packages: SPSS (V19), PRISM (V5), UniStat (V5.6) and Statistica (V9). We show that InVivoStat provides results that are similar to those from the other packages and, in some cases, are more advanced. This investigation provides evidence of further validation of InVivoStat and should strengthen users' confidence in this new software package.
MEXICAN-AMERICAN STUDY PROJECT. ADVANCE REPORT 4, RESIDENTIAL SEGREGATION IN THE URBAN SOUTHWEST.
ERIC Educational Resources Information Center
MOORE, JOAN W.; AND OTHERS
THIS ADVANCE REPORT PRESENTS A STATISTICAL ANALYSIS OF THE DEGREE OF RESIDENTIAL SEGREGATION OF THE MEXICAN-AMERICAN AND NEGRO SUBPOPULATIONS FROM THE ANGLO SUBPOPULATIONS IN URBAN AREAS. ALL OF THE DATA WERE DRAWN FROM THE 1950 AND 1960 CENSUSES OF POPULATION AND HOUSING. FACTORS STUDIED INCLUDE URBANIZATION PATTERNS AND ORIGINS OF…
Winer, E Samuel; Cervone, Daniel; Bryant, Jessica; McKinney, Cliff; Liu, Richard T; Nadorff, Michael R
2016-09-01
A popular way to attempt to discern causality in clinical psychology is through mediation analysis. However, mediation analysis is sometimes applied to research questions in clinical psychology when inferring causality is impossible. This practice may soon increase with new, readily available, and easy-to-use statistical advances. Thus, we here provide a heuristic to remind clinical psychological scientists of the assumptions of mediation analyses. We describe recent statistical advances and unpack assumptions of causality in mediation, underscoring the importance of time in understanding mediational hypotheses and analyses in clinical psychology. Example analyses demonstrate that statistical mediation can occur despite theoretical mediation being improbable. We propose a delineation of mediational effects derived from cross-sectional designs into the terms temporal and atemporal associations to emphasize time in conceptualizing process models in clinical psychology. The general implications for mediational hypotheses and the temporal frameworks from within which they may be drawn are discussed. © 2016 Wiley Periodicals, Inc.
USDA-ARS?s Scientific Manuscript database
The mixed linear model (MLM) is currently among the most advanced and flexible statistical modeling techniques and its use in tackling problems in plant pathology has begun surfacing in the literature. The longitudinal MLM is a multivariate extension that handles repeatedly measured data, such as r...
Pan, Larry; Baek, Seunghee; Edmonds, Pamela R; Roach, Mack; Wolkov, Harvey; Shah, Satish; Pollack, Alan; Hammond, M Elizabeth; Dicker, Adam P
2013-04-25
Angiogenesis is a key element in solid-tumor growth, invasion, and metastasis. VEGF is among the most potent angiogenic factor thus far detected. The aim of the present study is to explore the potential of VEGF (also known as VEGF-A) as a prognostic and predictive biomarker among men with locally advanced prostate cancer. The analysis was performed using patients enrolled on RTOG 8610, a phase III randomized control trial of radiation therapy alone (Arm 1) versus short-term neoadjuvant and concurrent androgen deprivation and radiation therapy (Arm 2) in men with locally advanced prostate carcinoma. Tissue samples were obtained from the RTOG tissue repository. Hematoxylin and eosin slides were reviewed, and paraffin blocks were immunohistochemically stained for VEGF expression and graded by Intensity score (0-3). Cox or Fine and Gray's proportional hazards models were used. Sufficient pathologic material was available from 103 (23%) of the 456 analyzable patients enrolled in the RTOG 8610 study. There were no statistically significant differences in the pre-treatment characteristics between the patient groups with and without VEGF intensity data. Median follow-up for all surviving patients with VEGF intensity data is 12.2 years. Univariate and multivariate analyses demonstrated no statistically significant correlation between the intensity of VEGF expression and overall survival, distant metastasis, local progression, disease-free survival, or biochemical failure. VEGF expression was also not statistically significantly associated with any of the endpoints when analyzed by treatment arm. This study revealed no statistically significant prognostic or predictive value of VEGF expression for locally advanced prostate cancer. This analysis is among one of the largest sample bases with long-term follow-up in a well-characterized patient population. There is an urgent need to establish multidisciplinary initiatives for coordinating further research in the area of human prostate cancer biomarkers.
Wavelet Statistical Analysis of Low-Latitude Geomagnetic Measurements
NASA Astrophysics Data System (ADS)
Papa, A. R.; Akel, A. F.
2009-05-01
Following previous works by our group (Papa et al., JASTP, 2006), where we analyzed a series of records acquired at the Vassouras National Geomagnetic Observatory in Brazil for the month of October 2000, we introduced a wavelet analysis for the same type of data and for other periods. It is well known that wavelets allow a more detailed study in several senses: the time window for analysis can be drastically reduced if compared to other traditional methods (Fourier, for example) and at the same time allow an almost continuous accompaniment of both amplitude and frequency of signals as time goes by. This advantage brings some possibilities for potentially useful forecasting methods of the type also advanced by our group in previous works (see for example, Papa and Sosman, JASTP, 2008). However, the simultaneous statistical analysis of both time series (in our case amplitude and frequency) is a challenging matter and is in this sense that we have found what we consider our main goal. Some possible trends for future works are advanced.
Tocolysis in women with advanced preterm labor: a secondary analysis of a randomized clinical trial.
Klauser, Chad K; Briery, Christian M; Tucker, Ann R; Martin, Rick W; Magann, Everett F; Chauhan, Suneet P; Morrison, John C
2016-03-01
To compare the efficacy of tocolytic treatment with indomethacin (I), magnesium sulfate (M) and nifedipine (N) for acute tocolysis in women with advanced cervical dilation (4-6 cm). A single center, randomized trial was carried out involving patients in preterm labor (cervix 1-6 cm). Secondary analysis of women with advanced cervical dilation (cervix 4-6 cm) at 24-32 weeks' gestation who received intravenous M, oral N or I suppositories comprised this study population. Over 38 months, 92 women with advanced cervical dilation were randomized to one tocoloytic type. Days gained in utero (11.7) and percent remaining undelivered at 48 h (60.8%), 72 h (53.1%) and >7 days (38.3%) were similar regardless of tocolytic employed (p = 0.923, 0.968, 0.791, 0.802, respectively). Likewise, gestational age at delivery (30.7 ± 3.2) was similar between groups (p = 0.771). Finally, neonatal statistics were not different when stratified by tocolytic treatment. There were no statistical differences between tocolytics in treating women with advanced cervical dilation. All offered significant days gained in utero after therapy, a high percentage remaining undelivered after 48 or 72 h and after 7 days. It would appear from data that there may be advantages to tocolytic treatment even in women with advanced cervical dilation.
Statistics used in current nursing research.
Zellner, Kathleen; Boerst, Connie J; Tabb, Wil
2007-02-01
Undergraduate nursing research courses should emphasize the statistics most commonly used in the nursing literature to strengthen students' and beginning researchers' understanding of them. To determine the most commonly used statistics, we reviewed all quantitative research articles published in 13 nursing journals in 2000. The findings supported Beitz's categorization of kinds of statistics. Ten primary statistics used in 80% of nursing research published in 2000 were identified. We recommend that the appropriate use of those top 10 statistics be emphasized in undergraduate nursing education and that the nursing profession continue to advocate for the use of methods (e.g., power analysis, odds ratio) that may contribute to the advancement of nursing research.
Advanced microwave soil moisture studies. [Big Sioux River Basin, Iowa
NASA Technical Reports Server (NTRS)
Dalsted, K. J.; Harlan, J. C.
1983-01-01
Comparisons of low level L-band brightness temperature (TB) and thermal infrared (TIR) data as well as the following data sets: soil map and land cover data; direct soil moisture measurement; and a computer generated contour map were statistically evaluated using regression analysis and linear discriminant analysis. Regression analysis of footprint data shows that statistical groupings of ground variables (soil features and land cover) hold promise for qualitative assessment of soil moisture and for reducing variance within the sampling space. Dry conditions appear to be more conductive to producing meaningful statistics than wet conditions. Regression analysis using field averaged TB and TIR data did not approach the higher sq R values obtained using within-field variations. The linear discriminant analysis indicates some capacity to distinguish categories with the results being somewhat better on a field basis than a footprint basis.
Scaling up to address data science challenges
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wendelberger, Joanne R.
Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less
Scaling up to address data science challenges
Wendelberger, Joanne R.
2017-04-27
Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less
ERIC Educational Resources Information Center
Osler, James Edward
2015-01-01
This monograph provides a neuroscience-based systemological, epistemological, and methodological rational for the design of an advanced and novel parametric statistical analytics designed for the biological sciences referred to as "Biotrichotomy". The aim of this new arena of statistics is to provide dual metrics designed to analyze the…
Advanced functional network analysis in the geosciences: The pyunicorn package
NASA Astrophysics Data System (ADS)
Donges, Jonathan F.; Heitzig, Jobst; Runge, Jakob; Schultz, Hanna C. H.; Wiedermann, Marc; Zech, Alraune; Feldhoff, Jan; Rheinwalt, Aljoscha; Kutza, Hannes; Radebach, Alexander; Marwan, Norbert; Kurths, Jürgen
2013-04-01
Functional networks are a powerful tool for analyzing large geoscientific datasets such as global fields of climate time series originating from observations or model simulations. pyunicorn (pythonic unified complex network and recurrence analysis toolbox) is an open-source, fully object-oriented and easily parallelizable package written in the language Python. It allows for constructing functional networks (aka climate networks) representing the structure of statistical interrelationships in large datasets and, subsequently, investigating this structure using advanced methods of complex network theory such as measures for networks of interacting networks, node-weighted statistics or network surrogates. Additionally, pyunicorn allows to study the complex dynamics of geoscientific systems as recorded by time series by means of recurrence networks and visibility graphs. The range of possible applications of the package is outlined drawing on several examples from climatology.
DOT National Transportation Integrated Search
2014-03-01
Recent research in highway safety has focused on the more advanced and statistically proven techniques of highway : safety analysis. This project focuses on the two most recent safety analysis tools, the Highway Safety Manual (HSM) : and SafetyAnalys...
Incipient fault detection study for advanced spacecraft systems
NASA Technical Reports Server (NTRS)
Milner, G. Martin; Black, Michael C.; Hovenga, J. Mike; Mcclure, Paul F.
1986-01-01
A feasibility study to investigate the application of vibration monitoring to the rotating machinery of planned NASA advanced spacecraft components is described. Factors investigated include: (1) special problems associated with small, high RPM machines; (2) application across multiple component types; (3) microgravity; (4) multiple fault types; (5) eight different analysis techniques including signature analysis, high frequency demodulation, cepstrum, clustering, amplitude analysis, and pattern recognition are compared; and (6) small sample statistical analysis is used to compare performance by computation of probability of detection and false alarm for an ensemble of repeated baseline and faulted tests. Both detection and classification performance are quantified. Vibration monitoring is shown to be an effective means of detecting the most important problem types for small, high RPM fans and pumps typical of those planned for the advanced spacecraft. A preliminary monitoring system design and implementation plan is presented.
NASA Astrophysics Data System (ADS)
Boning, Duane S.; Chung, James E.
1998-11-01
Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of "dummy fill" or "metal fill" to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal.
Advanced statistical methods for improved data analysis of NASA astrophysics missions
NASA Technical Reports Server (NTRS)
Feigelson, Eric D.
1992-01-01
The investigators under this grant studied ways to improve the statistical analysis of astronomical data. They looked at existing techniques, the development of new techniques, and the production and distribution of specialized software to the astronomical community. Abstracts of nine papers that were produced are included, as well as brief descriptions of four software packages. The articles that are abstracted discuss analytical and Monte Carlo comparisons of six different linear least squares fits, a (second) paper on linear regression in astronomy, two reviews of public domain software for the astronomer, subsample and half-sample methods for estimating sampling distributions, a nonparametric estimation of survival functions under dependent competing risks, censoring in astronomical data due to nondetections, an astronomy survival analysis computer package called ASURV, and improving the statistical methodology of astronomical data analysis.
Qumseya, Bashar J; Wang, Haibo; Badie, Nicole; Uzomba, Rosemary N; Parasa, Sravanthi; White, Donna L; Wolfsen, Herbert; Sharma, Prateek; Wallace, Michael B
2013-12-01
US guidelines recommend surveillance of patients with Barrett's esophagus (BE) to detect dysplasia. BE conventionally is monitored via white-light endoscopy (WLE) and a collection of random biopsy specimens. However, this approach does not definitively or consistently detect areas of dysplasia. Advanced imaging technologies can increase the detection of dysplasia and cancer. We investigated whether these imaging technologies can increase the diagnostic yield for the detection of neoplasia in patients with BE, compared with WLE and analysis of random biopsy specimens. We performed a systematic review, using Medline and Embase, to identify relevant peer-review studies. Fourteen studies were included in the final analysis, with a total of 843 patients. Our metameter (estimate) of interest was the paired-risk difference (RD), defined as the difference in yield of the detection of dysplasia or cancer using advanced imaging vs WLE. The estimated paired-RD and 95% confidence interval (CI) were obtained using random-effects models. Heterogeneity was assessed by means of the Q statistic and the I(2) statistic. An exploratory meta-regression was performed to look for associations between the metameter and potential confounders or modifiers. Overall, advanced imaging techniques increased the diagnostic yield for detection of dysplasia or cancer by 34% (95% CI, 20%-56%; P < .0001). A subgroup analysis showed that virtual chromoendoscopy significantly increased the diagnostic yield (RD, 0.34; 95% CI, 0.14-0.56; P < .0001). The RD for chromoendoscopy was 0.35 (95% CI, 0.13-0.56; P = .0001). There was no significant difference between virtual chromoendoscopy and chromoendoscopy, based on Student t test analysis (P = .45). Based on a meta-analysis, advanced imaging techniques such as chromoendoscopy or virtual chromoendoscopy significantly increase the diagnostic yield for identification of dysplasia or cancer in patients with BE. Copyright © 2013 AGA Institute. Published by Elsevier Inc. All rights reserved.
Single-case research design in pediatric psychology: considerations regarding data analysis.
Cohen, Lindsey L; Feinstein, Amanda; Masuda, Akihiko; Vowles, Kevin E
2014-03-01
Single-case research allows for an examination of behavior and can demonstrate the functional relation between intervention and outcome in pediatric psychology. This review highlights key assumptions, methodological and design considerations, and options for data analysis. Single-case methodology and guidelines are reviewed with an in-depth focus on visual and statistical analyses. Guidelines allow for the careful evaluation of design quality and visual analysis. A number of statistical techniques have been introduced to supplement visual analysis, but to date, there is no consensus on their recommended use in single-case research design. Single-case methodology is invaluable for advancing pediatric psychology science and practice, and guidelines have been introduced to enhance the consistency, validity, and reliability of these studies. Experts generally agree that visual inspection is the optimal method of analysis in single-case design; however, statistical approaches are becoming increasingly evaluated and used to augment data interpretation.
NASA Technical Reports Server (NTRS)
1993-01-01
The Marshall Space Flight Center is responsible for the development and management of advanced launch vehicle propulsion systems, including the Space Shuttle Main Engine (SSME), which is presently operational, and the Space Transportation Main Engine (STME) under development. The SSME's provide high performance within stringent constraints on size, weight, and reliability. Based on operational experience, continuous design improvement is in progress to enhance system durability and reliability. Specialized data analysis and interpretation is required in support of SSME and advanced propulsion system diagnostic evaluations. Comprehensive evaluation of the dynamic measurements obtained from test and flight operations is necessary to provide timely assessment of the vibrational characteristics indicating the operational status of turbomachinery and other critical engine components. Efficient performance of this effort is critical due to the significant impact of dynamic evaluation results on ground test and launch schedules, and requires direct familiarity with SSME and derivative systems, test data acquisition, and diagnostic software. Detailed analysis and evaluation of dynamic measurements obtained during SSME and advanced system ground test and flight operations was performed including analytical/statistical assessment of component dynamic behavior, and the development and implementation of analytical/statistical models to efficiently define nominal component dynamic characteristics, detect anomalous behavior, and assess machinery operational condition. In addition, the SSME and J-2 data will be applied to develop vibroacoustic environments for advanced propulsion system components, as required. This study will provide timely assessment of engine component operational status, identify probable causes of malfunction, and indicate feasible engineering solutions. This contract will be performed through accomplishment of negotiated task orders.
Reproducible research in vadose zone sciences
USDA-ARS?s Scientific Manuscript database
A significant portion of present-day soil and Earth science research is computational, involving complex data analysis pipelines, advanced mathematical and statistical models, and sophisticated computer codes. Opportunities for scientific progress are greatly diminished if reproducing and building o...
The US EPA’s ToxCastTM program seeks to combine advances in high-throughput screening technology with methodologies from statistics and computer science to develop high-throughput decision support tools for assessing chemical hazard and risk. To develop new methods of analysis of...
NASA Astrophysics Data System (ADS)
Poccia, Nicola; Campi, Gaetano; Ricci, Alessandro; Caporale, Alessandra S.; di Cola, Emanuela; Hawkins, Thomas A.; Bianconi, Antonio
2014-06-01
Degradation of the myelin sheath is a common pathology underlying demyelinating neurological diseases from Multiple Sclerosis to Leukodistrophies. Although large malformations of myelin ultrastructure in the advanced stages of Wallerian degradation is known, its subtle structural variations at early stages of demyelination remains poorly characterized. This is partly due to the lack of suitable and non-invasive experimental probes possessing sufficient resolution to detect the degradation. Here we report the feasibility of the application of an innovative non-invasive local structure experimental approach for imaging the changes of statistical structural fluctuations in the first stage of myelin degeneration. Scanning micro X-ray diffraction, using advances in synchrotron x-ray beam focusing, fast data collection, paired with spatial statistical analysis, has been used to unveil temporal changes in the myelin structure of dissected nerves following extraction of the Xenopus laevis sciatic nerve. The early myelin degeneration is a specific ordered compacted phase preceding the swollen myelin phase of Wallerian degradation. Our demonstration of the feasibility of the statistical analysis of SµXRD measurements using biological tissue paves the way for further structural investigations of degradation and death of neurons and other cells and tissues in diverse pathological states where nanoscale structural changes may be uncovered.
SimHap GUI: an intuitive graphical user interface for genetic association analysis.
Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J
2008-12-25
Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis.
New software for statistical analysis of Cambridge Structural Database data
Sykes, Richard A.; McCabe, Patrick; Allen, Frank H.; Battle, Gary M.; Bruno, Ian J.; Wood, Peter A.
2011-01-01
A collection of new software tools is presented for the analysis of geometrical, chemical and crystallographic data from the Cambridge Structural Database (CSD). This software supersedes the program Vista. The new functionality is integrated into the program Mercury in order to provide statistical, charting and plotting options alongside three-dimensional structural visualization and analysis. The integration also permits immediate access to other information about specific CSD entries through the Mercury framework, a common requirement in CSD data analyses. In addition, the new software includes a range of more advanced features focused towards structural analysis such as principal components analysis, cone-angle correction in hydrogen-bond analyses and the ability to deal with topological symmetry that may be exhibited in molecular search fragments. PMID:22477784
Computational methods to extract meaning from text and advance theories of human cognition.
McNamara, Danielle S
2011-01-01
Over the past two decades, researchers have made great advances in the area of computational methods for extracting meaning from text. This research has to a large extent been spurred by the development of latent semantic analysis (LSA), a method for extracting and representing the meaning of words using statistical computations applied to large corpora of text. Since the advent of LSA, researchers have developed and tested alternative statistical methods designed to detect and analyze meaning in text corpora. This research exemplifies how statistical models of semantics play an important role in our understanding of cognition and contribute to the field of cognitive science. Importantly, these models afford large-scale representations of human knowledge and allow researchers to explore various questions regarding knowledge, discourse processing, text comprehension, and language. This topic includes the latest progress by the leading researchers in the endeavor to go beyond LSA. Copyright © 2010 Cognitive Science Society, Inc.
Ramón, M; Martínez-Pastor, F
2018-04-23
Computer-aided sperm analysis (CASA) produces a wealth of data that is frequently ignored. The use of multiparametric statistical methods can help explore these datasets, unveiling the subpopulation structure of sperm samples. In this review we analyse the significance of the internal heterogeneity of sperm samples and its relevance. We also provide a brief description of the statistical tools used for extracting sperm subpopulations from the datasets, namely unsupervised clustering (with non-hierarchical, hierarchical and two-step methods) and the most advanced supervised methods, based on machine learning. The former method has allowed exploration of subpopulation patterns in many species, whereas the latter offering further possibilities, especially considering functional studies and the practical use of subpopulation analysis. We also consider novel approaches, such as the use of geometric morphometrics or imaging flow cytometry. Finally, although the data provided by CASA systems provides valuable information on sperm samples by applying clustering analyses, there are several caveats. Protocols for capturing and analysing motility or morphometry should be standardised and adapted to each experiment, and the algorithms should be open in order to allow comparison of results between laboratories. Moreover, we must be aware of new technology that could change the paradigm for studying sperm motility and morphology.
Validating an Air Traffic Management Concept of Operation Using Statistical Modeling
NASA Technical Reports Server (NTRS)
He, Yuning; Davies, Misty Dawn
2013-01-01
Validating a concept of operation for a complex, safety-critical system (like the National Airspace System) is challenging because of the high dimensionality of the controllable parameters and the infinite number of states of the system. In this paper, we use statistical modeling techniques to explore the behavior of a conflict detection and resolution algorithm designed for the terminal airspace. These techniques predict the robustness of the system simulation to both nominal and off-nominal behaviors within the overall airspace. They also can be used to evaluate the output of the simulation against recorded airspace data. Additionally, the techniques carry with them a mathematical value of the worth of each prediction-a statistical uncertainty for any robustness estimate. Uncertainty Quantification (UQ) is the process of quantitative characterization and ultimately a reduction of uncertainties in complex systems. UQ is important for understanding the influence of uncertainties on the behavior of a system and therefore is valuable for design, analysis, and verification and validation. In this paper, we apply advanced statistical modeling methodologies and techniques on an advanced air traffic management system, namely the Terminal Tactical Separation Assured Flight Environment (T-TSAFE). We show initial results for a parameter analysis and safety boundary (envelope) detection in the high-dimensional parameter space. For our boundary analysis, we developed a new sequential approach based upon the design of computer experiments, allowing us to incorporate knowledge from domain experts into our modeling and to determine the most likely boundary shapes and its parameters. We carried out the analysis on system parameters and describe an initial approach that will allow us to include time-series inputs, such as the radar track data, into the analysis
ERIC Educational Resources Information Center
Hsieh, Chueh-An; Maier, Kimberly S.
2009-01-01
The capacity of Bayesian methods in estimating complex statistical models is undeniable. Bayesian data analysis is seen as having a range of advantages, such as an intuitive probabilistic interpretation of the parameters of interest, the efficient incorporation of prior information to empirical data analysis, model averaging and model selection.…
Collaboration and Synergy among Government, Industry and Academia in M&S Domain: Turkey’s Approach
2009-10-01
Analysis, Decision Support System Design and Implementation, Simulation Output Analysis, Statistical Data Analysis, Virtual Reality , Artificial... virtual and constructive visual simulation systems as well as integrated advanced analytical models. Collaboration and Synergy among Government...simulation systems that are ready to use, credible, integrated with C4ISR systems. Creating synthetic environments and/or virtual prototypes of concepts
Peicius, Eimantas; Blazeviciene, Aurelija; Kaminskas, Raimondas
2017-06-05
This paper joins the debate over changes in the role of health professionals when applying advance directives to manage the decision-making process at the end of life care. Issues in relation to advance directives occur in clinical units in Lithuania; however, it remains one of the few countries in the European Union (EU) where the discussion on advance directives is not included in the health-care policy-making agenda. To encourage the discussion of advance directives, a study was designed to examine health professionals' understanding and preferences related to advance directives. In addition, the study sought to explore the views of health care professionals of the application of Advance Directives (AD) in clinical practice in Lithuania. A cross-sectional survey was conducted by interviewing 478 health professionals based at major health care centers in Kaunas district, Lithuania. The design of the study included the use of a questionnaire developed for this study and validated by a pilot study. The collected data were analyzed using standard descriptive statistical methods. The analysis of knowledge about AD revealed some statistically significant differences when comparing the respondents' profession and gender. The analysis also indicated key emerging themes among respondents including tranquility of mind, the longest possible life expectancy and freedom of choice. Further, the study findings revealed that more than half of the study participants preferred to express their will while alive by using advance directives. The study findings revealed a low level of knowledge on advance directives among health professionals. Most health professionals agreed that AD's improved end-of-life decision making while the majority of physicians appreciated AD as the best tool for sharing responsibilities in clinical practice in Lithuania. More physicians than nurses preferred the presence of advance directives to support their decision making in end-of-life situations.
An exploratory investigation of weight estimation techniques for hypersonic flight vehicles
NASA Technical Reports Server (NTRS)
Cook, E. L.
1981-01-01
The three basic methods of weight prediction (fixed-fraction, statistical correlation, and point stress analysis) and some of the computer programs that have been developed to implement them are discussed. A modified version of the WAATS (Weights Analysis of Advanced Transportation Systems) program is presented, along with input data forms and an example problem.
Bill Block
2012-01-01
I have been Editor-in-Chief for about 10 months now. Over that period of time, I have processed hundreds of manuscripts and considered hundreds of reviews. In doing so, I have noticed an emphasis on analysis at the expense of a better understanding of the ecological system under study. I mention this not to belittle statistical advances made within various disciplines...
NASA Technical Reports Server (NTRS)
Rana, D. S.
1980-01-01
The data reduction capabilities of the current data reduction programs were assessed and a search for a more comprehensive system with higher data analytic capabilities was made. Results of the investigation are presented.
Statistical strategy for anisotropic adventitia modelling in IVUS.
Gil, Debora; Hernández, Aura; Rodriguez, Oriol; Mauri, Josepa; Radeva, Petia
2006-06-01
Vessel plaque assessment by analysis of intravascular ultrasound sequences is a useful tool for cardiac disease diagnosis and intervention. Manual detection of luminal (inner) and media-adventitia (external) vessel borders is the main activity of physicians in the process of lumen narrowing (plaque) quantification. Difficult definition of vessel border descriptors, as well as, shades, artifacts, and blurred signal response due to ultrasound physical properties trouble automated adventitia segmentation. In order to efficiently approach such a complex problem, we propose blending advanced anisotropic filtering operators and statistical classification techniques into a vessel border modelling strategy. Our systematic statistical analysis shows that the reported adventitia detection achieves an accuracy in the range of interobserver variability regardless of plaque nature, vessel geometry, and incomplete vessel borders.
An Analysis Methodology for the Gamma-ray Large Area Space Telescope
NASA Technical Reports Server (NTRS)
Morris, Robin D.; Cohen-Tanugi, Johann
2004-01-01
The Large Area Telescope (LAT) instrument on the Gamma Ray Large Area Space Telescope (GLAST) has been designed to detect high-energy gamma rays and determine their direction of incidence and energy. We propose a reconstruction algorithm based on recent advances in statistical methodology. This method, alternative to the standard event analysis inherited from high energy collider physics experiments, incorporates more accurately the physical processes occurring in the detector, and makes full use of the statistical information available. It could thus provide a better estimate of the direction and energy of the primary photon.
An advanced probabilistic structural analysis method for implicit performance functions
NASA Technical Reports Server (NTRS)
Wu, Y.-T.; Millwater, H. R.; Cruse, T. A.
1989-01-01
In probabilistic structural analysis, the performance or response functions usually are implicitly defined and must be solved by numerical analysis methods such as finite element methods. In such cases, the most commonly used probabilistic analysis tool is the mean-based, second-moment method which provides only the first two statistical moments. This paper presents a generalized advanced mean value (AMV) method which is capable of establishing the distributions to provide additional information for reliability design. The method requires slightly more computations than the second-moment method but is highly efficient relative to the other alternative methods. In particular, the examples show that the AMV method can be used to solve problems involving non-monotonic functions that result in truncated distributions.
Statistical contact angle analyses; "slow moving" drops on a horizontal silicon-oxide surface.
Schmitt, M; Grub, J; Heib, F
2015-06-01
Sessile drop experiments on horizontal surfaces are commonly used to characterise surface properties in science and in industry. The advancing angle and the receding angle are measurable on every solid. Specially on horizontal surfaces even the notions themselves are critically questioned by some authors. Building a standard, reproducible and valid method of measuring and defining specific (advancing/receding) contact angles is an important challenge of surface science. Recently we have developed two/three approaches, by sigmoid fitting, by independent and by dependent statistical analyses, which are practicable for the determination of specific angles/slopes if inclining the sample surface. These approaches lead to contact angle data which are independent on "user-skills" and subjectivity of the operator which is also of urgent need to evaluate dynamic measurements of contact angles. We will show in this contribution that the slightly modified procedures are also applicable to find specific angles for experiments on horizontal surfaces. As an example droplets on a flat freshly cleaned silicon-oxide surface (wafer) are dynamically measured by sessile drop technique while the volume of the liquid is increased/decreased. The triple points, the time, the contact angles during the advancing and the receding of the drop obtained by high-precision drop shape analysis are statistically analysed. As stated in the previous contribution the procedure is called "slow movement" analysis due to the small covered distance and the dominance of data points with low velocity. Even smallest variations in velocity such as the minimal advancing motion during the withdrawing of the liquid are identifiable which confirms the flatness and the chemical homogeneity of the sample surface and the high sensitivity of the presented approaches. Copyright © 2014 Elsevier Inc. All rights reserved.
Modeling Longitudinal Data Containing Non-Normal Within Subject Errors
NASA Technical Reports Server (NTRS)
Feiveson, Alan; Glenn, Nancy L.
2013-01-01
The mission of the National Aeronautics and Space Administration’s (NASA) human research program is to advance safe human spaceflight. This involves conducting experiments, collecting data, and analyzing data. The data are longitudinal and result from a relatively few number of subjects; typically 10 – 20. A longitudinal study refers to an investigation where participant outcomes and possibly treatments are collected at multiple follow-up times. Standard statistical designs such as mean regression with random effects and mixed–effects regression are inadequate for such data because the population is typically not approximately normally distributed. Hence, more advanced data analysis methods are necessary. This research focuses on four such methods for longitudinal data analysis: the recently proposed linear quantile mixed models (lqmm) by Geraci and Bottai (2013), quantile regression, multilevel mixed–effects linear regression, and robust regression. This research also provides computational algorithms for longitudinal data that scientists can directly use for human spaceflight and other longitudinal data applications, then presents statistical evidence that verifies which method is best for specific situations. This advances the study of longitudinal data in a broad range of applications including applications in the sciences, technology, engineering and mathematics fields.
Quantile regression for the statistical analysis of immunological data with many non-detects.
Eilers, Paul H C; Röder, Esther; Savelkoul, Huub F J; van Wijk, Roy Gerth
2012-07-07
Immunological parameters are hard to measure. A well-known problem is the occurrence of values below the detection limit, the non-detects. Non-detects are a nuisance, because classical statistical analyses, like ANOVA and regression, cannot be applied. The more advanced statistical techniques currently available for the analysis of datasets with non-detects can only be used if a small percentage of the data are non-detects. Quantile regression, a generalization of percentiles to regression models, models the median or higher percentiles and tolerates very high numbers of non-detects. We present a non-technical introduction and illustrate it with an implementation to real data from a clinical trial. We show that by using quantile regression, groups can be compared and that meaningful linear trends can be computed, even if more than half of the data consists of non-detects. Quantile regression is a valuable addition to the statistical methods that can be used for the analysis of immunological datasets with non-detects.
Intermediate/Advanced Research Design and Statistics
NASA Technical Reports Server (NTRS)
Ploutz-Snyder, Robert
2009-01-01
The purpose of this module is To provide Institutional Researchers (IRs) with an understanding of the principles of advanced research design and the intermediate/advanced statistical procedures consistent with such designs
Specialized data analysis of SSME and advanced propulsion system vibration measurements
NASA Technical Reports Server (NTRS)
Coffin, Thomas; Swanson, Wayne L.; Jong, Yen-Yi
1993-01-01
The basic objectives of this contract were to perform detailed analysis and evaluation of dynamic data obtained during Space Shuttle Main Engine (SSME) test and flight operations, including analytical/statistical assessment of component dynamic performance, and to continue the development and implementation of analytical/statistical models to effectively define nominal component dynamic characteristics, detect anomalous behavior, and assess machinery operational conditions. This study was to provide timely assessment of engine component operational status, identify probable causes of malfunction, and define feasible engineering solutions. The work was performed under three broad tasks: (1) Analysis, Evaluation, and Documentation of SSME Dynamic Test Results; (2) Data Base and Analytical Model Development and Application; and (3) Development and Application of Vibration Signature Analysis Techniques.
Yuan, Meiqin; Wang, Zeng; Hu, Guinv; Yang, Yunshan; Lv, Wangxia; Lu, Fangxiao; Zhong, Haijun
2016-01-01
Peritoneal metastasis (PM) is a poor prognostic factor in patients with gastric cancer. The aim of this study was to evaluate the efficacy and safety of hyperthermic intraperitoneal chemotherapy (HIPEC) in patients with advanced gastric cancer with PM by retrospective analysis. A total of 54 gastric cancer patients with positive ascitic fluid cytology were included in this study: 23 patients were treated with systemic chemotherapy combined with HIPEC (HIPEC+ group) and 31 received systemic chemotherapy alone (HIPEC- group). The patients were divided into 4 categories according to the changes of ascites, namely disappear, decrease, stable and increase. The disappear + decrease rate in the HIPEC+ group was 82.60%, which was statistically significantly superior to that of the HIPEC- group (54.80%). The disappear + decrease + stable rate was 95.70% in the HIPEC+ group and 74.20% in the HIPEC- group, but the difference was not statistically significant. In 33 patients with complete survival data, including 12 from the HIPEC+ and 21 from the HIPEC- group, the median progression-free survival was 164 and 129 days, respectively, and the median overall survival (OS) was 494 and 223 days, respectively. In patients with ascites disappear/decrease/stable, the OS appeared to be better compared with that in patients with ascites increase, but the difference was not statistically significant. Further analysis revealed that patients with controlled disease (complete response + partial response + stable disease) may have a better OS compared with patients with progressive disease, with a statistically significant difference. The toxicities were well tolerated in both groups. Therefore, HIPEC was found to improve survival in advanced gastric cancer patients with PM, but the difference was not statistically significant, which may be attributed to the small number of cases. Further studies with larger samples are required to confirm our data. PMID:27446587
DOT National Transportation Integrated Search
2010-12-01
Recent research suggests that traditional safety evaluation methods may be inadequate in accurately determining the effectiveness of roadway safety measures. In recent years, advanced statistical methods are being utilized in traffic safety studies t...
Advanced LIGO low-latency searches
NASA Astrophysics Data System (ADS)
Kanner, Jonah; LIGO Scientific Collaboration, Virgo Collaboration
2016-06-01
Advanced LIGO recently made the first detection of gravitational waves from merging binary black holes. The signal was first identified by a low-latency analysis, which identifies gravitational-wave transients within a few minutes of data collection. More generally, Advanced LIGO transients are sought with a suite of automated tools, which collectively identify events, evaluate statistical significance, estimate source position, and attempt to characterize source properties. This low-latency effort is enabling a broad multi-messenger approach to the science of compact object mergers and other transients. This talk will give an overview of the low-latency methodology and recent results.
Zhu, Yun; Fan, Ruzong; Xiong, Momiao
2017-01-01
Investigating the pleiotropic effects of genetic variants can increase statistical power, provide important information to achieve deep understanding of the complex genetic structures of disease, and offer powerful tools for designing effective treatments with fewer side effects. However, the current multiple phenotype association analysis paradigm lacks breadth (number of phenotypes and genetic variants jointly analyzed at the same time) and depth (hierarchical structure of phenotype and genotypes). A key issue for high dimensional pleiotropic analysis is to effectively extract informative internal representation and features from high dimensional genotype and phenotype data. To explore correlation information of genetic variants, effectively reduce data dimensions, and overcome critical barriers in advancing the development of novel statistical methods and computational algorithms for genetic pleiotropic analysis, we proposed a new statistic method referred to as a quadratically regularized functional CCA (QRFCCA) for association analysis which combines three approaches: (1) quadratically regularized matrix factorization, (2) functional data analysis and (3) canonical correlation analysis (CCA). Large-scale simulations show that the QRFCCA has a much higher power than that of the ten competing statistics while retaining the appropriate type 1 errors. To further evaluate performance, the QRFCCA and ten other statistics are applied to the whole genome sequencing dataset from the TwinsUK study. We identify a total of 79 genes with rare variants and 67 genes with common variants significantly associated with the 46 traits using QRFCCA. The results show that the QRFCCA substantially outperforms the ten other statistics. PMID:29040274
Araújo, Marcelo Marotta; Lauria, Andrezza; Mendes, Marcelo Breno Meneses; Claro, Ana Paula Rosifini Alves; Claro, Cristiane Aparecida de Assis; Moreira, Roger William Fernandes
2015-12-01
The aim of this study was to analyze, through Vickers hardness test and photoelasticity analysis, pre-bent areas, manually bent areas, and areas without bends of 10-mm advancement pre-bent titanium plates (Leibinger system). The work was divided into three groups: group I-region without bend, group II-region of 90° manual bend, and group III-region of 90° pre-fabricated bends. All the materials were evaluated through hardness analysis by the Vickers hardness test, stress analysis by residual images obtained in a polariscope, and photoelastic analysis by reflection during the manual bending. The data obtained from the hardness tests were statistically analyzed using ANOVA and Tukey's tests at a significance level of 5 %. The pre-bent plate (group III) showed hardness means statistically significantly higher (P < 0.05) than those of the other groups (I-region without bends, II-90° manually bent region). Through the study of photoelastic reflection, it was possible to identify that the stress gradually increased, reaching a pink color (1.81 δ / λ), as the bending was performed. A general analysis of the results showed that the bent plate region of pre-bent titanium presented the best results.
Guisan, Antoine; Edwards, T.C.; Hastie, T.
2002-01-01
An important statistical development of the last 30 years has been the advance in regression analysis provided by generalized linear models (GLMs) and generalized additive models (GAMs). Here we introduce a series of papers prepared within the framework of an international workshop entitled: Advances in GLMs/GAMs modeling: from species distribution to environmental management, held in Riederalp, Switzerland, 6-11 August 2001. We first discuss some general uses of statistical models in ecology, as well as provide a short review of several key examples of the use of GLMs and GAMs in ecological modeling efforts. We next present an overview of GLMs and GAMs, and discuss some of their related statistics used for predictor selection, model diagnostics, and evaluation. Included is a discussion of several new approaches applicable to GLMs and GAMs, such as ridge regression, an alternative to stepwise selection of predictors, and methods for the identification of interactions by a combined use of regression trees and several other approaches. We close with an overview of the papers and how we feel they advance our understanding of their application to ecological modeling. ?? 2002 Elsevier Science B.V. All rights reserved.
Response to Comments on "Ducklings imprint on the relational concept of 'same or different'".
Martinho, Antone; Kacelnik, Alex
2017-02-24
Two Comments by Hupé and by Langbein and Puppe address our choice of statistical analysis in assigning preference between sets of stimuli to individual ducklings in our paper. We believe that our analysis remains the most appropriate approach for our data and experimental design. Copyright © 2017, American Association for the Advancement of Science.
Fiteni, Frédéric; Anota, Amélie; Westeel, Virginie; Bonnetain, Franck
2016-02-18
Health-related quality of life (HRQoL) is recognized as a component endpoint for cancer therapy approvals. The aim of this review was to evaluate the methodology of HRQoL analysis and reporting in phase III clinical trials of first-line chemotherapy in advanced non-small cell lung cancers (NSCLC). A search in MEDLINE databases identified phase III clinical trials in first-line chemotherapy for advanced NSCLC, published between January 2008 to December 2014. Two authors independently extracted information using predefined data abstraction forms. A total of 55 phase III advanced NSCLC trials were identified. HRQoL was declared as an endpoint in 27 studies (49%). Among these 27 studies, The EORTC questionnaire Quality of Life Questionnaire C30 was used in 13 (48%) of the studies and The Functional Assessment of Cancer Therapy-General was used in 12 (44%) trials. The targeted dimensions of HRQoL, the minimal clinically important difference and the statistical approaches for dealing with missing data were clearly specified in 13 (48.1%), 9 (33.3%) and 5 (18.5%) studies, respectively. The most frequent statistical methods for HRQoL analysis were: the mean change from baseline (33.3%), the linear mixed model for repeated measures (22.2%) and time to HRQoL score deterioration (18.5%). For each targeted dimension, the results for each group, the estimated effect size and its precision were clearly reported in 4 studies (14.8%), not clearly reported in 11 studies (40.7%) and not reported at all in 12 studies (44.4%). This review demonstrated the weakness and the heterogeneity of the measurement, analysis, and reporting of HRQoL in phase III advanced NSCLC trials. Precise and uniform recommendations are needed to compare HRQoL results across publications and to provide understandable messages for patients and clinicians.
SimHap GUI: An intuitive graphical user interface for genetic association analysis
Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J
2008-01-01
Background Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. Results We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. Conclusion SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis. PMID:19109877
A Comparison of Atmospheric Quantities Determined from Advanced WVR and Weather Analysis Data
NASA Astrophysics Data System (ADS)
Morabito, D.; Wu, L.; Slobin, S.
2017-05-01
Lower frequency bands used for deep space communications (e.g., 2.3 GHz and 8.4 GHz) are oversubscribed. Thus, NASA has become interested in using higher frequency bands (e.g., 26 GHz and 32 GHz) for telemetry, making use of the available wider bandwidth. However, these bands are more susceptible to atmospheric degradation. Currently, flight projects tend to be conservative in preparing their communications links by using worst-case or conservative assumptions, which result in nonoptimum data return. We previously explored the use of weather forecasting over different weather condition scenarios to determine more optimal values of atmospheric attenuation and atmospheric noise temperature for use in telecommunications link design. In this article, we present the results of a comparison of meteorological parameters (columnar water vapor and liquid water content) estimated from multifrequency Advanced Water Vapor Radiometer (AWVR) data with those estimated from weather analysis tools (FNL). We find that for the Deep Space Network's Goldstone and Madrid tracking sites, the statistics are in reasonable agreement between the two methods. We can then use the statistics of these quantities based on FNL runs to estimate statistics of atmospheric signal degradation for tracking sites that do not have the benefit of possessing multiyear WVR data sets, such as those of the NASA Near-Earth Network (NEN). The resulting statistics of atmospheric attenuation and atmospheric noise temperature increase can then be used in link budget calculations.
Foulquier, Nathan; Redou, Pascal; Le Gal, Christophe; Rouvière, Bénédicte; Pers, Jacques-Olivier; Saraux, Alain
2018-05-17
Big data analysis has become a common way to extract information from complex and large datasets among most scientific domains. This approach is now used to study large cohorts of patients in medicine. This work is a review of publications that have used artificial intelligence and advanced machine learning techniques to study physio pathogenesis-based treatments in pSS. A systematic literature review retrieved all articles reporting on the use of advanced statistical analysis applied to the study of systemic autoimmune diseases (SADs) over the last decade. An automatic bibliography screening method has been developed to perform this task. The program called BIBOT was designed to fetch and analyze articles from the pubmed database using a list of keywords and Natural Language Processing approaches. The evolution of trends in statistical approaches, sizes of cohorts and number of publications over this period were also computed in the process. In all, 44077 abstracts were screened and 1017 publications were analyzed. The mean number of selected articles was 101.0 (S.D. 19.16) by year, but increased significantly over the time (from 74 articles in 2008 to 138 in 2017). Among them only 12 focused on pSS but none of them emphasized on the aspect of pathogenesis-based treatments. To conclude, medicine progressively enters the era of big data analysis and artificial intelligence, but these approaches are not yet used to describe pSS-specific pathogenesis-based treatment. Nevertheless, large multicentre studies are investigating this aspect with advanced algorithmic tools on large cohorts of SADs patients.
Estimating population diversity with CatchAll
USDA-ARS?s Scientific Manuscript database
The massive quantity of data produced by next-generation sequencing has created a pressing need for advanced statistical tools, in particular for analysis of bacterial and phage communities. Here we address estimating the total diversity in a population – the species richness. This is an important s...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zauderer, B.; Fleming, E.S.
1991-08-30
This work contains to the final report of the demonstration of an advanced cyclone coal combustor. Titles include: Chronological Description of the Clean Coal Project Tests,'' Statistical Analysis of Operating Data for the Coal Tech Combustor,'' Photographic History of the Project,'' Results of Slag Analysis by PA DER Module 1 Procedure,'' Properties of the Coals Limestone Used in the Test Effort,'' Results of the Solid Waste Sampling Performed on the Coal Tech Combustor by an Independent Contractor During the February 1990 Tests.'' (VC)
Rodrigues-Pinto, E; Pereira, P; Coelho, R; Andrade, P; Ribeiro, A; Lopes, S; Moutinho-Ribeiro, P; Macedo, G
2017-02-01
Self-expanding metal stents (SEMS) are the treatment of choice for advanced esophageal cancers. Literature is scarce on risk factors predictors for adverse events after SEMS placement. Assess risk factors for adverse events after SEMS placement in advanced esophageal cancer and evaluate survival after SEMS placement. Cross-sectional study of patients with advanced esophageal cancer referred for SEMS placement, during a period of 3 years. Ninety-seven patients with advanced esophageal cancer placed SEMS. Adverse events were more common when tumors were located at the level of the distal esophagus/cardia (47% vs 23%, P = 0.011, OR 3.1), with statistical significance being kept in the multivariate analysis (OR 3.1, P = 0.018). Time until adverse events was lower in the tumors located at the level of the distal esophagus/cardia (P = 0.036). Survival was higher in patients who placed SEMS with curative intent (327 days [126-528] vs. 119 days [91-147], P = 0.002) and in patients submitted subsequently to surgery compared with those who did just chemo/radiotherapy or who did not do further treatment (563 days [378-748] vs. 154 days [133-175] vs. 46 days [20-72], P < 0.001). Subsequent treatment kept statistical significance in the multivariate analysis (HR 3.4, P < 0.001). SEMS allow palliation of dysphagia in advanced esophageal cancer and are associated with an increased out-of-hospital survival, as long as there are conditions for further treatments. Tumors located at the level of the distal esophagus/cardia are associated with a greater number of adverse events, which also occur earlier. © 2016 International Society for Diseases of the Esophagus.
Advanced Artificial Intelligence Technology Testbed
NASA Technical Reports Server (NTRS)
Anken, Craig S.
1993-01-01
The Advanced Artificial Intelligence Technology Testbed (AAITT) is a laboratory testbed for the design, analysis, integration, evaluation, and exercising of large-scale, complex, software systems, composed of both knowledge-based and conventional components. The AAITT assists its users in the following ways: configuring various problem-solving application suites; observing and measuring the behavior of these applications and the interactions between their constituent modules; gathering and analyzing statistics about the occurrence of key events; and flexibly and quickly altering the interaction of modules within the applications for further study.
Oostenveld, Robert; Fries, Pascal; Maris, Eric; Schoffelen, Jan-Mathijs
2011-01-01
This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow experimental neuroscientists to analyze experimental data. It includes algorithms for simple and advanced analysis, such as time-frequency analysis using multitapers, source reconstruction using dipoles, distributed sources and beamformers, connectivity analysis, and nonparametric statistical permutation tests at the channel and source level. The implementation as toolbox allows the user to perform elaborate and structured analyses of large data sets using the MATLAB command line and batch scripting. Furthermore, users and developers can easily extend the functionality and implement new algorithms. The modular design facilitates the reuse in other software packages.
Willis, Brian H; Riley, Richard D
2017-09-20
An important question for clinicians appraising a meta-analysis is: are the findings likely to be valid in their own practice-does the reported effect accurately represent the effect that would occur in their own clinical population? To this end we advance the concept of statistical validity-where the parameter being estimated equals the corresponding parameter for a new independent study. Using a simple ('leave-one-out') cross-validation technique, we demonstrate how we may test meta-analysis estimates for statistical validity using a new validation statistic, Vn, and derive its distribution. We compare this with the usual approach of investigating heterogeneity in meta-analyses and demonstrate the link between statistical validity and homogeneity. Using a simulation study, the properties of Vn and the Q statistic are compared for univariate random effects meta-analysis and a tailored meta-regression model, where information from the setting (included as model covariates) is used to calibrate the summary estimate to the setting of application. Their properties are found to be similar when there are 50 studies or more, but for fewer studies Vn has greater power but a higher type 1 error rate than Q. The power and type 1 error rate of Vn are also shown to depend on the within-study variance, between-study variance, study sample size, and the number of studies in the meta-analysis. Finally, we apply Vn to two published meta-analyses and conclude that it usefully augments standard methods when deciding upon the likely validity of summary meta-analysis estimates in clinical practice. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.
NASA Marshall Space Flight Center Controls Systems Design and Analysis Branch
NASA Technical Reports Server (NTRS)
Gilligan, Eric
2014-01-01
Marshall Space Flight Center maintains a critical national capability in the analysis of launch vehicle flight dynamics and flight certification of GN&C algorithms. MSFC analysts are domain experts in the areas of flexible-body dynamics and control-structure interaction, thrust vector control, sloshing propellant dynamics, and advanced statistical methods. Marshall's modeling and simulation expertise has supported manned spaceflight for over 50 years. Marshall's unparalleled capability in launch vehicle guidance, navigation, and control technology stems from its rich heritage in developing, integrating, and testing launch vehicle GN&C systems dating to the early Mercury-Redstone and Saturn vehicles. The Marshall team is continuously developing novel methods for design, including advanced techniques for large-scale optimization and analysis.
Thinking big: linking rivers to landscapes
Joan O’Callaghan; Ashley E. Steel; Kelly M. Burnett
2012-01-01
Exploring relationships between landscape characteristics and rivers is an emerging field, enabled by the proliferation of satellite date, advances in statistical analysis, and increased emphasis on large-scale monitoring. Landscapes features such as road networks, underlying geology, and human developments, determine the characteristics of the rivers flowing through...
Project T.E.A.M. (Technical Education Advancement Modules). Advanced Statistical Process Control.
ERIC Educational Resources Information Center
Dunlap, Dale
This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 20-hour advanced statistical process control (SPC) and quality improvement course designed to develop the following competencies: (1) understanding quality systems; (2) knowing the process; (3) solving quality problems; and (4)…
ADP of multispectral scanner data for land use mapping
NASA Technical Reports Server (NTRS)
Hoffer, R. M.
1971-01-01
The advantages and disadvantages of various remote sensing instrumentation and analysis techniques are reviewed. The use of multispectral scanner data and the automatic data processing techniques are considered. A computer-aided analysis system for remote sensor data is described with emphasis on the image display, statistics processor, wavelength band selection, classification processor, and results display. Advanced techniques in using spectral and temporal data are also considered.
Performance analysis of Integrated Communication and Control System networks
NASA Technical Reports Server (NTRS)
Halevi, Y.; Ray, A.
1990-01-01
This paper presents statistical analysis of delays in Integrated Communication and Control System (ICCS) networks that are based on asynchronous time-division multiplexing. The models are obtained in closed form for analyzing control systems with randomly varying delays. The results of this research are applicable to ICCS design for complex dynamical processes like advanced aircraft and spacecraft, autonomous manufacturing plants, and chemical and processing plants.
NASA Astrophysics Data System (ADS)
Choquet, Élodie; Pueyo, Laurent; Soummer, Rémi; Perrin, Marshall D.; Hagan, J. Brendan; Gofas-Salas, Elena; Rajan, Abhijith; Aguilar, Jonathan
2015-09-01
The ALICE program, for Archival Legacy Investigation of Circumstellar Environment, is currently conducting a virtual survey of about 400 stars, by re-analyzing the HST-NICMOS coronagraphic archive with advanced post-processing techniques. We present here the strategy that we adopted to identify detections and potential candidates for follow-up observations, and we give a preliminary overview of our detections. We present a statistical analysis conducted to evaluate the confidence level on these detection and the completeness of our candidate search.
Conceptualizing a Framework for Advanced Placement Statistics Teaching Knowledge
ERIC Educational Resources Information Center
Haines, Brenna
2015-01-01
The purpose of this article is to sketch a conceptualization of a framework for Advanced Placement (AP) Statistics Teaching Knowledge. Recent research continues to problematize the lack of knowledge and preparation among secondary level statistics teachers. The College Board's AP Statistics course continues to grow and gain popularity, but is a…
Otwombe, Kennedy N.; Petzold, Max; Martinson, Neil; Chirwa, Tobias
2014-01-01
Background Research in the predictors of all-cause mortality in HIV-infected people has widely been reported in literature. Making an informed decision requires understanding the methods used. Objectives We present a review on study designs, statistical methods and their appropriateness in original articles reporting on predictors of all-cause mortality in HIV-infected people between January 2002 and December 2011. Statistical methods were compared between 2002–2006 and 2007–2011. Time-to-event analysis techniques were considered appropriate. Data Sources Pubmed/Medline. Study Eligibility Criteria Original English-language articles were abstracted. Letters to the editor, editorials, reviews, systematic reviews, meta-analysis, case reports and any other ineligible articles were excluded. Results A total of 189 studies were identified (n = 91 in 2002–2006 and n = 98 in 2007–2011) out of which 130 (69%) were prospective and 56 (30%) were retrospective. One hundred and eighty-two (96%) studies described their sample using descriptive statistics while 32 (17%) made comparisons using t-tests. Kaplan-Meier methods for time-to-event analysis were commonly used in the earlier period (n = 69, 76% vs. n = 53, 54%, p = 0.002). Predictors of mortality in the two periods were commonly determined using Cox regression analysis (n = 67, 75% vs. n = 63, 64%, p = 0.12). Only 7 (4%) used advanced survival analysis methods of Cox regression analysis with frailty in which 6 (3%) were used in the later period. Thirty-two (17%) used logistic regression while 8 (4%) used other methods. There were significantly more articles from the first period using appropriate methods compared to the second (n = 80, 88% vs. n = 69, 70%, p-value = 0.003). Conclusion Descriptive statistics and survival analysis techniques remain the most common methods of analysis in publications on predictors of all-cause mortality in HIV-infected cohorts while prospective research designs are favoured. Sophisticated techniques of time-dependent Cox regression and Cox regression with frailty are scarce. This motivates for more training in the use of advanced time-to-event methods. PMID:24498313
Wang, Xiao-Yun; Zhao, Yu-Liang
2010-12-21
To observe the clinical efficacy and adverse effects of taxol plus carboplatin (TP) or gemcitabine plus carboplatin (GP) in patients with advanced non-small-cell lung carcinoma. A total of 86 patients with advanced non-small-cell lung carcinoma with a histologically confirmed diagnosis at our department were treated with at least two cycles of drug therapy according to the WHO standard. There were 43 cases in TP group and 43 cases in GP group. TP group: taxol 150 mg/m(2), d1, carboplatin 300 mg/m(2) in d1; GP group: gemcitabine 1000 mg/m(2), 30 min, d1, 8, carboplatin 300 mg/m(2) in d1, 3 weeks a cycle. The efficacy and side effects were analyzed after two cycles of chemotherapy. When TP and GP groups were compared, the effective rate was 44.2% vs 39.5%; disease control rate (CR + PR + SD): 81.4% vs 74.4%; median time to progress (TTP): 4.6 vs 4.5 months; medium survivals: 8.6 vs 8.8 months; 1-year survival rates: 17.2% vs 18.1%; 2-year survival rates: 8% vs 10%. The statistic analysis showed that the two groups had no significant difference. The main cytotoxicities of GP and TP groups were predominantly thrombocytopenia and leucopenia respectively. The two groups had no significant statistical difference. The incidences of allergen, alopecia and peripheral neurotoxicity were higher in the TP group. The two groups had statistical difference. Tolerance was excellent in both groups. The therapeutic effect and tolerance are excellent for advanced non-small cell lung carcinoma. The efficacy and survival rate of two groups show no statistical difference.
A Fishy Problem for Advanced Students
ERIC Educational Resources Information Center
Patterson, Richard A.
1977-01-01
While developing a research course for gifted high school students, improvements were made in a local pond. Students worked for a semester learning research techniques, statistical analysis, and limnology. At the end of the course, the three students produced a joint scientific paper detailing their study of the pond. (MA)
Joint QTL linkage mapping for multiple-cross mating design sharing one common parent
USDA-ARS?s Scientific Manuscript database
Nested association mapping (NAM) is a novel genetic mating design that combines the advantages of linkage analysis and association mapping. This design provides opportunities to study the inheritance of complex traits, but also requires more advanced statistical methods. In this paper, we present th...
USDA-ARS?s Scientific Manuscript database
Recent advances in technology have led to the collection of high-dimensional data not previously encountered in many scientific environments. As a result, scientists are often faced with the challenging task of including these high-dimensional data into statistical models. For example, data from sen...
AGR-1 Thermocouple Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeff Einerson
2012-05-01
This report documents an effort to analyze measured and simulated data obtained in the Advanced Gas Reactor (AGR) fuel irradiation test program conducted in the INL's Advanced Test Reactor (ATR) to support the Next Generation Nuclear Plant (NGNP) R&D program. The work follows up on a previous study (Pham and Einerson, 2010), in which statistical analysis methods were applied for AGR-1 thermocouple data qualification. The present work exercises the idea that, while recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, results of the numerical simulations can be used in combination with the statistical analysis methods tomore » further improve qualification of measured data. Additionally, the combined analysis of measured and simulation data can generate insights about simulation model uncertainty that can be useful for model improvement. This report also describes an experimental control procedure to maintain fuel target temperature in the future AGR tests using regression relationships that include simulation results. The report is organized into four chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program, AGR-1 test configuration and test procedure, overview of AGR-1 measured data, and overview of physics and thermal simulation, including modeling assumptions and uncertainties. A brief summary of statistical analysis methods developed in (Pham and Einerson 2010) for AGR-1 measured data qualification within NGNP Data Management and Analysis System (NDMAS) is also included for completeness. Chapters 2-3 describe and discuss cases, in which the combined use of experimental and simulation data is realized. A set of issues associated with measurement and modeling uncertainties resulted from the combined analysis are identified. This includes demonstration that such a combined analysis led to important insights for reducing uncertainty in presentation of AGR-1 measured data (Chapter 2) and interpretation of simulation results (Chapter 3). The statistics-based simulation-aided experimental control procedure described for the future AGR tests is developed and demonstrated in Chapter 4. The procedure for controlling the target fuel temperature (capsule peak or average) is based on regression functions of thermocouple readings and other relevant parameters and accounting for possible changes in both physical and thermal conditions and in instrument performance.« less
Recent advances in quantitative high throughput and high content data analysis.
Moutsatsos, Ioannis K; Parker, Christian N
2016-01-01
High throughput screening has become a basic technique with which to explore biological systems. Advances in technology, including increased screening capacity, as well as methods that generate multiparametric readouts, are driving the need for improvements in the analysis of data sets derived from such screens. This article covers the recent advances in the analysis of high throughput screening data sets from arrayed samples, as well as the recent advances in the analysis of cell-by-cell data sets derived from image or flow cytometry application. Screening multiple genomic reagents targeting any given gene creates additional challenges and so methods that prioritize individual gene targets have been developed. The article reviews many of the open source data analysis methods that are now available and which are helping to define a consensus on the best practices to use when analyzing screening data. As data sets become larger, and more complex, the need for easily accessible data analysis tools will continue to grow. The presentation of such complex data sets, to facilitate quality control monitoring and interpretation of the results will require the development of novel visualizations. In addition, advanced statistical and machine learning algorithms that can help identify patterns, correlations and the best features in massive data sets will be required. The ease of use for these tools will be important, as they will need to be used iteratively by laboratory scientists to improve the outcomes of complex analyses.
The changing landscape of astrostatistics and astroinformatics
NASA Astrophysics Data System (ADS)
Feigelson, Eric D.
2017-06-01
The history and current status of the cross-disciplinary fields of astrostatistics and astroinformatics are reviewed. Astronomers need a wide range of statistical methods for both data reduction and science analysis. With the proliferation of high-throughput telescopes, efficient large scale computational methods are also becoming essential. However, astronomers receive only weak training in these fields during their formal education. Interest in the fields is rapidly growing with conferences organized by scholarly societies, textbooks and tutorial workshops, and research studies pushing the frontiers of methodology. R, the premier language of statistical computing, can provide an important software environment for the incorporation of advanced statistical and computational methodology into the astronomical community.
Mediation analysis in nursing research: a methodological review.
Liu, Jianghong; Ulrich, Connie
2016-12-01
Mediation statistical models help clarify the relationship between independent predictor variables and dependent outcomes of interest by assessing the impact of third variables. This type of statistical analysis is applicable for many clinical nursing research questions, yet its use within nursing remains low. Indeed, mediational analyses may help nurse researchers develop more effective and accurate prevention and treatment programs as well as help bridge the gap between scientific knowledge and clinical practice. In addition, this statistical approach allows nurse researchers to ask - and answer - more meaningful and nuanced questions that extend beyond merely determining whether an outcome occurs. Therefore, the goal of this paper is to provide a brief tutorial on the use of mediational analyses in clinical nursing research by briefly introducing the technique and, through selected empirical examples from the nursing literature, demonstrating its applicability in advancing nursing science.
Interpretation of statistical results.
García Garmendia, J L; Maroto Monserrat, F
2018-02-21
The appropriate interpretation of the statistical results is crucial to understand the advances in medical science. The statistical tools allow us to transform the uncertainty and apparent chaos in nature to measurable parameters which are applicable to our clinical practice. The importance of understanding the meaning and actual extent of these instruments is essential for researchers, the funders of research and for professionals who require a permanent update based on good evidence and supports to decision making. Various aspects of the designs, results and statistical analysis are reviewed, trying to facilitate his comprehension from the basics to what is most common but no better understood, and bringing a constructive, non-exhaustive but realistic look. Copyright © 2018 Elsevier España, S.L.U. y SEMICYUC. All rights reserved.
LV software support for supersonic flow analysis
NASA Technical Reports Server (NTRS)
Bell, W. A.; Lepicovsky, J.
1992-01-01
The software for configuring an LV counter processor system has been developed using structured design. The LV system includes up to three counter processors and a rotary encoder. The software for configuring and testing the LV system has been developed, tested, and included in an overall software package for data acquisition, analysis, and reduction. Error handling routines respond to both operator and instrument errors which often arise in the course of measuring complex, high-speed flows. The use of networking capabilities greatly facilitates the software development process by allowing software development and testing from a remote site. In addition, high-speed transfers allow graphics files or commands to provide viewing of the data from a remote site. Further advances in data analysis require corresponding advances in procedures for statistical and time series analysis of nonuniformly sampled data.
LV software support for supersonic flow analysis
NASA Technical Reports Server (NTRS)
Bell, William A.
1992-01-01
The software for configuring a Laser Velocimeter (LV) counter processor system was developed using structured design. The LV system includes up to three counter processors and a rotary encoder. The software for configuring and testing the LV system was developed, tested, and included in an overall software package for data acquisition, analysis, and reduction. Error handling routines respond to both operator and instrument errors which often arise in the course of measuring complex, high-speed flows. The use of networking capabilities greatly facilitates the software development process by allowing software development and testing from a remote site. In addition, high-speed transfers allow graphics files or commands to provide viewing of the data from a remote site. Further advances in data analysis require corresponding advances in procedures for statistical and time series analysis of nonuniformly sampled data.
Riley, Richard D.
2017-01-01
An important question for clinicians appraising a meta‐analysis is: are the findings likely to be valid in their own practice—does the reported effect accurately represent the effect that would occur in their own clinical population? To this end we advance the concept of statistical validity—where the parameter being estimated equals the corresponding parameter for a new independent study. Using a simple (‘leave‐one‐out’) cross‐validation technique, we demonstrate how we may test meta‐analysis estimates for statistical validity using a new validation statistic, Vn, and derive its distribution. We compare this with the usual approach of investigating heterogeneity in meta‐analyses and demonstrate the link between statistical validity and homogeneity. Using a simulation study, the properties of Vn and the Q statistic are compared for univariate random effects meta‐analysis and a tailored meta‐regression model, where information from the setting (included as model covariates) is used to calibrate the summary estimate to the setting of application. Their properties are found to be similar when there are 50 studies or more, but for fewer studies Vn has greater power but a higher type 1 error rate than Q. The power and type 1 error rate of Vn are also shown to depend on the within‐study variance, between‐study variance, study sample size, and the number of studies in the meta‐analysis. Finally, we apply Vn to two published meta‐analyses and conclude that it usefully augments standard methods when deciding upon the likely validity of summary meta‐analysis estimates in clinical practice. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. PMID:28620945
Oostenveld, Robert; Fries, Pascal; Maris, Eric; Schoffelen, Jan-Mathijs
2011-01-01
This paper describes FieldTrip, an open source software package that we developed for the analysis of MEG, EEG, and other electrophysiological data. The software is implemented as a MATLAB toolbox and includes a complete set of consistent and user-friendly high-level functions that allow experimental neuroscientists to analyze experimental data. It includes algorithms for simple and advanced analysis, such as time-frequency analysis using multitapers, source reconstruction using dipoles, distributed sources and beamformers, connectivity analysis, and nonparametric statistical permutation tests at the channel and source level. The implementation as toolbox allows the user to perform elaborate and structured analyses of large data sets using the MATLAB command line and batch scripting. Furthermore, users and developers can easily extend the functionality and implement new algorithms. The modular design facilitates the reuse in other software packages. PMID:21253357
Empuku, Shinichiro; Nakajima, Kentaro; Akagi, Tomonori; Kaneko, Kunihiko; Hijiya, Naoki; Etoh, Tsuyoshi; Shiraishi, Norio; Moriyama, Masatsugu; Inomata, Masafumi
2016-05-01
Preoperative chemoradiotherapy (CRT) for locally advanced rectal cancer not only improves the postoperative local control rate, but also induces downstaging. However, it has not been established how to individually select patients who receive effective preoperative CRT. The aim of this study was to identify a predictor of response to preoperative CRT for locally advanced rectal cancer. This study is additional to our multicenter phase II study evaluating the safety and efficacy of preoperative CRT using oral fluorouracil (UMIN ID: 03396). From April, 2009 to August, 2011, 26 biopsy specimens obtained prior to CRT were analyzed by cyclopedic microarray analysis. Response to CRT was evaluated according to a histological grading system using surgically resected specimens. To decide on the number of genes for dividing into responder and non-responder groups, we statistically analyzed the data using a dimension reduction method, a principle component analysis. Of the 26 cases, 11 were responders and 15 non-responders. No significant difference was found in clinical background data between the two groups. We determined that the optimal number of genes for the prediction of response was 80 of 40,000 and the functions of these genes were analyzed. When comparing non-responders with responders, genes expressed at a high level functioned in alternative splicing, whereas those expressed at a low level functioned in the septin complex. Thus, an 80-gene expression set that predicts response to preoperative CRT for locally advanced rectal cancer was identified using a novel statistical method.
Cui, Jing; Fang, Fang; Shen, Fengping; Song, Lijuan; Zhou, Lingjun; Ma, Xiuqiang; Zhao, Jijun
2014-11-01
Quality of life (QOL) is the main outcome measure for patients with advanced cancer at the end of life. The McGill Quality of Life Questionnaire (MQOL) is designed specifically for palliative care patients and has been translated and validated in Hong Kong and Taiwan. This study aimed to investigate the QOL of patients with advanced cancer using the MQOL-Taiwan version after cultural adaptation to the Chinese mainland. A cross-sectional survey design was used. QOL data from patients with advanced cancer were gathered from 13 hospitals including five tertiary hospitals, six secondary hospitals, and community health care service centers in Shanghai and analyzed. QOL was assessed using the MQOL-Chinese version. Statistical analyses were performed using descriptive statistics, multiple regression analysis, and Spearman rank correlation analysis. A total of 531 cancer patients (297 male and 234 female) in 13 hospitals were recruited into the study and administered the MQOL-Chinese. The score of the support subscale was highest (6.82), and the score of the existential well-being subscale was the lowest (4.65). The five physical symptoms most frequently listed on the MQOL-Chinese were pain, loss of appetite, fatigue, powerless, and dyspnea. Participants' sex, educational level, number of children, disclosure of the disease, and hospital size were associated with their overall QOL. The Spearman rank correlation analysis found that Karnofsky Performance Status scores correlated with the MQOL-Chinese single-item score, physical well-being, psychological well-being, existential well-being, and support domains (P < 0.05). Our results revealed the aspects of QOL that need more attention for Chinese palliative care patients with advanced cancer. The association between the characteristics of patients, Karnofsky Performance Status, and their QOL also was identified. Copyright © 2014 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.
Howard Stauffer; Nadav Nur
2005-01-01
The papers included in the Advances in Statistics section of the Partners in Flight (PIF) 2002 Proceedings represent a small sample of statistical topics of current importance to Partners In Flight research scientists: hierarchical modeling, estimation of detection probabilities, and Bayesian applications. Sauer et al. (this volume) examines a hierarchical model...
Statistical Methodologies to Integrate Experimental and Computational Research
NASA Technical Reports Server (NTRS)
Parker, P. A.; Johnson, R. T.; Montgomery, D. C.
2008-01-01
Development of advanced algorithms for simulating engine flow paths requires the integration of fundamental experiments with the validation of enhanced mathematical models. In this paper, we provide an overview of statistical methods to strategically and efficiently conduct experiments and computational model refinement. Moreover, the integration of experimental and computational research efforts is emphasized. With a statistical engineering perspective, scientific and engineering expertise is combined with statistical sciences to gain deeper insights into experimental phenomenon and code development performance; supporting the overall research objectives. The particular statistical methods discussed are design of experiments, response surface methodology, and uncertainty analysis and planning. Their application is illustrated with a coaxial free jet experiment and a turbulence model refinement investigation. Our goal is to provide an overview, focusing on concepts rather than practice, to demonstrate the benefits of using statistical methods in research and development, thereby encouraging their broader and more systematic application.
Anima: Modular Workflow System for Comprehensive Image Data Analysis
Rantanen, Ville; Valori, Miko; Hautaniemi, Sampsa
2014-01-01
Modern microscopes produce vast amounts of image data, and computational methods are needed to analyze and interpret these data. Furthermore, a single image analysis project may require tens or hundreds of analysis steps starting from data import and pre-processing to segmentation and statistical analysis; and ending with visualization and reporting. To manage such large-scale image data analysis projects, we present here a modular workflow system called Anima. Anima is designed for comprehensive and efficient image data analysis development, and it contains several features that are crucial in high-throughput image data analysis: programing language independence, batch processing, easily customized data processing, interoperability with other software via application programing interfaces, and advanced multivariate statistical analysis. The utility of Anima is shown with two case studies focusing on testing different algorithms developed in different imaging platforms and an automated prediction of alive/dead C. elegans worms by integrating several analysis environments. Anima is a fully open source and available with documentation at www.anduril.org/anima. PMID:25126541
Conducting Simulation Studies in the R Programming Environment.
Hallgren, Kevin A
2013-10-12
Simulation studies allow researchers to answer specific questions about data analysis, statistical power, and best-practices for obtaining accurate results in empirical research. Despite the benefits that simulation research can provide, many researchers are unfamiliar with available tools for conducting their own simulation studies. The use of simulation studies need not be restricted to researchers with advanced skills in statistics and computer programming, and such methods can be implemented by researchers with a variety of abilities and interests. The present paper provides an introduction to methods used for running simulation studies using the R statistical programming environment and is written for individuals with minimal experience running simulation studies or using R. The paper describes the rationale and benefits of using simulations and introduces R functions relevant for many simulation studies. Three examples illustrate different applications for simulation studies, including (a) the use of simulations to answer a novel question about statistical analysis, (b) the use of simulations to estimate statistical power, and (c) the use of simulations to obtain confidence intervals of parameter estimates through bootstrapping. Results and fully annotated syntax from these examples are provided.
Recent advances in statistical energy analysis
NASA Technical Reports Server (NTRS)
Heron, K. H.
1992-01-01
Statistical Energy Analysis (SEA) has traditionally been developed using modal summation and averaging approach, and has led to the need for many restrictive SEA assumptions. The assumption of 'weak coupling' is particularly unacceptable when attempts are made to apply SEA to structural coupling. It is now believed that this assumption is more a function of the modal formulation rather than a necessary formulation of SEA. The present analysis ignores this restriction and describes a wave approach to the calculation of plate-plate coupling loss factors. Predictions based on this method are compared with results obtained from experiments using point excitation on one side of an irregular six-sided box structure. Conclusions show that the use and calculation of infinite transmission coefficients is the way forward for the development of a purely predictive SEA code.
It has been fifty years since Kirkham and Bartholmew (1954) presented the conceptual framework and derived the mathematical equations that formed the basis of the now commonly employed method of 15N isotope dilution. Although many advances in methodology and analysis have been ma...
Power Analysis for Complex Mediational Designs Using Monte Carlo Methods
ERIC Educational Resources Information Center
Thoemmes, Felix; MacKinnon, David P.; Reiser, Mark R.
2010-01-01
Applied researchers often include mediation effects in applications of advanced methods such as latent variable models and linear growth curve models. Guidance on how to estimate statistical power to detect mediation for these models has not yet been addressed in the literature. We describe a general framework for power analyses for complex…
Guidelines for collecting and maintaining archives for genetic monitoring
Jennifer A. Jackson; Linda Laikre; C. Scott Baker; Katherine C. Kendall; F. W. Allendorf; M. K. Schwartz
2011-01-01
Rapid advances in molecular genetic techniques and the statistical analysis of genetic data have revolutionized the way that populations of animals, plants and microorganisms can be monitored. Genetic monitoring is the practice of using molecular genetic markers to track changes in the abundance, diversity or distribution of populations, species or ecosystems over time...
Effect of sexual steroids on boar kinematic sperm subpopulations.
Ayala, E M E; Aragón, M A
2017-11-01
Here, we show the effects of sexual steroids, progesterone, testosterone, or estradiol on motility parameters of boar sperm. Sixteen commercial seminal doses, four each of four adult boars, were analyzed using computer assisted sperm analysis (CASA). Mean values of motility parameters were analyzed by bivariate and multivariate statistics. Principal component analysis (PCA), followed by hierarchical clustering, was applied on data of motility parameters, provided automatically as intervals by the CASA system. Effects of sexual steroids were described in the kinematic subpopulations identified from multivariate statistics. Mean values of motility parameters were not significantly changed after addition of sexual steroids. Multivariate graphics showed that sperm subpopulations were not sensitive to the addition of either testosterone or estradiol, but sperm subpopulations responsive to progesterone were found. Distribution of motility parameters were wide in controls but sharpened at distinct concentrations of progesterone. We conclude that kinematic sperm subpopulations responsive to progesterone are present in boar semen, and these subpopulations are masked in evaluations of mean values of motility parameters. © 2017 International Society for Advancement of Cytometry. © 2017 International Society for Advancement of Cytometry.
Busse, Harald; Schmitgen, Arno; Trantakis, Christos; Schober, Ralf; Kahn, Thomas; Moche, Michael
2006-07-01
To present an advanced approach for intraoperative image guidance in an open 0.5 T MRI and to evaluate its effectiveness for neurosurgical interventions by comparison with a dynamic scan-guided localization technique. The built-in scan guidance mode relied on successive interactive MRI scans. The additional advanced mode provided real-time navigation based on reformatted high-quality, intraoperatively acquired MR reference data, allowed multimodal image fusion, and used the successive scans of the built-in mode for quick verification of the position only. Analysis involved tumor resections and biopsies in either scan guidance (N = 36) or advanced mode (N = 59) by the same three neurosurgeons. Technical, surgical, and workflow aspects were compared. The image quality and hand-eye coordination of the advanced approach were improved. While the average extent of resection, neurologic outcome after functional MRI (fMRI) integration, and diagnostic yield appeared to be slightly better under advanced guidance, particularly for the main surgeon, statistical analysis revealed no significant differences. Resection times were comparable, while biopsies took around 30 minutes longer. The presented approach is safe and provides more detailed images and higher navigation speed at the expense of actuality. The surgical outcome achieved with advanced guidance is (at least) as good as that obtained with dynamic scan guidance. (c) 2006 Wiley-Liss, Inc.
Chamberlain, R S; Quinones, R; Dinndorf, P; Movassaghi, N; Goodstein, M; Newman, K
1995-03-01
A multi-modality approach combining surgery with aggressive chemotherapy and radiation is used to treat advanced neuroblastoma. Despite this treatment, children with advanced disease have a 20% 2-year survival rate. Controversy has developed regarding the efficacy of combining aggressive chemotherapy with repeated surgical intervention aimed at providing a complete surgical resection (CSR) of the primary tumor and metastatic sites. Several prospective and retrospective studies have provided conflicting reports regarding the benefit of this approach on overall survival. Therefore, we evaluated the efficacy of CSR versus partial surgical resection (PSR) using a strategy combining surgery with aggressive chemotherapy, radiation, and bone marrow transplantation (BMT) for stage IV neuroblastoma. A retrospective study was performed with review of the medical records of 52 consecutive children with neuroblastoma treated between 1985 and 1993. Twenty-eight of these 52 children presented with advanced disease, 24 of which had sufficient data to allow for analysis. All children were managed with protocols designed by the Children's Cancer Group (CCG). Statistical analysis was performed using Student's t test, chi 2 test, and Kaplan-Meier survival curves. Mean survival (35.1 months) and progression-free survival (29.1 months) for the CSR children was statistically superior to that of the PSR children (20.36 and 16.5 months, p = 0.04 and 0.04, respectively). Similar significance was demonstrated using life table analysis of mean and progression-free survival of these two groups (p = 0.05 and < 0.01, respectively). One-, 2-, and 3-year survival rates for the CSR versus the PSR group were 100%, 80%, and 40% versus 77%, 38%, and 15%, respectively. An analysis of the BMT group compared with those children treated with aggressive conventional therapy showed improvement in mean and progression-free survival. Aggressive surgical resection aimed at removing all gross disease is warranted for stage IV neuroblastoma. CSR is associated with prolonged mean and progression-free survival. BMT prolongs mean and progression-free survival in children with stage IV disease. These results suggest that CSR and BMT offer increased potential for long-term remission in children with advanced neuroblastoma.
Shitara, Kohei; Matsuo, Keitaro; Oze, Isao; Mizota, Ayako; Kondo, Chihiro; Nomura, Motoo; Yokota, Tomoya; Takahari, Daisuke; Ura, Takashi; Muro, Kei
2011-08-01
We performed a systematic review and meta-analysis to determine the impact of neutropenia or leukopenia experienced during chemotherapy on survival. Eligible studies included prospective or retrospective analyses that evaluated neutropenia or leukopenia as a prognostic factor for overall survival or disease-free survival. Statistical analyses were conducted to calculate a summary hazard ratio and 95% confidence interval (CI) using random-effects or fixed-effects models based on the heterogeneity of the included studies. Thirteen trials were selected for the meta-analysis, with a total of 9,528 patients. The hazard ratio of death was 0.69 (95% CI, 0.64-0.75) for patients with higher-grade neutropenia or leukopenia compared to patients with lower-grade or lack of cytopenia. Our analysis was also stratified by statistical method (any statistical method to decrease lead-time bias; time-varying analysis or landmark analysis), but no differences were observed. Our results indicate that neutropenia or leukopenia experienced during chemotherapy is associated with improved survival in patients with advanced cancer or hematological malignancies undergoing chemotherapy. Future prospective analyses designed to investigate the potential impact of chemotherapy dose adjustment coupled with monitoring of neutropenia or leukopenia on survival are warranted.
Public and patient involvement in quantitative health research: A statistical perspective.
Hannigan, Ailish
2018-06-19
The majority of studies included in recent reviews of impact for public and patient involvement (PPI) in health research had a qualitative design. PPI in solely quantitative designs is underexplored, particularly its impact on statistical analysis. Statisticians in practice have a long history of working in both consultative (indirect) and collaborative (direct) roles in health research, yet their perspective on PPI in quantitative health research has never been explicitly examined. To explore the potential and challenges of PPI from a statistical perspective at distinct stages of quantitative research, that is sampling, measurement and statistical analysis, distinguishing between indirect and direct PPI. Statistical analysis is underpinned by having a representative sample, and a collaborative or direct approach to PPI may help achieve that by supporting access to and increasing participation of under-represented groups in the population. Acknowledging and valuing the role of lay knowledge of the context in statistical analysis and in deciding what variables to measure may support collective learning and advance scientific understanding, as evidenced by the use of participatory modelling in other disciplines. A recurring issue for quantitative researchers, which reflects quantitative sampling methods, is the selection and required number of PPI contributors, and this requires further methodological development. Direct approaches to PPI in quantitative health research may potentially increase its impact, but the facilitation and partnership skills required may require further training for all stakeholders, including statisticians. © 2018 The Authors Health Expectations published by John Wiley & Sons Ltd.
Towards Precision Spectroscopy of Baryonic Resonances
NASA Astrophysics Data System (ADS)
Döring, Michael; Mai, Maxim; Rönchen, Deborah
2017-01-01
Recent progress in baryon spectroscopy is reviewed. In a common effort, various groups have analyzed a set of new high-precision polarization observables from ELSA. The Jülich-Bonn group has finalized the analysis of pion-induced meson-baryon production, the potoproduction of pions and eta mesons, and (almost) the KΛ final state. As data become preciser, statistical aspects in the analysis of excited baryons become increasingly relevant and several advances in this direction are proposed.
Towards precision spectroscopy of baryonic resonances
Doring, Michael; Mai, Maxim; Ronchen, Deborah
2017-01-26
Recent progress in baryon spectroscopy is reviewed. In a common effort, various groups have analyzed a set of new high-precision polarization observables from ELSA. The Julich-Bonn group has finalized the analysis of pion-induced meson-baryon production, the potoproduction of pions and eta mesons, and (almost) the KΛ final state. Lastly, as data become preciser, statistical aspects in the analysis of excited baryons become increasingly relevant and several advances in this direction are proposed.
Mediation analysis in nursing research: a methodological review
Liu, Jianghong; Ulrich, Connie
2017-01-01
Mediation statistical models help clarify the relationship between independent predictor variables and dependent outcomes of interest by assessing the impact of third variables. This type of statistical analysis is applicable for many clinical nursing research questions, yet its use within nursing remains low. Indeed, mediational analyses may help nurse researchers develop more effective and accurate prevention and treatment programs as well as help bridge the gap between scientific knowledge and clinical practice. In addition, this statistical approach allows nurse researchers to ask – and answer – more meaningful and nuanced questions that extend beyond merely determining whether an outcome occurs. Therefore, the goal of this paper is to provide a brief tutorial on the use of mediational analyses in clinical nursing research by briefly introducing the technique and, through selected empirical examples from the nursing literature, demonstrating its applicability in advancing nursing science. PMID:26176804
Webb-Vargas, Yenny; Chen, Shaojie; Fisher, Aaron; Mejia, Amanda; Xu, Yuting; Crainiceanu, Ciprian; Caffo, Brian; Lindquist, Martin A
2017-12-01
Big Data are of increasing importance in a variety of areas, especially in the biosciences. There is an emerging critical need for Big Data tools and methods, because of the potential impact of advancements in these areas. Importantly, statisticians and statistical thinking have a major role to play in creating meaningful progress in this arena. We would like to emphasize this point in this special issue, as it highlights both the dramatic need for statistical input for Big Data analysis and for a greater number of statisticians working on Big Data problems. We use the field of statistical neuroimaging to demonstrate these points. As such, this paper covers several applications and novel methodological developments of Big Data tools applied to neuroimaging data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gunn, Andrew J., E-mail: agunn@uabmc.edu; Sheth, Rahul A.; Luber, Brandon
2017-01-15
PurposeThe purpse of this study was to evaluate the ability of various radiologic response criteria to predict patient outcomes after trans-arterial chemo-embolization with drug-eluting beads (DEB-TACE) in patients with advanced-stage (BCLC C) hepatocellular carcinoma (HCC).Materials and methodsHospital records from 2005 to 2011 were retrospectively reviewed. Non-infiltrative lesions were measured at baseline and on follow-up scans after DEB-TACE according to various common radiologic response criteria, including guidelines of the World Health Organization (WHO), Response Evaluation Criteria in Solid Tumors (RECIST), the European Association for the Study of the Liver (EASL), and modified RECIST (mRECIST). Statistical analysis was performed to see which,more » if any, of the response criteria could be used as a predictor of overall survival (OS) or time-to-progression (TTP).Results75 patients met inclusion criteria. Median OS and TTP were 22.6 months (95 % CI 11.6–24.8) and 9.8 months (95 % CI 7.1–21.6), respectively. Univariate and multivariate Cox analyses revealed that none of the evaluated criteria had the ability to be used as a predictor for OS or TTP. Analysis of the C index in both univariate and multivariate models showed that the evaluated criteria were not accurate predictors of either OS (C-statistic range: 0.51–0.58 in the univariate model; range: 0.54–0.58 in the multivariate model) or TTP (C-statistic range: 0.55–0.59 in the univariate model; range: 0.57–0.61 in the multivariate model).ConclusionCurrent response criteria are not accurate predictors of OS or TTP in patients with advanced-stage HCC after DEB-TACE.« less
Gunn, Andrew J; Sheth, Rahul A; Luber, Brandon; Huynh, Minh-Huy; Rachamreddy, Niranjan R; Kalva, Sanjeeva P
2017-01-01
The purpse of this study was to evaluate the ability of various radiologic response criteria to predict patient outcomes after trans-arterial chemo-embolization with drug-eluting beads (DEB-TACE) in patients with advanced-stage (BCLC C) hepatocellular carcinoma (HCC). Hospital records from 2005 to 2011 were retrospectively reviewed. Non-infiltrative lesions were measured at baseline and on follow-up scans after DEB-TACE according to various common radiologic response criteria, including guidelines of the World Health Organization (WHO), Response Evaluation Criteria in Solid Tumors (RECIST), the European Association for the Study of the Liver (EASL), and modified RECIST (mRECIST). Statistical analysis was performed to see which, if any, of the response criteria could be used as a predictor of overall survival (OS) or time-to-progression (TTP). 75 patients met inclusion criteria. Median OS and TTP were 22.6 months (95 % CI 11.6-24.8) and 9.8 months (95 % CI 7.1-21.6), respectively. Univariate and multivariate Cox analyses revealed that none of the evaluated criteria had the ability to be used as a predictor for OS or TTP. Analysis of the C index in both univariate and multivariate models showed that the evaluated criteria were not accurate predictors of either OS (C-statistic range: 0.51-0.58 in the univariate model; range: 0.54-0.58 in the multivariate model) or TTP (C-statistic range: 0.55-0.59 in the univariate model; range: 0.57-0.61 in the multivariate model). Current response criteria are not accurate predictors of OS or TTP in patients with advanced-stage HCC after DEB-TACE.
Hiza, Elise A; Gottschalk, Michael B; Umpierrez, Erica; Bush, Patricia; Reisman, William M
2015-07-01
The objective of this study is to analyze the effect of an orthopaedic trauma advanced practice provider on length of stay (LOS) and cost in a level I trauma center. The hypothesis of this study is that the addition of a single full-time nurse practitioner (NP) to the orthopaedic trauma team at a level I Trauma center would decrease overall LOS and hospital cost. A retrospective chart review of all patients discharged from the orthopaedic surgery service 1 year before the addition of a NP (pre-NP) and 1 year after the hiring of a NP (post-NP) were reviewed. Chart review included age, gender, LOS, discharge destination, intravenous antibiotic use, wound VAC therapy, admission location, and length of time to surgery. Statistical analysis was performed using the Wilcoxon/Kruskal-Wallis test. The hiring of a NP yielded a statistically significant decrease in the LOS across the following patient subgroups: patients transferred from the trauma service (13.56 compared with 7.02 days, P < 0.001), patients aged 60 years and older (7.34 compared with 5.04 days, P = 0.037), patients discharged to a rehabilitation facility (10.84 compared with 8.31 days, P = 0.002), and patients discharged on antibiotics/wound VAC therapy (15.16 compared with 11.24 days, P = 0.017). Length of time to surgery was also decreased (1.48 compared with 1.31 days, P = 0.37). The addition of a dedicated orthopaedic trauma advanced practice provider at a county level I trauma center resulted in a statistically significant decrease in LOS and thus reduced indirect costs to the hospital. Economic Level IV. See Instructions for Authors for a complete description of levels of evidence.
Hosseinian, Banafsheh; Rubin, Marcie S; Clouston, Sean A P; Almaidhan, Asma; Shetye, Pradip R; Cutting, Court B; Grayson, Barry H
2018-01-01
To compare 3-dimensional nasal symmetry in patients with UCLP who had either rotation advancement alone or nasoalveolar molding (NAM) followed by rotation advancement in conjunction with primary nasal repair. Pilot retrospective cohort study. Nasal casts of 23 patients with UCLP from 2 institutions were analyzed; 12 in the rotation advancement only group (Iowa) and 11 in the NAM, rotation advancement with primary nasal repair group (New York). Casts from patients aged 6 to 18 years were scanned using the 3Shape scanner and 3-dimensional analysis of nasal symmetry performed using 3dMD Vultus software, Version 2507, 3dMD, Atlanta, GA. Cleft and noncleft side columellar height, nasal dome height, alar base width, and nasal projection were linearly measured. Inter- and intragroup analyses were performed using t tests and paired t tests as appropriate. A statistically significant difference in mean-scaled 3-dimensional asymmetry index was found between groups with group 1 having a larger measure of asymmetry (4.69 cm 3 ) than group 2 (2.56 cm 3 ; P = .02). Intergroup analysis performed on the most sensitive linear measure, alar base width, revealed significantly less asymmetry on average in group 2 than in group 1 ( P = .013). This study suggests the NAM followed by rotation advancement in conjunction with primary nasal repair approach may result in less nasal asymmetry compared to rotation advancement alone.
ERIC Educational Resources Information Center
Owens, Susan T.
2017-01-01
Technology is becoming an integral tool in the classroom and can make a positive impact on how the students learn. This quantitative comparative research study examined gender-based differences among secondary Advanced Placement (AP) Statistic students comparing Educational Testing Service (ETS) College Board AP Statistic examination scores…
Hutchinson, Marie; East, Leah; Stasa, Helen; Jackson, Debra
2014-01-01
Over recent decades, there has been considerable research and debate about essential features of advanced nursing practice and differences among various categories of advanced practice nurses. This study aimed to derive an integrative description of the defining characteristics of advanced practice nursing through a meta-summary of the existing literature. A three-phase approach involved (a) systematic review of the literature to identify the specific activities characterized as advanced practice nursing, (b) qualitative meta-summary of practice characteristics extracted from manuscripts meeting inclusion criteria; and (c) statistical analysis of domains across advanced practice categories and country in which the study was completed. A descriptive framework was distilled using qualitative and quantitative results. Fifty manuscripts met inclusion criteria and were retained for analysis. Seven domains of advanced nursing practice were identified: (a) autonomous or nurse-led extended clinical practice; (b) improving systems of care; (c) developing the practice of others; (d) developing/delivering educational programs/activities; (e) nursing research/scholarship; (f) leadership external to the organization; and (g) administering programs, budgets, and personnel. Domains were similar across categories of advanced nursing practice; the domain of developing/delivering educational programs/activities was more common in Australia than in the United States or United Kingdom. Similarity at the domain level was sufficient to suggest that advanced practice role categories are less distinct than often argued. There is merit in adopting a more integrated and consistent interpretation of advanced practice nursing.
Guo, Qiaojuan; Ren, Hui; Hu, Yanping; Xie, Tao
2016-01-01
Several studies have assessed the clinicopathological and prognostic value of cyclooxygenase-2 (COX-2) expression in patients with head and neck cancer (HNC), but their results remain controversial. To address this issue, a meta-analysis was carried out. A total of 29 studies involving 2430 patients were subjected to final analysis. Our results indicated that COX-2 expression was not statistically associated with advanced tumor stage (OR, 1.23; 95% CI, 0.98–1.55) but correlated with high risk of lymph node metastasis (OR, 1.28; 95% CI, 1.03–1.60) and advanced TNM stage (OR, 1.33; 95% CI, 1.06–1.66). Moreover, COX-2 expression had significant effect on poor OS (HR, 1.93; 95% CI, 1.29–2.90), RFS (HR, 2.02; 95% CI, 1.00–4.08) and DFS (HR, 5.14; 95% CI, 2.84–9.31). The results of subgroup analyses revealed that COX-2 expression was related with high possibility of lymph node metastasis in oral cancer (OR, 1.49; 95% CI, 1.01–2.20) and advanced TNM stage in oral cancer (OR, 1.58; 95% CI, 1.05–2.37) and no site-specific HNC (OR, 1.64; 95% CI, 1.02–2.62). However, subgroup analyses only showed a tendency without statistically significant association between COX-2 expression and survival. Significant heterogeneity was not found when analyzing clinicopathological data, but it appeared when considering survival data. No publication bias was detected in this study. This meta-analysis suggested that COX-2 expression could act as a prognostic factor for patients with HNC. PMID:27323811
Preliminary analysis of hot spot factors in an advanced reactor for space electric power systems
NASA Technical Reports Server (NTRS)
Lustig, P. H.; Holms, A. G.; Davison, H. W.
1973-01-01
The maximum fuel pin temperature for nominal operation in an advanced power reactor is 1370 K. Because of possible nitrogen embrittlement of the clad, the fuel temperature was limited to 1622 K. Assuming simultaneous occurrence of the most adverse conditions a deterministic analysis gave a maximum fuel temperature of 1610 K. A statistical analysis, using a synthesized estimate of the standard deviation for the highest fuel pin temperature, showed probabilities of 0.015 of that pin exceeding the temperature limit by the distribution free Chebyshev inequality and virtually nil assuming a normal distribution. The latter assumption gives a 1463 K maximum temperature at 3 standard deviations, the usually assumed cutoff. Further, the distribution and standard deviation of the fuel-clad gap are the most significant contributions to the uncertainty in the fuel temperature.
Re-analysis of survival data of cancer patients utilizing additive homeopathy.
Gleiss, Andreas; Frass, Michael; Gaertner, Katharina
2016-08-01
In this short communication we present a re-analysis of homeopathic patient data in comparison to control patient data from the same Outpatient´s Unit "Homeopathy in malignant diseases" of the Medical University of Vienna. In this analysis we took account of a probable immortal time bias. For patients suffering from advanced stages of cancer and surviving the first 6 or 12 months after diagnosis, respectively, the results show that utilizing homeopathy gives a statistically significant (p<0.001) advantage over control patients regarding survival time. In conclusion, bearing in mind all limitations, the results of this retrospective study suggest that patients with advanced stages of cancer might benefit from additional homeopathic treatment until a survival time of up to 12 months after diagnosis. Copyright © 2016 Elsevier Ltd. All rights reserved.
Advance directives in intensive care: Health professional competences.
Velasco-Sanz, T R; Rayón-Valpuesta, E
2016-04-01
To identify knowledge, skills and attitudes among physicians and nurses of adults' intensive care units (ICUs), referred to advance directives or living wills. A cross-sectional descriptive study was carried out. Nine hospitals in the Community of Madrid (Spain). Physicians and nurses of adults' intensive care. A qualitative Likert-type scale and multiple response survey were made. Knowledge, skills and attitudes about the advance directives. A descriptive statistical analysis based on percentages was made, with application of the chi-squared test for comparisons, accepting p < 0.05 as representing statistical significance. A total of 331 surveys were collected (51%). It was seen that 90.3% did not know all the measures envisaged by the advance directives. In turn, 50.2% claimed that the living wills are not respected, and 82.8% believed advance directives to be a useful tool for health professionals in the decision making process. A total of 85.3% the physicians stated that they would respect a living will, in cases of emergencies, compared to 66.2% of the nursing staff (p = 0.007). Lastly, only 19.1% of the physicians and 2.3% of the nursing staff knew whether their patients had advance directives (p < 0.001). Although health professionals displayed poor knowledge of advance directives, they had a favorable attitude toward their usefulness. However, most did not know whether their patients had a living will, and some professionals even failed to respect such instructions despite knowledge of the existence of advance directives. Improvements in health professional education in this field are needed. Copyright © 2015 Elsevier España, S.L.U. and SEMICYUC. All rights reserved.
Statistical ecology comes of age.
Gimenez, Olivier; Buckland, Stephen T; Morgan, Byron J T; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric
2014-12-01
The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1-4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data.
Statistical ecology comes of age
Gimenez, Olivier; Buckland, Stephen T.; Morgan, Byron J. T.; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M.; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M.; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric
2014-01-01
The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1–4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data. PMID:25540151
Quantification of Operational Risk Using A Data Mining
NASA Technical Reports Server (NTRS)
Perera, J. Sebastian
1999-01-01
What is Data Mining? - Data Mining is the process of finding actionable information hidden in raw data. - Data Mining helps find hidden patterns, trends, and important relationships often buried in a sea of data - Typically, automated software tools based on advanced statistical analysis and data modeling technology can be utilized to automate the data mining process
ERIC Educational Resources Information Center
Preacher, Kristopher J.; Kelley, Ken
2011-01-01
The statistical analysis of mediation effects has become an indispensable tool for helping scientists investigate processes thought to be causal. Yet, in spite of many recent advances in the estimation and testing of mediation effects, little attention has been given to methods for communicating effect size and the practical importance of those…
Attitudes toward Advanced and Multivariate Statistics When Using Computers.
ERIC Educational Resources Information Center
Kennedy, Robert L.; McCallister, Corliss Jean
This study investigated the attitudes toward statistics of graduate students who studied advanced statistics in a course in which the focus of instruction was the use of a computer program in class. The use of the program made it possible to provide an individualized, self-paced, student-centered, and activity-based course. The three sections…
ERIC Educational Resources Information Center
Perrett, Jamis J.
2012-01-01
This article demonstrates how textbooks differ in their description of the term "experimental unit". Advanced Placement Statistics teachers and students are often limited in their statistical knowledge by the information presented in their classroom textbook. Definitions and descriptions differ among textbooks as well as among different…
Computerized system for assessing heart rate variability.
Frigy, A; Incze, A; Brânzaniuc, E; Cotoi, S
1996-01-01
The principal theoretical, methodological and clinical aspects of heart rate variability (HRV) analysis are reviewed. This method has been developed over the last 10 years as a useful noninvasive method of measuring the activity of the autonomic nervous system. The main components and the functioning of the computerized rhythm-analyzer system developed by our team are presented. The system is able to perform short-term (maximum 20 minutes) time domain HRV analysis and statistical analysis of the ventricular rate in any rhythm, particularly in atrial fibrillation. The performances of our system are demonstrated by using the graphics (RR histograms, delta RR histograms, RR scattergrams) and the statistical parameters resulted from the processing of three ECG recordings. These recordings are obtained from a normal subject, from a patient with advanced heart failure, and from a patient with atrial fibrillation.
NASA Astrophysics Data System (ADS)
Zhang, Ying; Moges, Semu; Block, Paul
2018-01-01
Prediction of seasonal precipitation can provide actionable information to guide management of various sectoral activities. For instance, it is often translated into hydrological forecasts for better water resources management. However, many studies assume homogeneity in precipitation across an entire study region, which may prove ineffective for operational and local-level decisions, particularly for locations with high spatial variability. This study proposes advancing local-level seasonal precipitation predictions by first conditioning on regional-level predictions, as defined through objective cluster analysis, for western Ethiopia. To our knowledge, this is the first study predicting seasonal precipitation at high resolution in this region, where lives and livelihoods are vulnerable to precipitation variability given the high reliance on rain-fed agriculture and limited water resources infrastructure. The combination of objective cluster analysis, spatially high-resolution prediction of seasonal precipitation, and a modeling structure spanning statistical and dynamical approaches makes clear advances in prediction skill and resolution, as compared with previous studies. The statistical model improves versus the non-clustered case or dynamical models for a number of specific clusters in northwestern Ethiopia, with clusters having regional average correlation and ranked probability skill score (RPSS) values of up to 0.5 and 33 %, respectively. The general skill (after bias correction) of the two best-performing dynamical models over the entire study region is superior to that of the statistical models, although the dynamical models issue predictions at a lower resolution and the raw predictions require bias correction to guarantee comparable skills.
Analytic programming with FMRI data: a quick-start guide for statisticians using R.
Eloyan, Ani; Li, Shanshan; Muschelli, John; Pekar, Jim J; Mostofsky, Stewart H; Caffo, Brian S
2014-01-01
Functional magnetic resonance imaging (fMRI) is a thriving field that plays an important role in medical imaging analysis, biological and neuroscience research and practice. This manuscript gives a didactic introduction to the statistical analysis of fMRI data using the R project, along with the relevant R code. The goal is to give statisticians who would like to pursue research in this area a quick tutorial for programming with fMRI data. References of relevant packages and papers are provided for those interested in more advanced analysis.
The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bihn T. Pham; Jeffrey J. Einerson
2010-06-01
This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automatedmore » processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.« less
[Perimetric changes in advanced glaucoma].
Feraru, Crenguta Ioana; Pantalon, Anca
2011-01-01
The evaluation of various perimetric aspects in advanced glaucoma stages correlated to morpho-functional changes. MATHERIAL AND METHOD: Retrospective clinical trial over a 10 months time period that included patients with advanced glaucoma stages, for which there have been recorded several computerised visual field tests (central 24-2 strategy, 10-2 strategy with either III or V--Goldman stimulus spot size) along with other morpho-funtional ocular paramaters: VA, lOP optic disk analysis. We included in our study 56 eyes from 45 patients. In most cases 89% it was an open angle glaucoma (either primary or secondary) Mean visual acuity was 0.45 +/- 0.28. Regarding the perimetric deficit 83% had advanced deficit, 9% moderate and 8% early visual changes. As perimetric type of defect we found a majority with general reduction of sensitivity (33 eyes) + ring shape scotoma. In 6 eyes (10.7%) having left only a central isle of vision we performed the central 10-2 strategy with III or V Goldmann stimulus spot size. Statistic analysis showed scarce correlation between the visual acuity and the quantitative perimetric parameters (MD and PSD), and variance analysis found present a multiple correlation parameter p = 0.07 that proves there is no liniary correspondence between the morpho-functional parameters: VA-MD(PSD) and C/D ratio. In advanced glaucoma stages, the perimetric changes are mostly severe. Perimetric evaluation is essential in these stages and needs to be individualised.
Houts, Carrie R; Edwards, Michael C; Wirth, R J; Deal, Linda S
2016-11-01
There has been a notable increase in the advocacy of using small-sample designs as an initial quantitative assessment of item and scale performance during the scale development process. This is particularly true in the development of clinical outcome assessments (COAs), where Rasch analysis has been advanced as an appropriate statistical tool for evaluating the developing COAs using a small sample. We review the benefits such methods are purported to offer from both a practical and statistical standpoint and detail several problematic areas, including both practical and statistical theory concerns, with respect to the use of quantitative methods, including Rasch-consistent methods, with small samples. The feasibility of obtaining accurate information and the potential negative impacts of misusing large-sample statistical methods with small samples during COA development are discussed.
NASA Astrophysics Data System (ADS)
Skorobogatiy, Maksim; Sadasivan, Jayesh; Guerboukha, Hichem
2018-05-01
In this paper, we first discuss the main types of noise in a typical pump-probe system, and then focus specifically on terahertz time domain spectroscopy (THz-TDS) setups. We then introduce four statistical models for the noisy pulses obtained in such systems, and detail rigorous mathematical algorithms to de-noise such traces, find the proper averages and characterise various types of experimental noise. Finally, we perform a comparative analysis of the performance, advantages and limitations of the algorithms by testing them on the experimental data collected using a particular THz-TDS system available in our laboratories. We conclude that using advanced statistical models for trace averaging results in the fitting errors that are significantly smaller than those obtained when only a simple statistical average is used.
Statistical Approaches Used to Assess the Equity of Access to Food Outlets: A Systematic Review
Lamb, Karen E.; Thornton, Lukar E.; Cerin, Ester; Ball, Kylie
2015-01-01
Background Inequalities in eating behaviours are often linked to the types of food retailers accessible in neighbourhood environments. Numerous studies have aimed to identify if access to healthy and unhealthy food retailers is socioeconomically patterned across neighbourhoods, and thus a potential risk factor for dietary inequalities. Existing reviews have examined differences between methodologies, particularly focussing on neighbourhood and food outlet access measure definitions. However, no review has informatively discussed the suitability of the statistical methodologies employed; a key issue determining the validity of study findings. Our aim was to examine the suitability of statistical approaches adopted in these analyses. Methods Searches were conducted for articles published from 2000–2014. Eligible studies included objective measures of the neighbourhood food environment and neighbourhood-level socio-economic status, with a statistical analysis of the association between food outlet access and socio-economic status. Results Fifty-four papers were included. Outlet accessibility was typically defined as the distance to the nearest outlet from the neighbourhood centroid, or as the number of food outlets within a neighbourhood (or buffer). To assess if these measures were linked to neighbourhood disadvantage, common statistical methods included ANOVA, correlation, and Poisson or negative binomial regression. Although all studies involved spatial data, few considered spatial analysis techniques or spatial autocorrelation. Conclusions With advances in GIS software, sophisticated measures of neighbourhood outlet accessibility can be considered. However, approaches to statistical analysis often appear less sophisticated. Care should be taken to consider assumptions underlying the analysis and the possibility of spatially correlated residuals which could affect the results. PMID:29546115
Advanced Statistics for Exotic Animal Practitioners.
Hodsoll, John; Hellier, Jennifer M; Ryan, Elizabeth G
2017-09-01
Correlation and regression assess the association between 2 or more variables. This article reviews the core knowledge needed to understand these analyses, moving from visual analysis in scatter plots through correlation, simple and multiple linear regression, and logistic regression. Correlation estimates the strength and direction of a relationship between 2 variables. Regression can be considered more general and quantifies the numerical relationships between an outcome and 1 or multiple variables in terms of a best-fit line, allowing predictions to be made. Each technique is discussed with examples and the statistical assumptions underlying their correct application. Copyright © 2017 Elsevier Inc. All rights reserved.
Penco, Silvana; Buscema, Massimo; Patrosso, Maria Cristina; Marocchi, Alessandro; Grossi, Enzo
2008-05-30
Few genetic factors predisposing to the sporadic form of amyotrophic lateral sclerosis (ALS) have been identified, but the pathology itself seems to be a true multifactorial disease in which complex interactions between environmental and genetic susceptibility factors take place. The purpose of this study was to approach genetic data with an innovative statistical method such as artificial neural networks to identify a possible genetic background predisposing to the disease. A DNA multiarray panel was applied to genotype more than 60 polymorphisms within 35 genes selected from pathways of lipid and homocysteine metabolism, regulation of blood pressure, coagulation, inflammation, cellular adhesion and matrix integrity, in 54 sporadic ALS patients and 208 controls. Advanced intelligent systems based on novel coupling of artificial neural networks and evolutionary algorithms have been applied. The results obtained have been compared with those derived from the use of standard neural networks and classical statistical analysis Advanced intelligent systems based on novel coupling of artificial neural networks and evolutionary algorithms have been applied. The results obtained have been compared with those derived from the use of standard neural networks and classical statistical analysis. An unexpected discovery of a strong genetic background in sporadic ALS using a DNA multiarray panel and analytical processing of the data with advanced artificial neural networks was found. The predictive accuracy obtained with Linear Discriminant Analysis and Standard Artificial Neural Networks ranged from 70% to 79% (average 75.31%) and from 69.1 to 86.2% (average 76.6%) respectively. The corresponding value obtained with Advanced Intelligent Systems reached an average of 96.0% (range 94.4 to 97.6%). This latter approach allowed the identification of seven genetic variants essential to differentiate cases from controls: apolipoprotein E arg158cys; hepatic lipase -480 C/T; endothelial nitric oxide synthase 690 C/T and glu298asp; vitamin K-dependent coagulation factor seven arg353glu, glycoprotein Ia/IIa 873 G/A and E-selectin ser128arg. This study provides an alternative and reliable method to approach complex diseases. Indeed, the application of a novel artificial intelligence-based method offers a new insight into genetic markers of sporadic ALS pointing out the existence of a strong genetic background.
Zhu, Wensheng; Yuan, Ying; Zhang, Jingwen; Zhou, Fan; Knickmeyer, Rebecca C; Zhu, Hongtu
2017-02-01
The aim of this paper is to systematically evaluate a biased sampling issue associated with genome-wide association analysis (GWAS) of imaging phenotypes for most imaging genetic studies, including the Alzheimer's Disease Neuroimaging Initiative (ADNI). Specifically, the original sampling scheme of these imaging genetic studies is primarily the retrospective case-control design, whereas most existing statistical analyses of these studies ignore such sampling scheme by directly correlating imaging phenotypes (called the secondary traits) with genotype. Although it has been well documented in genetic epidemiology that ignoring the case-control sampling scheme can produce highly biased estimates, and subsequently lead to misleading results and suspicious associations, such findings are not well documented in imaging genetics. We use extensive simulations and a large-scale imaging genetic data analysis of the Alzheimer's Disease Neuroimaging Initiative (ADNI) data to evaluate the effects of the case-control sampling scheme on GWAS results based on some standard statistical methods, such as linear regression methods, while comparing it with several advanced statistical methods that appropriately adjust for the case-control sampling scheme. Copyright © 2016 Elsevier Inc. All rights reserved.
Imaging flow cytometry for phytoplankton analysis.
Dashkova, Veronika; Malashenkov, Dmitry; Poulton, Nicole; Vorobjev, Ivan; Barteneva, Natasha S
2017-01-01
This review highlights the concepts and instrumentation of imaging flow cytometry technology and in particular its use for phytoplankton analysis. Imaging flow cytometry, a hybrid technology combining speed and statistical capabilities of flow cytometry with imaging features of microscopy, is rapidly advancing as a cell imaging platform that overcomes many of the limitations of current techniques and contributed significantly to the advancement of phytoplankton analysis in recent years. This review presents the various instrumentation relevant to the field and currently used for assessment of complex phytoplankton communities' composition and abundance, size structure determination, biovolume estimation, detection of harmful algal bloom species, evaluation of viability and metabolic activity and other applications. Also we present our data on viability and metabolic assessment of Aphanizomenon sp. cyanobacteria using Imagestream X Mark II imaging cytometer. Herein, we highlight the immense potential of imaging flow cytometry for microalgal research, but also discuss limitations and future developments. Copyright © 2016 Elsevier Inc. All rights reserved.
Saadat, Mostafa; Khalili, Maryam; Omidvari, Shahpour; Ansari-Lari, Maryam
2011-03-28
The main aim of the present study was investigating the association between parental consanguinity and clinical response to chemotherapy in females affected with locally advanced breast cancer. A consecutive series of 92 patients were prospectively included in this study. Clinical assessment of treatment was accomplished by comparing initial tumor size with preoperative tumor size using revised RECIST guideline (version 1.1). Clinical response defined as complete response, partial response and no response. The Kaplan-Meier survival analysis were used to evaluate the association of parental marriages (first cousin vs unrelated marriages) and clinical response to chemotherapy (complete and partial response vs no response). Number of courses of chemotherapy was considered as time, in the analysis. Kaplan-Meier analysis revealed that offspring of unrelated marriages had poorer response to chemotherapy (log rank statistic=5.10, df=1, P=0.023). Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
[The role of meta-analysis in assessing the treatment of advanced non-small cell lung cancer].
Pérol, M; Pérol, D
2004-02-01
Meta-analysis is a statistical method allowing an evaluation of the direction and quantitative importance of a treatment effect observed in randomized trials which have tested the treatment but have not provided a definitive conclusion. In the present review, we discuss the methodology and the contribution of meta-analyses to the treatment of advanced-stage or metastatic non-small-cell lung cancer. In this area of cancerology, meta-analyses have provided determining information demonstrating the impact of chemotherapy on patient survival. They have also helped define a two-drug regimen based on cisplatin as the gold standard treatment for patients with a satisfactory general status. Recently, the meta-analysis method was used to measure the influence of gemcitabin in combination with platinium salts and demonstrated a small but significant benefit in survival, confirming that gemcitabin remains the gold standard treatment in combination with cisplatin.
Integration of Advanced Statistical Analysis Tools and Geophysical Modeling
2012-08-01
Carin Duke University Douglas Oldenburg University of British Columbia Stephen Billings Leonard Pasion Laurens Beran Sky Research...data processing for UXO discrimination is the time (or frequency) dependent dipole model (Bell and Barrow (2001), Pasion and Oldenburg (2001), Zhang...described by a bimodal distribution (i.e. two Gaussians, see Pasion (2007)). Data features are nonetheless useful when data quality is not sufficient
Monitoring and Evaluation: Statistical Support for Life-cycle Studies, Annual Report 2003.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skalski, John
2003-11-01
The ongoing mission of this project is the development of statistical tools for analyzing fisheries tagging data in the most precise and appropriate manner possible. This mission also includes providing statistical guidance on the best ways to design large-scale tagging studies. This mission continues because the technologies for conducting fish tagging studies continuously evolve. In just the last decade, fisheries biologists have seen the evolution from freeze-brands and coded wire tags (CWT) to passive integrated transponder (PIT) tags, balloon-tags, radiotelemetry, and now, acoustic-tags. With each advance, the technology holds the promise of more detailed and precise information. However, the technologymore » for analyzing and interpreting the data also becomes more complex as the tagging techniques become more sophisticated. The goal of the project is to develop the analytical tools in parallel with the technical advances in tagging studies, so that maximum information can be extracted on a timely basis. Associated with this mission is the transfer of these analytical capabilities to the field investigators to assure consistency and the highest levels of design and analysis throughout the fisheries community. Consequently, this project provides detailed technical assistance on the design and analysis of tagging studies to groups requesting assistance throughout the fisheries community. Ideally, each project and each investigator would invest in the statistical support needed for the successful completion of their study. However, this is an ideal that is rarely if every attained. Furthermore, there is only a small pool of highly trained scientists in this specialized area of tag analysis here in the Northwest. Project 198910700 provides the financial support to sustain this local expertise on the statistical theory of tag analysis at the University of Washington and make it available to the fisheries community. Piecemeal and fragmented support from various agencies and organizations would be incapable of maintaining a center of expertise. The mission of the project is to help assure tagging studies are designed and analyzed from the onset to extract the best available information using state-of-the-art statistical methods. The overarching goals of the project is to assure statistically sound survival studies so that fish managers can focus on the management implications of their findings and not be distracted by concerns whether the studies are statistically reliable or not. Specific goals and objectives of the study include the following: (1) Provide consistent application of statistical methodologies for survival estimation across all salmon life cycle stages to assure comparable performance measures and assessment of results through time, to maximize learning and adaptive management opportunities, and to improve and maintain the ability to responsibly evaluate the success of implemented Columbia River FWP salmonid mitigation programs and identify future mitigation options. (2) Improve analytical capabilities to conduct research on survival processes of wild and hatchery chinook and steelhead during smolt outmigration, to improve monitoring and evaluation capabilities and assist in-season river management to optimize operational and fish passage strategies to maximize survival. (3) Extend statistical support to estimate ocean survival and in-river survival of returning adults. Provide statistical guidance in implementing a river-wide adult PIT-tag detection capability. (4) Develop statistical methods for survival estimation for all potential users and make this information available through peer-reviewed publications, statistical software, and technology transfers to organizations such as NOAA Fisheries, the Fish Passage Center, US Fish and Wildlife Service, US Geological Survey (USGS), US Army Corps of Engineers (USACE), Public Utility Districts (PUDs), the Independent Scientific Advisory Board (ISAB), and other members of the Northwest fisheries community. (5) Provide and maintain statistical software for tag analysis and user support. (6) Provide improvements in statistical theory and software as requested by user groups. These improvements include extending software capabilities to address new research issues, adapting tagging techniques to new study designs, and extending the analysis capabilities to new technologies such as radio-tags and acoustic-tags.« less
Research in Computational Astrobiology
NASA Technical Reports Server (NTRS)
Chaban, Galina; Colombano, Silvano; Scargle, Jeff; New, Michael H.; Pohorille, Andrew; Wilson, Michael A.
2003-01-01
We report on several projects in the field of computational astrobiology, which is devoted to advancing our understanding of the origin, evolution and distribution of life in the Universe using theoretical and computational tools. Research projects included modifying existing computer simulation codes to use efficient, multiple time step algorithms, statistical methods for analysis of astrophysical data via optimal partitioning methods, electronic structure calculations on water-nuclei acid complexes, incorporation of structural information into genomic sequence analysis methods and calculations of shock-induced formation of polycylic aromatic hydrocarbon compounds.
Comparative study of smile analysis by subjective and computerized methods.
Basting, Roberta Tarkany; da Trindade, Rita de Cássia Silva; Flório, Flávia Martão
2006-01-01
This study compared: 1) the subjective analyses of a smile done by specialists with advanced training and by general dentists; 2) the subjective analysis of a smile, or that associated with the face, by specialists with advanced training and general dentists; 3) subjective analysis using a computerized analysis of the smile by specialists with advanced training, verifying the midline, labial line, smile line, the line between commissures and the golden proportion. The sample consisted of 100 adults with natural dentition; 200 photographs were taken (100 of the smile and 100 of the entire face). Computerized analysis using AutoCAD software was performed, together with the subjective analyses of 2 groups of professionals (3 general dentists and 3 specialists with advanced training), using the following assessment factors: the midline, labial line, smile line, line between the commissures and the golden proportion. The smile itself and the smile associated with the entire face were recorded as being agreeable or not agreeable by the professionals. The McNemar test showed a highly significant difference (p=0.0000) among the subjective analyses performed by specialists compared to general dentists. Between the 2 groups of dental professionals, there were highly significant differences (p=0.0000) found between the subjective analyses of the smile and that of the face. The McNemar test showed statistical differences in all factors assessed, with the exception of the midline (p=0.1951), when the computerized analysis and subjective analysis of the specialists were compared. In order to establish harmony of the smile, it was not possible to establish a greater or lesser relevance among the factors analyzed.
A Primer on Observational Measurement.
Girard, Jeffrey M; Cohn, Jeffrey F
2016-08-01
Observational measurement plays an integral role in a variety of scientific endeavors within biology, psychology, sociology, education, medicine, and marketing. The current article provides an interdisciplinary primer on observational measurement; in particular, it highlights recent advances in observational methodology and the challenges that accompany such growth. First, we detail the various types of instrument that can be used to standardize measurements across observers. Second, we argue for the importance of validity in observational measurement and provide several approaches to validation based on contemporary validity theory. Third, we outline the challenges currently faced by observational researchers pertaining to measurement drift, observer reactivity, reliability analysis, and time/expense. Fourth, we describe recent advances in computer-assisted measurement, fully automated measurement, and statistical data analysis. Finally, we identify several key directions for future observational research to explore.
Risk assessment model for development of advanced age-related macular degeneration.
Klein, Michael L; Francis, Peter J; Ferris, Frederick L; Hamon, Sara C; Clemons, Traci E
2011-12-01
To design a risk assessment model for development of advanced age-related macular degeneration (AMD) incorporating phenotypic, demographic, environmental, and genetic risk factors. We evaluated longitudinal data from 2846 participants in the Age-Related Eye Disease Study. At baseline, these individuals had all levels of AMD, ranging from none to unilateral advanced AMD (neovascular or geographic atrophy). Follow-up averaged 9.3 years. We performed a Cox proportional hazards analysis with demographic, environmental, phenotypic, and genetic covariates and constructed a risk assessment model for development of advanced AMD. Performance of the model was evaluated using the C statistic and the Brier score and externally validated in participants in the Complications of Age-Related Macular Degeneration Prevention Trial. The final model included the following independent variables: age, smoking history, family history of AMD (first-degree member), phenotype based on a modified Age-Related Eye Disease Study simple scale score, and genetic variants CFH Y402H and ARMS2 A69S. The model did well on performance measures, with very good discrimination (C statistic = 0.872) and excellent calibration and overall performance (Brier score at 5 years = 0.08). Successful external validation was performed, and a risk assessment tool was designed for use with or without the genetic component. We constructed a risk assessment model for development of advanced AMD. The model performed well on measures of discrimination, calibration, and overall performance and was successfully externally validated. This risk assessment tool is available for online use.
Haas, Magali; Stephenson, Diane; Romero, Klaus; Gordon, Mark Forrest; Zach, Neta; Geerts, Hugo
2016-09-01
Many disease-modifying clinical development programs in Alzheimer's disease (AD) have failed to date, and development of new and advanced preclinical models that generate actionable knowledge is desperately needed. This review reports on computer-based modeling and simulation approach as a powerful tool in AD research. Statistical data-analysis techniques can identify associations between certain data and phenotypes, such as diagnosis or disease progression. Other approaches integrate domain expertise in a formalized mathematical way to understand how specific components of pathology integrate into complex brain networks. Private-public partnerships focused on data sharing, causal inference and pathway-based analysis, crowdsourcing, and mechanism-based quantitative systems modeling represent successful real-world modeling examples with substantial impact on CNS diseases. Similar to other disease indications, successful real-world examples of advanced simulation can generate actionable support of drug discovery and development in AD, illustrating the value that can be generated for different stakeholders. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
[Development of an Excel spreadsheet for meta-analysis of indirect and mixed treatment comparisons].
Tobías, Aurelio; Catalá-López, Ferrán; Roqué, Marta
2014-01-01
Meta-analyses in clinical research usually aimed to evaluate treatment efficacy and safety in direct comparison with a unique comparator. Indirect comparisons, using the Bucher's method, can summarize primary data when information from direct comparisons is limited or nonexistent. Mixed comparisons allow combining estimates from direct and indirect comparisons, increasing statistical power. There is a need for simple applications for meta-analysis of indirect and mixed comparisons. These can easily be conducted using a Microsoft Office Excel spreadsheet. We developed a spreadsheet for indirect and mixed effects comparisons of friendly use for clinical researchers interested in systematic reviews, but non-familiarized with the use of more advanced statistical packages. The use of the proposed Excel spreadsheet for indirect and mixed comparisons can be of great use in clinical epidemiology to extend the knowledge provided by traditional meta-analysis when evidence from direct comparisons is limited or nonexistent.
NASA Astrophysics Data System (ADS)
Johnson, A. C.; Yeakley, A.
2009-12-01
Timberline forest advance associated with global climate change is occurring worldwide and is often associated with microsites. Microsites, controlled by topography, substrates, and plant cover, are localized regions dictating temperature, moisture, and solar radiation. These abiotic factors are integral to seedling survival. From a compilation of world-wide information on seedling regeneration on microsites at timberline, including our on-going research in the Pacific Northwest, we classified available literature into four microsite categories, related microsite category to annual precipitation, and used analysis of variance to detect statistical differences in microsite type and associated precipitation. We found statistical differences (p = 0.022) indicating the usefulness of understanding microsite/precipitation associations in detecting world-wide trends in timberline expansion. For example, wetter timberlines with downed wood, had regeneration associated with nurse logs, whereas on windy, drier landscapes, regeneration was typically associated with either leeward sides of tree clumps or on microsites protected from frost by overstory canopy. In our study of timberline expansion in the Pacific Northwest, we expect that such knowledge of microsite types associated with forest expansion will reveal a better understanding of mechanisms and rates of timberline forest advance during global warming.
Advances in Machine Learning and Data Mining for Astronomy
NASA Astrophysics Data System (ADS)
Way, Michael J.; Scargle, Jeffrey D.; Ali, Kamal M.; Srivastava, Ashok N.
2012-03-01
Advances in Machine Learning and Data Mining for Astronomy documents numerous successful collaborations among computer scientists, statisticians, and astronomers who illustrate the application of state-of-the-art machine learning and data mining techniques in astronomy. Due to the massive amount and complexity of data in most scientific disciplines, the material discussed in this text transcends traditional boundaries between various areas in the sciences and computer science. The book's introductory part provides context to issues in the astronomical sciences that are also important to health, social, and physical sciences, particularly probabilistic and statistical aspects of classification and cluster analysis. The next part describes a number of astrophysics case studies that leverage a range of machine learning and data mining technologies. In the last part, developers of algorithms and practitioners of machine learning and data mining show how these tools and techniques are used in astronomical applications. With contributions from leading astronomers and computer scientists, this book is a practical guide to many of the most important developments in machine learning, data mining, and statistics. It explores how these advances can solve current and future problems in astronomy and looks at how they could lead to the creation of entirely new algorithms within the data mining community.
Martin, Lisa; Watanabe, Sharon; Fainsinger, Robin; Lau, Francis; Ghosh, Sunita; Quan, Hue; Atkins, Marlis; Fassbender, Konrad; Downing, G Michael; Baracos, Vickie
2010-10-01
To determine whether elements of a standard nutritional screening assessment are independently prognostic of survival in patients with advanced cancer. A prospective nested cohort of patients with metastatic cancer were accrued from different units of a Regional Palliative Care Program. Patients completed a nutritional screen on admission. Data included age, sex, cancer site, height, weight history, dietary intake, 13 nutrition impact symptoms, and patient- and physician-reported performance status (PS). Univariate and multivariate survival analyses were conducted. Concordance statistics (c-statistics) were used to test the predictive accuracy of models based on training and validation sets; a c-statistic of 0.5 indicates the model predicts the outcome as well as chance; perfect prediction has a c-statistic of 1.0. A training set of patients in palliative home care (n = 1,164) was used to identify prognostic variables. Primary disease site, PS, short-term weight change (either gain or loss), dietary intake, and dysphagia predicted survival in multivariate analysis (P < .05). A model including only patients separated by disease site and PS with high c-statistics between predicted and observed responses for survival in the training set (0.90) and validation set (0.88; n = 603). The addition of weight change, dietary intake, and dysphagia did not further improve the c-statistic of the model. The c-statistic was also not altered by substituting physician-rated palliative PS for patient-reported PS. We demonstrate a high probability of concordance between predicted and observed survival for patients in distinct palliative care settings (home care, tertiary inpatient, ambulatory outpatient) based on patient-reported information.
Statistical Analysis of Time-Series from Monitoring of Active Volcanic Vents
NASA Astrophysics Data System (ADS)
Lachowycz, S.; Cosma, I.; Pyle, D. M.; Mather, T. A.; Rodgers, M.; Varley, N. R.
2016-12-01
Despite recent advances in the collection and analysis of time-series from volcano monitoring, and the resulting insights into volcanic processes, challenges remain in forecasting and interpreting activity from near real-time analysis of monitoring data. Statistical methods have potential to characterise the underlying structure and facilitate intercomparison of these time-series, and so inform interpretation of volcanic activity. We explore the utility of multiple statistical techniques that could be widely applicable to monitoring data, including Shannon entropy and detrended fluctuation analysis, by their application to various data streams from volcanic vents during periods of temporally variable activity. Each technique reveals changes through time in the structure of some of the data that were not apparent from conventional analysis. For example, we calculate the Shannon entropy (a measure of the randomness of a signal) of time-series from the recent dome-forming eruptions of Volcán de Colima (Mexico) and Soufrière Hills (Montserrat). The entropy of real-time seismic measurements and the count rate of certain volcano-seismic event types from both volcanoes is found to be temporally variable, with these data generally having higher entropy during periods of lava effusion and/or larger explosions. In some instances, the entropy shifts prior to or coincident with changes in seismic or eruptive activity, some of which were not clearly recognised by real-time monitoring. Comparison with other statistics demonstrates the sensitivity of the entropy to the data distribution, but that it is distinct from conventional statistical measures such as coefficient of variation. We conclude that each analysis technique examined could provide valuable insights for interpretation of diverse monitoring time-series.
Using Bayes' theorem for free energy calculations
NASA Astrophysics Data System (ADS)
Rogers, David M.
Statistical mechanics is fundamentally based on calculating the probabilities of molecular-scale events. Although Bayes' theorem has generally been recognized as providing key guiding principals for setup and analysis of statistical experiments [83], classical frequentist models still predominate in the world of computational experimentation. As a starting point for widespread application of Bayesian methods in statistical mechanics, we investigate the central quantity of free energies from this perspective. This dissertation thus reviews the basics of Bayes' view of probability theory, and the maximum entropy formulation of statistical mechanics before providing examples of its application to several advanced research areas. We first apply Bayes' theorem to a multinomial counting problem in order to determine inner shell and hard sphere solvation free energy components of Quasi-Chemical Theory [140]. We proceed to consider the general problem of free energy calculations from samples of interaction energy distributions. From there, we turn to spline-based estimation of the potential of mean force [142], and empirical modeling of observed dynamics using integrator matching. The results of this research are expected to advance the state of the art in coarse-graining methods, as they allow a systematic connection from high-resolution (atomic) to low-resolution (coarse) structure and dynamics. In total, our work on these problems constitutes a critical starting point for further application of Bayes' theorem in all areas of statistical mechanics. It is hoped that the understanding so gained will allow for improvements in comparisons between theory and experiment.
Narayanan, Sarath Kumar; Cohen, Ralph Clinton; Shun, Albert
2014-06-01
Minimal access techniques have transformed the way pediatric surgery is practiced. Due to various constraints, surgical residency programs have not been able to tutor adequate training skills in the routine setting. The advent of new technology and methods in minimally invasive surgery (MIS), has similarly contributed to the need for systematic skills' training in a safe, simulated environment. To enable the training of the proper technique among pediatric surgery trainees, we have advanced a porcine non-survival model for endoscopic surgery. The technical advancements over the past 3 years and a subjective validation of the porcine model from 114 participating trainees using a standard questionnaire and a 5-point Likert scale have been described here. Mean attitude scores and analysis of variance (ANOVA) were used for statistical analysis of the data. Almost all trainees agreed or strongly agreed that the animal-based model was appropriate (98.35%) and also acknowledged that such workshops provided adequate practical experience before attempting on human subjects (96.6%). Mean attitude score for respondents was 19.08 (SD 3.4, range 4-20). Attitude scores showed no statistical association with years of experience or the level of seniority, indicating a positive attitude among all groups of respondents. Structured porcine-based MIS training should be an integral part of skill acquisition for pediatric surgery trainees and the experience gained can be transferred into clinical practice. We advocate that laparoscopic training should begin in a controlled workshop setting before procedures are attempted on human patients.
MetaGenyo: a web tool for meta-analysis of genetic association studies.
Martorell-Marugan, Jordi; Toro-Dominguez, Daniel; Alarcon-Riquelme, Marta E; Carmona-Saez, Pedro
2017-12-16
Genetic association studies (GAS) aims to evaluate the association between genetic variants and phenotypes. In the last few years, the number of this type of study has increased exponentially, but the results are not always reproducible due to experimental designs, low sample sizes and other methodological errors. In this field, meta-analysis techniques are becoming very popular tools to combine results across studies to increase statistical power and to resolve discrepancies in genetic association studies. A meta-analysis summarizes research findings, increases statistical power and enables the identification of genuine associations between genotypes and phenotypes. Meta-analysis techniques are increasingly used in GAS, but it is also increasing the amount of published meta-analysis containing different errors. Although there are several software packages that implement meta-analysis, none of them are specifically designed for genetic association studies and in most cases their use requires advanced programming or scripting expertise. We have developed MetaGenyo, a web tool for meta-analysis in GAS. MetaGenyo implements a complete and comprehensive workflow that can be executed in an easy-to-use environment without programming knowledge. MetaGenyo has been developed to guide users through the main steps of a GAS meta-analysis, covering Hardy-Weinberg test, statistical association for different genetic models, analysis of heterogeneity, testing for publication bias, subgroup analysis and robustness testing of the results. MetaGenyo is a useful tool to conduct comprehensive genetic association meta-analysis. The application is freely available at http://bioinfo.genyo.es/metagenyo/ .
Validating Coherence Measurements Using Aligned and Unaligned Coherence Functions
NASA Technical Reports Server (NTRS)
Miles, Jeffrey Hilton
2006-01-01
This paper describes a novel approach based on the use of coherence functions and statistical theory for sensor validation in a harsh environment. By the use of aligned and unaligned coherence functions and statistical theory one can test for sensor degradation, total sensor failure or changes in the signal. This advanced diagnostic approach and the novel data processing methodology discussed provides a single number that conveys this information. This number as calculated with standard statistical procedures for comparing the means of two distributions is compared with results obtained using Yuen's robust statistical method to create confidence intervals. Examination of experimental data from Kulite pressure transducers mounted in a Pratt & Whitney PW4098 combustor using spectrum analysis methods on aligned and unaligned time histories has verified the effectiveness of the proposed method. All the procedures produce good results which demonstrates how robust the technique is.
Integration of Advanced Statistical Analysis Tools and Geophysical Modeling
2010-12-01
Carin Duke University Douglas Oldenburg University of British Columbia Stephen Billings, Leonard Pasion Laurens Beran Sky Research...means and covariances estimated for each class [5]. For this study, dipole polarizabilities were fit with a Pasion -Oldenburg parameterization of 8 −1...model for unexploded ordnance classification with EMI data,” IEEE Geosci. Remote Sensing Letters, vol. 4, pp. 629–633, 2007. [4] L. R. Pasion
Methods for estimating drought streamflow probabilities for Virginia streams
Austin, Samuel H.
2014-01-01
Maximum likelihood logistic regression model equations used to estimate drought flow probabilities for Virginia streams are presented for 259 hydrologic basins in Virginia. Winter streamflows were used to estimate the likelihood of streamflows during the subsequent drought-prone summer months. The maximum likelihood logistic regression models identify probable streamflows from 5 to 8 months in advance. More than 5 million streamflow daily values collected over the period of record (January 1, 1900 through May 16, 2012) were compiled and analyzed over a minimum 10-year (maximum 112-year) period of record. The analysis yielded the 46,704 equations with statistically significant fit statistics and parameter ranges published in two tables in this report. These model equations produce summer month (July, August, and September) drought flow threshold probabilities as a function of streamflows during the previous winter months (November, December, January, and February). Example calculations are provided, demonstrating how to use the equations to estimate probable streamflows as much as 8 months in advance.
Sikirzhytskaya, Aliaksandra; Sikirzhytski, Vitali; Lednev, Igor K
2014-01-01
Body fluids are a common and important type of forensic evidence. In particular, the identification of menstrual blood stains is often a key step during the investigation of rape cases. Here, we report on the application of near-infrared Raman microspectroscopy for differentiating menstrual blood from peripheral blood. We observed that the menstrual and peripheral blood samples have similar but distinct Raman spectra. Advanced statistical analysis of the multiple Raman spectra that were automatically (Raman mapping) acquired from the 40 dried blood stains (20 donors for each group) allowed us to build classification model with maximum (100%) sensitivity and specificity. We also demonstrated that despite certain common constituents, menstrual blood can be readily distinguished from vaginal fluid. All of the classification models were verified using cross-validation methods. The proposed method overcomes the problems associated with currently used biochemical methods, which are destructive, time consuming and expensive. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Bookstein, Fred L.
1995-08-01
Recent advances in computational geometry have greatly extended the range of neuroanatomical questions that can be approached by rigorous quantitative methods. One of the major current challenges in this area is to describe the variability of human cortical surface form and its implications for individual differences in neurophysiological functioning. Existing techniques for representation of stochastically invaginated surfaces do not conduce to the necessary parametric statistical summaries. In this paper, following a hint from David Van Essen and Heather Drury, I sketch a statistical method customized for the constraints of this complex data type. Cortical surface form is represented by its Riemannian metric tensor and averaged according to parameters of a smooth averaged surface. Sulci are represented by integral trajectories of the smaller principal strains of this metric, and their statistics follow the statistics of that relative metric. The diagrams visualizing this tensor analysis look like alligator leather but summarize all aspects of cortical surface form in between the principal sulci, the reliable ones; no flattening is required.
Colonic diverticulosis is not a risk factor for colonic adenoma.
Hong, Wandong; Dong, Lemei; Zippi, Maddalena; Stock, Simon; Geng, Wujun; Xu, Chunfang; Zhou, Mengtao
2018-01-01
Colonic diverticulosis may represent a risk factor for colonic adenomas by virtue of the fact that evolving data suggest that these 2 conditions may share common risk factors such as Western dietary pattern and physical inactivity. This study aims to investigate the association between colonic diverticulosis and colonic adenomas in mainland China. We conducted a cross-sectional study on patients who underwent colonoscopic examination between October 2013 and December 2014 in a university hospital in mainland China. Age, gender, colonic adenomas, advanced adenomas, and distribution of diverticulosis were recorded during the procedures. Multivariate logistic regression and stratified analysis were used to evaluate the associations between the prevalence of diverticulosis and age, sex, and presence of colonic adenomas and advanced adenomas. A total of 17,456 subjects were enrolled. The prevalence of colonic diverticulosis and adenoma was 2.4% and 13.2%, respectively. With regard to distribution of diverticula, most (365/424, 86.1%) were right-sided. Multiple logistic regression analysis suggested that age and male gender were independent risk factors for adenoma and advanced adenoma. There was no relationship between diverticulosis or location of diverticulosis and presence of adenoma and advanced adenoma adjusting by age and gender. In a stratified analysis according to age and gender, similar results were also noted. There was no statistical relationship between diverticulosis and the risk of adenoma and advanced adenoma. Our results may not be generalized to the Western population due to the fact that left-sided diverticular cases were very small in our study.
ERIC Educational Resources Information Center
Potter, James Thomson, III
2012-01-01
Research into teaching practices and strategies has been performed separately in AP Statistics and in K-12 online learning (Garfield, 2002; Ferdig, DiPietro, Black & Dawson, 2009). This study seeks combine the two and build on the need for more investigation into online teaching and learning in specific content (Ferdig et al, 2009; DiPietro,…
Using data warehousing and OLAP in public health care.
Hristovski, D; Rogac, M; Markota, M
2000-01-01
The paper describes the possibilities of using data warehousing and OLAP technologies in public health care in general and then our own experience with these technologies gained during the implementation of a data warehouse of outpatient data at the national level. Such a data warehouse serves as a basis for advanced decision support systems based on statistical, OLAP or data mining methods. We used OLAP to enable interactive exploration and analysis of the data. We found out that data warehousing and OLAP are suitable for the domain of public health and that they enable new analytical possibilities in addition to the traditional statistical approaches.
Using data warehousing and OLAP in public health care.
Hristovski, D.; Rogac, M.; Markota, M.
2000-01-01
The paper describes the possibilities of using data warehousing and OLAP technologies in public health care in general and then our own experience with these technologies gained during the implementation of a data warehouse of outpatient data at the national level. Such a data warehouse serves as a basis for advanced decision support systems based on statistical, OLAP or data mining methods. We used OLAP to enable interactive exploration and analysis of the data. We found out that data warehousing and OLAP are suitable for the domain of public health and that they enable new analytical possibilities in addition to the traditional statistical approaches. PMID:11079907
The clinical value of large neuroimaging data sets in Alzheimer's disease.
Toga, Arthur W
2012-02-01
Rapid advances in neuroimaging and cyberinfrastructure technologies have brought explosive growth in the Web-based warehousing, availability, and accessibility of imaging data on a variety of neurodegenerative and neuropsychiatric disorders and conditions. There has been a prolific development and emergence of complex computational infrastructures that serve as repositories of databases and provide critical functionalities such as sophisticated image analysis algorithm pipelines and powerful three-dimensional visualization and statistical tools. The statistical and operational advantages of collaborative, distributed team science in the form of multisite consortia push this approach in a diverse range of population-based investigations. Copyright © 2012 Elsevier Inc. All rights reserved.
Su, Hengchuan; Wang, Hongkai; Shi, Guohai; Zhang, Hailiang; Sun, Fukang; Ye, Dingwei
2018-06-01
In order to identify potential novel biomarkers of advanced clear cell renal cell carcinoma (ccRCC), we re-evaluated published long non-coding RNA (lncRNA) expression profiling data. The lncRNA expression profiles in ccRCC microarray dataset GSE47352 were analyzed and an independent cohort of 61 clinical samples including 21 advanced and 40 localized ccRCC patients was used to confirm the most statistically significant lncRNAs by real time PCR. Next, the relationships between the selected lncRNAs and ccRCC patients' clinicopathological features were investigated. The effects of LncRNAs on the invasion and proliferation of renal carcinoma cells were also investigated. The PCR results in a cohort of 21 advanced ccRCC and 40 localized ccRCC tissues were used for confirmation of the selected lncRNAs which were statistically most significant. The PCR results showed that the expression of three LncRNA (ENSG00000241684, ENSG00000231721 and NEAT1) were significantly downregulated in advanced ccRCC. Kaplan-Meier analysis revealed that reduced expression of LncRNA ENSG00000241684 and NEAT1 were significantly associated with poor overall survival. The univariate and multivariate Cox regression indicated LncRNA ENSG00000241684 had significant hazard ratios for predicting clinical outcome. LncRNA ENSG00000241684 expression was negatively correlated with pTNM stage. Overexpression of ENSG00000241684 significantly impaired cell proliferation and reduced the invasion ability in 786-O and ACHN cells. lncRNAs are involved in renal carcinogenesis and decreased lncRNA ENSG00000241684 expression may be an independent adverse prognostic factor in advanced ccRCC patients. Copyright © 2018 Elsevier Ltd, BASO ~ The Association for Cancer Surgery, and the European Society of Surgical Oncology. All rights reserved.
Power transfer systems for future navy helicopters. Final report 25 Jun 70--28 Jun 72
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bossler, R.B. Jr.
1972-11-01
The purpose of this program was to conduct an analysis of helicopter power transfer systems (pts), both conventional and advanced concept type, with the objective of reducing specific weights and improving reliability beyond present values. The analysis satisfied requirements specified for a 200,000 pound cargo transport helicopter (CTH), a 70,000 pound heavy assault helicopter, and a 15,000 pound non-combat search and rescue helicopter. Four selected gearing systems (out of seven studied), optimized for lightest weight and equal reliability for the CTH, using component proportioning via stress and stiffness equations, had no significant difference between their aircraft payloads. All optimized ptsmore » were approximately 70% of statistically predicted weight. Reliability increase is predicted via gearbox derating using Weibull relationships. Among advanced concepts, the Turbine Integrated Geared Rotor was competitive for weight, technology availability and reliability increase but handicapped by a special engine requirement. The warm cycle system was found not competitive. Helicopter parametric weight analysis is shown. Advanced development Plans are presented for the pts for the CTH, including total pts system, selected pts components, and scale model flight testing in a Kaman HH2 helicopter.« less
Development of new on-line statistical program for the Korean Society for Radiation Oncology
Song, Si Yeol; Ahn, Seung Do; Chung, Weon Kuu; Choi, Eun Kyung; Cho, Kwan Ho
2015-01-01
Purpose To develop new on-line statistical program for the Korean Society for Radiation Oncology (KOSRO) to collect and extract medical data in radiation oncology more efficiently. Materials and Methods The statistical program is a web-based program. The directory was placed in a sub-folder of the homepage of KOSRO and its web address is http://www.kosro.or.kr/asda. The operating systems server is Linux and the webserver is the Apache HTTP server. For database (DB) server, MySQL is adopted and dedicated scripting language is the PHP. Each ID and password are controlled independently and all screen pages for data input or analysis are made to be friendly to users. Scroll-down menu is actively used for the convenience of user and the consistence of data analysis. Results Year of data is one of top categories and main topics include human resource, equipment, clinical statistics, specialized treatment and research achievement. Each topic or category has several subcategorized topics. Real-time on-line report of analysis is produced immediately after entering each data and the administrator is able to monitor status of data input of each hospital. Backup of data as spread sheets can be accessed by the administrator and be used for academic works by any members of the KOSRO. Conclusion The new on-line statistical program was developed to collect data from nationwide departments of radiation oncology. Intuitive screen and consistent input structure are expected to promote entering data of member hospitals and annual statistics should be a cornerstone of advance in radiation oncology. PMID:26157684
Development of new on-line statistical program for the Korean Society for Radiation Oncology.
Song, Si Yeol; Ahn, Seung Do; Chung, Weon Kuu; Shin, Kyung Hwan; Choi, Eun Kyung; Cho, Kwan Ho
2015-06-01
To develop new on-line statistical program for the Korean Society for Radiation Oncology (KOSRO) to collect and extract medical data in radiation oncology more efficiently. The statistical program is a web-based program. The directory was placed in a sub-folder of the homepage of KOSRO and its web address is http://www.kosro.or.kr/asda. The operating systems server is Linux and the webserver is the Apache HTTP server. For database (DB) server, MySQL is adopted and dedicated scripting language is the PHP. Each ID and password are controlled independently and all screen pages for data input or analysis are made to be friendly to users. Scroll-down menu is actively used for the convenience of user and the consistence of data analysis. Year of data is one of top categories and main topics include human resource, equipment, clinical statistics, specialized treatment and research achievement. Each topic or category has several subcategorized topics. Real-time on-line report of analysis is produced immediately after entering each data and the administrator is able to monitor status of data input of each hospital. Backup of data as spread sheets can be accessed by the administrator and be used for academic works by any members of the KOSRO. The new on-line statistical program was developed to collect data from nationwide departments of radiation oncology. Intuitive screen and consistent input structure are expected to promote entering data of member hospitals and annual statistics should be a cornerstone of advance in radiation oncology.
Sonpavde, Guru; Pond, Gregory R.; Fougeray, Ronan; Choueiri, Toni K.; Qu, Angela Q.; Vaughn, David J.; Niegisch, Guenter; Albers, Peter; James, Nicholas D.; Wong, Yu-Ning; Ko, Yoo-Joung; Sridhar, Srikala S.; Galsky, Matthew D.; Petrylak, Daniel P.; Vaishampayan, Ulka N.; Khan, Awais; Vogelzang, Nicholas J.; Beer, Tomasz M.; Stadler, Walter M.; O’Donnell, Peter H.; Sternberg, Cora N.; Rosenberg, Jonathan E.; Bellmunt, Joaquim
2014-01-01
Background Outcomes for patients in the second-line setting of advanced urothelial carcinoma (UC) are dismal. The recognized prognostic factors in this context are Eastern Cooperative Oncology Group (ECOG) performance status (PS) >0, hemoglobin level (Hb) <10 g/dl, and liver metastasis (LM). Objectives The purpose of this retrospective study of prospective trials was to investigate the prognostic value of time from prior chemotherapy (TFPC) independent of known prognostic factors. Design, setting, and participants: Data from patients from seven prospective trials with available baseline TFPC, Hb, PS, and LM values were used for retrospective analysis (n = 570). External validation was conducted in a second-line phase 3 trial comparing best supportive care (BSC) versus vinflunine plus BSC (n = 352). Outcome measurements and statistical analysis Cox proportional hazards regression was used to evaluate the association of factors, with overall survival (OS) and progression-free survival (PFS) being the respective primary and secondary outcome measures. Results and limitations ECOG-PS >0, LM, Hb <10 g/dl, and shorter TFPC were significant prognostic factors for OS and PFS on multivariable analysis. Patients with zero, one, two, and three to four factors demonstrated median OS of 12.2, 6.7, 5.1, and 3.0 mo, respectively (concordance statistic = 0.638). Setting of prior chemotherapy (metastatic disease vs perioperative) and prior platinum agent (cisplatin or carboplatin) were not prognostic factors. External validation demonstrated a significant association of TFPC with PFS on univariable and most multivariable analyses, and with OS on univariable analyses. Limitations of retrospective analyses are applicable. Conclusions Shorter TFPC enhances prognostic classification independent of ECOG-PS>0, Hb<10 g/ dl, and LM in the setting of second-line therapy for advanced UC. These data may facilitate drug development and interpretation of trials. PMID:23206856
NASA Technical Reports Server (NTRS)
Simon, William E.; Li, Ku-Yen; Yaws, Carl L.; Mei, Harry T.; Nguyen, Vinh D.; Chu, Hsing-Wei
1994-01-01
A methyl acetate reactor was developed to perform a subscale kinetic investigation in the design and optimization of a full-scale metabolic simulator for long term testing of life support systems. Other tasks in support of the closed ecological life support system test program included: (1) heating, ventilation and air conditioning analysis of a variable pressure growth chamber, (2) experimental design for statistical analysis of plant crops, (3) resource recovery for closed life support systems, and (4) development of data acquisition software for automating an environmental growth chamber.
Ground-Based Telescope Parametric Cost Model
NASA Technical Reports Server (NTRS)
Stahl, H. Philip; Rowell, Ginger Holmes
2004-01-01
A parametric cost model for ground-based telescopes is developed using multi-variable statistical analysis, The model includes both engineering and performance parameters. While diameter continues to be the dominant cost driver, other significant factors include primary mirror radius of curvature and diffraction limited wavelength. The model includes an explicit factor for primary mirror segmentation and/or duplication (i.e.. multi-telescope phased-array systems). Additionally, single variable models based on aperture diameter are derived. This analysis indicates that recent mirror technology advances have indeed reduced the historical telescope cost curve.
Zhang, L L; Cao, F F; Wang, Y; Meng, F L; Zhang, Y; Zhong, D S; Zhou, Q H
2015-05-01
The application of newer signaling pathway-targeted agents has become an important addition to chemotherapy in the treatment of advanced non-small cell lung cancer (NSCLC). In this study, we evaluated the efficacy and toxicities of PKC inhibitors combined with chemotherapy versus chemotherapy alone for patients with advanced NSCLC systematically. Literature retrieval, trials selection and assessment, data collection, and statistic analysis were performed according to the Cochrane Handbook 5.1.0. The outcome measures were tumor response rate, disease control rate, progression-free survival (PFS), overall survival (OS), and adverse effects. Five randomized controlled trials, comprising totally 1,005 patients, were included in this study. Meta-analysis showed significantly decreased response rate (RR 0.79; 95 % CI 0.64-0.99) and disease control rate (RR 0.90; 95 % CI 0.82-0.99) in PKC inhibitors-chemotherapy groups versus chemotherapy groups. There was no significant difference between the two treatment groups regarding progression-free survival (PFS, HR 1.05; 95 % CI 0.91-1.22) and overall survival (OS, HR 1.00; 95 % CI 0.86-1.16). The risk of grade 3/4 neutropenia, leucopenia, and thrombosis/embolism increased significantly in PKC inhibitors combination groups as compared with chemotherapy alone groups. The use of PKC inhibitors in addition to chemotherapy was not a valid alternative for patients with advanced NSCLC.
Statistical Tests of Reliability of NDE
NASA Technical Reports Server (NTRS)
Baaklini, George Y.; Klima, Stanley J.; Roth, Don J.; Kiser, James D.
1987-01-01
Capabilities of advanced material-testing techniques analyzed. Collection of four reports illustrates statistical method for characterizing flaw-detecting capabilities of sophisticated nondestructive evaluation (NDE). Method used to determine reliability of several state-of-the-art NDE techniques for detecting failure-causing flaws in advanced ceramic materials considered for use in automobiles, airplanes, and space vehicles.
The Advanced Statistical Trajectory Regional Air Pollution (ASTRAP) model simulates long-term transport and deposition of oxides of and nitrogen. t is a potential screening tool for assessing long-term effects on regional visibility from sulfur emission sources. owever, a rigorou...
Improved Test Planning and Analysis Through the Use of Advanced Statistical Methods
NASA Technical Reports Server (NTRS)
Green, Lawrence L.; Maxwell, Katherine A.; Glass, David E.; Vaughn, Wallace L.; Barger, Weston; Cook, Mylan
2016-01-01
The goal of this work is, through computational simulations, to provide statistically-based evidence to convince the testing community that a distributed testing approach is superior to a clustered testing approach for most situations. For clustered testing, numerous, repeated test points are acquired at a limited number of test conditions. For distributed testing, only one or a few test points are requested at many different conditions. The statistical techniques of Analysis of Variance (ANOVA), Design of Experiments (DOE) and Response Surface Methods (RSM) are applied to enable distributed test planning, data analysis and test augmentation. The D-Optimal class of DOE is used to plan an optimally efficient single- and multi-factor test. The resulting simulated test data are analyzed via ANOVA and a parametric model is constructed using RSM. Finally, ANOVA can be used to plan a second round of testing to augment the existing data set with new data points. The use of these techniques is demonstrated through several illustrative examples. To date, many thousands of comparisons have been performed and the results strongly support the conclusion that the distributed testing approach outperforms the clustered testing approach.
NASA Astrophysics Data System (ADS)
Donges, Jonathan F.; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik V.; Marwan, Norbert; Dijkstra, Henk A.; Kurths, Jürgen
2015-11-01
We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology.
Analysis of S-box in Image Encryption Using Root Mean Square Error Method
NASA Astrophysics Data System (ADS)
Hussain, Iqtadar; Shah, Tariq; Gondal, Muhammad Asif; Mahmood, Hasan
2012-07-01
The use of substitution boxes (S-boxes) in encryption applications has proven to be an effective nonlinear component in creating confusion and randomness. The S-box is evolving and many variants appear in literature, which include advanced encryption standard (AES) S-box, affine power affine (APA) S-box, Skipjack S-box, Gray S-box, Lui J S-box, residue prime number S-box, Xyi S-box, and S8 S-box. These S-boxes have algebraic and statistical properties which distinguish them from each other in terms of encryption strength. In some circumstances, the parameters from algebraic and statistical analysis yield results which do not provide clear evidence in distinguishing an S-box for an application to a particular set of data. In image encryption applications, the use of S-boxes needs special care because the visual analysis and perception of a viewer can sometimes identify artifacts embedded in the image. In addition to existing algebraic and statistical analysis already used for image encryption applications, we propose an application of root mean square error technique, which further elaborates the results and enables the analyst to vividly distinguish between the performances of various S-boxes. While the use of the root mean square error analysis in statistics has proven to be effective in determining the difference in original data and the processed data, its use in image encryption has shown promising results in estimating the strength of the encryption method. In this paper, we show the application of the root mean square error analysis to S-box image encryption. The parameters from this analysis are used in determining the strength of S-boxes
NASA Astrophysics Data System (ADS)
Noda, Isao
2014-07-01
A comprehensive survey review of new and noteworthy developments, which are advancing forward the frontiers in the field of 2D correlation spectroscopy during the last four years, is compiled. This review covers books, proceedings, and review articles published on 2D correlation spectroscopy, a number of significant conceptual developments in the field, data pretreatment methods and other pertinent topics, as well as patent and publication trends and citation activities. Developments discussed include projection 2D correlation analysis, concatenated 2D correlation, and correlation under multiple perturbation effects, as well as orthogonal sample design, predicting 2D correlation spectra, manipulating and comparing 2D spectra, correlation strategy based on segmented data blocks, such as moving-window analysis, features like determination of sequential order and enhanced spectral resolution, statistical 2D spectroscopy using covariance and other statistical metrics, hetero-correlation analysis, and sample-sample correlation technique. Data pretreatment operations prior to 2D correlation analysis are discussed, including the correction for physical effects, background and baseline subtraction, selection of reference spectrum, normalization and scaling of data, derivatives spectra and deconvolution technique, and smoothing and noise reduction. Other pertinent topics include chemometrics and statistical considerations, peak position shift phenomena, variable sampling increments, computation and software, display schemes, such as color coded format, slice and power spectra, tabulation, and other schemes.
Role of metabolomics in TBI research
Wolahan, Stephanie M.; Hirt, Daniel; Braas, Daniel; Glenn, Thomas C.
2016-01-01
Synopsis Metabolomics is an important member of the omics community in that it defines which small molecules may be responsible for disease states. This article reviews the essential principles of metabolomics from specimen preparation, chemical analysis, and advanced statistical methods. Metabolomics in TBI has so far been underutilized. Future metabolomics based studies focused on the diagnoses, prognoses, and treatment effects, need to be conducted across all types of TBI. PMID:27637396
ERIC Educational Resources Information Center
Current Population Reports, 1986
1986-01-01
Analysis of information gained from the March 1986 Current Population Survey (CPS) conducted by the Bureau of the Census shows the following results for the year 1985: (1) median family money income continued to move ahead of inflation; (2) the median earnings of men showed no statistically significant change from 1984, but the earnings of women…
NASA Astrophysics Data System (ADS)
Salman, Ahmad; Lapidot, Itshak; Pomerantz, Ami; Tsror, Leah; Shufan, Elad; Moreh, Raymond; Mordechai, Shaul; Huleihel, Mahmoud
2012-01-01
The early diagnosis of phytopathogens is of a great importance; it could save large economical losses due to crops damaged by fungal diseases, and prevent unnecessary soil fumigation or the use of fungicides and bactericides and thus prevent considerable environmental pollution. In this study, 18 isolates of three different fungi genera were investigated; six isolates of Colletotrichum coccodes, six isolates of Verticillium dahliae and six isolates of Fusarium oxysporum. Our main goal was to differentiate these fungi samples on the level of isolates, based on their infrared absorption spectra obtained using the Fourier transform infrared-attenuated total reflection (FTIR-ATR) sampling technique. Advanced statistical and mathematical methods: principal component analysis (PCA), linear discriminant analysis (LDA), and k-means were applied to the spectra after manipulation. Our results showed significant spectral differences between the various fungi genera examined. The use of k-means enabled classification between the genera with a 94.5% accuracy, whereas the use of PCA [3 principal components (PCs)] and LDA has achieved a 99.7% success rate. However, on the level of isolates, the best differentiation results were obtained using PCA (9 PCs) and LDA for the lower wavenumber region (800-1775 cm-1), with identification success rates of 87%, 85.5%, and 94.5% for Colletotrichum, Fusarium, and Verticillium strains, respectively.
Do alterations in follicular fluid proteases contribute to human infertility?
Cookingham, Lisa Marii; Van Voorhis, Bradley J; Ascoli, Mario
2015-05-01
Cathepsin L and ADAMTS-1 are known to play critical roles in follicular rupture, ovulation, and fertility in mice. Similar studies in humans are limited; however, both are known to increase during the periovulatory period. No studies have examined either protease in the follicular fluid of women with unexplained infertility or infertility related to advanced maternal age (AMA). We sought to determine if alterations in cathepsin L and/or ADAMTS-1 existed in these infertile populations. Patients undergoing in vitro fertilization (IVF) for unexplained infertility or AMA-related infertility were prospectively recruited for the study; patients with tubal or male factor infertility were recruited as controls. Follicular fluid was collected to determine gene expression (via quantitative polymerase chain reaction), enzyme concentrations (via enzyme-linked immunosorbent assays), and enzymatic activities (via fluorogenic enzyme cleavage assay or Western blot analysis) of cathepsin L and ADAMTS-1. The analysis included a total of 42 patients (14 per group). We found no statistically significant difference in gene expression, enzyme concentration, or enzymatic activity of cathepsin L or ADAMTS-1 in unexplained infertility or AMA-related infertility as compared to controls. We also found no statistically significant difference in expression or concentration with advancing age. Cathepsin L and ADAMTS-1 are not altered in women with unexplained infertility or AMA-related infertility undergoing IVF, and they do not decline with advancing age. It is possible that differences exist in natural cycles, contributing to infertility; however, our findings do not support a role for protease alterations as a common cause of infertility.
Prevalence and features of colorectal lesions among Hispanics: A hospital-based study.
Ashktorab, Hassan; Laiyemo, Adeyinka O; Lee, Edward; Cruz-Correa, Marcia; Ghuman, Amita; Nouraie, Mehdi; Brim, Hassan
2015-12-14
To evaluate the prevalence and characteristics of colorectal adenoma and carcinoma in an inner city Hispanic population. We reviewed the reports of 1628 Hispanic patients who underwent colonoscopy at Howard University from 2000 to 2010. Advanced adenoma was defined as adenoma ≥ 1 cm in size, adenomas with villous histology, high grade dysplasia and/or invasive cancer. Statistical analysis was performed using χ(2) statistics and t-test. The median age of the patients was 54 years, 64.2% were females. Polyps were observed in 489 (30.0%) of patients. Adenoma prevalence was 16.8% (n = 273), advanced adenoma 2.4% (n = 39), and colorectal cancer 0.4% (n = 7). Hyperplastic polyps were seen in 6.6% of the cohort (n = 107). Adenomas predominantly exhibited a proximal colonic distribution (53.7%, n = 144); while hyperplastic polyps were mostly located in the distal colon (70%, n = 75). Among 11.7% (n = 191) patients who underwent screening colonoscopy, the prevalence of colorectal lesions was 21.4% adenoma, 2.6% advanced adenoma; and 8.3% hyperplastic polyps. Our data showed low colorectal cancer prevalence among Hispanics in the Washington DC area. However, the pre-neoplastic pattern of colonic lesions in Hispanics likely points toward a shift in this population that needs to be monitored closely through large epidemiological studies.
Hogstrom, L. J.; Guo, S. M.; Murugadoss, K.; Bathe, M.
2016-01-01
Brain function emerges from hierarchical neuronal structure that spans orders of magnitude in length scale, from the nanometre-scale organization of synaptic proteins to the macroscopic wiring of neuronal circuits. Because the synaptic electrochemical signal transmission that drives brain function ultimately relies on the organization of neuronal circuits, understanding brain function requires an understanding of the principles that determine hierarchical neuronal structure in living or intact organisms. Recent advances in fluorescence imaging now enable quantitative characterization of neuronal structure across length scales, ranging from single-molecule localization using super-resolution imaging to whole-brain imaging using light-sheet microscopy on cleared samples. These tools, together with correlative electron microscopy and magnetic resonance imaging at the nanoscopic and macroscopic scales, respectively, now facilitate our ability to probe brain structure across its full range of length scales with cellular and molecular specificity. As these imaging datasets become increasingly accessible to researchers, novel statistical and computational frameworks will play an increasing role in efforts to relate hierarchical brain structure to its function. In this perspective, we discuss several prominent experimental advances that are ushering in a new era of quantitative fluorescence-based imaging in neuroscience along with novel computational and statistical strategies that are helping to distil our understanding of complex brain structure. PMID:26855758
Statistical tools for transgene copy number estimation based on real-time PCR.
Yuan, Joshua S; Burris, Jason; Stewart, Nathan R; Mentewab, Ayalew; Stewart, C Neal
2007-11-01
As compared with traditional transgene copy number detection technologies such as Southern blot analysis, real-time PCR provides a fast, inexpensive and high-throughput alternative. However, the real-time PCR based transgene copy number estimation tends to be ambiguous and subjective stemming from the lack of proper statistical analysis and data quality control to render a reliable estimation of copy number with a prediction value. Despite the recent progresses in statistical analysis of real-time PCR, few publications have integrated these advancements in real-time PCR based transgene copy number determination. Three experimental designs and four data quality control integrated statistical models are presented. For the first method, external calibration curves are established for the transgene based on serially-diluted templates. The Ct number from a control transgenic event and putative transgenic event are compared to derive the transgene copy number or zygosity estimation. Simple linear regression and two group T-test procedures were combined to model the data from this design. For the second experimental design, standard curves were generated for both an internal reference gene and the transgene, and the copy number of transgene was compared with that of internal reference gene. Multiple regression models and ANOVA models can be employed to analyze the data and perform quality control for this approach. In the third experimental design, transgene copy number is compared with reference gene without a standard curve, but rather, is based directly on fluorescence data. Two different multiple regression models were proposed to analyze the data based on two different approaches of amplification efficiency integration. Our results highlight the importance of proper statistical treatment and quality control integration in real-time PCR-based transgene copy number determination. These statistical methods allow the real-time PCR-based transgene copy number estimation to be more reliable and precise with a proper statistical estimation. Proper confidence intervals are necessary for unambiguous prediction of trangene copy number. The four different statistical methods are compared for their advantages and disadvantages. Moreover, the statistical methods can also be applied for other real-time PCR-based quantification assays including transfection efficiency analysis and pathogen quantification.
Szczesniak, Rhonda; Heltshe, Sonya L.; Stanojevic, Sanja; Mayer-Hamblett, Nicole
2017-01-01
Background Forced expiratory volume in 1 second (FEV1) is an established marker of cystic fibrosis (CF) disease progression that is used to capture clinical course and evaluate therapeutic efficacy. The research community has established FEV1 surveillance data through a variety of observational data sources such as patient registries, and there is a growing pipeline of new CF therapies demonstrated to be efficacious in clinical trials by establishing improvements in FEV1. Results In this review, we summarize from a statistical perspective the clinical relevance of FEV1 based on its association with morbidity and mortality in CF, its role in epidemiologic studies of disease progression and comparative effectiveness, and its utility in clinical trials. In addition, we identify opportunities to advance epidemiologic research and the clinical development pipeline through further statistical considerations. Conclusions Our understanding of CF disease course, therapeutics, and clinical care has evolved immensely in the past decades, in large part due to the thoughtful application of rigorous research methods and meaningful clinical endpoints such as FEV1. A continued commitment to conduct research that minimizes the potential for bias, maximizes the limited patient population, and harmonizes approaches to FEV1 analysis while maintaining clinical relevance, will facilitate further opportunities to advance CF care. PMID:28117136
Canadian Eskimo permanent tooth emergence timing.
Mayhall, J T; Belier, P L; Mayhall, M F
1978-08-01
To identify the times of emergence of the permanent teeth of Canadian Eskimos (Inuit), 368 children and adolescents were examined. The presence or absence of all permanent teeth except the third molars was recorded and these data subjected to probit analysis. Female emergence times were advanced over males. Generally, the Inuit of both sexes showed statistically significant earlier emergence times than Montreal children, except for the incisors. The present results do not support hypotheses indicating that premature extraction of the deciduous teeth advances the emergence of their succedaneous counterparts. There is some indication the controls of deciduous tooth emergence continue to play some part in emergence of the permanent dentition, especially the first permanent teeth that emerge.
Computer-assisted detection of epileptiform focuses on SPECT images
NASA Astrophysics Data System (ADS)
Grzegorczyk, Dawid; Dunin-Wąsowicz, Dorota; Mulawka, Jan J.
2010-09-01
Epilepsy is a common nervous system disease often related to consciousness disturbances and muscular spasm which affects about 1% of the human population. Despite major technological advances done in medicine in the last years there was no sufficient progress towards overcoming it. Application of advanced statistical methods and computer image analysis offers the hope for accurate detection and later removal of an epileptiform focuses which are the cause of some types of epilepsy. The aim of this work was to create a computer system that would help to find and diagnose disorders of blood circulation in the brain This may be helpful for the diagnosis of the epileptic seizures onset in the brain.
Lipid membranes and single ion channel recording for the advanced physics laboratory
NASA Astrophysics Data System (ADS)
Klapper, Yvonne; Nienhaus, Karin; Röcker, Carlheinz; Ulrich Nienhaus, G.
2014-05-01
We present an easy-to-handle, low-cost, and reliable setup to study various physical phenomena on a nanometer-thin lipid bilayer using the so-called black lipid membrane technique. The apparatus allows us to precisely measure optical and electrical properties of free-standing lipid membranes, to study the formation of single ion channels, and to gain detailed information on the ion conduction properties of these channels using statistical physics and autocorrelation analysis. The experiments are well suited as part of an advanced physics or biophysics laboratory course; they interconnect physics, chemistry, and biology and will be appealing to students of the natural sciences who are interested in quantitative experimentation.
Using Self-Reflection To Increase Science Process Skills in the General Chemistry Laboratory
NASA Astrophysics Data System (ADS)
Veal, William R.; Taylor, Dawne; Rogers, Amy L.
2009-03-01
Self-reflection is a tool of instruction that has been used in the science classroom. Research has shown great promise in using video as a learning tool in the classroom. However, the integration of self-reflective practice using video in the general chemistry laboratory to help students develop process skills has not been done. Immediate video feedback and direct instruction were employed in a general chemistry laboratory course to improve students' mastery and understanding of basic and advanced process skills. Qualitative results and statistical analysis of quantitative data proved that self-reflection significantly helped students develop basic and advanced process skills, yet did not seem to influence the general understanding of the science content.
ERIC Educational Resources Information Center
Hassan, Mahamood M.; Schwartz, Bill N.
2014-01-01
This paper discusses a student research project that is part of an advanced cost accounting class. The project emphasizes active learning, integrates cost accounting with macroeconomics and statistics by "learning by doing" using real world data. Students analyze sales data for a publicly listed company by focusing on the company's…
ERIC Educational Resources Information Center
McCarthy, Christopher J.; Lambert, Richard G.; Crowe, Elizabeth W.; McCarthy, Colleen J.
2010-01-01
This study examined the relationship of teachers' perceptions of coping resources and demands to job satisfaction factors. Participants were 158 Advanced Placement Statistics high school teachers who completed measures of personal resources for stress prevention, classroom demands and resources, job satisfaction, and intention to leave the field…
ERIC Educational Resources Information Center
Averitt, Sallie D.
This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills and instructing them in basic calculator…
Magalhães, Eunice; Calheiros, María M
2015-01-01
Although the significant scientific advances on place attachment literature, no instruments exist specifically developed or adapted to residential care. 410 adolescents (11 - 18 years old) participated in this study. The place attachment scale evaluates five dimensions: Place identity, Place dependence, Institutional bonding, Caregivers bonding and Friend bonding. Data analysis included descriptive statistics, content validity, construct validity (Confirmatory Factor Analysis), concurrent validity with correlations with satisfaction with life and with institution, and reliability evidences. The relationship with individual characteristics and placement length was also verified. Content validity analysis revealed that more than half of the panellists perceive all the items as relevant to assess the construct in residential care. The structure with five dimensions revealed good fit statistics and concurrent validity evidences were found, with significant correlations with satisfaction with life and with the institution. Acceptable values of internal consistence and specific gender differences were found. The preliminary psychometric properties of this scale suggest it potential to be used with youth in care.
Efficient and Flexible Climate Analysis with Python in a Cloud-Based Distributed Computing Framework
NASA Astrophysics Data System (ADS)
Gannon, C.
2017-12-01
As climate models become progressively more advanced, and spatial resolution further improved through various downscaling projects, climate projections at a local level are increasingly insightful and valuable. However, the raw size of climate datasets presents numerous hurdles for analysts wishing to develop customized climate risk metrics or perform site-specific statistical analysis. Four Twenty Seven, a climate risk consultancy, has implemented a Python-based distributed framework to analyze large climate datasets in the cloud. With the freedom afforded by efficiently processing these datasets, we are able to customize and continually develop new climate risk metrics using the most up-to-date data. Here we outline our process for using Python packages such as XArray and Dask to evaluate netCDF files in a distributed framework, StarCluster to operate in a cluster-computing environment, cloud computing services to access publicly hosted datasets, and how this setup is particularly valuable for generating climate change indicators and performing localized statistical analysis.
A shift from significance test to hypothesis test through power analysis in medical research.
Singh, G
2006-01-01
Medical research literature until recently, exhibited substantial dominance of the Fisher's significance test approach of statistical inference concentrating more on probability of type I error over Neyman-Pearson's hypothesis test considering both probability of type I and II error. Fisher's approach dichotomises results into significant or not significant results with a P value. The Neyman-Pearson's approach talks of acceptance or rejection of null hypothesis. Based on the same theory these two approaches deal with same objective and conclude in their own way. The advancement in computing techniques and availability of statistical software have resulted in increasing application of power calculations in medical research and thereby reporting the result of significance tests in the light of power of the test also. Significance test approach, when it incorporates power analysis contains the essence of hypothesis test approach. It may be safely argued that rising application of power analysis in medical research may have initiated a shift from Fisher's significance test to Neyman-Pearson's hypothesis test procedure.
Rodríguez-Entrena, Macario; Schuberth, Florian; Gelhard, Carsten
2018-01-01
Structural equation modeling using partial least squares (PLS-SEM) has become a main-stream modeling approach in various disciplines. Nevertheless, prior literature still lacks a practical guidance on how to properly test for differences between parameter estimates. Whereas existing techniques such as parametric and non-parametric approaches in PLS multi-group analysis solely allow to assess differences between parameters that are estimated for different subpopulations, the study at hand introduces a technique that allows to also assess whether two parameter estimates that are derived from the same sample are statistically different. To illustrate this advancement to PLS-SEM, we particularly refer to a reduced version of the well-established technology acceptance model.
NASA Technical Reports Server (NTRS)
Slutz, R. J.; Gray, T. B.; West, M. L.; Stewart, F. G.; Leftin, M.
1971-01-01
A statistical study of formulas for predicting the sunspot number several years in advance is reported. By using a data lineup with cycle maxima coinciding, and by using multiple and nonlinear predictors, a new formula which gives better error estimates than former formulas derived from the work of McNish and Lincoln is obtained. A statistical analysis is conducted to determine which of several mathematical expressions best describes the relationship between 10.7 cm solar flux and Zurich sunspot numbers. Attention is given to the autocorrelation of the observations, and confidence intervals for the derived relationships are presented. The accuracy of predicting a value of 10.7 cm solar flux from a predicted sunspot number is dicussed.
NASA DOE POD NDE Capabilities Data Book
NASA Technical Reports Server (NTRS)
Generazio, Edward R.
2015-01-01
This data book contains the Directed Design of Experiments for Validating Probability of Detection (POD) Capability of NDE Systems (DOEPOD) analyses of the nondestructive inspection data presented in the NTIAC, Nondestructive Evaluation (NDE) Capabilities Data Book, 3rd ed., NTIAC DB-97-02. DOEPOD is designed as a decision support system to validate inspection system, personnel, and protocol demonstrating 0.90 POD with 95% confidence at critical flaw sizes, a90/95. The test methodology used in DOEPOD is based on the field of statistical sequential analysis founded by Abraham Wald. Sequential analysis is a method of statistical inference whose characteristic feature is that the number of observations required by the procedure is not determined in advance of the experiment. The decision to terminate the experiment depends, at each stage, on the results of the observations previously made. A merit of the sequential method, as applied to testing statistical hypotheses, is that test procedures can be constructed which require, on average, a substantially smaller number of observations than equally reliable test procedures based on a predetermined number of observations.
Yang, Hua; Xia, Bing-Qing; Jiang, Bo; Wang, Guozhen; Yang, Yi-Peng; Chen, Hao; Li, Bing-Sheng; Xu, An-Gao; Huang, Yun-Bo; Wang, Xin-Ying
2013-08-01
The diagnostic value of stool DNA (sDNA) testing for colorectal neoplasms remains controversial. To compensate for the lack of large-scale unbiased population studies, a meta-analysis was performed to evaluate the diagnostic value of sDNA testing for multiple markers of colorectal cancer (CRC) and advanced adenoma. The PubMed, Science Direct, Biosis Review, Cochrane Library and Embase databases were systematically searched in January 2012 without time restriction. Meta-analysis was performed using a random-effects model using sensitivity, specificity, diagnostic OR (DOR), summary ROC curves, area under the curve (AUC), and 95% CIs as effect measures. Heterogeneity was measured using the χ(2) test and Q statistic; subgroup analysis was also conducted. A total of 20 studies comprising 5876 individuals were eligible. There was no heterogeneity for CRC, but adenoma and advanced adenoma harboured considerable heterogeneity influenced by risk classification and various detection markers. Stratification analysis according to risk classification showed that multiple markers had a high DOR for the high-risk subgroups of both CRC (sensitivity 0.759 [95% CI 0.711 to 0.804]; specificity 0.883 [95% CI 0.846 to 0.913]; AUC 0.906) and advanced adenoma (sensitivity 0.683 [95% CI 0.584 to 0.771]; specificity 0.918 [95% CI 0.866 to 0.954]; AUC 0.946) but not for the average-risk subgroups of either. In the methylation subgroup, sDNA testing had significantly higher DOR for CRC (sensitivity 0.753 [95% CI 0.685 to 0.812]; specificity 0.913 [95% CI 0.860 to 0.950]; AUC 0.918) and advanced adenoma (sensitivity 0.623 [95% CI 0.527 to 0.712]; specificity 0.926 [95% CI 0.882 to 0.958]; AUC 0.910) compared with the mutation subgroup. There was no significant heterogeneity among studies for subgroup analysis. sDNA testing for multiple markers had strong diagnostic significance for CRC and advanced adenoma in high-risk subjects. Methylation makers had more diagnostic value than mutation markers.
Chemical information obtained from Auger depth profiles by means of advanced factor analysis (MLCFA)
NASA Astrophysics Data System (ADS)
De Volder, P.; Hoogewijs, R.; De Gryse, R.; Fiermans, L.; Vennik, J.
1993-01-01
The advanced multivariate statistical technique "maximum likelihood common factor analysis (MLCFA)" is shown to be superior to "principal component analysis (PCA)" for decomposing overlapping peaks into their individual component spectra of which neither the number of components nor the peak shape of the component spectra is known. An examination of the maximum resolving power of both techniques, MLCFA and PCA, by means of artificially created series of multicomponent spectra confirms this finding unambiguously. Substantial progress in the use of AES as a chemical-analysis technique is accomplished through the implementation of MLCFA. Chemical information from Auger depth profiles is extracted by investigating the variation of the line shape of the Auger signal as a function of the changing chemical state of the element. In particular, MLCFA combined with Auger depth profiling has been applied to problems related to steelcord-rubber tyre adhesion. MLCFA allows one to elucidate the precise nature of the interfacial layer of reaction products between natural rubber vulcanized on a thin brass layer. This study reveals many interesting chemical aspects of the oxi-sulfidation of brass undetectable with classical AES.
A Review of the Statistical and Quantitative Methods Used to Study Alcohol-Attributable Crime.
Fitterer, Jessica L; Nelson, Trisalyn A
2015-01-01
Modelling the relationship between alcohol consumption and crime generates new knowledge for crime prevention strategies. Advances in data, particularly data with spatial and temporal attributes, have led to a growing suite of applied methods for modelling. In support of alcohol and crime researchers we synthesized and critiqued existing methods of spatially and quantitatively modelling the effects of alcohol exposure on crime to aid method selection, and identify new opportunities for analysis strategies. We searched the alcohol-crime literature from 1950 to January 2014. Analyses that statistically evaluated or mapped the association between alcohol and crime were included. For modelling purposes, crime data were most often derived from generalized police reports, aggregated to large spatial units such as census tracts or postal codes, and standardized by residential population data. Sixty-eight of the 90 selected studies included geospatial data of which 48 used cross-sectional datasets. Regression was the prominent modelling choice (n = 78) though dependent on data many variations existed. There are opportunities to improve information for alcohol-attributable crime prevention by using alternative population data to standardize crime rates, sourcing crime information from non-traditional platforms (social media), increasing the number of panel studies, and conducting analysis at the local level (neighbourhood, block, or point). Due to the spatio-temporal advances in crime data, we expect a continued uptake of flexible Bayesian hierarchical modelling, a greater inclusion of spatial-temporal point pattern analysis, and shift toward prospective (forecast) modelling over small areas (e.g., blocks).
NASA Astrophysics Data System (ADS)
Mukherjee, S.; Salazar, L.; Mittelstaedt, J.; Valdez, O.
2017-11-01
Supernovae in our universe are potential sources of gravitational waves (GW) that could be detected in a network of GW detectors like LIGO and Virgo. Core-collapse supernovae are rare, but the associated gravitational radiation is likely to carry profuse information about the underlying processes driving the supernovae. Calculations based on analytic models predict GW energies within the detection range of the Advanced LIGO detectors, out to tens of Mpc for certain types of signals e.g. coalescing binary neutron stars. For supernovae however, the corresponding distances are much less. Thus, methods that can improve the sensitivity of searches for GW signals from supernovae are desirable, especially in the advanced detector era. Several methods have been proposed based on various likelihood-based regulators that work on data from a network of detectors to detect burst-like signals (as is the case for signals from supernovae) from potential GW sources. To address this problem, we have developed an analysis pipeline based on a method of noise reduction known as the harmonic regeneration noise reduction (HRNR) algorithm. To demonstrate the method, sixteen supernova waveforms from the Murphy et al. 2009 catalog have been used in presence of LIGO science data. A comparative analysis is presented to show detection statistics for a standard network analysis as commonly used in GW pipelines and the same by implementing the new method in conjunction with the network. The result shows significant improvement in detection statistics.
A Review of the Statistical and Quantitative Methods Used to Study Alcohol-Attributable Crime
Fitterer, Jessica L.; Nelson, Trisalyn A.
2015-01-01
Modelling the relationship between alcohol consumption and crime generates new knowledge for crime prevention strategies. Advances in data, particularly data with spatial and temporal attributes, have led to a growing suite of applied methods for modelling. In support of alcohol and crime researchers we synthesized and critiqued existing methods of spatially and quantitatively modelling the effects of alcohol exposure on crime to aid method selection, and identify new opportunities for analysis strategies. We searched the alcohol-crime literature from 1950 to January 2014. Analyses that statistically evaluated or mapped the association between alcohol and crime were included. For modelling purposes, crime data were most often derived from generalized police reports, aggregated to large spatial units such as census tracts or postal codes, and standardized by residential population data. Sixty-eight of the 90 selected studies included geospatial data of which 48 used cross-sectional datasets. Regression was the prominent modelling choice (n = 78) though dependent on data many variations existed. There are opportunities to improve information for alcohol-attributable crime prevention by using alternative population data to standardize crime rates, sourcing crime information from non-traditional platforms (social media), increasing the number of panel studies, and conducting analysis at the local level (neighbourhood, block, or point). Due to the spatio-temporal advances in crime data, we expect a continued uptake of flexible Bayesian hierarchical modelling, a greater inclusion of spatial-temporal point pattern analysis, and shift toward prospective (forecast) modelling over small areas (e.g., blocks). PMID:26418016
Waltman, Ludo; van Raan, Anthony F J; Smart, Sue
2014-01-01
We investigate the extent to which advances in the health and life sciences (HLS) are dependent on research in the engineering and physical sciences (EPS), particularly physics, chemistry, mathematics, and engineering. The analysis combines two different bibliometric approaches. The first approach to analyze the 'EPS-HLS interface' is based on term map visualizations of HLS research fields. We consider 16 clinical fields and five life science fields. On the basis of expert judgment, EPS research in these fields is studied by identifying EPS-related terms in the term maps. In the second approach, a large-scale citation-based network analysis is applied to publications from all fields of science. We work with about 22,000 clusters of publications, each representing a topic in the scientific literature. Citation relations are used to identify topics at the EPS-HLS interface. The two approaches complement each other. The advantages of working with textual data compensate for the limitations of working with citation relations and the other way around. An important advantage of working with textual data is in the in-depth qualitative insights it provides. Working with citation relations, on the other hand, yields many relevant quantitative statistics. We find that EPS research contributes to HLS developments mainly in the following five ways: new materials and their properties; chemical methods for analysis and molecular synthesis; imaging of parts of the body as well as of biomaterial surfaces; medical engineering mainly related to imaging, radiation therapy, signal processing technology, and other medical instrumentation; mathematical and statistical methods for data analysis. In our analysis, about 10% of all EPS and HLS publications are classified as being at the EPS-HLS interface. This percentage has remained more or less constant during the past decade.
Waltman, Ludo; van Raan, Anthony F. J.; Smart, Sue
2014-01-01
We investigate the extent to which advances in the health and life sciences (HLS) are dependent on research in the engineering and physical sciences (EPS), particularly physics, chemistry, mathematics, and engineering. The analysis combines two different bibliometric approaches. The first approach to analyze the ‘EPS-HLS interface’ is based on term map visualizations of HLS research fields. We consider 16 clinical fields and five life science fields. On the basis of expert judgment, EPS research in these fields is studied by identifying EPS-related terms in the term maps. In the second approach, a large-scale citation-based network analysis is applied to publications from all fields of science. We work with about 22,000 clusters of publications, each representing a topic in the scientific literature. Citation relations are used to identify topics at the EPS-HLS interface. The two approaches complement each other. The advantages of working with textual data compensate for the limitations of working with citation relations and the other way around. An important advantage of working with textual data is in the in-depth qualitative insights it provides. Working with citation relations, on the other hand, yields many relevant quantitative statistics. We find that EPS research contributes to HLS developments mainly in the following five ways: new materials and their properties; chemical methods for analysis and molecular synthesis; imaging of parts of the body as well as of biomaterial surfaces; medical engineering mainly related to imaging, radiation therapy, signal processing technology, and other medical instrumentation; mathematical and statistical methods for data analysis. In our analysis, about 10% of all EPS and HLS publications are classified as being at the EPS-HLS interface. This percentage has remained more or less constant during the past decade. PMID:25360616
Statistical analysis of target acquisition sensor modeling experiments
NASA Astrophysics Data System (ADS)
Deaver, Dawne M.; Moyer, Steve
2015-05-01
The U.S. Army RDECOM CERDEC NVESD Modeling and Simulation Division is charged with the development and advancement of military target acquisition models to estimate expected soldier performance when using all types of imaging sensors. Two elements of sensor modeling are (1) laboratory-based psychophysical experiments used to measure task performance and calibrate the various models and (2) field-based experiments used to verify the model estimates for specific sensors. In both types of experiments, it is common practice to control or measure environmental, sensor, and target physical parameters in order to minimize uncertainty of the physics based modeling. Predicting the minimum number of test subjects required to calibrate or validate the model should be, but is not always, done during test planning. The objective of this analysis is to develop guidelines for test planners which recommend the number and types of test samples required to yield a statistically significant result.
New insights into old methods for identifying causal rare variants.
Wang, Haitian; Huang, Chien-Hsun; Lo, Shaw-Hwa; Zheng, Tian; Hu, Inchi
2011-11-29
The advance of high-throughput next-generation sequencing technology makes possible the analysis of rare variants. However, the investigation of rare variants in unrelated-individuals data sets faces the challenge of low power, and most methods circumvent the difficulty by using various collapsing procedures based on genes, pathways, or gene clusters. We suggest a new way to identify causal rare variants using the F-statistic and sliced inverse regression. The procedure is tested on the data set provided by the Genetic Analysis Workshop 17 (GAW17). After preliminary data reduction, we ranked markers according to their F-statistic values. Top-ranked markers were then subjected to sliced inverse regression, and those with higher absolute coefficients in the most significant sliced inverse regression direction were selected. The procedure yields good false discovery rates for the GAW17 data and thus is a promising method for future study on rare variants.
24 CFR 266.420 - Closing and endorsement by the Commissioner.
Code of Federal Regulations, 2010 CFR
2010-04-01
.... (a) Closing. Before disbursement of loan advances in periodic advances cases, and in all cases after... market occupancy percentages, value/replacement cost, interest rate, and similar statistical information... certification for periodic advances cases, if submitted for final endorsement, that advances were made...
Schmitt, M; Groß, K; Grub, J; Heib, F
2015-06-01
Contact angle determination by sessile drop technique is essential to characterise surface properties in science and in industry. Different specific angles can be observed on every solid which are correlated with the advancing or the receding of the triple line. Different procedures and definitions for the determination of specific angles exist which are often not comprehensible or reproducible. Therefore one of the most important things in this area is to build standard, reproducible and valid methods for determining advancing/receding contact angles. This contribution introduces novel techniques to analyse dynamic contact angle measurements (sessile drop) in detail which are applicable for axisymmetric and non-axisymmetric drops. Not only the recently presented fit solution by sigmoid function and the independent analysis of the different parameters (inclination, contact angle, velocity of the triple point) but also the dependent analysis will be firstly explained in detail. These approaches lead to contact angle data and different access on specific contact angles which are independent from "user-skills" and subjectivity of the operator. As example the motion behaviour of droplets on flat silicon-oxide surfaces after different surface treatments is dynamically measured by sessile drop technique when inclining the sample plate. The triple points, the inclination angles, the downhill (advancing motion) and the uphill angles (receding motion) obtained by high-precision drop shape analysis are independently and dependently statistically analysed. Due to the small covered distance for the dependent analysis (<0.4mm) and the dominance of counted events with small velocity the measurements are less influenced by motion dynamics and the procedure can be called "slow moving" analysis. The presented procedures as performed are especially sensitive to the range which reaches from the static to the "slow moving" dynamic contact angle determination. They are characterised by small deviations of the computed values. Additional to the detailed introduction of this novel analytical approaches plus fit solution special motion relations for the drop on inclined surfaces and detailed relations about the reactivity of the freshly cleaned silicon wafer surface resulting in acceleration behaviour (reactive de-wetting) are presented. Copyright © 2014 Elsevier Inc. All rights reserved.
Comment on "Ducklings imprint on the relational concept of 'same or different'".
Hupé, Jean-Michel
2017-02-24
Martinho and Kacelnik's (Reports, 15 July 2016, p. 286) finding that mallard ducklings can deal with abstract concepts is important for understanding the evolution of cognition. However, a statistically more robust analysis of the data calls their conclusions into question. This example brings to light the risk of drawing too strong an inference by relying solely on P values. Copyright © 2017, American Association for the Advancement of Science.
Computer-assisted qualitative data analysis software.
Cope, Diane G
2014-05-01
Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Chun-Chieh; Department of Medical Imaging and Radiological Science, Chang Gung University, School of Medicine, Taoyuan, Taiwan; Lai, Chyong-Huey
Purpose: To study the prognostic value of human papillomavirus (HPV) genotypes in patients with advanced cervical cancer treated with radiation therapy (RT) alone or concurrent chemoradiation therapy (CCRT). Methods and Materials: Between August 1993 and May 2000, 327 patients with advanced squamous cell carcinoma of the cervix (International Federation of Gynecology and Obstetrics stage III/IVA or stage IIB with positive lymph nodes) were eligible for this study. HPV genotypes were determined using the Easychip Registered-Sign HPV genechip. Outcomes were analyzed using Kaplan-Meier survival analysis and the Cox proportional hazards model. Results: We detected 22 HPV genotypes in 323 (98.8%) patients.more » The leading 4 types were HPV16, 58, 18, and 33. The 5-year overall and disease-specific survival estimates for the entire cohort were 41.9% and 51.4%, respectively. CCRT improved the 5-year disease-specific survival by an absolute 9.8%, but this was not statistically significant (P=.089). There was a significant improvement in disease-specific survival in the CCRT group for HPV18-positive (60.9% vs 30.4%, P=.019) and HPV58-positive (69.3% vs 48.9%, P=.026) patients compared with the RT alone group. In contrast, the differences in survival with CCRT compared with RT alone in the HPV16-positive and HPV-33 positive subgroups were not statistically significant (P=.86 and P=.53, respectively). An improved disease-specific survival was observed for CCRT treated patients infected with both HPV16 and HPV18, but these differenced also were not statistically significant. Conclusions: The HPV genotype may be a useful predictive factor for the effect of CCRT in patients with advanced squamous cell carcinoma of the cervix. Verifying these results in prospective trials could have an impact on tailoring future treatment based on HPV genotype.« less
The ImageJ ecosystem: an open platform for biomedical image analysis
Schindelin, Johannes; Rueden, Curtis T.; Hiner, Mark C.; Eliceiri, Kevin W.
2015-01-01
Technology in microscopy advances rapidly, enabling increasingly affordable, faster, and more precise quantitative biomedical imaging, which necessitates correspondingly more-advanced image processing and analysis techniques. A wide range of software is available – from commercial to academic, special-purpose to Swiss army knife, small to large–but a key characteristic of software that is suitable for scientific inquiry is its accessibility. Open-source software is ideal for scientific endeavors because it can be freely inspected, modified, and redistributed; in particular, the open-software platform ImageJ has had a huge impact on life sciences, and continues to do so. From its inception, ImageJ has grown significantly due largely to being freely available and its vibrant and helpful user community. Scientists as diverse as interested hobbyists, technical assistants, students, scientific staff, and advanced biology researchers use ImageJ on a daily basis, and exchange knowledge via its dedicated mailing list. Uses of ImageJ range from data visualization and teaching to advanced image processing and statistical analysis. The software's extensibility continues to attract biologists at all career stages as well as computer scientists who wish to effectively implement specific image-processing algorithms. In this review, we use the ImageJ project as a case study of how open-source software fosters its suites of software tools, making multitudes of image-analysis technology easily accessible to the scientific community. We specifically explore what makes ImageJ so popular, how it impacts life science, how it inspires other projects, and how it is self-influenced by coevolving projects within the ImageJ ecosystem. PMID:26153368
The ImageJ ecosystem: An open platform for biomedical image analysis.
Schindelin, Johannes; Rueden, Curtis T; Hiner, Mark C; Eliceiri, Kevin W
2015-01-01
Technology in microscopy advances rapidly, enabling increasingly affordable, faster, and more precise quantitative biomedical imaging, which necessitates correspondingly more-advanced image processing and analysis techniques. A wide range of software is available-from commercial to academic, special-purpose to Swiss army knife, small to large-but a key characteristic of software that is suitable for scientific inquiry is its accessibility. Open-source software is ideal for scientific endeavors because it can be freely inspected, modified, and redistributed; in particular, the open-software platform ImageJ has had a huge impact on the life sciences, and continues to do so. From its inception, ImageJ has grown significantly due largely to being freely available and its vibrant and helpful user community. Scientists as diverse as interested hobbyists, technical assistants, students, scientific staff, and advanced biology researchers use ImageJ on a daily basis, and exchange knowledge via its dedicated mailing list. Uses of ImageJ range from data visualization and teaching to advanced image processing and statistical analysis. The software's extensibility continues to attract biologists at all career stages as well as computer scientists who wish to effectively implement specific image-processing algorithms. In this review, we use the ImageJ project as a case study of how open-source software fosters its suites of software tools, making multitudes of image-analysis technology easily accessible to the scientific community. We specifically explore what makes ImageJ so popular, how it impacts the life sciences, how it inspires other projects, and how it is self-influenced by coevolving projects within the ImageJ ecosystem. © 2015 Wiley Periodicals, Inc.
Differential proteome analysis of diabetes mellitus type 2 and its pathophysiological complications.
Sohail, Waleed; Majeed, Fatimah; Afroz, Amber
2018-06-11
The prevalence of Diabetes Mellitus Type 2 (DM 2) is increasing every passing year due to some global changes in lifestyles of people. The exact underlying mechanisms of the progression of this disease are not yet known. However recent advances in the combined omics more particularly in proteomics and genomics have opened a gateway towards the understanding of predetermined genetic factors, progression, complications and treatment of this disease. Here we shall review the recent advances in proteomics that have led to an early and better diagnostic approaches in controlling DM 2 more importantly the comparison of structural and functional protein biomarkers that are modified in the diseased state. By applying these advanced and promising proteomic strategies with bioinformatics applications and bio-statistical tools the prevalence of DM 2 and its associated disorders i-e nephropathy and retinopathy are expected to be controlled. Copyright © 2018 Diabetes India. Published by Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Averitt, Sallie D.
This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills in working with line graphs and teaching…
ERIC Educational Resources Information Center
Billings, Paul H.
This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 6-hour introductory module on statistical process control (SPC), designed to develop competencies in the following skill areas: (1) identification of the three classes of SPC use; (2) understanding a process and how it works; (3)…
ERIC Educational Resources Information Center
McGrath, April L.; Ferns, Alyssa; Greiner, Leigh; Wanamaker, Kayla; Brown, Shelley
2015-01-01
In this study we assessed the usefulness of a multifaceted teaching framework in an advanced statistics course. We sought to expand on past findings by using this framework to assess changes in anxiety and self-efficacy, and we collected focus group data to ascertain whether students attribute such changes to a multifaceted teaching approach.…
Keough, N; L'Abbé, E N; Steyn, M; Pretorius, S
2015-01-01
Forensic anthropologists are tasked with interpreting the sequence of events from death to the discovery of a body. Burned bone often evokes questions as to the timing of burning events. The purpose of this study was to assess the progression of thermal damage on bones with advancement in decomposition. Twenty-five pigs in various stages of decomposition (fresh, early, advanced, early and late skeletonisation) were exposed to fire for 30 min. The scored heat-related features on bone included colour change (unaltered, charred, calcined), brown and heat borders, heat lines, delineation, greasy bone, joint shielding, predictable and minimal cracking, delamination and heat-induced fractures. Colour changes were scored according to a ranked percentage scale (0-3) and the remaining traits as absent or present (0/1). Kappa statistics was used to evaluate intra- and inter-observer error. Transition analysis was used to formulate probability mass functions [P(X=j|i)] to predict decomposition stage from the scored features of thermal destruction. Nine traits displayed potential to predict decomposition stage from burned remains. An increase in calcined and charred bone occurred synchronously with advancement of decomposition with subsequent decrease in unaltered surfaces. Greasy bone appeared more often in the early/fresh stages (fleshed bone). Heat borders, heat lines, delineation, joint shielding, predictable and minimal cracking are associated with advanced decomposition, when bone remains wet but lacks extensive soft tissue protection. Brown burn/borders, delamination and other heat-induced fractures are associated with early and late skeletonisation, showing that organic composition of bone and percentage of flesh present affect the manner in which it burns. No statistically significant difference was noted among observers for the majority of the traits, indicating that they can be scored reliably. Based on the data analysis, the pattern of heat-induced changes may assist in estimating decomposition stage from unknown, burned remains. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Experimental analysis of computer system dependability
NASA Technical Reports Server (NTRS)
Iyer, Ravishankar, K.; Tang, Dong
1993-01-01
This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.
48 CFR 31.109 - Advance agreements.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Advance agreements. 31.109... REQUIREMENTS CONTRACT COST PRINCIPLES AND PROCEDURES Applicability 31.109 Advance agreements. (a) The extent of... contractors should seek advance agreement on the treatment of special or unusual costs and on statistical...
48 CFR 31.109 - Advance agreements.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Advance agreements. 31.109... REQUIREMENTS CONTRACT COST PRINCIPLES AND PROCEDURES Applicability 31.109 Advance agreements. (a) The extent of... contractors should seek advance agreement on the treatment of special or unusual costs and on statistical...
48 CFR 31.109 - Advance agreements.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Advance agreements. 31.109... REQUIREMENTS CONTRACT COST PRINCIPLES AND PROCEDURES Applicability 31.109 Advance agreements. (a) The extent of... contractors should seek advance agreement on the treatment of special or unusual costs and on statistical...
Big heart data: advancing health informatics through data sharing in cardiovascular imaging.
Suinesiaputra, Avan; Medrano-Gracia, Pau; Cowan, Brett R; Young, Alistair A
2015-07-01
The burden of heart disease is rapidly worsening due to the increasing prevalence of obesity and diabetes. Data sharing and open database resources for heart health informatics are important for advancing our understanding of cardiovascular function, disease progression and therapeutics. Data sharing enables valuable information, often obtained at considerable expense and effort, to be reused beyond the specific objectives of the original study. Many government funding agencies and journal publishers are requiring data reuse, and are providing mechanisms for data curation and archival. Tools and infrastructure are available to archive anonymous data from a wide range of studies, from descriptive epidemiological data to gigabytes of imaging data. Meta-analyses can be performed to combine raw data from disparate studies to obtain unique comparisons or to enhance statistical power. Open benchmark datasets are invaluable for validating data analysis algorithms and objectively comparing results. This review provides a rationale for increased data sharing and surveys recent progress in the cardiovascular domain. We also highlight the potential of recent large cardiovascular epidemiological studies enabling collaborative efforts to facilitate data sharing, algorithms benchmarking, disease modeling and statistical atlases.
Developing Statistical Literacy with Year 9 Students: A Collaborative Research Project
ERIC Educational Resources Information Center
Sharma, Sashi
2013-01-01
Advances in technology and communication have increased the amount of statistical information delivered through everyday media. The importance of statistics in everyday life has led to calls for increased attention to statistical literacy in the mathematics curriculum (Watson 2006). Gal (2004) sees statistical literacy as the need for students to…
Teaching Statistics Online: A Decade's Review of the Literature about What Works
ERIC Educational Resources Information Center
Mills, Jamie D.; Raju, Dheeraj
2011-01-01
A statistics course can be a very challenging subject to teach. To enhance learning, today's modern course in statistics might incorporate many different aspects of technology. Due to advances in technology, teaching statistics online has also become a popular course option. Although researchers are studying how to deliver statistics courses in…
LES, DNS and RANS for the analysis of high-speed turbulent reacting flows
NASA Technical Reports Server (NTRS)
Adumitroaie, V.; Colucci, P. J.; Taulbee, D. B.; Givi, P.
1995-01-01
The purpose of this research is to continue our efforts in advancing the state of knowledge in large eddy simulation (LES), direct numerical simulation (DNS), and Reynolds averaged Navier Stokes (RANS) methods for the computational analysis of high-speed reacting turbulent flows. In the second phase of this work, covering the period 1 Aug. 1994 - 31 Jul. 1995, we have focused our efforts on two programs: (1) developments of explicit algebraic moment closures for statistical descriptions of compressible reacting flows and (2) development of Monte Carlo numerical methods for LES of chemically reacting flows.
Discriminant analysis of Raman spectra for body fluid identification for forensic purposes.
Sikirzhytski, Vitali; Virkler, Kelly; Lednev, Igor K
2010-01-01
Detection and identification of blood, semen and saliva stains, the most common body fluids encountered at a crime scene, are very important aspects of forensic science today. This study targets the development of a nondestructive, confirmatory method for body fluid identification based on Raman spectroscopy coupled with advanced statistical analysis. Dry traces of blood, semen and saliva obtained from multiple donors were probed using a confocal Raman microscope with a 785-nm excitation wavelength under controlled laboratory conditions. Results demonstrated the capability of Raman spectroscopy to identify an unknown substance to be semen, blood or saliva with high confidence.
SpecViz: Interactive Spectral Data Analysis
NASA Astrophysics Data System (ADS)
Earl, Nicholas Michael; STScI
2016-06-01
The astronomical community is about to enter a new generation of scientific enterprise. With next-generation instrumentation and advanced capabilities, the need has arisen to equip astronomers with the necessary tools to deal with large, multi-faceted data. The Space Telescope Science Institute has initiated a data analysis forum for the creation, development, and maintenance of software tools for the interpretation of these new data sets. SpecViz is a spectral 1-D interactive visualization and analysis application built with Python in an open source development environment. A user-friendly GUI allows for a fast, interactive approach to spectral analysis. SpecViz supports handling of unique and instrument-specific data, incorporation of advanced spectral unit handling and conversions in a flexible, high-performance interactive plotting environment. Active spectral feature analysis is possible through interactive measurement and statistical tools. It can be used to build wide-band SEDs, with the capability of combining or overplotting data products from various instruments. SpecViz sports advanced toolsets for filtering and detrending spectral lines; identifying, isolating, and manipulating spectral features; as well as utilizing spectral templates for renormalizing data in an interactive way. SpecViz also includes a flexible model fitting toolset that allows for multi-component models, as well as custom models, to be used with various fitting and decomposition routines. SpecViz also features robust extension via custom data loaders and connection to the central communication system underneath the interface for more advanced control. Incorporation with Jupyter notebooks via connection with the active iPython kernel allows for SpecViz to be used in addition to a user’s normal workflow without demanding the user drastically alter their method of data analysis. In addition, SpecViz allows the interactive analysis of multi-object spectroscopy in the same straight-forward, consistent way. Through the development of such tools, STScI hopes to unify astronomical data analysis software for JWST and other instruments, allowing for efficient, reliable, and consistent scientific results.
Directions for new developments on statistical design and analysis of small population group trials.
Hilgers, Ralf-Dieter; Roes, Kit; Stallard, Nigel
2016-06-14
Most statistical design and analysis methods for clinical trials have been developed and evaluated where at least several hundreds of patients could be recruited. These methods may not be suitable to evaluate therapies if the sample size is unavoidably small, which is usually termed by small populations. The specific sample size cut off, where the standard methods fail, needs to be investigated. In this paper, the authors present their view on new developments for design and analysis of clinical trials in small population groups, where conventional statistical methods may be inappropriate, e.g., because of lack of power or poor adherence to asymptotic approximations due to sample size restrictions. Following the EMA/CHMP guideline on clinical trials in small populations, we consider directions for new developments in the area of statistical methodology for design and analysis of small population clinical trials. We relate the findings to the research activities of three projects, Asterix, IDeAl, and InSPiRe, which have received funding since 2013 within the FP7-HEALTH-2013-INNOVATION-1 framework of the EU. As not all aspects of the wide research area of small population clinical trials can be addressed, we focus on areas where we feel advances are needed and feasible. The general framework of the EMA/CHMP guideline on small population clinical trials stimulates a number of research areas. These serve as the basis for the three projects, Asterix, IDeAl, and InSPiRe, which use various approaches to develop new statistical methodology for design and analysis of small population clinical trials. Small population clinical trials refer to trials with a limited number of patients. Small populations may result form rare diseases or specific subtypes of more common diseases. New statistical methodology needs to be tailored to these specific situations. The main results from the three projects will constitute a useful toolbox for improved design and analysis of small population clinical trials. They address various challenges presented by the EMA/CHMP guideline as well as recent discussions about extrapolation. There is a need for involvement of the patients' perspective in the planning and conduct of small population clinical trials for a successful therapy evaluation.
A weighted U-statistic for genetic association analyses of sequencing data.
Wei, Changshuai; Li, Ming; He, Zihuai; Vsevolozhskaya, Olga; Schaid, Daniel J; Lu, Qing
2014-12-01
With advancements in next-generation sequencing technology, a massive amount of sequencing data is generated, which offers a great opportunity to comprehensively investigate the role of rare variants in the genetic etiology of complex diseases. Nevertheless, the high-dimensional sequencing data poses a great challenge for statistical analysis. The association analyses based on traditional statistical methods suffer substantial power loss because of the low frequency of genetic variants and the extremely high dimensionality of the data. We developed a Weighted U Sequencing test, referred to as WU-SEQ, for the high-dimensional association analysis of sequencing data. Based on a nonparametric U-statistic, WU-SEQ makes no assumption of the underlying disease model and phenotype distribution, and can be applied to a variety of phenotypes. Through simulation studies and an empirical study, we showed that WU-SEQ outperformed a commonly used sequence kernel association test (SKAT) method when the underlying assumptions were violated (e.g., the phenotype followed a heavy-tailed distribution). Even when the assumptions were satisfied, WU-SEQ still attained comparable performance to SKAT. Finally, we applied WU-SEQ to sequencing data from the Dallas Heart Study (DHS), and detected an association between ANGPTL 4 and very low density lipoprotein cholesterol. © 2014 WILEY PERIODICALS, INC.
Hayat, Matthew J.; Powell, Amanda; Johnson, Tessa; Cadwell, Betsy L.
2017-01-01
Statistical literacy and knowledge is needed to read and understand the public health literature. The purpose of this study was to quantify basic and advanced statistical methods used in public health research. We randomly sampled 216 published articles from seven top tier general public health journals. Studies were reviewed by two readers and a standardized data collection form completed for each article. Data were analyzed with descriptive statistics and frequency distributions. Results were summarized for statistical methods used in the literature, including descriptive and inferential statistics, modeling, advanced statistical techniques, and statistical software used. Approximately 81.9% of articles reported an observational study design and 93.1% of articles were substantively focused. Descriptive statistics in table or graphical form were reported in more than 95% of the articles, and statistical inference reported in more than 76% of the studies reviewed. These results reveal the types of statistical methods currently used in the public health literature. Although this study did not obtain information on what should be taught, information on statistical methods being used is useful for curriculum development in graduate health sciences education, as well as making informed decisions about continuing education for public health professionals. PMID:28591190
Hayat, Matthew J; Powell, Amanda; Johnson, Tessa; Cadwell, Betsy L
2017-01-01
Statistical literacy and knowledge is needed to read and understand the public health literature. The purpose of this study was to quantify basic and advanced statistical methods used in public health research. We randomly sampled 216 published articles from seven top tier general public health journals. Studies were reviewed by two readers and a standardized data collection form completed for each article. Data were analyzed with descriptive statistics and frequency distributions. Results were summarized for statistical methods used in the literature, including descriptive and inferential statistics, modeling, advanced statistical techniques, and statistical software used. Approximately 81.9% of articles reported an observational study design and 93.1% of articles were substantively focused. Descriptive statistics in table or graphical form were reported in more than 95% of the articles, and statistical inference reported in more than 76% of the studies reviewed. These results reveal the types of statistical methods currently used in the public health literature. Although this study did not obtain information on what should be taught, information on statistical methods being used is useful for curriculum development in graduate health sciences education, as well as making informed decisions about continuing education for public health professionals.
NASA Astrophysics Data System (ADS)
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.
[Applications of meta-analysis in multi-omics].
Han, Mingfei; Zhu, Yunping
2014-07-01
As a statistical method integrating multi-features and multi-data, meta-analysis was introduced to the field of life science in the 1990s. With the rapid advances in high-throughput technologies, life omics, the core of which are genomics, transcriptomics and proteomics, is becoming the new hot spot of life science. Although the fast output of massive data has promoted the development of omics study, it results in excessive data that are difficult to integrate systematically. In this case, meta-analysis is frequently applied to analyze different types of data and is improved continuously. Here, we first summarize the representative meta-analysis methods systematically, and then study the current applications of meta-analysis in various omics fields, finally we discuss the still-existing problems and the future development of meta-analysis.
MASPECTRAS: a platform for management and analysis of proteomics LC-MS/MS data
Hartler, Jürgen; Thallinger, Gerhard G; Stocker, Gernot; Sturn, Alexander; Burkard, Thomas R; Körner, Erik; Rader, Robert; Schmidt, Andreas; Mechtler, Karl; Trajanoski, Zlatko
2007-01-01
Background The advancements of proteomics technologies have led to a rapid increase in the number, size and rate at which datasets are generated. Managing and extracting valuable information from such datasets requires the use of data management platforms and computational approaches. Results We have developed the MAss SPECTRometry Analysis System (MASPECTRAS), a platform for management and analysis of proteomics LC-MS/MS data. MASPECTRAS is based on the Proteome Experimental Data Repository (PEDRo) relational database schema and follows the guidelines of the Proteomics Standards Initiative (PSI). Analysis modules include: 1) import and parsing of the results from the search engines SEQUEST, Mascot, Spectrum Mill, X! Tandem, and OMSSA; 2) peptide validation, 3) clustering of proteins based on Markov Clustering and multiple alignments; and 4) quantification using the Automated Statistical Analysis of Protein Abundance Ratios algorithm (ASAPRatio). The system provides customizable data retrieval and visualization tools, as well as export to PRoteomics IDEntifications public repository (PRIDE). MASPECTRAS is freely available at Conclusion Given the unique features and the flexibility due to the use of standard software technology, our platform represents significant advance and could be of great interest to the proteomics community. PMID:17567892
Yao, Yanwen; Yuan, Dongmei; Liu, Hongbing; Gu, Xiaoling; Song, Yong
2013-03-01
Neutrophil to lymphocyte ratio (NLR) has been shown to be a prognosis indicator in different types of cancer. We aimed to investigate the association between NLR and therapy response, progression free survival (PFS) and overall survival (OS) in advanced non-small cell lung cancer (NSCLC) patients treated with first-line platinum-based chemotherapy. Patients who were hospitalized between January 2007 and December 2010 were enrolled and eliminated according to the inclusion and exclusion criteria. The NLR was defined as the absolute neutrophil count divided by the absolute lymphocyte count. Logistic regression analysis was applied for response rate and Cox regression analysis was adopted for PFS and OS. A P value of ≤0.05 was considered to be statistically significant. A total of 182 patients were enrolled in the current study. The median PFS was 164.5 days and median OS was 439.5 days. The statistical analysis data indicated that low pretreatment NLR (≤ 2.63) (OR = 2.043, P = 0.043), decreased posttreatment NLR (OR = 2.368, P = 0.013), well and moderate differentiation (OR = 2.773, P = 0.021) and normal CEA level (≤ 9.6 ng/ml) (OR = 2.090, P = 0.046) were associated with response to first-line platinum-based chemotherapy. A high pretreatment NLR (HR = 1.807, P = 0.018 for PFS, HR = 1.761, P = 0.020 for OS) and distant metastasis (HR = 2.118, P = 0.008 for PFS, HR = 2.753, P = 0.000 for OS) were independent prognostic factors for PFS and OS. Elevated pretreatment NLR might be a potential biomarker of worse response to first-line platinum-based chemotherapy and shorter PFS and OS for advanced NSCLC patients. To confirm these findings, larger, prospective and randomized studies are needed.
Liu, Fenghua; Tang, Yong; Sun, Junwei; Yuan, Zhanna; Li, Shasha; Sheng, Jun; Ren, He; Hao, Jihui
2012-01-01
To investigate the efficacy and safety of regional intra-arterial chemotherapy (RIAC) versus systemic chemotherapy for stage III/IV pancreatic cancer. Randomized controlled trials of patients with advanced pancreatic cancer treated by regional intra-arterial or systemic chemotherapy were identified using PubMed, ISI, EMBASE, Cochrane Library, Google, Chinese Scientific Journals Database (VIP), and China National Knowledge Infrastructure (CNKI) electronic databases, for all publications dated between 1960 and December 31, 2010. Data was independently extracted by two reviewers. Odds ratios and relative risks were pooled using either fixed- or random-effects models, depending on I(2) statistic and Q test assessments of heterogeneity. Statistical analysis was performed using RevMan 5.0. Six randomized controlled trials comprised of 298 patients met the standards for inclusion in the meta-analysis, among 492 articles that were identified. Eight patients achieved complete remission (CR) with regional intra-arterial chemotherapy (RIAC), whereas no patients achieved CR with systemic chemotherapy. Compared with systemic chemotherapy, patients receiving RIAC had superior partial remissions (RR = 1.99, 95% CI: 1.50, 2.65; 58.06% with RIAC and 29.37% with systemic treatment), clinical benefits (RR = 2.34, 95% CI: 1.84, 2.97; 78.06% with RAIC and 29.37% with systemic treatment), total complication rates (RR = 0.72, 95% CI: 0.60, 0.87; 49.03% with RIAC and 71.33% with systemic treatment), and hematological side effects (RR = 0.76, 95% CI: 0.63, 0.91; 60.87% with RIAC and 85.71% with systemic treatment). The median survival time with RIAC (5-21 months) was longer than for systemic chemotherapy (2.7-14 months). Similarly, one year survival rates with RIAC (28.6%-41.2%) were higher than with systemic chemotherapy (0%-12.9%.). Regional intra-arterial chemotherapy is more effective and has fewer complications than systemic chemotherapy for treating advanced pancreatic cancer.
GOEAST: a web-based software toolkit for Gene Ontology enrichment analysis.
Zheng, Qi; Wang, Xiu-Jie
2008-07-01
Gene Ontology (GO) analysis has become a commonly used approach for functional studies of large-scale genomic or transcriptomic data. Although there have been a lot of software with GO-related analysis functions, new tools are still needed to meet the requirements for data generated by newly developed technologies or for advanced analysis purpose. Here, we present a Gene Ontology Enrichment Analysis Software Toolkit (GOEAST), an easy-to-use web-based toolkit that identifies statistically overrepresented GO terms within given gene sets. Compared with available GO analysis tools, GOEAST has the following improved features: (i) GOEAST displays enriched GO terms in graphical format according to their relationships in the hierarchical tree of each GO category (biological process, molecular function and cellular component), therefore, provides better understanding of the correlations among enriched GO terms; (ii) GOEAST supports analysis for data from various sources (probe or probe set IDs of Affymetrix, Illumina, Agilent or customized microarrays, as well as different gene identifiers) and multiple species (about 60 prokaryote and eukaryote species); (iii) One unique feature of GOEAST is to allow cross comparison of the GO enrichment status of multiple experiments to identify functional correlations among them. GOEAST also provides rigorous statistical tests to enhance the reliability of analysis results. GOEAST is freely accessible at http://omicslab.genetics.ac.cn/GOEAST/
Østergaard, Mia L; Nielsen, Kristina R; Albrecht-Beste, Elisabeth; Konge, Lars; Nielsen, Michael B
2018-01-01
This study aimed to develop a test with validity evidence for abdominal diagnostic ultrasound with a pass/fail-standard to facilitate mastery learning. The simulator had 150 real-life patient abdominal scans of which 15 cases with 44 findings were selected, representing level 1 from The European Federation of Societies for Ultrasound in Medicine and Biology. Four groups of experience levels were constructed: Novices (medical students), trainees (first-year radiology residents), intermediates (third- to fourth-year radiology residents) and advanced (physicians with ultrasound fellowship). Participants were tested in a standardized setup and scored by two blinded reviewers prior to an item analysis. The item analysis excluded 14 diagnoses. Both internal consistency (Cronbach's alpha 0.96) and inter-rater reliability (0.99) were good and there were statistically significant differences (p < 0.001) between all four groups, except the intermediate and advanced groups (p = 1.0). There was a statistically significant correlation between experience and test scores (Pearson's r = 0.82, p < 0.001). The pass/fail-standard failed all novices (no false positives) and passed all advanced (no false negatives). All intermediate participants and six out of 14 trainees passed. We developed a test for diagnostic abdominal ultrasound with solid validity evidence and a pass/fail-standard without any false-positive or false-negative scores. • Ultrasound training can benefit from competency-based education based on reliable tests. • This simulation-based test can differentiate between competency levels of ultrasound examiners. • This test is suitable for competency-based education, e.g. mastery learning. • We provide a pass/fail standard without false-negative or false-positive scores.
Ribas, Antoni; Kefford, Richard; Marshall, Margaret A; Punt, Cornelis J A; Haanen, John B; Marmol, Maribel; Garbe, Claus; Gogas, Helen; Schachter, Jacob; Linette, Gerald; Lorigan, Paul; Kendra, Kari L; Maio, Michele; Trefzer, Uwe; Smylie, Michael; McArthur, Grant A; Dreno, Brigitte; Nathan, Paul D; Mackiewicz, Jacek; Kirkwood, John M; Gomez-Navarro, Jesus; Huang, Bo; Pavlov, Dmitri; Hauschild, Axel
2013-02-10
In phase I/II trials, the cytotoxic T lymphocyte-associated antigen-4-blocking monoclonal antibody tremelimumab induced durable responses in a subset of patients with advanced melanoma. This phase III study evaluated overall survival (OS) and other safety and efficacy end points in patients with advanced melanoma treated with tremelimumab or standard-of-care chemotherapy. Patients with treatment-naive, unresectable stage IIIc or IV melanoma were randomly assigned at a ratio of one to one to tremelimumab (15 mg/kg once every 90 days) or physician's choice of standard-of-care chemotherapy (temozolomide or dacarbazine). In all, 655 patients were enrolled and randomly assigned. The test statistic crossed the prespecified futility boundary at second interim analysis after 340 deaths, but survival follow-up continued. At final analysis with 534 events, median OS by intent to treat was 12.6 months (95% CI, 10.8 to 14.3) for tremelimumab and 10.7 months (95% CI, 9.36 to 11.96) for chemotherapy (hazard ratio, 0.88; P = .127). Objective response rates were similar in the two arms: 10.7% in the tremelimumab arm and 9.8% in the chemotherapy arm. However, response duration (measured from date of random assignment) was significantly longer after tremelimumab (35.8 v 13.7 months; P = .0011). Diarrhea, pruritus, and rash were the most common treatment-related adverse events in the tremelimumab arm; 7.4% had endocrine toxicities. Seven deaths in the tremelimumab arm and one in the chemotherapy arm were considered treatment related by either investigators or sponsor. This study failed to demonstrate a statistically significant survival advantage of treatment with tremelimumab over standard-of-care chemotherapy in first-line treatment of patients with metastatic melanoma.
Application of the Statistical ICA Technique in the DANCE Data Analysis
NASA Astrophysics Data System (ADS)
Baramsai, Bayarbadrakh; Jandel, M.; Bredeweg, T. A.; Rusev, G.; Walker, C. L.; Couture, A.; Mosby, S.; Ullmann, J. L.; Dance Collaboration
2015-10-01
The Detector for Advanced Neutron Capture Experiments (DANCE) at the Los Alamos Neutron Science Center is used to improve our understanding of the neutron capture reaction. DANCE is a highly efficient 4 π γ-ray detector array consisting of 160 BaF2 crystals which make it an ideal tool for neutron capture experiments. The (n, γ) reaction Q-value equals to the sum energy of all γ-rays emitted in the de-excitation cascades from the excited capture state to the ground state. The total γ-ray energy is used to identify reactions on different isotopes as well as the background. However, it's challenging to identify contribution in the Esum spectra from different isotopes with the similar Q-values. Recently we have tested the applicability of modern statistical methods such as Independent Component Analysis (ICA) to identify and separate different (n, γ) reaction yields on different isotopes that are present in the target material. ICA is a recently developed computational tool for separating multidimensional data into statistically independent additive subcomponents. In this conference talk, we present some results of the application of ICA algorithms and its modification for the DANCE experimental data analysis. This research is supported by the U. S. Department of Energy, Office of Science, Nuclear Physics under the Early Career Award No. LANL20135009.
Recent developments in measurement and evaluation of FAC damage in power plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garud, Y.S.; Besuner, P.; Cohn, M.J.
1999-11-01
This paper describes some recent developments in the measurement and evaluation of flow-accelerated corrosion (FAC) damage in power plants. The evaluation focuses on data checking and smoothing to account for gross errors, noise, and uncertainty in the wall thickness measurements from ultrasonic or pulsed eddy-current data. Also, the evaluation method utilizes advanced regression analysis for spatial and temporal evolution of the wall loss, providing statistically robust predictions of wear rates and associated uncertainty. Results of the application of these new tools are presented for several components in actual service. More importantly, the practical implications of using these advances are discussedmore » in relation to the likely impact on the scope and effectiveness of FAC related inspection programs.« less
Gender disparities in prosthodontics: authorship and leadership, 13 years of observation.
Kongkiatkamon, Suchada; Yuan, Judy Chia-Chun; Lee, Damian J; Knoernschild, Kent L; Campbell, Stephen D; Sukotjo, Cortino
2010-10-01
The purpose of this study was to examine gender disparities in prosthodontics by reviewing the trend of female authorship in prosthodontic journals and exploring the role of female leadership in prosthodontic organizations and Advanced Education in Prosthodontic (AEP) programs. Three journals representing the prosthodontic specialty were selected to analyze the percentage of female dentist first and last (senior) authors for the years 1995, 2000, 2005, and 2008. Article inclusion criteria were restricted to the first or last authors who held at least a DMD/DDS/BDS degree and were from U.S. institutions. Data on female leadership in prosthodontic organizations and advanced education programs were collected, and the trends were studied. Descriptive statistics were used to analyze the data. A linear regression analysis was performed to investigate the proportion of female authorship compared to male in the dental literature. A Fisher's Exact Test was performed to contrast differences of female first and last authorship in the selected journals between years 1995 and 2008. Overall, there was no statistically significant linear increase in the proportion of either first or last female authorship compared to male authorship over time. With respect to each journal, the linear regression analysis showed that the increase of first female authorship was statistically significant (p= 0.016) compared to male authorship only in the Journal of Prosthetic Dentistry. The percentage of female presidents of prosthodontic organizations has been very limited. A similar trend was also observed in AEP program director positions. Over the past 13 years, female dentists' participation in prosthodontics literature authorship has not increased significantly in the United States. Furthermore, female involvement in prosthodontics leadership has been limited over the past decades. © 2010 by The American College of Prosthodontists.
Understanding Statistics and Statistics Education: A Chinese Perspective
ERIC Educational Resources Information Center
Shi, Ning-Zhong; He, Xuming; Tao, Jian
2009-01-01
In recent years, statistics education in China has made great strides. However, there still exists a fairly large gap with the advanced levels of statistics education in more developed countries. In this paper, we identify some existing problems in statistics education in Chinese schools and make some proposals as to how they may be overcome. We…
Blattmann, Peter; Heusel, Moritz; Aebersold, Ruedi
2016-01-01
SWATH-MS is an acquisition and analysis technique of targeted proteomics that enables measuring several thousand proteins with high reproducibility and accuracy across many samples. OpenSWATH is popular open-source software for peptide identification and quantification from SWATH-MS data. For downstream statistical and quantitative analysis there exist different tools such as MSstats, mapDIA and aLFQ. However, the transfer of data from OpenSWATH to the downstream statistical tools is currently technically challenging. Here we introduce the R/Bioconductor package SWATH2stats, which allows convenient processing of the data into a format directly readable by the downstream analysis tools. In addition, SWATH2stats allows annotation, analyzing the variation and the reproducibility of the measurements, FDR estimation, and advanced filtering before submitting the processed data to downstream tools. These functionalities are important to quickly analyze the quality of the SWATH-MS data. Hence, SWATH2stats is a new open-source tool that summarizes several practical functionalities for analyzing, processing, and converting SWATH-MS data and thus facilitates the efficient analysis of large-scale SWATH/DIA datasets.
2012-12-16
sterilizing without causing toxicity in vivo. 1 Introduction As reported to the Centers for Disease Control and Pre- vention (CDC) between 2006 and...Owings MF. National Hospital Discharge Survey. Advance Data from Vital and Health Statistics. United States: Centers for Disease Control and Prevention...10.1007/s10856-012-4730-3. 19. Shirwaiker RA, Wysk RA, Kariyawasam S, Carrion H, Voigt RC. Micro-scale fabrication and characterization of a silver–polymer
Infant Statistical-Learning Ability Is Related to Real-Time Language Processing
ERIC Educational Resources Information Center
Lany, Jill; Shoaib, Amber; Thompson, Abbie; Estes, Katharine Graf
2018-01-01
Infants are adept at learning statistical regularities in artificial language materials, suggesting that the ability to learn statistical structure may support language development. Indeed, infants who perform better on statistical learning tasks tend to be more advanced in parental reports of infants' language skills. Work with adults suggests…
Cross-cultural examination of measurement invariance of the Beck Depression Inventory-II.
Dere, Jessica; Watters, Carolyn A; Yu, Stephanie Chee-Min; Bagby, R Michael; Ryder, Andrew G; Harkness, Kate L
2015-03-01
Given substantial rates of major depressive disorder among college and university students, as well as the growing cultural diversity on many campuses, establishing the cross-cultural validity of relevant assessment tools is important. In the current investigation, we examined the Beck Depression Inventory-Second Edition (BDI-II; Beck, Steer, & Brown, 1996) among Chinese-heritage (n = 933) and European-heritage (n = 933) undergraduates in North America. The investigation integrated 3 distinct lines of inquiry: (a) the literature on cultural variation in depressive symptom reporting between people of Chinese and Western heritage; (b) recent developments regarding the factor structure of the BDI-II; and (c) the application of advanced statistical techniques to the issue of cross-cultural measurement invariance. A bifactor model was found to represent the optimal factor structure of the BDI-II. Multigroup confirmatory factor analysis showed that the BDI-II had strong measurement invariance across both culture and gender. In group comparisons with latent and observed variables, Chinese-heritage students scored higher than European-heritage students on cognitive symptoms of depression. This finding deviates from the commonly held view that those of Chinese heritage somatize depression. These findings hold implications for the study and use of the BDI-II, highlight the value of advanced statistical techniques such as multigroup confirmatory factor analysis, and offer methodological lessons for cross-cultural psychopathology research more broadly. 2015 APA, all rights reserved
van der Krieke, Lian; Emerencia, Ando C; Bos, Elisabeth H; Rosmalen, Judith Gm; Riese, Harriëtte; Aiello, Marco; Sytema, Sjoerd; de Jonge, Peter
2015-08-07
Health promotion can be tailored by combining ecological momentary assessments (EMA) with time series analysis. This combined method allows for studying the temporal order of dynamic relationships among variables, which may provide concrete indications for intervention. However, application of this method in health care practice is hampered because analyses are conducted manually and advanced statistical expertise is required. This study aims to show how this limitation can be overcome by introducing automated vector autoregressive modeling (VAR) of EMA data and to evaluate its feasibility through comparisons with results of previously published manual analyses. We developed a Web-based open source application, called AutoVAR, which automates time series analyses of EMA data and provides output that is intended to be interpretable by nonexperts. The statistical technique we used was VAR. AutoVAR tests and evaluates all possible VAR models within a given combinatorial search space and summarizes their results, thereby replacing the researcher's tasks of conducting the analysis, making an informed selection of models, and choosing the best model. We compared the output of AutoVAR to the output of a previously published manual analysis (n=4). An illustrative example consisting of 4 analyses was provided. Compared to the manual output, the AutoVAR output presents similar model characteristics and statistical results in terms of the Akaike information criterion, the Bayesian information criterion, and the test statistic of the Granger causality test. Results suggest that automated analysis and interpretation of times series is feasible. Compared to a manual procedure, the automated procedure is more robust and can save days of time. These findings may pave the way for using time series analysis for health promotion on a larger scale. AutoVAR was evaluated using the results of a previously conducted manual analysis. Analysis of additional datasets is needed in order to validate and refine the application for general use.
Emerencia, Ando C; Bos, Elisabeth H; Rosmalen, Judith GM; Riese, Harriëtte; Aiello, Marco; Sytema, Sjoerd; de Jonge, Peter
2015-01-01
Background Health promotion can be tailored by combining ecological momentary assessments (EMA) with time series analysis. This combined method allows for studying the temporal order of dynamic relationships among variables, which may provide concrete indications for intervention. However, application of this method in health care practice is hampered because analyses are conducted manually and advanced statistical expertise is required. Objective This study aims to show how this limitation can be overcome by introducing automated vector autoregressive modeling (VAR) of EMA data and to evaluate its feasibility through comparisons with results of previously published manual analyses. Methods We developed a Web-based open source application, called AutoVAR, which automates time series analyses of EMA data and provides output that is intended to be interpretable by nonexperts. The statistical technique we used was VAR. AutoVAR tests and evaluates all possible VAR models within a given combinatorial search space and summarizes their results, thereby replacing the researcher’s tasks of conducting the analysis, making an informed selection of models, and choosing the best model. We compared the output of AutoVAR to the output of a previously published manual analysis (n=4). Results An illustrative example consisting of 4 analyses was provided. Compared to the manual output, the AutoVAR output presents similar model characteristics and statistical results in terms of the Akaike information criterion, the Bayesian information criterion, and the test statistic of the Granger causality test. Conclusions Results suggest that automated analysis and interpretation of times series is feasible. Compared to a manual procedure, the automated procedure is more robust and can save days of time. These findings may pave the way for using time series analysis for health promotion on a larger scale. AutoVAR was evaluated using the results of a previously conducted manual analysis. Analysis of additional datasets is needed in order to validate and refine the application for general use. PMID:26254160
NASA Astrophysics Data System (ADS)
Alves, Gelio; Wang, Guanghui; Ogurtsov, Aleksey Y.; Drake, Steven K.; Gucek, Marjan; Suffredini, Anthony F.; Sacks, David B.; Yu, Yi-Kuo
2016-02-01
Correct and rapid identification of microorganisms is the key to the success of many important applications in health and safety, including, but not limited to, infection treatment, food safety, and biodefense. With the advance of mass spectrometry (MS) technology, the speed of identification can be greatly improved. However, the increasing number of microbes sequenced is challenging correct microbial identification because of the large number of choices present. To properly disentangle candidate microbes, one needs to go beyond apparent morphology or simple `fingerprinting'; to correctly prioritize the candidate microbes, one needs to have accurate statistical significance in microbial identification. We meet these challenges by using peptidome profiles of microbes to better separate them and by designing an analysis method that yields accurate statistical significance. Here, we present an analysis pipeline that uses tandem MS (MS/MS) spectra for microbial identification or classification. We have demonstrated, using MS/MS data of 81 samples, each composed of a single known microorganism, that the proposed pipeline can correctly identify microorganisms at least at the genus and species levels. We have also shown that the proposed pipeline computes accurate statistical significances, i.e., E-values for identified peptides and unified E-values for identified microorganisms. The proposed analysis pipeline has been implemented in MiCId, a freely available software for Microorganism Classification and Identification. MiCId is available for download at http://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads.html.
FDA Approval Summary: Temsirolimus as Treatment for Advanced Renal Cell Carcinoma
Prowell, Tatiana M.; Ibrahim, Amna; Farrell, Ann T.; Justice, Robert; Mitchell, Shan Sun; Sridhara, Rajeshwari; Pazdur, Richard
2010-01-01
This report summarizes the U.S. Food and Drug Administration (FDA)'s approval of temsirolimus (Torisel®), on May 30, 2007, for the treatment of advanced renal cell carcinoma (RCC). Information provided includes regulatory history, study design, study results, and literature review. A multicenter, three-arm, randomized, open-label study was conducted in previously untreated patients with poor-prognosis, advanced RCC. The study objectives were to compare overall survival (OS), progression-free survival (PFS), objective response rate, and safety in patients receiving interferon (IFN)-α versus those receiving temsirolimus alone or in combination with IFN-α. In the second planned interim analysis of the intent-to-treat population (n = 626), there was a statistically significant longer OS time in the temsirolimus (25 mg) arm than in the IFN-α arm (median, 10.9 months versus 7.3 months; hazard ratio [HR], 0.73; p = .0078). The combination of temsirolimus (15 mg) and IFN-α did not lead to a significant difference in OS compared with IFN-α alone. There was also a statistically significant longer PFS time for the temsirolimus (25 mg) arm than for the IFN-α arm (median, 5.5 months versus 3.1 months; HR, 0.66, p = .0001). Common adverse reactions reported in patients receiving temsirolimus were rash, asthenia, and mucositis. Common laboratory abnormalities were anemia, hyperglycemia, hyperlipidemia, and hypertriglyceridemia. Serious but rare cases of interstitial lung disease, bowel perforation, and acute renal failure were observed. Temsirolimus has demonstrated superiority in terms of OS and PFS over IFN-α and provides an additional treatment option for patients with advanced RCC. PMID:20332142
NASA Astrophysics Data System (ADS)
Pollard, D.; Chang, W.; Haran, M.; Applegate, P.; DeConto, R.
2015-11-01
A 3-D hybrid ice-sheet model is applied to the last deglacial retreat of the West Antarctic Ice Sheet over the last ~ 20 000 years. A large ensemble of 625 model runs is used to calibrate the model to modern and geologic data, including reconstructed grounding lines, relative sea-level records, elevation-age data and uplift rates, with an aggregate score computed for each run that measures overall model-data misfit. Two types of statistical methods are used to analyze the large-ensemble results: simple averaging weighted by the aggregate score, and more advanced Bayesian techniques involving Gaussian process-based emulation and calibration, and Markov chain Monte Carlo. Results for best-fit parameter ranges and envelopes of equivalent sea-level rise with the simple averaging method agree quite well with the more advanced techniques, but only for a large ensemble with full factorial parameter sampling. Best-fit parameter ranges confirm earlier values expected from prior model tuning, including large basal sliding coefficients on modern ocean beds. Each run is extended 5000 years into the "future" with idealized ramped climate warming. In the majority of runs with reasonable scores, this produces grounding-line retreat deep into the West Antarctic interior, and the analysis provides sea-level-rise envelopes with well defined parametric uncertainty bounds.
Prevalence and features of colorectal lesions among Hispanics: A hospital-based study
Ashktorab, Hassan; Laiyemo, Adeyinka O; Lee, Edward; Cruz-Correa, Marcia; Ghuman, Amita; Nouraie, Mehdi; Brim, Hassan
2015-01-01
AIM: To evaluate the prevalence and characteristics of colorectal adenoma and carcinoma in an inner city Hispanic population. METHODS: We reviewed the reports of 1628 Hispanic patients who underwent colonoscopy at Howard University from 2000 to 2010. Advanced adenoma was defined as adenoma ≥ 1 cm in size, adenomas with villous histology, high grade dysplasia and/or invasive cancer. Statistical analysis was performed using χ2 statistics and t-test. RESULTS: The median age of the patients was 54 years, 64.2% were females. Polyps were observed in 489 (30.0%) of patients. Adenoma prevalence was 16.8% (n = 273), advanced adenoma 2.4% (n = 39), and colorectal cancer 0.4% (n = 7). Hyperplastic polyps were seen in 6.6% of the cohort (n = 107). Adenomas predominantly exhibited a proximal colonic distribution (53.7%, n = 144); while hyperplastic polyps were mostly located in the distal colon (70%, n = 75). Among 11.7% (n = 191) patients who underwent screening colonoscopy, the prevalence of colorectal lesions was 21.4% adenoma, 2.6% advanced adenoma; and 8.3% hyperplastic polyps. CONCLUSION: Our data showed low colorectal cancer prevalence among Hispanics in the Washington DC area. However, the pre-neoplastic pattern of colonic lesions in Hispanics likely points toward a shift in this population that needs to be monitored closely through large epidemiological studies. PMID:26673447
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaffney, David K., E-mail: david.gaffney@hci.utah.edu; King, Bronwyn; Viswanathan, Akila N.
Purpose: The purpose of this study was to develop a radiation therapy (RT) contouring atlas and recommendations for women with postoperative and locally advanced vulvar carcinoma. Methods and Materials: An international committee of 35 expert gynecologic radiation oncologists completed a survey of the treatment of vulvar carcinoma. An initial set of recommendations for contouring was discussed and generated by consensus. Two cases, 1 locally advanced and 1 postoperative, were contoured by 14 physicians. Contours were compared and analyzed using an expectation-maximization algorithm for simultaneous truth and performance level estimation (STAPLE), and a 95% confidence interval contour was developed. The levelmore » of agreement among contours was assessed using a kappa statistic. STAPLE contours underwent full committee editing to generate the final atlas consensus contours. Results: Analysis of the 14 contours showed substantial agreement, with kappa statistics of 0.69 and 0.64 for cases 1 and 2, respectively. There was high specificity for both cases (≥99%) and only moderate sensitivity of 71.3% and 64.9% for cases 1 and 2, respectively. Expert review and discussion generated consensus recommendations for contouring target volumes and treatment for postoperative and locally advanced vulvar cancer. Conclusions: These consensus recommendations for contouring and treatment of vulvar cancer identified areas of complexity and controversy. Given the lack of clinical research evidence in vulvar cancer radiation therapy, the committee advocates a conservative and consistent approach using standardized recommendations.« less
Lee, Sang Ho; Hayano, Koichi; Zhu, Andrew X.; Sahani, Dushyant V.; Yoshida, Hiroyuki
2015-01-01
Background To find prognostic biomarkers in pretreatment dynamic contrast-enhanced MRI (DCE-MRI) water-exchange-modified (WX) kinetic parameters for advanced hepatocellular carcinoma (HCC) treated with antiangiogenic monotherapy. Methods Twenty patients with advanced HCC underwent DCE-MRI and were subsequently treated with sunitinib. Pretreatment DCE-MRI data on advanced HCC were analyzed using five different WX kinetic models: the Tofts-Kety (WX-TK), extended TK (WX-ETK), two compartment exchange, adiabatic approximation to tissue homogeneity (WX-AATH), and distributed parameter (WX-DP) models. The total hepatic blood flow, arterial flow fraction (γ), arterial blood flow (BF A), portal blood flow, blood volume, mean transit time, permeability-surface area product, fractional interstitial volume (v I), extraction fraction, mean intracellular water molecule lifetime (τ C), and fractional intracellular volume (v C) were calculated. After receiver operating characteristic analysis with leave-one-out cross-validation, individual parameters for each model were assessed in terms of 1-year-survival (1YS) discrimination using Kaplan-Meier analysis, and association with overall survival (OS) using univariate Cox regression analysis with permutation testing. Results The WX-TK-model-derived γ (P = 0.022) and v I (P = 0.010), and WX-ETK-model-derived τ C (P = 0.023) and v C (P = 0.042) were statistically significant prognostic biomarkers for 1YS. Increase in the WX-DP-model-derived BF A (P = 0.025) and decrease in the WX-TK, WX-ETK, WX-AATH, and WX-DP-model-derived v C (P = 0.034, P = 0.038, P = 0.028, P = 0.041, respectively) were significantly associated with an increase in OS. Conclusions The WX-ETK-model-derived v C was an effective prognostic biomarker for advanced HCC treated with sunitinib. PMID:26366997
Qumseya, Bashar J; Brown, Jessica; Abraham, Merna; White, Donna; Wolfsen, Herbert; Gupta, Neil; Vennalaganti, Prashanth; Sharma, Prateek; Wallace, Michael B
2015-04-01
The role of EUS among patients with Barrett's esophagus (BE) with high-grade dysplasia (HGD) or suspected mucosal carcinoma is controversial. To define the role of EUS in detecting advanced disease among patients with BE. Systematic review and meta-analysis. MEDLINE, Embase, Web of Science, and Cochrane Central databases. Patients with BE and HGD or esophageal adenocarcinoma (EAC) who were referred for endoscopic evaluation and underwent EUS. EUS. Pooled proportion of patients with advanced EAC identified by EUS among patients with BE who are referred for HGD or EAC (with or without visible lesions). Forest plots were used to contrast effect sizes in each of the studies and random effect models when tests of heterogeneity were significant (I(2) > 50% or P < .1 for the Q statistic). Of 1278 articles, 47 were reviewed in full text, and 11 articles met the inclusion criteria, including a total of 656 patients. Based on a random-effects model, the proportion of patients with advanced disease detected on EUS was 14% (95% confidence interval, 8%-22%; P < .0001). In a subanalysis, the pooled proportion of patients with advanced disease on EUS in the absence of nodules was 4% (95% confidence interval, 2%-6%, P < .0001). Significant heterogeneity among studies. EUS will result in a change in the therapeutic approach among in a significant minority of patients with BE who are referred for HGD or EAC. Copyright © 2015. Published by Elsevier Inc.
Extracting and Converting Quantitative Data into Human Error Probabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tuan Q. Tran; Ronald L. Boring; Jeffrey C. Joe
2007-08-01
This paper discusses a proposed method using a combination of advanced statistical approaches (e.g., meta-analysis, regression, structural equation modeling) that will not only convert different empirical results into a common metric for scaling individual PSFs effects, but will also examine the complex interrelationships among PSFs. Furthermore, the paper discusses how the derived statistical estimates (i.e., effect sizes) can be mapped onto a HRA method (e.g. SPAR-H) to generate HEPs that can then be use in probabilistic risk assessment (PRA). The paper concludes with a discussion of the benefits of using academic literature in assisting HRA analysts in generating sound HEPsmore » and HRA developers in validating current HRA models and formulating new HRA models.« less
Results of a joint NOAA/NASA sounder simulation study
NASA Technical Reports Server (NTRS)
Phillips, N.; Susskind, Joel; Mcmillin, L.
1988-01-01
This paper presents the results of a joint NOAA and NASA sounder simulation study in which the accuracies of atmospheric temperature profiles and surface skin temperature measuremnents retrieved from two sounders were compared: (1) the currently used IR temperature sounder HIRS2 (High-resolution Infrared Radiation Sounder 2); and (2) the recently proposed high-spectral-resolution IR sounder AMTS (Advanced Moisture and Temperature Sounder). Simulations were conducted for both clear and partial cloud conditions. Data were analyzed at NASA using a physical inversion technique and at NOAA using a statistical technique. Results show significant improvement of AMTS compared to HIRS2 for both clear and cloudy conditions. The improvements are indicated by both methods of data analysis, but the physical retrievals outperform the statistical retrievals.
Dynamic Quantitative Trait Locus Analysis of Plant Phenomic Data.
Li, Zitong; Sillanpää, Mikko J
2015-12-01
Advanced platforms have recently become available for automatic and systematic quantification of plant growth and development. These new techniques can efficiently produce multiple measurements of phenotypes over time, and introduce time as an extra dimension to quantitative trait locus (QTL) studies. Functional mapping utilizes a class of statistical models for identifying QTLs associated with the growth characteristics of interest. A major benefit of functional mapping is that it integrates information over multiple timepoints, and therefore could increase the statistical power for QTL detection. We review the current development of computationally efficient functional mapping methods which provide invaluable tools for analyzing large-scale timecourse data that are readily available in our post-genome era. Copyright © 2015 Elsevier Ltd. All rights reserved.
Computational pathology: Exploring the spatial dimension of tumor ecology.
Nawaz, Sidra; Yuan, Yinyin
2016-09-28
Tumors are evolving ecosystems where cancer subclones and the microenvironment interact. This is analogous to interaction dynamics between species in their natural habitats, which is a prime area of study in ecology. Spatial statistics are frequently used in ecological studies to infer complex relations including predator-prey, resource dependency and co-evolution. Recently, the emerging field of computational pathology has enabled high-throughput spatial analysis by using image processing to identify different cell types and their locations within histological tumor samples. We discuss how these data may be analyzed with spatial statistics used in ecology to reveal patterns and advance our understanding of ecological interactions occurring among cancer cells and their microenvironment. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Desensitized Optimal Filtering and Sensor Fusion Toolkit
NASA Technical Reports Server (NTRS)
Karlgaard, Christopher D.
2015-01-01
Analytical Mechanics Associates, Inc., has developed a software toolkit that filters and processes navigational data from multiple sensor sources. A key component of the toolkit is a trajectory optimization technique that reduces the sensitivity of Kalman filters with respect to model parameter uncertainties. The sensor fusion toolkit also integrates recent advances in adaptive Kalman and sigma-point filters for non-Gaussian problems with error statistics. This Phase II effort provides new filtering and sensor fusion techniques in a convenient package that can be used as a stand-alone application for ground support and/or onboard use. Its modular architecture enables ready integration with existing tools. A suite of sensor models and noise distribution as well as Monte Carlo analysis capability are included to enable statistical performance evaluations.
Omoumi, Patrick; Babel, Hugo; Jolles, Brigitte M; Favre, Julien
2018-04-16
To test, through tridimensional analysis, whether (1) cartilage thickness at the posterior aspect of femoral condyles differs in knees with medial femorotibial osteoarthritis (OA) compared to non-OA knees; (2) the location of the thickest cartilage at the posterior aspect of femoral condyles differs between OA and non-OA knees. CT arthrograms of knees without radiographic OA (n = 30) and with severe medial femorotibial OA (n = 30) were selected retrospectively from patients over 50 years of age. The groups did not differ in gender, age and femoral size. CT arthrograms were segmented to measure the mean cartilage thickness, the maximal cartilage thickness and its location in a region of interest at the posterior aspect of condyles. For the medial condyle, mean and maximum cartilage thicknesses were statistically significantly higher in OA knees compared to non-OA knees [1.66 vs 1.46 mm (p = 0.03) and 2.56 vs 2.14 mm (p = 0.003), respectively]. The thickest cartilage was located in the half most medial aspect of the posterior medial condyle for both groups, without significant difference between groups. For the lateral condyle, no statistically significant difference between non-OA and OA knees was found (p ≥ 0.17). Cartilage at the posterior aspect of the medial condyle, but not the lateral condyle, is statistically significantly thicker in advanced medial femorotibial OA knees compared to non-OA knees. The thickest cartilage was located in the half most medial aspect of the posterior medial condyle. These results will serve as the basis for future research to determine the histobiological processes involved in this thicker cartilage. Advances in knowledge: This study, through a quantitative tridimensional approach, shows that cartilage at the posterior aspect of the medial condyles is thicker in severe femorotibial osteoarthritic knees compared to non-OA knees. In the posterior aspect of the medial condyle, the thickest cartilage is located in the vicinity of the center of the half most medial aspect of the posterior medial condyle. These results will serve as the basis for future research to determine the histobiological processes involved in this thicker cartilage.
McElreath, Richard; Bell, Adrian V; Efferson, Charles; Lubell, Mark; Richerson, Peter J; Waring, Timothy
2008-11-12
The existence of social learning has been confirmed in diverse taxa, from apes to guppies. In order to advance our understanding of the consequences of social transmission and evolution of behaviour, however, we require statistical tools that can distinguish among diverse social learning strategies. In this paper, we advance two main ideas. First, social learning is diverse, in the sense that individuals can take advantage of different kinds of information and combine them in different ways. Examining learning strategies for different information conditions illuminates the more detailed design of social learning. We construct and analyse an evolutionary model of diverse social learning heuristics, in order to generate predictions and illustrate the impact of design differences on an organism's fitness. Second, in order to eventually escape the laboratory and apply social learning models to natural behaviour, we require statistical methods that do not depend upon tight experimental control. Therefore, we examine strategic social learning in an experimental setting in which the social information itself is endogenous to the experimental group, as it is in natural settings. We develop statistical models for distinguishing among different strategic uses of social information. The experimental data strongly suggest that most participants employ a hierarchical strategy that uses both average observed pay-offs of options as well as frequency information, the same model predicted by our evolutionary analysis to dominate a wide range of conditions.
The taxonomy statistic uncovers novel clinical patterns in a population of ischemic stroke patients.
Tukiendorf, Andrzej; Kaźmierski, Radosław; Michalak, Sławomir
2013-01-01
In this paper, we describe a simple taxonomic approach for clinical data mining elaborated by Marczewski and Steinhaus (M-S), whose performance equals the advanced statistical methodology known as the expectation-maximization (E-M) algorithm. We tested these two methods on a cohort of ischemic stroke patients. The comparison of both methods revealed strong agreement. Direct agreement between M-S and E-M classifications reached 83%, while Cohen's coefficient of agreement was κ = 0.766(P < 0.0001). The statistical analysis conducted and the outcomes obtained in this paper revealed novel clinical patterns in ischemic stroke patients. The aim of the study was to evaluate the clinical usefulness of Marczewski-Steinhaus' taxonomic approach as a tool for the detection of novel patterns of data in ischemic stroke patients and the prediction of disease outcome. In terms of the identification of fairly frequent types of stroke patients using their age, National Institutes of Health Stroke Scale (NIHSS), and diabetes mellitus (DM) status, when dealing with rough characteristics of patients, four particular types of patients are recognized, which cannot be identified by means of routine clinical methods. Following the obtained taxonomical outcomes, the strong correlation between the health status at moment of admission to emergency department (ED) and the subsequent recovery of patients is established. Moreover, popularization and simplification of the ideas of advanced mathematicians may provide an unconventional explorative platform for clinical problems.
NASA Astrophysics Data System (ADS)
Hoell, Simon; Omenzetter, Piotr
2017-07-01
Considering jointly damage sensitive features (DSFs) of signals recorded by multiple sensors, applying advanced transformations to these DSFs and assessing systematically their contribution to damage detectability and localisation can significantly enhance the performance of structural health monitoring systems. This philosophy is explored here for partial autocorrelation coefficients (PACCs) of acceleration responses. They are interrogated with the help of the linear discriminant analysis based on the Fukunaga-Koontz transformation using datasets of the healthy and selected reference damage states. Then, a simple but efficient fast forward selection procedure is applied to rank the DSF components with respect to statistical distance measures specialised for either damage detection or localisation. For the damage detection task, the optimal feature subsets are identified based on the statistical hypothesis testing. For damage localisation, a hierarchical neuro-fuzzy tool is developed that uses the DSF ranking to establish its own optimal architecture. The proposed approaches are evaluated experimentally on data from non-destructively simulated damage in a laboratory scale wind turbine blade. The results support our claim of being able to enhance damage detectability and localisation performance by transforming and optimally selecting DSFs. It is demonstrated that the optimally selected PACCs from multiple sensors or their Fukunaga-Koontz transformed versions can not only improve the detectability of damage via statistical hypothesis testing but also increase the accuracy of damage localisation when used as inputs into a hierarchical neuro-fuzzy network. Furthermore, the computational effort of employing these advanced soft computing models for damage localisation can be significantly reduced by using transformed DSFs.
Study of advanced fuel system concepts for commercial aircraft
NASA Technical Reports Server (NTRS)
Coffinberry, G. A.
1985-01-01
An analytical study was performed in order to assess relative performance and economic factors involved with alternative advanced fuel systems for future commercial aircraft operating with broadened property fuels. The DC-10-30 wide-body tri-jet aircraft and the CF6-8OX engine were used as a baseline design for the study. Three advanced systems were considered and were specifically aimed at addressing freezing point, thermal stability and lubricity fuel properties. Actual DC-10-30 routes and flight profiles were simulated by computer modeling and resulted in prediction of aircraft and engine fuel system temperatures during a nominal flight and during statistical one-day-per-year cold and hot flights. Emergency conditions were also evaluated. Fuel consumption and weight and power extraction results were obtained. An economic analysis was performed for new aircraft and systems. Advanced system means for fuel tank heating included fuel recirculation loops using engine lube heat and generator heat. Environmental control system bleed air heat was used for tank heating in a water recirculation loop. The results showed that fundamentally all of the three advanced systems are feasible but vary in their degree of compatibility with broadened-property fuel.
Effects of bite-jumping appliances on mandibular advancement in growing rats: A radiographic study
Oksayan, Ridvan; Sokucu, Oral; Ucuncu, Neslihan
2014-01-01
Objective: The aim was to evaluate the effects of the use of mandibular advancement appliances on mandibular growth in growing rats. Materials and Methods: Twenty-four 8-week-old male Wistar albino rats were randomly divided into two experimental groups (12 rats each): Group I was a control group, and Group II was the mandibular advancement appliance group. A functional bite-jumping appliance was used in Group II to promote mandibular advancement. Anatomical changes in the condyle and mandible were evaluated by comparing radiographic results from before and after the study, with angular and linear measurements. Friedman and Mann-Whitney U-tests were used in statistical analysis. Results: According to the radiographic results, the growth of mandibles and condyles in Group II was significantly greater than with the length of the condylar process (A-B) and distance from condyle to menton (A-D) variables (P < 0.05). In addition, Group I showed greater mandibular base growth than did Group II (P < 0.05). Conclusions: We conclude that the use of an intraoral bite-jumping appliance can stimulate condylar growth and increase sagittal mandibular advancement in growing rats. PMID:25202205
Murchie, Brent; Tandon, Kanwarpreet; Hakim, Seifeldin; Shah, Kinchit; O'Rourke, Colin; Castro, Fernando J
2017-04-01
Colorectal cancer (CRC) screening guidelines likely over-generalizes CRC risk, 35% of Americans are not up to date with screening, and there is growing incidence of CRC in younger patients. We developed a practical prediction model for high-risk colon adenomas in an average-risk population, including an expanded definition of high-risk polyps (≥3 nonadvanced adenomas), exposing higher than average-risk patients. We also compared results with previously created calculators. Patients aged 40 to 59 years, undergoing first-time average-risk screening or diagnostic colonoscopies were evaluated. Risk calculators for advanced adenomas and high-risk adenomas were created based on age, body mass index, sex, race, and smoking history. Previously established calculators with similar risk factors were selected for comparison of concordance statistic (c-statistic) and external validation. A total of 5063 patients were included. Advanced adenomas, and high-risk adenomas were seen in 5.7% and 7.4% of the patient population, respectively. The c-statistic for our calculator was 0.639 for the prediction of advanced adenomas, and 0.650 for high-risk adenomas. When applied to our population, all previous models had lower c-statistic results although one performed similarly. Our model compares favorably to previously established prediction models. Age and body mass index were used as continuous variables, likely improving the c-statistic. It also reports absolute predictive probabilities of advanced and high-risk polyps, allowing for more individualized risk assessment of CRC.
ERIC Educational Resources Information Center
McCulloch, Ryan Sterling
2017-01-01
The role of any statistics course is to increase the understanding and comprehension of statistical concepts and those goals can be achieved via both theoretical instruction and statistical software training. However, many introductory courses either forego advanced software usage, or leave its use to the student as a peripheral activity. The…
Advanced Placement® Statistics Students' Education Choices after High School. Research Notes. RN-38
ERIC Educational Resources Information Center
Patterson, Brian F.
2009-01-01
Taking the AP Statistics course and exam does not appear to be related to greater interest in the statistical sciences. Despite this finding, with respect to deciding whether to take further statistics course work and majoring in statistics, students appear to feel prepared for, but not interested in, further study. There is certainly more…
Computer aided manual validation of mass spectrometry-based proteomic data.
Curran, Timothy G; Bryson, Bryan D; Reigelhaupt, Michael; Johnson, Hannah; White, Forest M
2013-06-15
Advances in mass spectrometry-based proteomic technologies have increased the speed of analysis and the depth provided by a single analysis. Computational tools to evaluate the accuracy of peptide identifications from these high-throughput analyses have not kept pace with technological advances; currently the most common quality evaluation methods are based on statistical analysis of the likelihood of false positive identifications in large-scale data sets. While helpful, these calculations do not consider the accuracy of each identification, thus creating a precarious situation for biologists relying on the data to inform experimental design. Manual validation is the gold standard approach to confirm accuracy of database identifications, but is extremely time-intensive. To palliate the increasing time required to manually validate large proteomic datasets, we provide computer aided manual validation software (CAMV) to expedite the process. Relevant spectra are collected, catalogued, and pre-labeled, allowing users to efficiently judge the quality of each identification and summarize applicable quantitative information. CAMV significantly reduces the burden associated with manual validation and will hopefully encourage broader adoption of manual validation in mass spectrometry-based proteomics. Copyright © 2013 Elsevier Inc. All rights reserved.
Use of Statistical Analyses in the Ophthalmic Literature
Lisboa, Renato; Meira-Freitas, Daniel; Tatham, Andrew J.; Marvasti, Amir H.; Sharpsten, Lucie; Medeiros, Felipe A.
2014-01-01
Purpose To identify the most commonly used statistical analyses in the ophthalmic literature and to determine the likely gain in comprehension of the literature that readers could expect if they were to sequentially add knowledge of more advanced techniques to their statistical repertoire. Design Cross-sectional study Methods All articles published from January 2012 to December 2012 in Ophthalmology, American Journal of Ophthalmology and Archives of Ophthalmology were reviewed. A total of 780 peer-reviewed articles were included. Two reviewers examined each article and assigned categories to each one depending on the type of statistical analyses used. Discrepancies between reviewers were resolved by consensus. Main Outcome Measures Total number and percentage of articles containing each category of statistical analysis were obtained. Additionally we estimated the accumulated number and percentage of articles that a reader would be expected to be able to interpret depending on their statistical repertoire. Results Readers with little or no statistical knowledge would be expected to be able to interpret the statistical methods presented in only 20.8% of articles. In order to understand more than half (51.4%) of the articles published, readers were expected to be familiar with at least 15 different statistical methods. Knowledge of 21 categories of statistical methods was necessary to comprehend 70.9% of articles, while knowledge of more than 29 categories was necessary to comprehend more than 90% of articles. Articles in retina and glaucoma subspecialties showed a tendency for using more complex analysis when compared to cornea. Conclusions Readers of clinical journals in ophthalmology need to have substantial knowledge of statistical methodology to understand the results of published studies in the literature. The frequency of use of complex statistical analyses also indicates that those involved in the editorial peer-review process must have sound statistical knowledge in order to critically appraise articles submitted for publication. The results of this study could provide guidance to direct the statistical learning of clinical ophthalmologists, researchers and educators involved in the design of courses for residents and medical students. PMID:24612977
Smith, Ashlee L.; Sun, Mai; Bhargava, Rohit; Stewart, Nicolas A.; Flint, Melanie S.; Bigbee, William L.; Krivak, Thomas C.; Strange, Mary A.; Cooper, Kristine L.; Zorn, Kristin K.
2013-01-01
Objective: The biology of high grade serous ovarian carcinoma (HGSOC) is poorly understood. Little has been reported on intratumoral homogeneity or heterogeneity of primary HGSOC tumors and their metastases. We evaluated the global protein expression profiles of paired primary and metastatic HGSOC from formalin-fixed, paraffin-embedded (FFPE) tissue samples. Methods: After IRB approval, six patients with advanced HGSOC were identified with tumor in both ovaries at initial surgery. Laser capture microdissection (LCM) was used to extract tumor for protein digestion. Peptides were extracted and analyzed by reversed-phase liquid chromatography coupled to a linear ion trap mass spectrometer. Tandem mass spectra were searched against the UniProt human protein database. Differences in protein abundance between samples were assessed and analyzed by Ingenuity Pathway Analysis software. Immunohistochemistry (IHC) for select proteins from the original and an additional validation set of five patients was performed. Results: Unsupervised clustering of the abundance profiles placed the paired specimens adjacent to each other. IHC H-score analysis of the validation set revealed a strong correlation between paired samples for all proteins. For the similarly expressed proteins, the estimated correlation coefficients in two of three experimental samples and all validation samples were statistically significant (p < 0.05). The estimated correlation coefficients in the experimental sample proteins classified as differentially expressed were not statistically significant. Conclusion: A global proteomic screen of primary HGSOC tumors and their metastatic lesions identifies tumoral homogeneity and heterogeneity and provides preliminary insight into these protein profiles and the cellular pathways they constitute. PMID:28250404
Universal Recurrence Time Statistics of Characteristic Earthquakes
NASA Astrophysics Data System (ADS)
Goltz, C.; Turcotte, D. L.; Abaimov, S.; Nadeau, R. M.
2006-12-01
Characteristic earthquakes are defined to occur quasi-periodically on major faults. Do recurrence time statistics of such earthquakes follow a particular statistical distribution? If so, which one? The answer is fundamental and has important implications for hazard assessment. The problem cannot be solved by comparing the goodness of statistical fits as the available sequences are too short. The Parkfield sequence of M ≍ 6 earthquakes, one of the most extensive reliable data sets available, has grown to merely seven events with the last earthquake in 2004, for example. Recently, however, advances in seismological monitoring and improved processing methods have unveiled so-called micro-repeaters, micro-earthquakes which recur exactly in the same location on a fault. It seems plausible to regard these earthquakes as a miniature version of the classic characteristic earthquakes. Micro-repeaters are much more frequent than major earthquakes, leading to longer sequences for analysis. Due to their recent discovery, however, available sequences contain less than 20 events at present. In this paper we present results for the analysis of recurrence times for several micro-repeater sequences from Parkfield and adjacent regions. To improve the statistical significance of our findings, we combine several sequences into one by rescaling the individual sets by their respective mean recurrence intervals and Weibull exponents. This novel approach of rescaled combination yields the most extensive data set possible. We find that the resulting statistics can be fitted well by an exponential distribution, confirming the universal applicability of the Weibull distribution to characteristic earthquakes. A similar result is obtained from rescaled combination, however, with regard to the lognormal distribution.
ERIC Educational Resources Information Center
DeMark, Sarah F.; Behrens, John T.
2004-01-01
Whereas great advances have been made in the statistical sophistication of assessments in terms of evidence accumulation and task selection, relatively little statistical work has explored the possibility of applying statistical techniques to data for the purposes of determining appropriate domain understanding and to generate task-level scoring…
Air Quality Forecasting through Different Statistical and Artificial Intelligence Techniques
NASA Astrophysics Data System (ADS)
Mishra, D.; Goyal, P.
2014-12-01
Urban air pollution forecasting has emerged as an acute problem in recent years because there are sever environmental degradation due to increase in harmful air pollutants in the ambient atmosphere. In this study, there are different types of statistical as well as artificial intelligence techniques are used for forecasting and analysis of air pollution over Delhi urban area. These techniques are principle component analysis (PCA), multiple linear regression (MLR) and artificial neural network (ANN) and the forecasting are observed in good agreement with the observed concentrations through Central Pollution Control Board (CPCB) at different locations in Delhi. But such methods suffers from disadvantages like they provide limited accuracy as they are unable to predict the extreme points i.e. the pollution maximum and minimum cut-offs cannot be determined using such approach. Also, such methods are inefficient approach for better output forecasting. But with the advancement in technology and research, an alternative to the above traditional methods has been proposed i.e. the coupling of statistical techniques with artificial Intelligence (AI) can be used for forecasting purposes. The coupling of PCA, ANN and fuzzy logic is used for forecasting of air pollutant over Delhi urban area. The statistical measures e.g., correlation coefficient (R), normalized mean square error (NMSE), fractional bias (FB) and index of agreement (IOA) of the proposed model are observed in better agreement with the all other models. Hence, the coupling of statistical and artificial intelligence can be use for the forecasting of air pollutant over urban area.
Quality of statistical reporting in developmental disability journals.
Namasivayam, Aravind K; Yan, Tina; Wong, Wing Yiu Stephanie; van Lieshout, Pascal
2015-12-01
Null hypothesis significance testing (NHST) dominates quantitative data analysis, but its use is controversial and has been heavily criticized. The American Psychological Association has advocated the reporting of effect sizes (ES), confidence intervals (CIs), and statistical power analysis to complement NHST results to provide a more comprehensive understanding of research findings. The aim of this paper is to carry out a sample survey of statistical reporting practices in two journals with the highest h5-index scores in the areas of developmental disability and rehabilitation. Using a checklist that includes critical recommendations by American Psychological Association, we examined 100 randomly selected articles out of 456 articles reporting inferential statistics in the year 2013 in the Journal of Autism and Developmental Disorders (JADD) and Research in Developmental Disabilities (RDD). The results showed that for both journals, ES were reported only half the time (JADD 59.3%; RDD 55.87%). These findings are similar to psychology journals, but are in stark contrast to ES reporting in educational journals (73%). Furthermore, a priori power and sample size determination (JADD 10%; RDD 6%), along with reporting and interpreting precision measures (CI: JADD 13.33%; RDD 16.67%), were the least reported metrics in these journals, but not dissimilar to journals in other disciplines. To advance the science in developmental disability and rehabilitation and to bridge the research-to-practice divide, reforms in statistical reporting, such as providing supplemental measures to NHST, are clearly needed.
Nour-Eldein, Hebatallah
2016-01-01
With limited statistical knowledge of most physicians it is not uncommon to find statistical errors in research articles. To determine the statistical methods and to assess the statistical errors in family medicine (FM) research articles that were published between 2010 and 2014. This was a cross-sectional study. All 66 FM research articles that were published over 5 years by FM authors with affiliation to Suez Canal University were screened by the researcher between May and August 2015. Types and frequencies of statistical methods were reviewed in all 66 FM articles. All 60 articles with identified inferential statistics were examined for statistical errors and deficiencies. A comprehensive 58-item checklist based on statistical guidelines was used to evaluate the statistical quality of FM articles. Inferential methods were recorded in 62/66 (93.9%) of FM articles. Advanced analyses were used in 29/66 (43.9%). Contingency tables 38/66 (57.6%), regression (logistic, linear) 26/66 (39.4%), and t-test 17/66 (25.8%) were the most commonly used inferential tests. Within 60 FM articles with identified inferential statistics, no prior sample size 19/60 (31.7%), application of wrong statistical tests 17/60 (28.3%), incomplete documentation of statistics 59/60 (98.3%), reporting P value without test statistics 32/60 (53.3%), no reporting confidence interval with effect size measures 12/60 (20.0%), use of mean (standard deviation) to describe ordinal/nonnormal data 8/60 (13.3%), and errors related to interpretation were mainly for conclusions without support by the study data 5/60 (8.3%). Inferential statistics were used in the majority of FM articles. Data analysis and reporting statistics are areas for improvement in FM research articles.
Nour-Eldein, Hebatallah
2016-01-01
Background: With limited statistical knowledge of most physicians it is not uncommon to find statistical errors in research articles. Objectives: To determine the statistical methods and to assess the statistical errors in family medicine (FM) research articles that were published between 2010 and 2014. Methods: This was a cross-sectional study. All 66 FM research articles that were published over 5 years by FM authors with affiliation to Suez Canal University were screened by the researcher between May and August 2015. Types and frequencies of statistical methods were reviewed in all 66 FM articles. All 60 articles with identified inferential statistics were examined for statistical errors and deficiencies. A comprehensive 58-item checklist based on statistical guidelines was used to evaluate the statistical quality of FM articles. Results: Inferential methods were recorded in 62/66 (93.9%) of FM articles. Advanced analyses were used in 29/66 (43.9%). Contingency tables 38/66 (57.6%), regression (logistic, linear) 26/66 (39.4%), and t-test 17/66 (25.8%) were the most commonly used inferential tests. Within 60 FM articles with identified inferential statistics, no prior sample size 19/60 (31.7%), application of wrong statistical tests 17/60 (28.3%), incomplete documentation of statistics 59/60 (98.3%), reporting P value without test statistics 32/60 (53.3%), no reporting confidence interval with effect size measures 12/60 (20.0%), use of mean (standard deviation) to describe ordinal/nonnormal data 8/60 (13.3%), and errors related to interpretation were mainly for conclusions without support by the study data 5/60 (8.3%). Conclusion: Inferential statistics were used in the majority of FM articles. Data analysis and reporting statistics are areas for improvement in FM research articles. PMID:27453839
Tsai, Chu-Lin; Camargo, Carlos A
2009-09-01
Acute exacerbations of chronic disease are ubiquitous in clinical medicine, and thus far, there has been a paucity of integrated methodological discussion on this phenomenon. We use acute exacerbations of chronic obstructive pulmonary disease as an example to emphasize key epidemiological and statistical issues for this understudied field in clinical epidemiology. Directed acyclic graphs are a useful epidemiological tool to explain the differential effects of risk factor on health outcomes in studies of acute and chronic phases of disease. To study the pathogenesis of acute exacerbations of chronic disease, case-crossover design and time-series analysis are well-suited study designs to differentiate acute and chronic effect. Modeling changes over time and setting appropriate thresholds are important steps to separate acute from chronic phases of disease in serial measurements. In statistical analysis, acute exacerbations are recurrent events, and some individuals are more prone to recurrences than others. Therefore, appropriate statistical modeling should take into account intraindividual dependence. Finally, we recommend the use of "event-based" number needed to treat (NNT) to prevent a single exacerbation instead of traditional patient-based NNT. Addressing these methodological challenges will advance research quality in acute on chronic disease epidemiology.
DnaSAM: Software to perform neutrality testing for large datasets with complex null models.
Eckert, Andrew J; Liechty, John D; Tearse, Brandon R; Pande, Barnaly; Neale, David B
2010-05-01
Patterns of DNA sequence polymorphisms can be used to understand the processes of demography and adaptation within natural populations. High-throughput generation of DNA sequence data has historically been the bottleneck with respect to data processing and experimental inference. Advances in marker technologies have largely solved this problem. Currently, the limiting step is computational, with most molecular population genetic software allowing a gene-by-gene analysis through a graphical user interface. An easy-to-use analysis program that allows both high-throughput processing of multiple sequence alignments along with the flexibility to simulate data under complex demographic scenarios is currently lacking. We introduce a new program, named DnaSAM, which allows high-throughput estimation of DNA sequence diversity and neutrality statistics from experimental data along with the ability to test those statistics via Monte Carlo coalescent simulations. These simulations are conducted using the ms program, which is able to incorporate several genetic parameters (e.g. recombination) and demographic scenarios (e.g. population bottlenecks). The output is a set of diversity and neutrality statistics with associated probability values under a user-specified null model that are stored in easy to manipulate text file. © 2009 Blackwell Publishing Ltd.
P-Value Club: Teaching Significance Level on the Dance Floor
ERIC Educational Resources Information Center
Gray, Jennifer
2010-01-01
Courses: Beginning research methods and statistics courses, as well as advanced communication courses that require reading research articles and completing research projects involving statistics. Objective: Students will understand the difference between significant and nonsignificant statistical results based on p-value.
Hayes, Andrew F; Rockwood, Nicholas J
2017-11-01
There have been numerous treatments in the clinical research literature about various design, analysis, and interpretation considerations when testing hypotheses about mechanisms and contingencies of effects, popularly known as mediation and moderation analysis. In this paper we address the practice of mediation and moderation analysis using linear regression in the pages of Behaviour Research and Therapy and offer some observations and recommendations, debunk some popular myths, describe some new advances, and provide an example of mediation, moderation, and their integration as conditional process analysis using the PROCESS macro for SPSS and SAS. Our goal is to nudge clinical researchers away from historically significant but increasingly old school approaches toward modifications, revisions, and extensions that characterize more modern thinking about the analysis of the mechanisms and contingencies of effects. Copyright © 2016 Elsevier Ltd. All rights reserved.
Fatigue criterion to system design, life and reliability
NASA Technical Reports Server (NTRS)
Zaretsky, E. V.
1985-01-01
A generalized methodology to structural life prediction, design, and reliability based upon a fatigue criterion is advanced. The life prediction methodology is based in part on work of W. Weibull and G. Lundberg and A. Palmgren. The approach incorporates the computed life of elemental stress volumes of a complex machine element to predict system life. The results of coupon fatigue testing can be incorporated into the analysis allowing for life prediction and component or structural renewal rates with reasonable statistical certainty.
Analysis of Emergency Department Nurse Attitudes Toward Caring for Ethnically Diverse Patients
1997-12-31
care Even with recent advances in cultural care, unethical practices such as sexism and racism remain a problem in the U.S. health care system...Dr. Mary Kay Rayens, for her statistical expertise; * Julie Wachal, who kindly assisted me through the intricacies of the Master’s Program...and is based on King’s Theory of Goal Attainment (Rooda, 1992). The main propositions of King’s Theory of Goal Attainment can be summarized as
Morgan, Martin; Anders, Simon; Lawrence, Michael; Aboyoun, Patrick; Pagès, Hervé; Gentleman, Robert
2009-01-01
Summary: ShortRead is a package for input, quality assessment, manipulation and output of high-throughput sequencing data. ShortRead is provided in the R and Bioconductor environments, allowing ready access to additional facilities for advanced statistical analysis, data transformation, visualization and integration with diverse genomic resources. Availability and Implementation: This package is implemented in R and available at the Bioconductor web site; the package contains a ‘vignette’ outlining typical work flows. Contact: mtmorgan@fhcrc.org PMID:19654119
ERIC Educational Resources Information Center
Earl, Lorna L.
This series of manuals describing and illustrating the Statistical Package for the Social Sciences (SPSS) was planned as a self-teaching instrument, beginning with the basics and progressing to an advanced level. Information on what the searcher must know to define the data and write a program for preliminary analysis is contained in manual 1,…
Breath Analysis as a Potential and Non-Invasive Frontier in Disease Diagnosis: An Overview
Pereira, Jorge; Porto-Figueira, Priscilla; Cavaco, Carina; Taunk, Khushman; Rapole, Srikanth; Dhakne, Rahul; Nagarajaram, Hampapathalu; Câmara, José S.
2015-01-01
Currently, a small number of diseases, particularly cardiovascular (CVDs), oncologic (ODs), neurodegenerative (NDDs), chronic respiratory diseases, as well as diabetes, form a severe burden to most of the countries worldwide. Hence, there is an urgent need for development of efficient diagnostic tools, particularly those enabling reliable detection of diseases, at their early stages, preferably using non-invasive approaches. Breath analysis is a non-invasive approach relying only on the characterisation of volatile composition of the exhaled breath (EB) that in turn reflects the volatile composition of the bloodstream and airways and therefore the status and condition of the whole organism metabolism. Advanced sampling procedures (solid-phase and needle traps microextraction) coupled with modern analytical technologies (proton transfer reaction mass spectrometry, selected ion flow tube mass spectrometry, ion mobility spectrometry, e-noses, etc.) allow the characterisation of EB composition to an unprecedented level. However, a key challenge in EB analysis is the proper statistical analysis and interpretation of the large and heterogeneous datasets obtained from EB research. There is no standard statistical framework/protocol yet available in literature that can be used for EB data analysis towards discovery of biomarkers for use in a typical clinical setup. Nevertheless, EB analysis has immense potential towards development of biomarkers for the early disease diagnosis of diseases. PMID:25584743
Breath analysis as a potential and non-invasive frontier in disease diagnosis: an overview.
Pereira, Jorge; Porto-Figueira, Priscilla; Cavaco, Carina; Taunk, Khushman; Rapole, Srikanth; Dhakne, Rahul; Nagarajaram, Hampapathalu; Câmara, José S
2015-01-09
Currently, a small number of diseases, particularly cardiovascular (CVDs), oncologic (ODs), neurodegenerative (NDDs), chronic respiratory diseases, as well as diabetes, form a severe burden to most of the countries worldwide. Hence, there is an urgent need for development of efficient diagnostic tools, particularly those enabling reliable detection of diseases, at their early stages, preferably using non-invasive approaches. Breath analysis is a non-invasive approach relying only on the characterisation of volatile composition of the exhaled breath (EB) that in turn reflects the volatile composition of the bloodstream and airways and therefore the status and condition of the whole organism metabolism. Advanced sampling procedures (solid-phase and needle traps microextraction) coupled with modern analytical technologies (proton transfer reaction mass spectrometry, selected ion flow tube mass spectrometry, ion mobility spectrometry, e-noses, etc.) allow the characterisation of EB composition to an unprecedented level. However, a key challenge in EB analysis is the proper statistical analysis and interpretation of the large and heterogeneous datasets obtained from EB research. There is no standard statistical framework/protocol yet available in literature that can be used for EB data analysis towards discovery of biomarkers for use in a typical clinical setup. Nevertheless, EB analysis has immense potential towards development of biomarkers for the early disease diagnosis of diseases.
ACCOUNTING FOR CALIBRATION UNCERTAINTIES IN X-RAY ANALYSIS: EFFECTIVE AREAS IN SPECTRAL FITTING
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Hyunsook; Kashyap, Vinay L.; Drake, Jeremy J.
2011-04-20
While considerable advance has been made to account for statistical uncertainties in astronomical analyses, systematic instrumental uncertainties have been generally ignored. This can be crucial to a proper interpretation of analysis results because instrumental calibration uncertainty is a form of systematic uncertainty. Ignoring it can underestimate error bars and introduce bias into the fitted values of model parameters. Accounting for such uncertainties currently requires extensive case-specific simulations if using existing analysis packages. Here, we present general statistical methods that incorporate calibration uncertainties into spectral analysis of high-energy data. We first present a method based on multiple imputation that can bemore » applied with any fitting method, but is necessarily approximate. We then describe a more exact Bayesian approach that works in conjunction with a Markov chain Monte Carlo based fitting. We explore methods for improving computational efficiency, and in particular detail a method of summarizing calibration uncertainties with a principal component analysis of samples of plausible calibration files. This method is implemented using recently codified Chandra effective area uncertainties for low-resolution spectral analysis and is verified using both simulated and actual Chandra data. Our procedure for incorporating effective area uncertainty is easily generalized to other types of calibration uncertainties.« less
Adaptive and Optimal Control of Stochastic Dynamical Systems
2015-09-14
Advances in Statistics, Probability and Actuarial Sciences , Vol. 1, World Scientific, 2012, 451- 463. [4] T. E. Duncan and B. Pasik-Duncan, A...S. N. Cohen, T. K. Siu and H. Yang) Advances in Statistics, Probability and Actuarial Sciences , Vol. 1, World Scientific, 2012, 451-463. 4. T. E...games with gen- eral noise processes, Models and Methods in Economics and Management Science : Essays in Honor of Charles S. Tapiero, (eds. F. El
Huang, Ying; Li, Cao; Liu, Linhai; Jia, Xianbo; Lai, Song-Jia
2016-01-01
Although various computer tools have been elaborately developed to calculate a series of statistics in molecular population genetics for both small- and large-scale DNA data, there is no efficient and easy-to-use toolkit available yet for exclusively focusing on the steps of mathematical calculation. Here, we present PopSc, a bioinformatic toolkit for calculating 45 basic statistics in molecular population genetics, which could be categorized into three classes, including (i) genetic diversity of DNA sequences, (ii) statistical tests for neutral evolution, and (iii) measures of genetic differentiation among populations. In contrast to the existing computer tools, PopSc was designed to directly accept the intermediate metadata, such as allele frequencies, rather than the raw DNA sequences or genotyping results. PopSc is first implemented as the web-based calculator with user-friendly interface, which greatly facilitates the teaching of population genetics in class and also promotes the convenient and straightforward calculation of statistics in research. Additionally, we also provide the Python library and R package of PopSc, which can be flexibly integrated into other advanced bioinformatic packages of population genetics analysis. PMID:27792763
Chen, Shi-Yi; Deng, Feilong; Huang, Ying; Li, Cao; Liu, Linhai; Jia, Xianbo; Lai, Song-Jia
2016-01-01
Although various computer tools have been elaborately developed to calculate a series of statistics in molecular population genetics for both small- and large-scale DNA data, there is no efficient and easy-to-use toolkit available yet for exclusively focusing on the steps of mathematical calculation. Here, we present PopSc, a bioinformatic toolkit for calculating 45 basic statistics in molecular population genetics, which could be categorized into three classes, including (i) genetic diversity of DNA sequences, (ii) statistical tests for neutral evolution, and (iii) measures of genetic differentiation among populations. In contrast to the existing computer tools, PopSc was designed to directly accept the intermediate metadata, such as allele frequencies, rather than the raw DNA sequences or genotyping results. PopSc is first implemented as the web-based calculator with user-friendly interface, which greatly facilitates the teaching of population genetics in class and also promotes the convenient and straightforward calculation of statistics in research. Additionally, we also provide the Python library and R package of PopSc, which can be flexibly integrated into other advanced bioinformatic packages of population genetics analysis.
Yin, X-X; Zhang, Y; Cao, J; Wu, J-L; Hadjiloucas, S
2016-12-01
We provide a comprehensive account of recent advances in biomedical image analysis and classification from two complementary imaging modalities: terahertz (THz) pulse imaging and dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI). The work aims to highlight underlining commonalities in both data structures so that a common multi-channel data fusion framework can be developed. Signal pre-processing in both datasets is discussed briefly taking into consideration advances in multi-resolution analysis and model based fractional order calculus system identification. Developments in statistical signal processing using principal component and independent component analysis are also considered. These algorithms have been developed independently by the THz-pulse imaging and DCE-MRI communities, and there is scope to place them in a common multi-channel framework to provide better software standardization at the pre-processing de-noising stage. A comprehensive discussion of feature selection strategies is also provided and the importance of preserving textural information is highlighted. Feature extraction and classification methods taking into consideration recent advances in support vector machine (SVM) and extreme learning machine (ELM) classifiers and their complex extensions are presented. An outlook on Clifford algebra classifiers and deep learning techniques suitable to both types of datasets is also provided. The work points toward the direction of developing a new unified multi-channel signal processing framework for biomedical image analysis that will explore synergies from both sensing modalities for inferring disease proliferation. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Marchetti, C; De Felice, F; Di Pinto, A; D'Oria, O; Aleksa, N; Musella, A; Palaia, I; Muzii, L; Tombolini, V; Benedetti Panici, P
2018-05-01
The use of dose-dense weekly chemotherapy in the management of advanced ovarian cancer (OC) remains controversial. The aim of this meta-analysis was to evaluate the efficacy of dose-dense regimen to improve clinical outcomes in OC patients with the inclusion of new trials. For this updated meta-analysis, PubMed Medline and Scopus databases and meeting proceedings were searched for eligible studies with the limitation of randomized controlled trials, comparing dose-dense chemotherapy versus standard treatment. Trials were grouped in two types of dose-dense chemotherapy: weekly dose-dense (both paclitaxel and carboplatin weekly administration) and semi-weekly dose-dense (weekly paclitaxel and three weekly carboplatin administration). Data were extracted independently and were analyzed using RevMan statistical software version 5.3 (http://www.cochrane.org). Primary end-point was progression-free survival (PFS). Four randomized controlled trials comprising 3698 patients were identified as eligible. Dose-dense chemotherapy had not a significant benefit on PFS (HR 0.92, 95% CI 0.81-1.04, p = 0.20). When the analysis was restricted to both weekly and semi-weekly dose-dense data, a no significant interaction between dose-dense and standard regimen was confirmed (HR 1.01, 95% CI 0.93-1.10 and HR 0.82, 95% CI 0.63-1.08, respectively). In the absence of PFS superiority of dose-dense schedule, three weekly schedule should remain the standard of care for advanced OC. Copyright © 2018 Elsevier B.V. All rights reserved.
Buciński, Adam; Marszałł, Michał Piotr; Krysiński, Jerzy; Lemieszek, Andrzej; Załuski, Jerzy
2010-07-01
Hodgkin's lymphoma is one of the most curable malignancies and most patients achieve a lasting complete remission. In this study, artificial neural network (ANN) analysis was shown to provide significant factors with regard to 5-year recurrence after lymphoma treatment. Data from 114 patients treated for Hodgkin's disease were available for evaluation and comparison. A total of 31 variables were subjected to ANN analysis. The ANN approach as an advanced multivariate data processing method was shown to provide objective prognostic data. Some of these prognostic factors are consistent or even identical to the factors evaluated earlier by other statistical methods.
Discriminant Analysis of Raman Spectra for Body Fluid Identification for Forensic Purposes
Sikirzhytski, Vitali; Virkler, Kelly; Lednev, Igor K.
2010-01-01
Detection and identification of blood, semen and saliva stains, the most common body fluids encountered at a crime scene, are very important aspects of forensic science today. This study targets the development of a nondestructive, confirmatory method for body fluid identification based on Raman spectroscopy coupled with advanced statistical analysis. Dry traces of blood, semen and saliva obtained from multiple donors were probed using a confocal Raman microscope with a 785-nm excitation wavelength under controlled laboratory conditions. Results demonstrated the capability of Raman spectroscopy to identify an unknown substance to be semen, blood or saliva with high confidence. PMID:22319277
The relationship between apical root resorption and orthodontic tooth movement in growing subjects.
Xu, Tianmin; Baumrind, S
2002-07-01
To investigate the relationship between apical root resorption and orthodontic tooth movement in growing subjects. 58 growing subjects were collected randomly into the study sample and another 40 non-treated cases were used as control. The apical resoption of the upper central incisors was measured on periapical film and the incisor displacement was measured on lateral cephalogram. Using multiple linear regression analysis to examine the relationship between root resoption and the displacement of the upper incisor apex in each of four direction (retraction, advancement, intrusion and extrusion). The statistically significant negative association were found between resorption and both intrusion (P < 0.001) and extrusion (P < 0.05), but no significant association was found between resorption and both retraction and advancement. The regression analysis implied an average of 2.29 mm resorption in the absence of apical displacement. The likelihood that the magnitude of displacement of the incisor root is positively associated with root resoption in the population of treated growing subjects is very small.
Single-Case Experimental Designs to Evaluate Novel Technology-Based Health Interventions
Cassidy, Rachel N; Raiff, Bethany R
2013-01-01
Technology-based interventions to promote health are expanding rapidly. Assessing the preliminary efficacy of these interventions can be achieved by employing single-case experiments (sometimes referred to as n-of-1 studies). Although single-case experiments are often misunderstood, they offer excellent solutions to address the challenges associated with testing new technology-based interventions. This paper provides an introduction to single-case techniques and highlights advances in developing and evaluating single-case experiments, which help ensure that treatment outcomes are reliable, replicable, and generalizable. These advances include quality control standards, heuristics to guide visual analysis of time-series data, effect size calculations, and statistical analyses. They also include experimental designs to isolate the active elements in a treatment package and to assess the mechanisms of behavior change. The paper concludes with a discussion of issues related to the generality of findings derived from single-case research and how generality can be established through replication and through analysis of behavioral mechanisms. PMID:23399668
Hosseinzadeh, Majid; Bidhendi, Gholamreza Nabi; Torabian, Ali; Mehrdadi, Naser; Pourabdullah, Mehdi
2015-09-01
This paper introduces a new hybrid electro membrane bioreactor (HEMBR) for reverse osmosis (RO) pretreatment and advanced treatment of effluent by simultaneously integrating electrical coagulation (EC) with a membrane bioreactor (MBR) and its performance was compared with conventional MBR. Experimental results and their statistical analysis showed removal efficiency for suspended solids (SS) of almost 100% for both reactors. HEMBR removal of chemical oxygen demand (COD) improved by 4% and membrane fouling was alleviated according to transmembrane pressure (TMP). The average silt density index (SDI) of HEMBR permeate samples was slightly better indicating less RO membrane fouling. Moreover, based on the SVI comparison of two reactor biomass samples, HEMBR showed better settling characteristics which improved the dewaterability and filterability of the sludge. Analysis the change of membrane surfaces and the cake layer formed over them through field emission scanning electron microscopy (FESEM) and X-ray fluorescence spectrometer (XRF) were also discussed. Copyright © 2015 Elsevier Ltd. All rights reserved.
Fast gene ontology based clustering for microarray experiments.
Ovaska, Kristian; Laakso, Marko; Hautaniemi, Sampsa
2008-11-21
Analysis of a microarray experiment often results in a list of hundreds of disease-associated genes. In order to suggest common biological processes and functions for these genes, Gene Ontology annotations with statistical testing are widely used. However, these analyses can produce a very large number of significantly altered biological processes. Thus, it is often challenging to interpret GO results and identify novel testable biological hypotheses. We present fast software for advanced gene annotation using semantic similarity for Gene Ontology terms combined with clustering and heat map visualisation. The methodology allows rapid identification of genes sharing the same Gene Ontology cluster. Our R based semantic similarity open-source package has a speed advantage of over 2000-fold compared to existing implementations. From the resulting hierarchical clustering dendrogram genes sharing a GO term can be identified, and their differences in the gene expression patterns can be seen from the heat map. These methods facilitate advanced annotation of genes resulting from data analysis.
Maurovich-Horvat, Pál; Schlett, Christopher L; Alkadhi, Hatem; Nakano, Masataka; Stolzmann, Paul; Vorpahl, Marc; Scheffel, Hans; Tanaka, Atsushi; Warger, William C; Maehara, Akiko; Ma, Shixin; Kriegel, Matthias F; Kaple, Ryan K; Seifarth, Harald; Bamberg, Fabian; Mintz, Gary S; Tearney, Guillermo J; Virmani, Renu; Hoffmann, Udo
2012-11-01
To establish an ex vivo experimental setup for imaging coronary atherosclerosis with coronary computed tomographic (CT) angiography, intravascular ultrasonography (US), and optical frequency domain imaging (OFDI) and to investigate their ability to help differentiate early from advanced coronary plaques. All procedures were performed in accordance with local and federal regulations and the Declaration of Helsinki. Approval of the local Ethics Committee was obtained. Overall, 379 histologic cuts from nine coronary arteries from three donor hearts were acquired, coregistered among modalities, and assessed for the presence and composition of atherosclerotic plaque. To assess the discriminatory capacity of the different modalities in the detection of advanced lesions, c statistic analysis was used. Interobserver agreement was assessed with the Cohen κ statistic. Cross sections without plaque at coronary CT angiography and with fibrous plaque at OFDI almost never showed advanced lesions at histopathologic examination (odds ratio [OR]: 0.02 and 0.06, respectively; both P<.0001), while mixed plaque at coronary CT angiography, calcified plaque at intravascular US, and lipid-rich plaque at OFDI were associated with advanced lesions (OR: 2.49, P=.0003; OR: 2.60, P=.002; and OR: 31.2, P<.0001, respectively). OFDI had higher accuracy for discriminating early from advanced lesions than intravascular US and coronary CT angiography (area under the receiver operating characteristic curve: 0.858 [95% confidence interval {CI}: 0.802, 0.913], 0.631 [95% CI: 0.554, 0.709], and 0.679 [95% CI: 0.618, 0.740]; respectively, P<.0001). Interobserver agreement was excellent for OFDI and coronary CT angiography (κ=0.87 and 0.85, respectively) and was good for intravascular US (κ=0.66). Systematic and standardized comparison between invasive and noninvasive modalities for coronary plaque characterization in ex vivo specimens demonstrated that coronary CT angiography and intravascular US are reasonably associated with plaque composition and lesion grading according to histopathologic findings, while OFDI was strongly associated. These data may help to develop initial concepts of sequential imaging strategies to identify patients with advanced coronary plaques. © RSNA, 2012
NASA Astrophysics Data System (ADS)
Wu, Di; Torres, Elizabeth B.; Jose, Jorge V.
2015-03-01
ASD is a spectrum of neurodevelopmental disorders. The high heterogeneity of the symptoms associated with the disorder impedes efficient diagnoses based on human observations. Recent advances with high-resolution MEM wearable sensors enable accurate movement measurements that may escape the naked eye. It calls for objective metrics to extract physiological relevant information from the rapidly accumulating data. In this talk we'll discuss the statistical analysis of movement data continuously collected with high-resolution sensors at 240Hz. We calculated statistical properties of speed fluctuations within the millisecond time range that closely correlate with the subjects' cognitive abilities. We computed the periodicity and synchronicity of the speed fluctuations' from their power spectrum and ensemble averaged two-point cross-correlation function. We built a two-parameter phase space from the temporal statistical analyses of the nearest neighbor fluctuations that provided a quantitative biomarker for ASD and adult normal subjects and further classified ASD severity. We also found age related developmental statistical signatures and potential ASD parental links in our movement dynamical studies. Our results may have direct clinical applications.
Utilizing Wavelet Analysis to assess hydrograph change in northwestern North America
NASA Astrophysics Data System (ADS)
Tang, W.; Carey, S. K.
2017-12-01
Historical streamflow data in the mountainous regions of northwestern North America suggest that changes flows are driven by warming temperature, declining snowpack and glacier extent, and large-scale teleconnections. However, few sites exist that have robust long-term records for statistical analysis, and pervious research has focussed on high and low-flow indices along with trend analysis using Mann-Kendal test and other similar approaches. Furthermore, there has been less emphasis on ascertaining the drivers of change in changes in shape of the streamflow hydrograph compared with traditional flow metrics. In this work, we utilize wavelet analysis to evaluate changes in hydrograph characteristics for snowmelt driven rivers in northwestern North America across a range of scales. Results suggest that wavelets can be used to detect a lengthening and advancement of freshet with a corresponding decline in peak flows. Furthermore, the gradual transition of flows from nival to pluvial regimes in more southerly catchments is evident in the wavelet spectral power through time. This method of change detection is challenged by evaluating the statistical significance of changes in wavelet spectra as related to hydrograph form, yet ongoing work seeks to link these patters to driving weather and climate along with larger scale teleconnections.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Apte, A; Veeraraghavan, H; Oh, J
Purpose: To present an open source and free platform to facilitate radiomics research — The “Radiomics toolbox” in CERR. Method: There is scarcity of open source tools that support end-to-end modeling of image features to predict patient outcomes. The “Radiomics toolbox” strives to fill the need for such a software platform. The platform supports (1) import of various kinds of image modalities like CT, PET, MR, SPECT, US. (2) Contouring tools to delineate structures of interest. (3) Extraction and storage of image based features like 1st order statistics, gray-scale co-occurrence and zonesize matrix based texture features and shape features andmore » (4) Statistical Analysis. Statistical analysis of the extracted features is supported with basic functionality that includes univariate correlations, Kaplan-Meir curves and advanced functionality that includes feature reduction and multivariate modeling. The graphical user interface and the data management are performed with Matlab for the ease of development and readability of code and features for wide audience. Open-source software developed with other programming languages is integrated to enhance various components of this toolbox. For example: Java-based DCM4CHE for import of DICOM, R for statistical analysis. Results: The Radiomics toolbox will be distributed as an open source, GNU copyrighted software. The toolbox was prototyped for modeling Oropharyngeal PET dataset at MSKCC. The analysis will be presented in a separate paper. Conclusion: The Radiomics Toolbox provides an extensible platform for extracting and modeling image features. To emphasize new uses of CERR for radiomics and image-based research, we have changed the name from the “Computational Environment for Radiotherapy Research” to the “Computational Environment for Radiological Research”.« less
NASA Technical Reports Server (NTRS)
Gayda, John
2003-01-01
As part of NASA s Advanced Subsonic Technology Program, a study of stabilization heat treatment options for an advanced nickel-base disk alloy, ME 209, was performed. Using a simple, physically based approach, the effect of stabilization heat treatments on tensile and creep properties was analyzed in this paper. Solutions temperature, solution cooling rate, and stabilization temperature/time were found to have a significant impact on tensile and creep properties. These effects were readily quantified using the following methodology. First, the effect of solution cooling rate was assessed to determine its impact on a given property. The as-cooled property was then modified by using two multiplicative factors which assess the impact of solution temperature and stabilization parameters. Comparison of experimental data with predicted values showed this physically based analysis produced good results that rivaled the statistical analysis employed, which required numerous changes in the form of the regression equation depending on the property and temperature in question. As this physically based analysis uses the data for input, it should be noted that predictions which attempt to extrapolate beyond the bounds of the data must be viewed with skepticism. Future work aimed at expanding the range of the stabilization/aging parameters explored in this study would be highly desirable, especially at the higher solution cooling rates.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wurtz, R.; Kaplan, A.
Pulse shape discrimination (PSD) is a variety of statistical classifier. Fully-realized statistical classifiers rely on a comprehensive set of tools for designing, building, and implementing. PSD advances rely on improvements to the implemented algorithm. PSD advances can be improved by using conventional statistical classifier or machine learning methods. This paper provides the reader with a glossary of classifier-building elements and their functions in a fully-designed and operational classifier framework that can be used to discover opportunities for improving PSD classifier projects. This paper recommends reporting the PSD classifier’s receiver operating characteristic (ROC) curve and its behavior at a gamma rejectionmore » rate (GRR) relevant for realistic applications.« less
Finding the Root Causes of Statistical Inconsistency in Community Earth System Model Output
NASA Astrophysics Data System (ADS)
Milroy, D.; Hammerling, D.; Baker, A. H.
2017-12-01
Baker et al (2015) developed the Community Earth System Model Ensemble Consistency Test (CESM-ECT) to provide a metric for software quality assurance by determining statistical consistency between an ensemble of CESM outputs and new test runs. The test has proved useful for detecting statistical difference caused by compiler bugs and errors in physical modules. However, detection is only the necessary first step in finding the causes of statistical difference. The CESM is a vastly complex model comprised of millions of lines of code which is developed and maintained by a large community of software engineers and scientists. Any root cause analysis is correspondingly challenging. We propose a new capability for CESM-ECT: identifying the sections of code that cause statistical distinguishability. The first step is to discover CESM variables that cause CESM-ECT to classify new runs as statistically distinct, which we achieve via Randomized Logistic Regression. Next we use a tool developed to identify CESM components that define or compute the variables found in the first step. Finally, we employ the application Kernel GENerator (KGEN) created in Kim et al (2016) to detect fine-grained floating point differences. We demonstrate an example of the procedure and advance a plan to automate this process in our future work.
EpiModel: An R Package for Mathematical Modeling of Infectious Disease over Networks.
Jenness, Samuel M; Goodreau, Steven M; Morris, Martina
2018-04-01
Package EpiModel provides tools for building, simulating, and analyzing mathematical models for the population dynamics of infectious disease transmission in R. Several classes of models are included, but the unique contribution of this software package is a general stochastic framework for modeling the spread of epidemics on networks. EpiModel integrates recent advances in statistical methods for network analysis (temporal exponential random graph models) that allow the epidemic modeling to be grounded in empirical data on contacts that can spread infection. This article provides an overview of both the modeling tools built into EpiModel , designed to facilitate learning for students new to modeling, and the application programming interface for extending package EpiModel , designed to facilitate the exploration of novel research questions for advanced modelers.
EpiModel: An R Package for Mathematical Modeling of Infectious Disease over Networks
Jenness, Samuel M.; Goodreau, Steven M.; Morris, Martina
2018-01-01
Package EpiModel provides tools for building, simulating, and analyzing mathematical models for the population dynamics of infectious disease transmission in R. Several classes of models are included, but the unique contribution of this software package is a general stochastic framework for modeling the spread of epidemics on networks. EpiModel integrates recent advances in statistical methods for network analysis (temporal exponential random graph models) that allow the epidemic modeling to be grounded in empirical data on contacts that can spread infection. This article provides an overview of both the modeling tools built into EpiModel, designed to facilitate learning for students new to modeling, and the application programming interface for extending package EpiModel, designed to facilitate the exploration of novel research questions for advanced modelers. PMID:29731699
Advancements in RNASeqGUI towards a Reproducible Analysis of RNA-Seq Experiments
Russo, Francesco; Righelli, Dario
2016-01-01
We present the advancements and novelties recently introduced in RNASeqGUI, a graphical user interface that helps biologists to handle and analyse large data collected in RNA-Seq experiments. This work focuses on the concept of reproducible research and shows how it has been incorporated in RNASeqGUI to provide reproducible (computational) results. The novel version of RNASeqGUI combines graphical interfaces with tools for reproducible research, such as literate statistical programming, human readable report, parallel executions, caching, and interactive and web-explorable tables of results. These features allow the user to analyse big datasets in a fast, efficient, and reproducible way. Moreover, this paper represents a proof of concept, showing a simple way to develop computational tools for Life Science in the spirit of reproducible research. PMID:26977414
Uncertainty Management in Remote Sensing of Climate Data. Summary of A Workshop
NASA Technical Reports Server (NTRS)
McConnell, M.; Weidman, S.
2009-01-01
Great advances have been made in our understanding of the climate system over the past few decades, and remotely sensed data have played a key role in supporting many of these advances. Improvements in satellites and in computational and data-handling techniques have yielded high quality, readily accessible data. However, rapid increases in data volume have also led to large and complex datasets that pose significant challenges in data analysis (NRC, 2007). Uncertainty characterization is needed for every satellite mission and scientists continue to be challenged by the need to reduce the uncertainty in remotely sensed climate records and projections. The approaches currently used to quantify the uncertainty in remotely sensed data, including statistical methods used to calibrate and validate satellite instruments, lack an overall mathematically based framework.
Robertson, David S; Prevost, A Toby; Bowden, Jack
2016-09-30
Seamless phase II/III clinical trials offer an efficient way to select an experimental treatment and perform confirmatory analysis within a single trial. However, combining the data from both stages in the final analysis can induce bias into the estimates of treatment effects. Methods for bias adjustment developed thus far have made restrictive assumptions about the design and selection rules followed. In order to address these shortcomings, we apply recent methodological advances to derive the uniformly minimum variance conditionally unbiased estimator for two-stage seamless phase II/III trials. Our framework allows for the precision of the treatment arm estimates to take arbitrary values, can be utilised for all treatments that are taken forward to phase III and is applicable when the decision to select or drop treatment arms is driven by a multiplicity-adjusted hypothesis testing procedure. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.
NASA Technical Reports Server (NTRS)
He, Yuning
2015-01-01
The behavior of complex aerospace systems is governed by numerous parameters. For safety analysis it is important to understand how the system behaves with respect to these parameter values. In particular, understanding the boundaries between safe and unsafe regions is of major importance. In this paper, we describe a hierarchical Bayesian statistical modeling approach for the online detection and characterization of such boundaries. Our method for classification with active learning uses a particle filter-based model and a boundary-aware metric for best performance. From a library of candidate shapes incorporated with domain expert knowledge, the location and parameters of the boundaries are estimated using advanced Bayesian modeling techniques. The results of our boundary analysis are then provided in a form understandable by the domain expert. We illustrate our approach using a simulation model of a NASA neuro-adaptive flight control system, as well as a system for the detection of separation violations in the terminal airspace.
Vibroacoustic Response of the NASA ACTS Spacecraft Antenna to Launch Acoustic Excitation
NASA Technical Reports Server (NTRS)
Larko, Jeffrey M.; Cotoni, Vincent
2008-01-01
The Advanced Communications Technology Satellite was an experimental NASA satellite launched from the Space Shuttle Discovery. As part of the ground test program, the satellite s large, parabolic reflector antennas were exposed to a reverberant acoustic loading to simulate the launch acoustics in the Shuttle payload bay. This paper describes the modelling and analysis of the dynamic response of these large, composite spacecraft antenna structure subjected to a diffuse acoustic field excitation. Due to the broad frequency range of the excitation, different models were created to make predictions in the various frequency regimes of interest: a statistical energy analysis (SEA) model to capture the high frequency response and a hybrid finite element-statistical energy (hybrid FE-SEA) model for the low to mid-frequency responses. The strengths and limitations of each of the analytical techniques are discussed. The predictions are then compared to the measured acoustic test data and to a boundary element (BEM) model to evaluate the performance of the hybrid techniques.
Egorova, K.S.; Kondakova, A.N.; Toukach, Ph.V.
2015-01-01
Carbohydrates are biological blocks participating in diverse and crucial processes both at cellular and organism levels. They protect individual cells, establish intracellular interactions, take part in the immune reaction and participate in many other processes. Glycosylation is considered as one of the most important modifications of proteins and other biologically active molecules. Still, the data on the enzymatic machinery involved in the carbohydrate synthesis and processing are scattered, and the advance on its study is hindered by the vast bulk of accumulated genetic information not supported by any experimental evidences for functions of proteins that are encoded by these genes. In this article, we present novel instruments for statistical analysis of glycomes in taxa. These tools may be helpful for investigating carbohydrate-related enzymatic activities in various groups of organisms and for comparison of their carbohydrate content. The instruments are developed on the Carbohydrate Structure Database (CSDB) platform and are available freely on the CSDB web-site at http://csdb.glycoscience.ru. Database URL: http://csdb.glycoscience.ru PMID:26337239
Statistical process management: An essential element of quality improvement
NASA Astrophysics Data System (ADS)
Buckner, M. R.
Successful quality improvement requires a balanced program involving the three elements that control quality: organization, people and technology. The focus of the SPC/SPM User's Group is to advance the technology component of Total Quality by networking within the Group and by providing an outreach within Westinghouse to foster the appropriate use of statistic techniques to achieve Total Quality. SPM encompasses the disciplines by which a process is measured against its intrinsic design capability, in the face of measurement noise and other obscuring variability. SPM tools facilitate decisions about the process that generated the data. SPM deals typically with manufacturing processes, but with some flexibility of definition and technique it accommodates many administrative processes as well. The techniques of SPM are those of Statistical Process Control, Statistical Quality Control, Measurement Control, and Experimental Design. In addition, techniques such as job and task analysis, and concurrent engineering are important elements of systematic planning and analysis that are needed early in the design process to ensure success. The SPC/SPM User's Group is endeavoring to achieve its objectives by sharing successes that have occurred within the member's own Westinghouse department as well as within other US and foreign industry. In addition, failures are reviewed to establish lessons learned in order to improve future applications. In broader terms, the Group is interested in making SPM the accepted way of doing business within Westinghouse.
Getting There: Despite Their Prevalence in Advancement, Women Still Trail Men in Pay and Titles
ERIC Educational Resources Information Center
Scully, Maura King
2011-01-01
Advancement is a women-dominated profession. The numbers say so: Approximately two-thirds of Council for Advancement and Support of Education (CASE) members are women, and one-third are men. What does this mean for women and the advancement profession as a whole? As anyone who has ever analyzed statistics can tell, it depends. The numbers…
Emerging technologies for pediatric and adult trauma care.
Moulton, Steven L; Haley-Andrews, Stephanie; Mulligan, Jane
2010-06-01
Current Emergency Medical Service protocols rely on provider-directed care for evaluation, management and triage of injured patients from the field to a trauma center. New methods to quickly diagnose, support and coordinate the movement of trauma patients from the field to the most appropriate trauma center are in development. These methods will enhance trauma care and promote trauma system development. Recent advances in machine learning, statistical methods, device integration and wireless communication are giving rise to new methods for vital sign data analysis and a new generation of transport monitors. These monitors will collect and synchronize exponentially growing amounts of vital sign data with electronic patient care information. The application of advanced statistical methods to these complex clinical data sets has the potential to reveal many important physiological relationships and treatment effects. Several emerging technologies are converging to yield a new generation of smart sensors and tightly integrated transport monitors. These technologies will assist prehospital providers in quickly identifying and triaging the most severely injured children and adults to the most appropriate trauma centers. They will enable the development of real-time clinical support systems of increasing complexity, able to provide timelier, more cost-effective, autonomous care.
Improved silicon nitride for advanced heat engines
NASA Technical Reports Server (NTRS)
Yeh, Hun C.; Fang, Ho T.
1987-01-01
The technology base required to fabricate silicon nitride components with the strength, reliability, and reproducibility necessary for actual heat engine applications is presented. Task 2 was set up to develop test bars with high Weibull slope and greater high temperature strength, and to conduct an initial net shape component fabrication evaluation. Screening experiments were performed in Task 7 on advanced materials and processing for input to Task 2. The technical efforts performed in the second year of a 5-yr program are covered. The first iteration of Task 2 was completed as planned. Two half-replicated, fractional factorial (2 sup 5), statistically designed matrix experiments were conducted. These experiments have identified Denka 9FW Si3N4 as an alternate raw material to GTE SN502 Si3N4 for subsequent process evaluation. A detailed statistical analysis was conducted to correlate processing conditions with as-processed test bar properties. One processing condition produced a material with a 97 ksi average room temperature MOR (100 percent of goal) with 13.2 Weibull slope (83 percent of goal); another condition produced 86 ksi (6 percent over baseline) room temperature strength with a Weibull slope of 20 (125 percent of goal).
Wang, Cheng; Peng, Jingjin; Kuang, Yanling; Zhang, Jiaqiang; Dai, Luming
2017-01-01
Pleural effusion is a common clinical manifestation with various causes. Current diagnostic and therapeutic methods have exhibited numerous limitations. By involving the analysis of dynamic changes in low molecular weight catabolites, metabolomics has been widely applied in various types of disease and have provided platforms to distinguish many novel biomarkers. However, to the best of our knowledge, there are few studies regarding the metabolic profiling for pleural effusion. In the current study, 58 pleural effusion samples were collected, among which 20 were malignant pleural effusions, 20 were tuberculous pleural effusions and 18 were transudative pleural effusions. The small molecule metabolite spectrums were obtained by adopting 1H nuclear magnetic resonance technology, and pattern-recognition multi-variable statistical analysis was used to screen out different metabolites. One-way analysis of variance, and Student-Newman-Keuls and the Kruskal-Wallis test were adopted for statistical analysis. Over 400 metabolites were identified in the untargeted metabolomic analysis and 26 metabolites were identified as significantly different among tuberculous, malignant and transudative pleural effusions. These metabolites were predominantly involved in the metabolic pathways of amino acids metabolism, glycometabolism and lipid metabolism. Statistical analysis revealed that eight metabolites contributed to the distinction between the three groups: Tuberculous, malignant and transudative pleural effusion. In the current study, the feasibility of identifying small molecule biochemical profiles in different types of pleural effusion were investigated reveal novel biological insights into the underlying mechanisms. The results provide specific insights into the biology of tubercular, malignant and transudative pleural effusion and may offer novel strategies for the diagnosis and therapy of associated diseases, including tuberculosis, advanced lung cancer and congestive heart failure. PMID:28627685
Pouch, Alison M; Vergnat, Mathieu; McGarvey, Jeremy R; Ferrari, Giovanni; Jackson, Benjamin M; Sehgal, Chandra M; Yushkevich, Paul A; Gorman, Robert C; Gorman, Joseph H
2014-01-01
The basis of mitral annuloplasty ring design has progressed from qualitative surgical intuition to experimental and theoretical analysis of annular geometry with quantitative imaging techniques. In this work, we present an automated three-dimensional (3D) echocardiographic image analysis method that can be used to statistically assess variability in normal mitral annular geometry to support advancement in annuloplasty ring design. Three-dimensional patient-specific models of the mitral annulus were automatically generated from 3D echocardiographic images acquired from subjects with normal mitral valve structure and function. Geometric annular measurements including annular circumference, annular height, septolateral diameter, intercommissural width, and the annular height to intercommissural width ratio were automatically calculated. A mean 3D annular contour was computed, and principal component analysis was used to evaluate variability in normal annular shape. The following mean ± standard deviations were obtained from 3D echocardiographic image analysis: annular circumference, 107.0 ± 14.6 mm; annular height, 7.6 ± 2.8 mm; septolateral diameter, 28.5 ± 3.7 mm; intercommissural width, 33.0 ± 5.3 mm; and annular height to intercommissural width ratio, 22.7% ± 6.9%. Principal component analysis indicated that shape variability was primarily related to overall annular size, with more subtle variation in the skewness and height of the anterior annular peak, independent of annular diameter. Patient-specific 3D echocardiographic-based modeling of the human mitral valve enables statistical analysis of physiologically normal mitral annular geometry. The tool can potentially lead to the development of a new generation of annuloplasty rings that restore the diseased mitral valve annulus back to a truly normal geometry. Copyright © 2014 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.
A Data Assimilation System For Operational Weather Forecast In Galicia Region (nw Spain)
NASA Astrophysics Data System (ADS)
Balseiro, C. F.; Souto, M. J.; Pérez-Muñuzuri, V.; Brewster, K.; Xue, M.
Regional weather forecast models, such as the Advanced Regional Prediction System (ARPS), over complex environments with varying local influences require an accurate meteorological analysis that should include all local meteorological measurements available. In this work, the ARPS Data Analysis System (ADAS) (Xue et al. 2001) is applied as a three-dimensional weather analysis tool to include surface station and rawinsonde data with the NCEP AVN forecasts as the analysis background. Currently in ADAS, a set of five meteorological variables are considered during the analysis: horizontal grid-relative wind components, pressure, potential temperature and spe- cific humidity. The analysis is used for high resolution numerical weather prediction for the Galicia region. The analysis method used in ADAS is based on the successive corrective scheme of Bratseth (1986), which asymptotically approaches the result of a statistical (optimal) interpolation, but at lower computational cost. As in the optimal interpolation scheme, the Bratseth interpolation method can take into account the rel- ative error between background and observational data, therefore they are relatively insensitive to large variations in data density and can integrate data of mixed accuracy. This method can be applied economically in an operational setting, providing signifi- cant improvement over the background model forecast as well as any analysis without high-resolution local observations. A one-way nesting is applied for weather forecast in Galicia region, and the use of this assimilation system in both domains shows better results not only in initial conditions but also in all forecast periods. Bratseth, A.M. (1986): "Statistical interpolation by means of successive corrections." Tellus, 38A, 439-447. Souto, M. J., Balseiro, C. F., Pérez-Muñuzuri, V., Xue, M. Brewster, K., (2001): "Im- pact of cloud analysis on numerical weather prediction in the galician region of Spain". Submitted to Journal of Applied Meteorology. Xue, M., Wang. D., Gao, J., Brewster, K, Droegemeier, K. K., (2001): "The Advanced Regional Prediction System (ARPS), storm-scale numerical weather prediction and data assimilation". Meteor. Atmos Physics. Accepted
Nakagawa, Tateo; Shimada, Mitsuo; Kurita, Nobuhiro; Iwata, Takashi; Nishioka, Masanori; Yoshikawa, Kozo; Higashijima, Jun; Utsunomiya, Tohru
2012-06-01
The role of intratumoral thymidylate synthase (TS) mRNA or protein expression is still controversial and little has been reported regarding relation of them in colorectal cancer. Forty-six patients with advanced colorectal cancer who underwent surgical resection were included. TS mRNA expression was determined by the Danenberg tumor profile method based on laser-captured micro-dissection of the tumor cells. TS protein expression was evaluated using immunohistochemical staining. TS mRNA expression tended to relate TS protein expression. Statistical significance was not found in overall survival between the TS mRNA high group and low group regardless of performing adjuvant chemotherapy. The overall survival in the TS protein negative group was significantly higher than that in positive group in all and the patients without adjuvant chemotherapy. Multivariate analysis showed TS protein expression was as an independent prognostic factor. TS protein expression tends to be related TS mRNA expression and is an independent prognostic factor in advanced colorectal cancer.
Descriptive and inferential statistical methods used in burns research.
Al-Benna, Sammy; Al-Ajam, Yazan; Way, Benjamin; Steinstraesser, Lars
2010-05-01
Burns research articles utilise a variety of descriptive and inferential methods to present and analyse data. The aim of this study was to determine the descriptive methods (e.g. mean, median, SD, range, etc.) and survey the use of inferential methods (statistical tests) used in articles in the journal Burns. This study defined its population as all original articles published in the journal Burns in 2007. Letters to the editor, brief reports, reviews, and case reports were excluded. Study characteristics, use of descriptive statistics and the number and types of statistical methods employed were evaluated. Of the 51 articles analysed, 11(22%) were randomised controlled trials, 18(35%) were cohort studies, 11(22%) were case control studies and 11(22%) were case series. The study design and objectives were defined in all articles. All articles made use of continuous and descriptive data. Inferential statistics were used in 49(96%) articles. Data dispersion was calculated by standard deviation in 30(59%). Standard error of the mean was quoted in 19(37%). The statistical software product was named in 33(65%). Of the 49 articles that used inferential statistics, the tests were named in 47(96%). The 6 most common tests used (Student's t-test (53%), analysis of variance/co-variance (33%), chi(2) test (27%), Wilcoxon & Mann-Whitney tests (22%), Fisher's exact test (12%)) accounted for the majority (72%) of statistical methods employed. A specified significance level was named in 43(88%) and the exact significance levels were reported in 28(57%). Descriptive analysis and basic statistical techniques account for most of the statistical tests reported. This information should prove useful in deciding which tests should be emphasised in educating burn care professionals. These results highlight the need for burn care professionals to have a sound understanding of basic statistics, which is crucial in interpreting and reporting data. Advice should be sought from professionals in the fields of biostatistics and epidemiology when using more advanced statistical techniques. Copyright 2009 Elsevier Ltd and ISBI. All rights reserved.
The response of numerical weather prediction analysis systems to FGGE 2b data
NASA Technical Reports Server (NTRS)
Hollingsworth, A.; Lorenc, A.; Tracton, S.; Arpe, K.; Cats, G.; Uppala, S.; Kallberg, P.
1985-01-01
An intercomparison of analyses of the main PGGE Level IIb data set is presented with three advanced analysis systems. The aims of the work are to estimate the extent and magnitude of the differences between the analyses, to identify the reasons for the differences, and finally to estimate the significance of the differences. Extratropical analyses only are considered. Objective evaluations of analysis quality, such as fit to observations, statistics of analysis differences, and mean fields are discussed. In addition, substantial emphasis is placed on subjective evaluation of a series of case studies that were selected to illustrate the importance of different aspects of the analysis procedures, such as quality control, data selection, resolution, dynamical balance, and the role of the assimilating forecast model. In some cases, the forecast models are used as selective amplifiers of analysis differences to assist in deciding which analysis was more nearly correct in the treatment of particular data.
van Rhee, Henk; Hak, Tony
2017-01-01
We present a new tool for meta‐analysis, Meta‐Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta‐analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta‐Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta‐analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp‐Hartung adjustment of the DerSimonian‐Laird estimator. However, more advanced meta‐analysis methods such as meta‐analytical structural equation modelling and meta‐regression with multiple covariates are not available. In summary, Meta‐Essentials may prove a valuable resource for meta‐analysts, including researchers, teachers, and students. PMID:28801932
Tularosa Basin Play Fairway Analysis: Weights of Evidence; Mineralogy, and Temperature Anomaly Maps
Adam Brandt
2015-11-15
This submission has two shapefiles and a tiff image. The weights of evidence analysis was applied to data representing heat of the earth and fracture permeability using training sites around the Southwest; this is shown in the tiff image. A shapefile of surface temperature anomalies was derived from the statistical analysis of Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) thermal infrared data which had been converted to surface temperatures; these anomalies have not been field checked. The second shapefile shows outcrop mineralogy which originally mapped by the New Mexico Bureau of Geology and Mineral Resources, and supplemented with mineralogic information related to rock fracability risk for EGS. Further metadata can be found within each file.
Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas
2008-01-01
Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual classes; however, pooling the results across all courses and sections, SOCR effects on the treatment groups were exceptionally robust and significant. Coupling these findings with a clear decrease in the variance of the quantitative examination measures in the treatment groups indicates that employing technology, like SOCR, in a sound pedagogical and scientific manner enhances overall the students' understanding and suggests better long-term knowledge retention.
Dinov, Ivo D.; Sanchez, Juana; Christou, Nicolas
2009-01-01
Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment. The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual classes; however, pooling the results across all courses and sections, SOCR effects on the treatment groups were exceptionally robust and significant. Coupling these findings with a clear decrease in the variance of the quantitative examination measures in the treatment groups indicates that employing technology, like SOCR, in a sound pedagogical and scientific manner enhances overall the students’ understanding and suggests better long-term knowledge retention. PMID:19750185
ERIC Educational Resources Information Center
Rossman, Allan; Pearl, Dennis
2017-01-01
Dennis Pearl is Professor of Statistics at Pennsylvania State University and Director of the Consortium for the Advancement of Undergraduate Statistics Education (CAUSE). He is a Fellow of the American Statistical Association. This interview took place via email on November 18-29, 2016, and provides Dennis Pearl's background story, which describes…
5 CFR 297.401 - Conditions of disclosure.
Code of Federal Regulations, 2010 CFR
2010-01-01
... with advance adequate written assurance that the record will be used solely as a statistical research... records; and (ii) Certification that the records will be used only for statistical purposes. (2) These... information from records released for statistical purposes, the system manager will reasonably ensure that the...
Write-Skewed: Writing in an Introductory Statistics Course
ERIC Educational Resources Information Center
Delcham, Hendrick; Sezer, Renan
2010-01-01
Statistics is used in almost every facet of our daily lives: crime reports, election results, environmental/climate change, advances in business, financial planning, and progress in multifarious research. Although understanding statistics is essential for efficient functioning in the modern world (Cerrito 1996), students often do not grasp…
28 CFR 0.154 - Advance and evacuation payments and special allowances.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Advance and evacuation payments and... Advance and evacuation payments and special allowances. The Director of the Federal Bureau of... Marshals Service, and the Director of the Office of Justice Assistance, Research and Statistics, as to...
SMART-DS: Synthetic Models for Advanced, Realistic Testing: Distribution
statistical summary of the U.S. distribution systems World-class, high spatial/temporal resolution of solar Systems and Scenarios | Grid Modernization | NREL SMART-DS: Synthetic Models for Advanced , Realistic Testing: Distribution Systems and Scenarios SMART-DS: Synthetic Models for Advanced, Realistic
New advances in the statistical parton distributions approach
NASA Astrophysics Data System (ADS)
Soffer, Jacques; Bourrely, Claude
2016-03-01
The quantum statistical parton distributions approach proposed more than one decade ago is revisited by considering a larger set of recent and accurate Deep Inelastic Scattering experimental results. It enables us to improve the description of the data by means of a new determination of the parton distributions. This global next-to-leading order QCD analysis leads to a good description of several structure functions, involving unpolarized parton distributions and helicity distributions, in terms of a rather small number of free parameters. There are many serious challenging issues. The predictions of this theoretical approach will be tested for single-jet production and charge asymmetry in W± production in p¯p and pp collisions up to LHC energies, using recent data and also for forthcoming experimental results. Presented by J. So.er at POETIC 2015
A guide to missing data for the pediatric nephrologist.
Larkins, Nicholas G; Craig, Jonathan C; Teixeira-Pinto, Armando
2018-03-13
Missing data is an important and common source of bias in clinical research. Readers should be alert to and consider the impact of missing data when reading studies. Beyond preventing missing data in the first place, through good study design and conduct, there are different strategies available to handle data containing missing observations. Complete case analysis is often biased unless data are missing completely at random. Better methods of handling missing data include multiple imputation and models using likelihood-based estimation. With advancing computing power and modern statistical software, these methods are within the reach of clinician-researchers under guidance of a biostatistician. As clinicians reading papers, we need to continue to update our understanding of statistical methods, so that we understand the limitations of these techniques and can critically interpret literature.
Memon, Aftab Hameed; Rahman, Ismail Abdul
2014-01-01
This study uncovered inhibiting factors to cost performance in large construction projects of Malaysia. Questionnaire survey was conducted among clients and consultants involved in large construction projects. In the questionnaire, a total of 35 inhibiting factors grouped in 7 categories were presented to the respondents for rating significant level of each factor. A total of 300 questionnaire forms were distributed. Only 144 completed sets were received and analysed using advanced multivariate statistical software of Structural Equation Modelling (SmartPLS v2). The analysis involved three iteration processes where several of the factors were deleted in order to make the model acceptable. The result of the analysis found that R 2 value of the model is 0.422 which indicates that the developed model has a substantial impact on cost performance. Based on the final form of the model, contractor's site management category is the most prominent in exhibiting effect on cost performance of large construction projects. This finding is validated using advanced techniques of power analysis. This vigorous multivariate analysis has explicitly found the significant category which consists of several causative factors to poor cost performance in large construction projects. This will benefit all parties involved in construction projects for controlling cost overrun. PMID:24693227
Memon, Aftab Hameed; Rahman, Ismail Abdul
2014-01-01
This study uncovered inhibiting factors to cost performance in large construction projects of Malaysia. Questionnaire survey was conducted among clients and consultants involved in large construction projects. In the questionnaire, a total of 35 inhibiting factors grouped in 7 categories were presented to the respondents for rating significant level of each factor. A total of 300 questionnaire forms were distributed. Only 144 completed sets were received and analysed using advanced multivariate statistical software of Structural Equation Modelling (SmartPLS v2). The analysis involved three iteration processes where several of the factors were deleted in order to make the model acceptable. The result of the analysis found that R(2) value of the model is 0.422 which indicates that the developed model has a substantial impact on cost performance. Based on the final form of the model, contractor's site management category is the most prominent in exhibiting effect on cost performance of large construction projects. This finding is validated using advanced techniques of power analysis. This vigorous multivariate analysis has explicitly found the significant category which consists of several causative factors to poor cost performance in large construction projects. This will benefit all parties involved in construction projects for controlling cost overrun.
NASA Astrophysics Data System (ADS)
Alves, Gelio; Wang, Guanghui; Ogurtsov, Aleksey Y.; Drake, Steven K.; Gucek, Marjan; Sacks, David B.; Yu, Yi-Kuo
2018-06-01
Rapid and accurate identification and classification of microorganisms is of paramount importance to public health and safety. With the advance of mass spectrometry (MS) technology, the speed of identification can be greatly improved. However, the increasing number of microbes sequenced is complicating correct microbial identification even in a simple sample due to the large number of candidates present. To properly untwine candidate microbes in samples containing one or more microbes, one needs to go beyond apparent morphology or simple "fingerprinting"; to correctly prioritize the candidate microbes, one needs to have accurate statistical significance in microbial identification. We meet these challenges by using peptide-centric representations of microbes to better separate them and by augmenting our earlier analysis method that yields accurate statistical significance. Here, we present an updated analysis workflow that uses tandem MS (MS/MS) spectra for microbial identification or classification. We have demonstrated, using 226 MS/MS publicly available data files (each containing from 2500 to nearly 100,000 MS/MS spectra) and 4000 additional MS/MS data files, that the updated workflow can correctly identify multiple microbes at the genus and often the species level for samples containing more than one microbe. We have also shown that the proposed workflow computes accurate statistical significances, i.e., E values for identified peptides and unified E values for identified microbes. Our updated analysis workflow MiCId, a freely available software for Microorganism Classification and Identification, is available for download at https://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads.html.
Neurological Outcomes Following Suicidal Hanging: A Prospective Study of 101 Patients
Jawaid, Mohammed Turab; Amalnath, S. Deepak; Subrahmanyam, D. K. S.
2017-01-01
Context: Survivors of suicidal hanging can have variable neurological outcomes – from complete recovery to irreversible brain damage. Literature on the neurological outcomes in these patients is confined to retrospective studies and case series. Hence, this prospective study was carried out. Aims: The aim is to study the neurological outcomes in suicidal hanging. Settings and Design: This was a prospective observational study carried out from July 2014 to July 2016. Subjects and Methods: Consecutive patients admitted to the emergency and medicine wards were included in the study. Details of the clinical and radiological findings, course in hospital and at 1 month postdischarge were analyzed. Statistical Analysis Used: Statistical analysis was performed using IBM SPSS advanced statistics 20.0 (SPSS Inc., Chicago, USA). Univariate analysis was performed using Chi-square test for significance and Odd's ratio was calculated. Results: Of the 101 patients, 6 died and 4 had residual neuro deficits. Cervical spine injury was seen in 3 patients. Interestingly, 39 patients could not remember the act of hanging (retrograde amnesia). Hypotension, pulmonary edema, Glasgow coma scale (GCS) score <8 at admission, need for mechanical ventilation, and cerebral edema on plain computed tomography were more in those with amnesia as compared to those with normal memory and these findings were statistically significant. Conclusions: Majority of patients recovered without any sequelae. Routine imaging of cervical spine may not be warranted in all patients, even in those with poor GCS. Retrograde amnesia might be more common than previously believed and further studies are needed to analyze this peculiar feature. PMID:28584409
Chahal, Gurparkash Singh; Chhina, Kamalpreet; Chhabra, Vipin; Bhatnagar, Rakhi; Chahal, Amna
2014-01-01
Background: A surface smear layer consisting of organic and inorganic material is formed on the root surface following mechanical instrumentation and may inhibit the formation of new connective tissue attachment to the root surface. Modification of the tooth surface by root conditioning has resulted in improved connective tissue attachment and has advanced the goal of reconstructive periodontal treatment. Aim: The aim of this study was to compare the effects of citric acid, tetracycline, and doxycycline on the instrumented periodontally involved root surfaces in vitro using a scanning electron microscope. Settings and Design: A total of 45 dentin samples obtained from 15 extracted, scaled, and root planed teeth were divided into three groups. Materials and Methods: The root conditioning agents were applied with cotton pellets using the Passive burnishing technique for 5 minutes. The samples were then examined by the scanning electron microscope. Statistical Analysis Used: The statistical analysis was carried out using Statistical Package for Social Sciences (SPSS Inc., Chicago, IL, version 15.0 for Windows). For all quantitative variables means and standard deviations were calculated and compared. For more than two groups ANOVA was applied. For multiple comparisons post hoc tests with Bonferroni correction was used. Results: Upon statistical analysis the root conditioning agents used in this study were found to be effective in removing the smear layer, uncovering and widening the dentin tubules and unmasking the dentin collagen matrix. Conclusion: Tetracycline HCl was found to be the best root conditioner among the three agents used. PMID:24744541
Alves, Gelio; Wang, Guanghui; Ogurtsov, Aleksey Y; Drake, Steven K; Gucek, Marjan; Sacks, David B; Yu, Yi-Kuo
2018-06-05
Rapid and accurate identification and classification of microorganisms is of paramount importance to public health and safety. With the advance of mass spectrometry (MS) technology, the speed of identification can be greatly improved. However, the increasing number of microbes sequenced is complicating correct microbial identification even in a simple sample due to the large number of candidates present. To properly untwine candidate microbes in samples containing one or more microbes, one needs to go beyond apparent morphology or simple "fingerprinting"; to correctly prioritize the candidate microbes, one needs to have accurate statistical significance in microbial identification. We meet these challenges by using peptide-centric representations of microbes to better separate them and by augmenting our earlier analysis method that yields accurate statistical significance. Here, we present an updated analysis workflow that uses tandem MS (MS/MS) spectra for microbial identification or classification. We have demonstrated, using 226 MS/MS publicly available data files (each containing from 2500 to nearly 100,000 MS/MS spectra) and 4000 additional MS/MS data files, that the updated workflow can correctly identify multiple microbes at the genus and often the species level for samples containing more than one microbe. We have also shown that the proposed workflow computes accurate statistical significances, i.e., E values for identified peptides and unified E values for identified microbes. Our updated analysis workflow MiCId, a freely available software for Microorganism Classification and Identification, is available for download at https://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads.html . Graphical Abstract ᅟ.
NASA Astrophysics Data System (ADS)
Santiago-Lona, Cynthia V.; Hernández-Montes, María del Socorro; Mendoza-Santoyo, Fernando; Esquivel-Tejeda, Jesús
2018-02-01
The study and quantification of the tympanic membrane (TM) displacements add important information to advance the knowledge about the hearing process. A comparative statistical analysis between two commonly used demodulation methods employed to recover the optical phase in digital holographic interferometry, namely the fast Fourier transform and phase-shifting interferometry, is presented as applied to study thin tissues such as the TM. The resulting experimental TM surface displacement data are used to contrast both methods through the analysis of variance and F tests. Data are gathered when the TMs are excited with continuous sound stimuli at levels 86, 89 and 93 dB SPL for the frequencies of 800, 1300 and 2500 Hz under the same experimental conditions. The statistical analysis shows repeatability in z-direction displacements with a standard deviation of 0.086, 0.098 and 0.080 μm using the Fourier method, and 0.080, 0.104 and 0.055 μm with the phase-shifting method at a 95% confidence level for all frequencies. The precision and accuracy are evaluated by means of the coefficient of variation; the results with the Fourier method are 0.06143, 0.06125, 0.06154 and 0.06154, 0.06118, 0.06111 with phase-shifting. The relative error between both methods is 7.143, 6.250 and 30.769%. On comparing the measured displacements, the results indicate that there is no statistically significant difference between both methods for frequencies at 800 and 1300 Hz; however, errors and other statistics increase at 2500 Hz.
The effects of hands-on-science instruction on the science achievement of middle school students
NASA Astrophysics Data System (ADS)
Wiggins, Felita
Student achievement in the Twenty First Century demands a new rigor in student science knowledge, since advances in science and technology require students to think and act like scientists. As a result, students must acquire proficient levels of knowledge and skills to support a knowledge base that is expanding exponentially with new scientific advances. This study examined the effects of hands-on-science instruction on the science achievement of middle school students. More specifically, this study was concerned with the influence of hands-on science instruction versus traditional science instruction on the science test scores of middle school students. The subjects in this study were one hundred and twenty sixth-grade students in six classes. Instruction involved lecture/discussion and hands-on activities carried out for a three week period. Specifically, the study ascertained the influence of the variables gender, ethnicity, and socioeconomic status on the science test scores of middle school students. Additionally, this study assessed the effect of the variables gender, ethnicity, and socioeconomic status on the attitudes of sixth grade students toward science. The two instruments used to collect data for this study were the Prentice Hall unit ecosystem test and the Scientific Work Experience Programs for Teachers Study (SWEPT) student's attitude survey. Moreover, the data for the study was treated using the One-Way Analysis of Covariance and the One-Way Analysis of Variance. The following findings were made based on the results: (1) A statistically significant difference existed in the science performance of middle school students exposed to hands-on science instruction. These students had significantly higher scores than the science performance of middle school students exposed to traditional instruction. (2) A statistically significant difference did not exist between the science scores of male and female middle school students. (3) A statistically significant difference did not exist between the science scores of African American and non-African American middle school students. (4) A statistically significant difference existed in the socioeconomic status of students who were not provided with assisted lunches. Students with unassisted lunches had significantly higher science scores than those middle school students who were provided with assisted lunches. (5) A statistically significant difference was not found in the attitude scores of middle school students who were exposed to hands-on or traditional science instruction. (6) A statistically significant difference was not found in the observed attitude scores of middle school students who were exposed to either hands-on or traditional science instruction by their socioeconomic status. (7) A statistically significant difference was not found in the observed attitude scores of male and female students. (8) A statistically significant difference was not found in the observed attitude scores of African American and non African American students.
Ankle plantarflexion strength in rearfoot and forefoot runners: a novel clusteranalytic approach.
Liebl, Dominik; Willwacher, Steffen; Hamill, Joseph; Brüggemann, Gert-Peter
2014-06-01
The purpose of the present study was to test for differences in ankle plantarflexion strengths of habitually rearfoot and forefoot runners. In order to approach this issue, we revisit the problem of classifying different footfall patterns in human runners. A dataset of 119 subjects running shod and barefoot (speed 3.5m/s) was analyzed. The footfall patterns were clustered by a novel statistical approach, which is motivated by advances in the statistical literature on functional data analysis. We explain the novel statistical approach in detail and compare it to the classically used strike index of Cavanagh and Lafortune (1980). The two groups found by the new cluster approach are well interpretable as a forefoot and a rearfoot footfall groups. The subsequent comparison study of the clustered subjects reveals that runners with a forefoot footfall pattern are capable of producing significantly higher joint moments in a maximum voluntary contraction (MVC) of their ankle plantarflexor muscles tendon units; difference in means: 0.28Nm/kg. This effect remains significant after controlling for an additional gender effect and for differences in training levels. Our analysis confirms the hypothesis that forefoot runners have a higher mean MVC plantarflexion strength than rearfoot runners. Furthermore, we demonstrate that our proposed stochastic cluster analysis provides a robust and useful framework for clustering foot strikes. Copyright © 2014 Elsevier B.V. All rights reserved.
Bautista, Josef; Bella, Archie; Chaudhari, Ashok; Pekler, Gerald; Sapra, Katherine J; Carbajal, Roger; Baumstein, Donald
2015-04-01
The R2CHADS2 is a new prediction rule for stroke risk in atrial fibrillation (AF) patients wherein R stands for renal risk. However, it was created from a cohort that excluded patients with advanced renal failure (defined as glomerular filtration rate of <30 mL/min). Our study extends the use of R2CHADS2 to patients with advanced renal failure and aims to compare its predictive power against the currently used CHADS and CHA2DS2VaSc. This retrospective cohort study analyzed the 1-year risk for stroke of the 524 patients with AF at Metropolitan Hospital Center. AUC and C statistics were calculated using three groups: (i) the entire cohort including patients with advanced renal failure, (ii) a cohort excluding patients with advanced renal failure and (iii) all patients with GFR < 30 mL/min only. R2CHADS2, as a predictor for stroke risk, consistently performs better than CHADS2 and CHA2DS2VsC in groups 1 and 2. The C-statistic was highest in R2CHADS compared with CHADS or CHADSVASC in group 1 (0.718 versus 0.605 versus 0.602) and in group 2 (0.724 versus 0.584 versus 0.579). However, there was no statistically significant difference in group 3 (0.631 versus 0.629 versus 0.623). Our study supports the utility of R2CHADS2 as a clinical prediction rule for stroke risk in patients with advanced renal failure.
Key statistical and analytical issues for evaluating treatment effects in periodontal research.
Tu, Yu-Kang; Gilthorpe, Mark S
2012-06-01
Statistics is an indispensible tool for evaluating treatment effects in clinical research. Due to the complexities of periodontal disease progression and data collection, statistical analyses for periodontal research have been a great challenge for both clinicians and statisticians. The aim of this article is to provide an overview of several basic, but important, statistical issues related to the evaluation of treatment effects and to clarify some common statistical misconceptions. Some of these issues are general, concerning many disciplines, and some are unique to periodontal research. We first discuss several statistical concepts that have sometimes been overlooked or misunderstood by periodontal researchers. For instance, decisions about whether to use the t-test or analysis of covariance, or whether to use parametric tests such as the t-test or its non-parametric counterpart, the Mann-Whitney U-test, have perplexed many periodontal researchers. We also describe more advanced methodological issues that have sometimes been overlooked by researchers. For instance, the phenomenon of regression to the mean is a fundamental issue to be considered when evaluating treatment effects, and collinearity amongst covariates is a conundrum that must be resolved when explaining and predicting treatment effects. Quick and easy solutions to these methodological and analytical issues are not always available in the literature, and careful statistical thinking is paramount when conducting useful and meaningful research. © 2012 John Wiley & Sons A/S.
Factors related to student performance in statistics courses in Lebanon
NASA Astrophysics Data System (ADS)
Naccache, Hiba Salim
The purpose of the present study was to identify factors that may contribute to business students in Lebanese universities having difficulty in introductory and advanced statistics courses. Two statistics courses are required for business majors at Lebanese universities. Students are not obliged to be enrolled in any math courses prior to taking statistics courses. Drawing on recent educational research, this dissertation attempted to identify the relationship between (1) students’ scores on Lebanese university math admissions tests; (2) students’ scores on a test of very basic mathematical concepts; (3) students’ scores on the survey of attitude toward statistics (SATS); (4) course performance as measured by students’ final scores in the course; and (5) their scores on the final exam. Data were collected from 561 students enrolled in multiple sections of two courses: 307 students in the introductory statistics course and 260 in the advanced statistics course in seven campuses across Lebanon over one semester. The multiple regressions results revealed four significant relationships at the introductory level: between students’ scores on the math quiz with their (1) final exam scores; (2) their final averages; (3) the Cognitive subscale of the SATS with their final exam scores; and (4) their final averages. These four significant relationships were also found at the advanced level. In addition, two more significant relationships were found between students’ final average and the two subscales of Effort (5) and Affect (6). No relationship was found between students’ scores on the admission math tests and both their final exam scores and their final averages in both the introductory and advanced level courses. On the other hand, there was no relationship between students’ scores on Lebanese admissions tests and their final achievement. Although these results were consistent across course formats and instructors, they may encourage Lebanese universities to assess the effectiveness of prerequisite math courses. Moreover, these findings may lead the Lebanese Ministry of Education to make changes to the admissions exams, course prerequisites, and course content. Finally, to enhance the attitude of students, new learning techniques, such as group work during class meetings can be helpful, and future research should aim to test the effectiveness of these pedagogical techniques on students’ attitudes toward statistics.
Advanced Land Imager Assessment System
NASA Technical Reports Server (NTRS)
Chander, Gyanesh; Choate, Mike; Christopherson, Jon; Hollaren, Doug; Morfitt, Ron; Nelson, Jim; Nelson, Shar; Storey, James; Helder, Dennis; Ruggles, Tim;
2008-01-01
The Advanced Land Imager Assessment System (ALIAS) supports radiometric and geometric image processing for the Advanced Land Imager (ALI) instrument onboard NASA s Earth Observing-1 (EO-1) satellite. ALIAS consists of two processing subsystems for radiometric and geometric processing of the ALI s multispectral imagery. The radiometric processing subsystem characterizes and corrects, where possible, radiometric qualities including: coherent, impulse; and random noise; signal-to-noise ratios (SNRs); detector operability; gain; bias; saturation levels; striping and banding; and the stability of detector performance. The geometric processing subsystem and analysis capabilities support sensor alignment calibrations, sensor chip assembly (SCA)-to-SCA alignments and band-to-band alignment; and perform geodetic accuracy assessments, modulation transfer function (MTF) characterizations, and image-to-image characterizations. ALIAS also characterizes and corrects band-toband registration, and performs systematic precision and terrain correction of ALI images. This system can geometrically correct, and automatically mosaic, the SCA image strips into a seamless, map-projected image. This system provides a large database, which enables bulk trending for all ALI image data and significant instrument telemetry. Bulk trending consists of two functions: Housekeeping Processing and Bulk Radiometric Processing. The Housekeeping function pulls telemetry and temperature information from the instrument housekeeping files and writes this information to a database for trending. The Bulk Radiometric Processing function writes statistical information from the dark data acquired before and after the Earth imagery and the lamp data to the database for trending. This allows for multi-scene statistical analyses.
Abramson, Zachary; Susarla, Srinivas M; Lawler, Matthew; Bouchard, Carl; Troulis, Maria; Kaban, Leonard B
2011-03-01
To evaluate changes in airway size and shape in patients with obstructive sleep apnea (OSA) after maxillomandibular advancement (MMA) and genial tubercle advancement (GTA). This was a retrospective cohort study, enrolling a sample of adults with polysomnography-confirmed OSA who underwent MMA + GTA. All subjects who had preoperative and postoperative 3-dimensional computed tomography (CT) scans to evaluate changes in airway size and shape after MMA + GTA were included. Preoperative and postoperative sleep- and breathing-related symptoms were recorded. Descriptive and bivariate statistics were computed. For all analyses, P < .05 was considered statistically significant. During the study period, 13 patients underwent MMA + GTA, of whom 11 (84.6%) met the inclusion criteria. There were 9 men and 2 women with a mean age of 39 years. The mean body mass index was 26.3; mean respiratory disturbance index (RDI), 48.8; and mean lowest oxygen saturation, 80.5%. After MMA + GTA, there were significant increases in lateral and anteroposterior airway diameters (P < .01), volume (P = .02), surface area (P < .01), and cross-sectional areas at multiple sites (P < .04). Airway length decreased (P < .01) and airway shape (P = .04) became more uniform. The mean change in RDI was -60%. Results of this preliminary study indicate that MMA + GTA appears to produce significant changes in airway size and shape that correlate with a decrease in RDI. Copyright © 2011 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.
Bellmunt, Joaquim; Théodore, Christine; Demkov, Tomasz; Komyakov, Boris; Sengelov, Lisa; Daugaard, Gedske; Caty, Armelle; Carles, Joan; Jagiello-Gruszfeld, Agnieszka; Karyakin, Oleg; Delgado, François-Michel; Hurteloup, Patrick; Winquist, Eric; Morsli, Nassim; Salhi, Yacine; Culine, Stéphane; von der Maase, Hans
2009-09-20
Vinflunine (VFL) is a new microtubule inhibitor that has activity against transitional cell carcinoma of urothelial tract (TCCU). We conducted a randomized phase III study of VFL and best supportive care (BSC) versus BSC alone in the treatment of patients with advanced TCCU who had experienced progression after a first-line platinum-containing regimen. The study was designed to compare overall survival (OS) between patients receiving VFL + BSC (performance status [PS] = 0: 320 mg/m(2), every 3 weeks; PS = 0 with previous pelvic radiation and PS = 1: 280 mg/m(2) subsequently escalated to 320 mg/m(2)) or BSC. Three hundred seventy patients were randomly assigned (VFL + BSC, n =253; BSC, n = 117). Both arms were well balanced except there were more patients with PS more than 1 (10% difference) in the BSC arm. Main grade 3 or 4 toxicities for VFL + BSC were neutropenia (50%), febrile neutropenia (6%), anemia (19%), fatigue (19%), and constipation (16%). In the intent-to-treat population, the objective of a median 2-month survival advantage (6.9 months for VFL + BSC v 4.6 months for BSC) was achieved (hazard ratio [HR] = 0.88; 95% CI, 0.69 to 1.12) but was not statistically significant (P = .287). Multivariate Cox analysis adjusting for prognostic factors showed statistically significant effect of VFL on OS (P = .036), reducing the death risk by 23% (HR = 0.77; 95% CI, 0.61 to 0.98). In the eligible population (n = 357), the median OS was significantly longer for VFL + BSC than BSC (6.9 v 4.3 months, respectively), with the difference being statistically significant (P = .040). Overall response rate, disease control, and progression-free survival were all statistically significant favoring VFL + BSC (P = .006, P = .002, and P = .001, respectively). VFL demonstrates a survival advantage in second-line treatment for advanced TCCU. Consistency of results exists with significant and meaningful benefit over all efficacy parameters. Safety profile is acceptable, and therefore, VFL seems to be a reasonable option for TCCU progressing after first-line platinum-based therapy.
Ciani, Oriana; Davis, Sarah; Tappenden, Paul; Garside, Ruth; Stein, Ken; Cantrell, Anna; Saad, Everardo D; Buyse, Marc; Taylor, Rod S
2014-07-01
Licensing of, and coverage decisions on, new therapies should rely on evidence from patient-relevant endpoints such as overall survival (OS). Nevertheless, evidence from surrogate endpoints may also be useful, as it may not only expedite the regulatory approval of new therapies but also inform coverage decisions. It is, therefore, essential that candidate surrogate endpoints be properly validated. However, there is no consensus on statistical methods for such validation and on how the evidence thus derived should be applied by policy makers. We review current statistical approaches to surrogate-endpoint validation based on meta-analysis in various advanced-tumor settings. We assessed the suitability of two surrogates (progression-free survival [PFS] and time-to-progression [TTP]) using three current validation frameworks: Elston and Taylor's framework, the German Institute of Quality and Efficiency in Health Care's (IQWiG) framework and the Biomarker-Surrogacy Evaluation Schema (BSES3). A wide variety of statistical methods have been used to assess surrogacy. The strength of the association between the two surrogates and OS was generally low. The level of evidence (observation-level versus treatment-level) available varied considerably by cancer type, by evaluation tools and was not always consistent even within one specific cancer type. Not in all solid tumors the treatment-level association between PFS or TTP and OS has been investigated. According to IQWiG's framework, only PFS achieved acceptable evidence of surrogacy in metastatic colorectal and ovarian cancer treated with cytotoxic agents. Our study emphasizes the challenges of surrogate-endpoint validation and the importance of building consensus on the development of evaluation frameworks.
Resampling: A Marriage of Computers and Statistics. ERIC/TM Digest.
ERIC Educational Resources Information Center
Rudner, Lawrence M.; Shafer, Mary Morello
Advances in computer technology are making it possible for educational researchers to use simpler statistical methods to address a wide range of questions with smaller data sets and fewer, and less restrictive, assumptions. This digest introduces computationally intensive statistics, collectively called resampling techniques. Resampling is a…
Testing the Difference of Correlated Agreement Coefficients for Statistical Significance
ERIC Educational Resources Information Center
Gwet, Kilem L.
2016-01-01
This article addresses the problem of testing the difference between two correlated agreement coefficients for statistical significance. A number of authors have proposed methods for testing the difference between two correlated kappa coefficients, which require either the use of resampling methods or the use of advanced statistical modeling…
Rinaldi, Antonio
2011-04-01
Traditional fiber bundles models (FBMs) have been an effective tool to understand brittle heterogeneous systems. However, fiber bundles in modern nano- and bioapplications demand a new generation of FBM capturing more complex deformation processes in addition to damage. In the context of loose bundle systems and with reference to time-independent plasticity and soft biomaterials, we formulate a generalized statistical model for ductile fracture and nonlinear elastic problems capable of handling more simultaneous deformation mechanisms by means of two order parameters (as opposed to one). As the first rational FBM for coupled damage problems, it may be the cornerstone for advanced statistical models of heterogeneous systems in nanoscience and materials design, especially to explore hierarchical and bio-inspired concepts in the arena of nanobiotechnology. Applicative examples are provided for illustrative purposes at last, discussing issues in inverse analysis (i.e., nonlinear elastic polymer fiber and ductile Cu submicron bars arrays) and direct design (i.e., strength prediction).
Application of multivariate statistical techniques in microbial ecology
Paliy, O.; Shankar, V.
2016-01-01
Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large scale ecological datasets. Especially noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions, and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amounts of data, powerful statistical techniques of multivariate analysis are well suited to analyze and interpret these datasets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular dataset. In this review we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive, and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and dataset structure. PMID:26786791
NASA Technical Reports Server (NTRS)
Sprowls, D. O.; Bucci, R. J.; Ponchel, B. M.; Brazill, R. L.; Bretz, P. E.
1984-01-01
A technique is demonstrated for accelerated stress corrosion testing of high strength aluminum alloys. The method offers better precision and shorter exposure times than traditional pass fail procedures. The approach uses data from tension tests performed on replicate groups of smooth specimens after various lengths of exposure to static stress. The breaking strength measures degradation in the test specimen load carrying ability due to the environmental attack. Analysis of breaking load data by extreme value statistics enables the calculation of survival probabilities and a statistically defined threshold stress applicable to the specific test conditions. A fracture mechanics model is given which quantifies depth of attack in the stress corroded specimen by an effective flaw size calculated from the breaking stress and the material strength and fracture toughness properties. Comparisons are made with experimental results from three tempers of 7075 alloy plate tested by the breaking load method and by traditional tests of statistically loaded smooth tension bars and conventional precracked specimens.
On entropy, financial markets and minority games
NASA Astrophysics Data System (ADS)
Zapart, Christopher A.
2009-04-01
The paper builds upon an earlier statistical analysis of financial time series with Shannon information entropy, published in [L. Molgedey, W. Ebeling, Local order, entropy and predictability of financial time series, European Physical Journal B-Condensed Matter and Complex Systems 15/4 (2000) 733-737]. A novel generic procedure is proposed for making multistep-ahead predictions of time series by building a statistical model of entropy. The approach is first demonstrated on the chaotic Mackey-Glass time series and later applied to Japanese Yen/US dollar intraday currency data. The paper also reinterprets Minority Games [E. Moro, The minority game: An introductory guide, Advances in Condensed Matter and Statistical Physics (2004)] within the context of physical entropy, and uses models derived from minority game theory as a tool for measuring the entropy of a model in response to time series. This entropy conditional upon a model is subsequently used in place of information-theoretic entropy in the proposed multistep prediction algorithm.
Advances and Best Practices in Airborne Gravimetry from the U.S. GRAV-D Project
NASA Astrophysics Data System (ADS)
Diehl, Theresa; Childers, Vicki; Preaux, Sandra; Holmes, Simon; Weil, Carly
2013-04-01
The Gravity for the Redefinition of the American Vertical Datum (GRAV-D) project, an official policy of the U.S. National Geodetic Survey as of 2007, is working to survey the entire U.S. and its holdings with high-altitude airborne gravimetry. The goal of the project is to provide a consistent, high-quality gravity dataset that will become the cornerstone of a new gravimetric geoid and national vertical datum in 2022. Over the last five years, the GRAV-D project has surveyed more than 25% of the country, accomplishing almost 500 flights on six different aircraft platforms and producing more than 3.7 Million square km of data thus far. This wealth of experience has led to advances in the collection, processing, and evaluation of high-altitude (20,000 - 35,000 ft) airborne gravity data. This presentation will highlight the most important practical and theoretical advances of the GRAV-D project, giving an introduction to each. Examples of innovation include: 1. Use of navigation grade inertial measurement unit data and precise lever arm measurements for positioning; 2. New quality control tests and software for near real-time analysis of data in the field; 3. Increased accuracy of gravity post-processing by reexamining assumptions and simplifications that were inconsistent with a goal of 1 mGal precision; and 4. Better final data evaluation through crossovers, additional statistics, and inclusion of airborne data into harmonic models that use EGM08 as a base model. The increases in data quality that resulted from implementation of the above advances (and others) will be shown with a case study of the GRAV-D 2008 southern Alaska survey near Anchorage, over Cook Inlet. The case study's statistics and comparisons to global models illustrate the impact that these advances have had on the final airborne gravity data quality. Finally, the presentation will summarize the best practices identified by the project from its last five years of experience.
Angstman, Nicholas B; Frank, Hans-Georg; Schmitz, Christoph
2016-01-01
As a widely used and studied model organism, Caenorhabditis elegans worms offer the ability to investigate implications of behavioral change. Although, investigation of C. elegans behavioral traits has been shown, analysis is often narrowed down to measurements based off a single point, and thus cannot pick up on subtle behavioral and morphological changes. In the present study videos were captured of four different C. elegans strains grown in liquid cultures and transferred to NGM-agar plates with an E. coli lawn or with no lawn. Using an advanced software, WormLab, the full skeleton and outline of worms were tracked to determine whether the presence of food affects behavioral traits. In all seven investigated parameters, statistically significant differences were found in worm behavior between those moving on NGM-agar plates with an E. coli lawn and NGM-agar plates with no lawn. Furthermore, multiple test groups showed differences in interaction between variables as the parameters that significantly correlated statistically with speed of locomotion varied. In the present study, we demonstrate the validity of a model to analyze C. elegans behavior beyond simple speed of locomotion. The need to account for a nested design while performing statistical analyses in similar studies is also demonstrated. With extended analyses, C. elegans behavioral change can be investigated with greater sensitivity, which could have wide utility in fields such as, but not limited to, toxicology, drug discovery, and RNAi screening.
Quaglio, Pietro; Yegenoglu, Alper; Torre, Emiliano; Endres, Dominik M; Grün, Sonja
2017-01-01
Repeated, precise sequences of spikes are largely considered a signature of activation of cell assemblies. These repeated sequences are commonly known under the name of spatio-temporal patterns (STPs). STPs are hypothesized to play a role in the communication of information in the computational process operated by the cerebral cortex. A variety of statistical methods for the detection of STPs have been developed and applied to electrophysiological recordings, but such methods scale poorly with the current size of available parallel spike train recordings (more than 100 neurons). In this work, we introduce a novel method capable of overcoming the computational and statistical limits of existing analysis techniques in detecting repeating STPs within massively parallel spike trains (MPST). We employ advanced data mining techniques to efficiently extract repeating sequences of spikes from the data. Then, we introduce and compare two alternative approaches to distinguish statistically significant patterns from chance sequences. The first approach uses a measure known as conceptual stability, of which we investigate a computationally cheap approximation for applications to such large data sets. The second approach is based on the evaluation of pattern statistical significance. In particular, we provide an extension to STPs of a method we recently introduced for the evaluation of statistical significance of synchronous spike patterns. The performance of the two approaches is evaluated in terms of computational load and statistical power on a variety of artificial data sets that replicate specific features of experimental data. Both methods provide an effective and robust procedure for detection of STPs in MPST data. The method based on significance evaluation shows the best overall performance, although at a higher computational cost. We name the novel procedure the spatio-temporal Spike PAttern Detection and Evaluation (SPADE) analysis.
Quaglio, Pietro; Yegenoglu, Alper; Torre, Emiliano; Endres, Dominik M.; Grün, Sonja
2017-01-01
Repeated, precise sequences of spikes are largely considered a signature of activation of cell assemblies. These repeated sequences are commonly known under the name of spatio-temporal patterns (STPs). STPs are hypothesized to play a role in the communication of information in the computational process operated by the cerebral cortex. A variety of statistical methods for the detection of STPs have been developed and applied to electrophysiological recordings, but such methods scale poorly with the current size of available parallel spike train recordings (more than 100 neurons). In this work, we introduce a novel method capable of overcoming the computational and statistical limits of existing analysis techniques in detecting repeating STPs within massively parallel spike trains (MPST). We employ advanced data mining techniques to efficiently extract repeating sequences of spikes from the data. Then, we introduce and compare two alternative approaches to distinguish statistically significant patterns from chance sequences. The first approach uses a measure known as conceptual stability, of which we investigate a computationally cheap approximation for applications to such large data sets. The second approach is based on the evaluation of pattern statistical significance. In particular, we provide an extension to STPs of a method we recently introduced for the evaluation of statistical significance of synchronous spike patterns. The performance of the two approaches is evaluated in terms of computational load and statistical power on a variety of artificial data sets that replicate specific features of experimental data. Both methods provide an effective and robust procedure for detection of STPs in MPST data. The method based on significance evaluation shows the best overall performance, although at a higher computational cost. We name the novel procedure the spatio-temporal Spike PAttern Detection and Evaluation (SPADE) analysis. PMID:28596729
Seddon, Johanna M; Reynolds, Robyn; Maller, Julian; Fagerness, Jesen A; Daly, Mark J; Rosner, Bernard
2009-05-01
The joint effects of genetic, ocular, and environmental variables were evaluated and predictive models for prevalence and incidence of AMD were assessed. Participants in the multicenter Age-Related Eye Disease Study (AREDS) were included in a prospective evaluation of 1446 individuals, of which 279 progressed to advanced AMD (geographic atrophy or neovascular disease) and 1167 did not progress during 6.3 years of follow-up. For prevalent AMD, 509 advanced cases were compared with 222 controls. Covariates for the incidence analysis included age, sex, education, smoking, body mass index (BMI), baseline AMD grade, and the AREDS vitamin-mineral treatment assignment. DNA specimens were evaluated for six variants in five genes related to AMD. Unconditional logistic regression analyses were performed for prevalent and incident advanced AMD. An algorithm was developed and receiver operating characteristic curves and C statistics were calculated to assess the predictive ability of risk scores to discriminate progressors from nonprogressors. All genetic polymorphisms were independently related to prevalence of advanced AMD, controlling for genetic factors, smoking, BMI, and AREDS treatment. Multivariate odds ratios (ORs) were 3.5 (95% confidence interval [CI], 1.7-7.1) for CFH Y402H; 3.7 (95% CI, 1.6-8.4) for CFH rs1410996; 25.4 (95% CI, 8.6-75.1) for LOC387715 A69S (ARMS2); 0.3 (95% CI, 0.1-0.7) for C2 E318D; 0.3 (95% CI, 0.1-0.5) for CFB; and 3.6 (95% CI, 1.4-9.4) for C3 R102G, comparing the homozygous risk/protective genotypes to the referent genotypes. For incident AMD, all these variants except CFB were significantly related to progression to advanced AMD, after controlling for baseline AMD grade and other factors, with ORs from 1.8 to 4.0 for presence of two risk alleles and 0.4 for the protective allele. An interaction was seen between CFH402H and treatment, after controlling for all genotypes. Smoking was independently related to AMD, with a multiplicative joint effect with genotype on AMD risk. The C statistic for the full model with all variables was 0.831 for progression to advanced AMD. Factors reflective of nature and nurture are independently related to prevalence and incidence of advanced AMD, with excellent predictive power.
75 FR 78063 - Passenger Weight and Inspected Vessel Stability Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-14
... Health Statistics NEPA--National Environmental Policy Act of 1969 NHANES--National Health and Nutrition..., Advance Data From Vital Health Statistics Mean Body Weight, Height, and Body Mass Index, United States...
Ardin, Maude; Cahais, Vincent; Castells, Xavier; Bouaoun, Liacine; Byrnes, Graham; Herceg, Zdenko; Zavadil, Jiri; Olivier, Magali
2016-04-18
The nature of somatic mutations observed in human tumors at single gene or genome-wide levels can reveal information on past carcinogenic exposures and mutational processes contributing to tumor development. While large amounts of sequencing data are being generated, the associated analysis and interpretation of mutation patterns that may reveal clues about the natural history of cancer present complex and challenging tasks that require advanced bioinformatics skills. To make such analyses accessible to a wider community of researchers with no programming expertise, we have developed within the web-based user-friendly platform Galaxy a first-of-its-kind package called MutSpec. MutSpec includes a set of tools that perform variant annotation and use advanced statistics for the identification of mutation signatures present in cancer genomes and for comparing the obtained signatures with those published in the COSMIC database and other sources. MutSpec offers an accessible framework for building reproducible analysis pipelines, integrating existing methods and scripts developed in-house with publicly available R packages. MutSpec may be used to analyse data from whole-exome, whole-genome or targeted sequencing experiments performed on human or mouse genomes. Results are provided in various formats including rich graphical outputs. An example is presented to illustrate the package functionalities, the straightforward workflow analysis and the richness of the statistics and publication-grade graphics produced by the tool. MutSpec offers an easy-to-use graphical interface embedded in the popular Galaxy platform that can be used by researchers with limited programming or bioinformatics expertise to analyse mutation signatures present in cancer genomes. MutSpec can thus effectively assist in the discovery of complex mutational processes resulting from exogenous and endogenous carcinogenic insults.
Santos, Hellen-Bandeira-de-Pontes; dos Santos, Thayana-Karla-Guerra; Paz, Alexandre-Rolim; Cavalcanti, Yuri-Wanderley; Nonaka, Cassiano-Francisco-Weege; Godoy, Gustavo-Pina; Alves, Pollianna-Muniz
2016-03-01
In recent years have been observed an increased incidence of OSCC in young individuals. Based on this, the aim this study was to describe the clinical characteristics of all cases of OSCC in younger patients, diagnosed in two oncology referral hospitals, at the northeast region of Brazil within a 12-year period. Data regarding general characteristics of patients (age, gender and tobacco and/or alcohol habits) and information about the lesions (tumor location, size, regional lymph node metastasis, distant metastasis and clinical stage) were submitted to descriptive and inferential analysis. Statistical analysis included Chi-square and Fisher's exact tests (P<0.05). Out of 2311 registered cases of OSCC, 76 (3.3%) corresponded to OSCC in patients under 45 years old. Most of them were male (n=62, 81.6%) and tobacco and/or alcohol users (n=40, 52.8%). The most frequent site was the tongue (n=31, 40.8%), with predominance of cases classified at advanced clinical stage (III and IV, n = 46, 60.5%). The advanced stage of OSCC (III and IV) was statistically associated with male gender (P=0.035), lower education level (P=0.007), intraoral sites (P<0.001), presence of pain symptomatology (P=0.006), and consumption of tobacco and/or alcohol (P=0.001). The profile of OSCC in young patients resembles to the commonly characteristics reported for overall population. The late diagnosis in young patients usually results in poor prognosis, associated with gender, harmful habits and tumor location. Although prevalence is low, stimulus to prevention and to early diagnosis should be addressed to young individuals exposed to risk factors.
NASA Astrophysics Data System (ADS)
Wilson, D.; Hopkins, C.
2015-04-01
For bending wave transmission across periodic box-like arrangements of plates, the effects of spatial filtering can be significant and this needs to be considered in the choice of prediction model. This paper investigates the errors that can occur with Statistical Energy Analysis (SEA) and the potential of using Advanced SEA (ASEA) to improve predictions. The focus is on the low- and mid-frequency range where plates only support local modes with low mode counts and the in situ modal overlap is relatively high. To increase the computational efficiency when using ASEA on large systems, a beam tracing method is introduced which groups together all rays with the same heading into a single beam. Based on a diffuse field on the source plate, numerical experiments are used to determine the angular distribution of incident power on receiver plate edges on linear and cuboid box-like structures. These show that on receiver plates which do not share a boundary with the source plate, the angular distribution on the receiver plate boundaries differs significantly from a diffuse field. SEA and ASEA predictions are assessed through comparison with finite element models. With rain-on-the-roof excitation on the source plate, the results show that compared to SEA, ASEA provides significantly better estimates of the receiver plate energy, but only where there are at least one or two bending modes in each one-third octave band. Whilst ASEA provides better accuracy than SEA, discrepancies still exist which become more apparent when the direct propagation path crosses more than three nominally identical structural junctions.
14 CFR 151.111 - Advance planning proposals: General.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Engineering Proposals § 151.111 Advance planning proposals: General. (a) Each advance planning and engineering... application, under §§ 151.21(c) and 151.27, or both. (c) Each proposal must relate to planning and engineering... “Airport Activity Statistics of Certificated Route Air Carriers” (published jointly by FAA and the Civil...
Simultaneous Analysis and Quality Assurance for Diffusion Tensor Imaging
Lauzon, Carolyn B.; Asman, Andrew J.; Esparza, Michael L.; Burns, Scott S.; Fan, Qiuyun; Gao, Yurui; Anderson, Adam W.; Davis, Nicole; Cutting, Laurie E.; Landman, Bennett A.
2013-01-01
Diffusion tensor imaging (DTI) enables non-invasive, cyto-architectural mapping of in vivo tissue microarchitecture through voxel-wise mathematical modeling of multiple magnetic resonance imaging (MRI) acquisitions, each differently sensitized to water diffusion. DTI computations are fundamentally estimation processes and are sensitive to noise and artifacts. Despite widespread adoption in the neuroimaging community, maintaining consistent DTI data quality remains challenging given the propensity for patient motion, artifacts associated with fast imaging techniques, and the possibility of hardware changes/failures. Furthermore, the quantity of data acquired per voxel, the non-linear estimation process, and numerous potential use cases complicate traditional visual data inspection approaches. Currently, quality inspection of DTI data has relied on visual inspection and individual processing in DTI analysis software programs (e.g. DTIPrep, DTI-studio). However, recent advances in applied statistical methods have yielded several different metrics to assess noise level, artifact propensity, quality of tensor fit, variance of estimated measures, and bias in estimated measures. To date, these metrics have been largely studied in isolation. Herein, we select complementary metrics for integration into an automatic DTI analysis and quality assurance pipeline. The pipeline completes in 24 hours, stores statistical outputs, and produces a graphical summary quality analysis (QA) report. We assess the utility of this streamlined approach for empirical quality assessment on 608 DTI datasets from pediatric neuroimaging studies. The efficiency and accuracy of quality analysis using the proposed pipeline is compared with quality analysis based on visual inspection. The unified pipeline is found to save a statistically significant amount of time (over 70%) while improving the consistency of QA between a DTI expert and a pool of research associates. Projection of QA metrics to a low dimensional manifold reveal qualitative, but clear, QA-study associations and suggest that automated outlier/anomaly detection would be feasible. PMID:23637895
Simultaneous analysis and quality assurance for diffusion tensor imaging.
Lauzon, Carolyn B; Asman, Andrew J; Esparza, Michael L; Burns, Scott S; Fan, Qiuyun; Gao, Yurui; Anderson, Adam W; Davis, Nicole; Cutting, Laurie E; Landman, Bennett A
2013-01-01
Diffusion tensor imaging (DTI) enables non-invasive, cyto-architectural mapping of in vivo tissue microarchitecture through voxel-wise mathematical modeling of multiple magnetic resonance imaging (MRI) acquisitions, each differently sensitized to water diffusion. DTI computations are fundamentally estimation processes and are sensitive to noise and artifacts. Despite widespread adoption in the neuroimaging community, maintaining consistent DTI data quality remains challenging given the propensity for patient motion, artifacts associated with fast imaging techniques, and the possibility of hardware changes/failures. Furthermore, the quantity of data acquired per voxel, the non-linear estimation process, and numerous potential use cases complicate traditional visual data inspection approaches. Currently, quality inspection of DTI data has relied on visual inspection and individual processing in DTI analysis software programs (e.g. DTIPrep, DTI-studio). However, recent advances in applied statistical methods have yielded several different metrics to assess noise level, artifact propensity, quality of tensor fit, variance of estimated measures, and bias in estimated measures. To date, these metrics have been largely studied in isolation. Herein, we select complementary metrics for integration into an automatic DTI analysis and quality assurance pipeline. The pipeline completes in 24 hours, stores statistical outputs, and produces a graphical summary quality analysis (QA) report. We assess the utility of this streamlined approach for empirical quality assessment on 608 DTI datasets from pediatric neuroimaging studies. The efficiency and accuracy of quality analysis using the proposed pipeline is compared with quality analysis based on visual inspection. The unified pipeline is found to save a statistically significant amount of time (over 70%) while improving the consistency of QA between a DTI expert and a pool of research associates. Projection of QA metrics to a low dimensional manifold reveal qualitative, but clear, QA-study associations and suggest that automated outlier/anomaly detection would be feasible.
Stochastic Analysis and Design of Heterogeneous Microstructural Materials System
NASA Astrophysics Data System (ADS)
Xu, Hongyi
Advanced materials system refers to new materials that are comprised of multiple traditional constituents but complex microstructure morphologies, which lead to superior properties over the conventional materials. To accelerate the development of new advanced materials system, the objective of this dissertation is to develop a computational design framework and the associated techniques for design automation of microstructure materials systems, with an emphasis on addressing the uncertainties associated with the heterogeneity of microstructural materials. Five key research tasks are identified: design representation, design evaluation, design synthesis, material informatics and uncertainty quantification. Design representation of microstructure includes statistical characterization and stochastic reconstruction. This dissertation develops a new descriptor-based methodology, which characterizes 2D microstructures using descriptors of composition, dispersion and geometry. Statistics of 3D descriptors are predicted based on 2D information to enable 2D-to-3D reconstruction. An efficient sequential reconstruction algorithm is developed to reconstruct statistically equivalent random 3D digital microstructures. In design evaluation, a stochastic decomposition and reassembly strategy is developed to deal with the high computational costs and uncertainties induced by material heterogeneity. The properties of Representative Volume Elements (RVE) are predicted by stochastically reassembling SVE elements with stochastic properties into a coarse representation of the RVE. In design synthesis, a new descriptor-based design framework is developed, which integrates computational methods of microstructure characterization and reconstruction, sensitivity analysis, Design of Experiments (DOE), metamodeling and optimization the enable parametric optimization of the microstructure for achieving the desired material properties. Material informatics is studied to efficiently reduce the dimension of microstructure design space. This dissertation develops a machine learning-based methodology to identify the key microstructure descriptors that highly impact properties of interest. In uncertainty quantification, a comparative study on data-driven random process models is conducted to provide guidance for choosing the most accurate model in statistical uncertainty quantification. Two new goodness-of-fit metrics are developed to provide quantitative measurements of random process models' accuracy. The benefits of the proposed methods are demonstrated by the example of designing the microstructure of polymer nanocomposites. This dissertation provides material-generic, intelligent modeling/design methodologies and techniques to accelerate the process of analyzing and designing new microstructural materials system.
Advanced Methodologies for NASA Science Missions
NASA Astrophysics Data System (ADS)
Hurlburt, N. E.; Feigelson, E.; Mentzel, C.
2017-12-01
Most of NASA's commitment to computational space science involves the organization and processing of Big Data from space-based satellites, and the calculations of advanced physical models based on these datasets. But considerable thought is also needed on what computations are needed. The science questions addressed by space data are so diverse and complex that traditional analysis procedures are often inadequate. The knowledge and skills of the statistician, applied mathematician, and algorithmic computer scientist must be incorporated into programs that currently emphasize engineering and physical science. NASA's culture and administrative mechanisms take full cognizance that major advances in space science are driven by improvements in instrumentation. But it is less well recognized that new instruments and science questions give rise to new challenges in the treatment of satellite data after it is telemetered to the ground. These issues might be divided into two stages: data reduction through software pipelines developed within NASA mission centers; and science analysis that is performed by hundreds of space scientists dispersed through NASA, U.S. universities, and abroad. Both stages benefit from the latest statistical and computational methods; in some cases, the science result is completely inaccessible using traditional procedures. This paper will review the current state of NASA and present example applications using modern methodologies.
Spatial Differentiation of Landscape Values in the Murray River Region of Victoria, Australia
NASA Astrophysics Data System (ADS)
Zhu, Xuan; Pfueller, Sharron; Whitelaw, Paul; Winter, Caroline
2010-05-01
This research advances the understanding of the location of perceived landscape values through a statistically based approach to spatial analysis of value densities. Survey data were obtained from a sample of people living in and using the Murray River region, Australia, where declining environmental quality prompted a reevaluation of its conservation status. When densities of 12 perceived landscape values were mapped using geographic information systems (GIS), valued places clustered along the entire river bank and in associated National/State Parks and reserves. While simple density mapping revealed high value densities in various locations, it did not indicate what density of a landscape value could be regarded as a statistically significant hotspot or distinguish whether overlapping areas of high density for different values indicate identical or adjacent locations. A spatial statistic Getis-Ord Gi* was used to indicate statistically significant spatial clusters of high value densities or “hotspots”. Of 251 hotspots, 40% were for single non-use values, primarily spiritual, therapeutic or intrinsic. Four hotspots had 11 landscape values. Two, lacking economic value, were located in ecologically important river red gum forests and two, lacking wilderness value, were near the major towns of Echuca-Moama and Albury-Wodonga. Hotspots for eight values showed statistically significant associations with another value. There were high associations between learning and heritage values while economic and biological diversity values showed moderate associations with several other direct and indirect use values. This approach may improve confidence in the interpretation of spatial analysis of landscape values by enhancing understanding of value relationships.
Rear-End Crashes: Problem Size Assessment And Statistical Description
DOT National Transportation Integrated Search
1993-05-01
KEYWORDS : RESEARCH AND DEVELOPMENT OR R&D, ADVANCED VEHICLE CONTROL & SAFETY SYSTEMS OR AVCSS, INTELLIGENT VEHICLE INITIATIVE OR IVI : THIS DOCUMENT PRESENTS PROBLEM SIZE ASSESSMENTS AND STATISTICAL CRASH DESCRIPTION FOR REAR-END CRASHES, INC...
75 FR 55333 - Board of Scientific Counselors, National Center for Health Statistics, (BSC, NCHS)
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-10
... Scientific Counselors, National Center for Health Statistics, (BSC, NCHS) In accordance with section 10(a)(2... Prevention (CDC), National Center for Health Statistics (NCHS) announces the following meeting of [email protected] or Virginia Cain, [email protected] at least 10 days in advance for requirements). All visitors...
Using R in Introductory Statistics Courses with the pmg Graphical User Interface
ERIC Educational Resources Information Center
Verzani, John
2008-01-01
The pmg add-on package for the open source statistics software R is described. This package provides a simple to use graphical user interface (GUI) that allows introductory statistics students, without advanced computing skills, to quickly create the graphical and numeric summaries expected of them. (Contains 9 figures.)
A Selective Overview of Variable Selection in High Dimensional Feature Space
Fan, Jianqing
2010-01-01
High dimensional statistical problems arise from diverse fields of scientific research and technological development. Variable selection plays a pivotal role in contemporary statistical learning and scientific discoveries. The traditional idea of best subset selection methods, which can be regarded as a specific form of penalized likelihood, is computationally too expensive for many modern statistical applications. Other forms of penalized likelihood methods have been successfully developed over the last decade to cope with high dimensionality. They have been widely applied for simultaneously selecting important variables and estimating their effects in high dimensional statistical inference. In this article, we present a brief account of the recent developments of theory, methods, and implementations for high dimensional variable selection. What limits of the dimensionality such methods can handle, what the role of penalty functions is, and what the statistical properties are rapidly drive the advances of the field. The properties of non-concave penalized likelihood and its roles in high dimensional statistical modeling are emphasized. We also review some recent advances in ultra-high dimensional variable selection, with emphasis on independence screening and two-scale methods. PMID:21572976
Subgroup effects of occupational therapy-based intervention for people with advanced cancer.
Sampedro Pilegaard, Marc; Oestergaard, Lisa Gregersen; la Cour, Karen; Thit Johnsen, Anna; Brandt, Åse
2018-03-23
Many people with advanced cancer have decreased ability to perform activities of daily living (ADL). We recently performed a randomized, controlled trial (RCT) assessing the efficacy of an occupational therapy-based program, the 'Cancer Home-Life Intervention' in people with advanced cancer (N = 242) and found no overall effects on ADL ability. However, heterogeneity of treatment effect may disguise subgroup differences. To investigate whether subgroups of people with advanced cancer gain positive effects from the 'Cancer Home-Life Intervention' on ADL ability. An exploratory subgroup analysis including 191 participants from a RCT. The outcome was ADL motor ability measured by the Assessment of Motor and Process Skills (AMPS). Subgroups were defined by age, gender, years of education, type of primary tumor, functional level, and activity problems. The 'Cancer Home-Life Intervention' had no statistically significant effect in the six subgroups. Modifying effects of age (0.30 [95% CI: -0.05 to 0.64]) and gender (0.23 [95% CI: -0.11 to 0.57]) were not found. There were no subgroup effects of the 'Cancer Home-Life Intervention'on ADL motor ability. Some indications suggest greater effects for those aged below 69 years; however, this result should be interpreted with caution.
Laparoscopic versus open-component separation: a comparative analysis in a porcine model.
Rosen, Michael J; Williams, Christina; Jin, Judy; McGee, Michael F; Schomisch, Steve; Marks, Jeffrey; Ponsky, Jeffrey
2007-09-01
The ideal surgical treatment for complicated ventral hernias remains elusive. Traditional component separation provides local advancement of native tissue for tension-free closure without prosthetic materials. This technique requires an extensive subcutaneous dissection with division of perforating vessels predisposing to skin-flap necrosis and complicated wound infections. A minimally invasive component separation may decrease wound complication rates; however, the adequacy of the myofascial advancement has not been studied. Five 25-kg pigs underwent bilateral laparoscopic component separation. A 10-mm incision was made lateral to the rectus abdominus muscle. The external oblique fascia was incised, and a dissecting balloon was inflated between the internal and external oblique muscles. Two additional ports were placed in the intermuscular space. The external oblique was incised from the costal margin to the inguinal ligament. The maximal abdominal wall advancement was recorded. A formal open-component separation was performed and maximal advancement 5 cm superior and 5 cm inferior to the umbilicus was recorded for comparison. Groups were compared using standard statistical analysis. The laparoscopic component separation was completed successfully in all animals, with a mean of 22 min/side. Laparoscopic component separation yielded 3.9 cm (SD 1.1) of fascial advancement above the umbilicus, whereas 4.4 cm (1.2) was obtained after open release (P = .24). Below the umbilicus, laparoscopic release achieved 5.0 cm (1.0) of advancement, whereas 5.8 cm (1.2) was gained after open release (P = .13). The minimally invasive component separation achieved an average of 86% of the myofascial advancement compared with a formal open release. The laparoscopic approach does not require extensive subcutaneous dissection and might theoretically result in a decreased incidence or decreased complexity of postoperative wound infections or skin-flap necrosis. Based on our preliminary data in this porcine model, further comparative studies of laparoscopic versus open component separation in complex ventral hernia repair is warranted to evaluate postoperative morbidity and long-term hernia recurrence rates.
Turvey, Timothy A.; Bell, R. Bryan; Phillips, Ceib; Proffit, William R.
2013-01-01
Purpose This report compares the skeletal stability and treatment outcomes of 2 similar cohorts undergoing bilateral sagittal osteotomies of the mandible for advancement. The study groups included patients stabilized with 2-mm self-reinforced polylactate (PLLDL 70/30), biodegradable screws (group B), and 2-mm titanium screws placed in a positional fashion (group T). Materials and Methods Sixty-nine patients underwent bilateral sagittal osteotomies of the mandibular ramus for advancement utilizing an identical technique. There were 34 patients in group B and 35 patients in group T. Each patient had preoperative, immediate postoperative, splint out, and 1-year postoperative cephalometric radiographs available for analysis. The method of analysis and treatment outcomes parameters are identical to those previously used. Repeated measures analysis of variance was performed with means of fixation as the between-subject factor and time as the within subject factor. The level of significance was set at .01. Results There were no clinical failures in group T and a single failure in group B. The average difference in stability between the groups is small and subtly different at the mandibular angle. The data documented similarity of the postsurgical changes in the 2 groups with the only statistically significant difference being the vertical position of the gonion (P < .001) and the mandibular plane angle (P < .01) with greater upward remodeling at gonion in group T. Conclusions Two-mm self-reinforced PLLDL (70/30) screws can be used as effectively as 2-mm titanium screws to stabilize the mandible after bilateral sagittal osteotomies for mandibular advancement. The difference in 1-year stability and outcome is minimal. PMID:16360855
Tsukiyama, Ikuto; Ejiri, Masayuki; Yamamoto, Yoshihiro; Nakao, Haruhisa; Yoneda, Masashi; Matsuura, Katsuhiko; Arakawa, Ichiro; Saito, Hiroko; Inoue, Tadao
2017-12-01
This study assessed the cost-effectiveness of combination treatment with gemcitabine and cisplatin compared to treatment with gemcitabine alone for advanced biliary tract cancer (BTC) in Japan. A monthly transmitted Markov model of three states was constructed based on the Japan BT-22 trial. Transition probabilities among the health states were derived from a trial conducted in Japan and converted to appropriate parameters for our model. The associated cost components, obtained from a receipt-based survey undertaken at the Aichi Medical University Hospital, were those related to inpatient care, outpatient care, and treatment for BTC. Costs for palliative care and treatment of adverse events were obtained from the National Health Insurance price list. We estimated cost-effectiveness per quality-adjusted life year (QALY) at a time horizon of 36 months. An annual discount of 3 % for both cost and outcome was considered. The base case outcomes indicated that combination therapy was less cost-effective than monotherapy when the incremental cost-effectiveness ratio (ICER) was approximately 14 million yen per QALY gained. The deterministic sensitivity analysis of the ICER revealed that the ICER of the base case was robust. A probabilistic analysis conducted with 10,000-time Monte Carlo simulations demonstrated efficacy at the willingness to pay threshold of 6 million yen per QALY gained for approximately 33 % of the population. In Japan, combination therapy is less cost-effective than monotherapy for treating advanced BTC, regardless of the statistical significance of the two therapies. Useful information on the cost-effectiveness of chemotherapy is much needed for the treatment of advanced BTC in Japan.
Subscale Test Methods for Combustion Devices
NASA Technical Reports Server (NTRS)
Anderson, W. E.; Sisco, J. C.; Long, M. R.; Sung, I.-K.
2005-01-01
Stated goals for long-life LRE s have been between 100 and 500 cycles: 1) Inherent technical difficulty of accurately defining the transient and steady state thermochemical environments and structural response (strain); 2) Limited statistical basis on failure mechanisms and effects of design and operational variability; and 3) Very high test costs and budget-driven need to protect test hardware (aversion to test-to-failure). Ambitious goals will require development of new databases: a) Advanced materials, e.g., tailored composites with virtually unlimited property variations; b) Innovative functional designs to exploit full capabilities of advanced materials; and c) Different cycles/operations. Subscale testing is one way to address technical and budget challenges: 1) Prototype subscale combustors exposed to controlled simulated conditions; 2) Complementary to conventional laboratory specimen database development; 3) Instrumented with sensors to measure thermostructural response; and 4) Coupled with analysis
Shi, Yulan; Ying, Xiao; Hu, Xiaoye; Zhao, Jing; Fang, Xuefeng; Wu, Minghui; Chen, Tian Zhou; Shen, Hong
2015-05-01
Present study was designed to investigate the pancreatic endocrine and exocrine function damage after High Intensity Focused Ultrasound (HIFU) therapy in patients with advanced pancreatic cancer. It was a retrospective analysis of blood glucose and amylase changes in 59 advanced pancreatic cancer patients treated with HIFU from 2010 February to 2014 January. The mean glucose and amylase before HIFU treatment were 6.02mmol/L and 59.17 U/L respectively. After HIFU treatment, it was shown that the mean glucose and amylase levels were 5.66mmol/L and 57.86/L respectively. There was no statistical significance between them. No acute pancreatitis was observed. The endocrine and exocrine function of pancreatic cancer patients was not damaged by HIFU treatment. HIFU treatment for the pancreatic cancer patients seems to be safe.
Chen, Ran; Zhang, Yuntao; Sahneh, Faryad Darabi; Scoglio, Caterina M; Wohlleben, Wendel; Haase, Andrea; Monteiro-Riviere, Nancy A; Riviere, Jim E
2014-09-23
Quantitative characterization of nanoparticle interactions with their surrounding environment is vital for safe nanotechnological development and standardization. A recent quantitative measure, the biological surface adsorption index (BSAI), has demonstrated promising applications in nanomaterial surface characterization and biological/environmental prediction. This paper further advances the approach beyond the application of five descriptors in the original BSAI to address the concentration dependence of the descriptors, enabling better prediction of the adsorption profile and more accurate categorization of nanomaterials based on their surface properties. Statistical analysis on the obtained adsorption data was performed based on three different models: the original BSAI, a concentration-dependent polynomial model, and an infinite dilution model. These advancements in BSAI modeling showed a promising development in the application of quantitative predictive modeling in biological applications, nanomedicine, and environmental safety assessment of nanomaterials.
Barteneva, Natasha S; Vorobjev, Ivan A
2018-01-01
In this paper, we review some of the recent advances in cellular heterogeneity and single-cell analysis methods. In modern research of cellular heterogeneity, there are four major approaches: analysis of pooled samples, single-cell analysis, high-throughput single-cell analysis, and lately integrated analysis of cellular population at a single-cell level. Recently developed high-throughput single-cell genetic analysis methods such as RNA-Seq require purification step and destruction of an analyzed cell often are providing a snapshot of the investigated cell without spatiotemporal context. Correlative analysis of multiparameter morphological, functional, and molecular information is important for differentiation of more uniform groups in the spectrum of different cell types. Simplified distributions (histograms and 2D plots) can underrepresent biologically significant subpopulations. Future directions may include the development of nondestructive methods for dissecting molecular events in intact cells, simultaneous correlative cellular analysis of phenotypic and molecular features by hybrid technologies such as imaging flow cytometry, and further progress in supervised and non-supervised statistical analysis algorithms.
NASA Astrophysics Data System (ADS)
Kwon, O.; Kim, W.; Kim, J.
2017-12-01
Recently construction of subsea tunnel has been increased globally. For safe construction of subsea tunnel, identifying the geological structure including fault at design and construction stage is more than important. Then unlike the tunnel in land, it's very difficult to obtain the data on geological structure because of the limit in geological survey. This study is intended to challenge such difficulties in a way of developing the technology to identify the geological structure of seabed automatically by using echo sounding data. When investigation a potential site for a deep subsea tunnel, there is the technical and economical limit with borehole of geophysical investigation. On the contrary, echo sounding data is easily obtainable while information reliability is higher comparing to above approaches. This study is aimed at developing the algorithm that identifies the large scale of geological structure of seabed using geostatic approach. This study is based on theory of structural geology that topographic features indicate geological structure. Basic concept of algorithm is outlined as follows; (1) convert the seabed topography to the grid data using echo sounding data, (2) apply the moving window in optimal size to the grid data, (3) estimate the spatial statistics of the grid data in the window area, (4) set the percentile standard of spatial statistics, (5) display the values satisfying the standard on the map, (6) visualize the geological structure on the map. The important elements in this study include optimal size of moving window, kinds of optimal spatial statistics and determination of optimal percentile standard. To determine such optimal elements, a numerous simulations were implemented. Eventually, user program based on R was developed using optimal analysis algorithm. The user program was designed to identify the variations of various spatial statistics. It leads to easy analysis of geological structure depending on variation of spatial statistics by arranging to easily designate the type of spatial statistics and percentile standard. This research was supported by the Korea Agency for Infrastructure Technology Advancement under the Ministry of Land, Infrastructure and Transport of the Korean government. (Project Number: 13 Construction Research T01)
Gaussian curvature analysis allows for automatic block placement in multi-block hexahedral meshing.
Ramme, Austin J; Shivanna, Kiran H; Magnotta, Vincent A; Grosland, Nicole M
2011-10-01
Musculoskeletal finite element analysis (FEA) has been essential to research in orthopaedic biomechanics. The generation of a volumetric mesh is often the most challenging step in a FEA. Hexahedral meshing tools that are based on a multi-block approach rely on the manual placement of building blocks for their mesh generation scheme. We hypothesise that Gaussian curvature analysis could be used to automatically develop a building block structure for multi-block hexahedral mesh generation. The Automated Building Block Algorithm incorporates principles from differential geometry, combinatorics, statistical analysis and computer science to automatically generate a building block structure to represent a given surface without prior information. We have applied this algorithm to 29 bones of varying geometries and successfully generated a usable mesh in all cases. This work represents a significant advancement in automating the definition of building blocks.
1996-01-01
failure as due to an adhesive layer between the foil and inner polypropylene layers. "* Under subcontract, NFPA provided HACCP draft manuals for the...parameters of the production process and to ensure that they are within their target values. In addition, a HACCP program was used to assure product...played an important part in implementing Hazard Analysis Critical Control Points ( HACCP ) as part of the Process and Quality Control manual. The National
NASA Astrophysics Data System (ADS)
Haseltine, Jessica
2006-10-01
A statistical analysis of enrollment in AP maths and sciences in the Abilene Independent School District, between 2000 and 2005, studied the relationship between gender, enrollment, and performance. Data suggested that mid-scoring females were less likely than their male counterparts to enroll in AP-level courses. AISD showed higher female : male score ratios than national and state averages but no improvement in enrollment comparisons. Several programs are suggested to improve both participation and performance of females in upper-level math and science courses.
Report on the ''ESO Python Boot Camp — Pilot Version''
NASA Astrophysics Data System (ADS)
Dias, B.; Milli, J.
2017-03-01
The Python programming language is becoming very popular within the astronomical community. Python is a high-level language with multiple applications including database management, handling FITS images and tables, statistical analysis, and more advanced topics. Python is a very powerful tool both for astronomical publications and for observatory operations. Since the best way to learn a new programming language is through practice, we therefore organised a two-day hands-on workshop to share expertise among ESO colleagues. We report here the outcome and feedback from this pilot event.
Web-TCGA: an online platform for integrated analysis of molecular cancer data sets.
Deng, Mario; Brägelmann, Johannes; Schultze, Joachim L; Perner, Sven
2016-02-06
The Cancer Genome Atlas (TCGA) is a pool of molecular data sets publicly accessible and freely available to cancer researchers anywhere around the world. However, wide spread use is limited since an advanced knowledge of statistics and statistical software is required. In order to improve accessibility we created Web-TCGA, a web based, freely accessible online tool, which can also be run in a private instance, for integrated analysis of molecular cancer data sets provided by TCGA. In contrast to already available tools, Web-TCGA utilizes different methods for analysis and visualization of TCGA data, allowing users to generate global molecular profiles across different cancer entities simultaneously. In addition to global molecular profiles, Web-TCGA offers highly detailed gene and tumor entity centric analysis by providing interactive tables and views. As a supplement to other already available tools, such as cBioPortal (Sci Signal 6:pl1, 2013, Cancer Discov 2:401-4, 2012), Web-TCGA is offering an analysis service, which does not require any installation or configuration, for molecular data sets available at the TCGA. Individual processing requests (queries) are generated by the user for mutation, methylation, expression and copy number variation (CNV) analyses. The user can focus analyses on results from single genes and cancer entities or perform a global analysis (multiple cancer entities and genes simultaneously).
Gustafsson, O; Norming, U; Gustafsson, S; Eneroth, P; Aström, G; Nyman, C R
1996-03-01
To investigate the possible relationship between serum levels of prostate specific antigen (PSA), dihydrotestosterone (DHT), testosterone, sexual-hormone binding globulin (SHBG) and tumour stage, grade and ploidy in 65 cases of prostate cancer diagnosed in a screening study compared to 130 controls from the same population. From a population of 26,602 men between the ages of 55 and 70 years, 2400 were selected randomly and invited to undergo screening for prostate cancer using a digital rectal examination, transrectal ultrasonography and PSA analysis. Among the 1782 attendees, 65 cases of prostate cancer were diagnosed. Each case was matched with two control subjects of similar age and prostate volume from the screening population. Frozen serum samples were analysed for PSA, DHT, testosterone and SHBG, and compared to the diagnosis and tumour stage, grade and ploidy. Comparisons between these variables, and multivariate and regression analyses were performed. There were significant differences in PSA level with all variables except tumour ploidy. DHT levels were slightly lower in patients with prostate cancer but the difference was not statistically significant. There was a trend towards lower DHT values in more advanced tumours and the difference for T-stages was close to statistical significance (P = 0.059). Testosterone levels were lower in patients with cancer than in the control group, but the differences were not significant. There was no correlation between testosterone levels, tumour stage and ploidy, but the differences in testosterone level in tumours of a low grade of differentiation compared to those with intermediate and high grade was nearly significant (P = 0.058). The testosterone/DHT ratio tended to be higher in patients with more advanced tumours. SHBG levels were lower in patients with cancer than in controls but the differences were not statistically significant. There were no systematic variations of tumour stage, grade and ploidy. Multivariate analysis showed that if the PSA level was known, then DHT, testosterone or SHBG added no further information concerning diagnosis, stage, grade or ploidy. Regression analysis on T-stage, PSA level and DHT showed an inverse linear relationship between PSA and DHT for stage T-3 (P = 0.035), but there was no relationship between PSA and testosterone. PSA was of value in discriminating between cases and controls and between various tumour stages and grades, but no statistically significant correlation was found for ploidy. If PSA level was known, no other variable added information in individual cases. Within a group, DHT levels tended to be lower among cases and in those with more advanced tumours. There was an inverse relationship between tumour volume, as defined by PSA level, and 5 alpha-reductase activity, as defined by DHT level, and the testosterone/DHT ratio. This trend was most obvious with T-stage. No systematic variation were found in the levels of testosterone or SHBG.
Mohsin, Abdul Habeeb Bin; Reddy, Varalakshmi; Kumar, Praveen; Raj, Jeevan; Babu, Siva Santosh
2017-01-01
Introduction The aim of this study was to evaluate & compare the wetting ability of five saliva substitutes & distilled water on heat-polymerized acrylic resin. Contact angle of the saliva substitute on denture base can be taken as an indicator of wettability. Good wetting of heat-polymerized acrylic resin is critical for optimum retention of complete dentures. Methods Two hundred & forty samples of heat-polymerized acrylic resin were fabricated using conventional method. 240 samples divided into 6 groups with 40 samples in each group. Advancing & Receding contact angles were measured using Contact Angle Goniometer & DSA4 software analysis. Results Anova test was carried out to test the significance in difference of contact angle values in the six groups. The mean of advancing angle values & mean of receding angle values of all the six groups has shown statistically significant difference between the groups. The mean of angle of hysteresis values of all the six groups are statistically not significant between the groups. A multiple comparison using Bonferroni’s test was carried out to verify the significance of difference between the contact angles in a pair of groups. Statistically significant difference was seen when Aqwet (Group II) was compared to Distilled water (Group I), Wet Mouth (Group III), E-Saliva (Group IV), Biotene (Group V), and Moi-Stir (Group VI). Conclusion The contact angles of five saliva substitutes and distilled water were measured and compared. Group II (AQWET) has the lowest advancing and receding contact angle values and the highest angle of hysteresis on heat-polymerized acrylic resin. Based on contact angle values, Group II (AQWET) has the best wetting ability on heat-cured acrylic resins. The ability of saliva to wet the denture surface is one of the most important properties for complete denture retention in dry mouth cases. PMID:29187918
Kamran, Sophia C; Manuel, Matthias M; Cho, Linda P; Damato, Antonio L; Schmidt, Ehud J; Tempany, Clare; Cormack, Robert A; Viswanathan, Akila N
2017-05-01
The purpose was to compare local control (LC), overall survival (OS) and dose to the organs at risk (OAR) in women with locally advanced cervical cancer treated with MR-guided versus CT-guided interstitial brachytherapy (BT). 56 patients (29 MR, 27 CT) were treated with high-dose-rate (HDR) interstitial BT between 2005-2015. The MR patients had been prospectively enrolled on a Phase II clinical trial. Data were analyzed using Kaplan-Meier (K-M) and Cox proportional hazards statistical modeling in JMP® & R®. Median follow-up time was 19.7months (MR group) and 18.4months (CT group). There were no statistically significant differences in patient age at diagnosis, histology, percent with tumor size >4cm, grade, FIGO stage or lymph node involvement between the groups. Patients in the MR group had more lymphovascular involvement compared to patients in the CT group (p<0.01). When evaluating plans generated, there were no statistically significant differences in median cumulative dose to the high-risk clinical target volume or the OAR. 2-year K-M LC rates for MR-based and CT-based treatments were 96% and 87%, respectively (log-rank p=0.65). At 2years, OS was significantly better in the MR-guided cohort (84% vs. 56%, p=0.036). On multivariate analysis, squamous histology was associated with longer OS (HR 0.23, 95% CI 0.07-0.72) in a model with MR BT (HR 0.35, 95% CI 0.08-1.18). There was no difference in toxicities between CT and MR BT. In this population of locally advanced cervical-cancer patients, MR-guided HDR BT resulted in estimated 96% 2-year local control and excellent survival and toxicity rates. Copyright © 2017 Elsevier Inc. All rights reserved.
Kamran, Sophia C.; Manuel, Matthias M.; Cho, Linda P.; Damato, Antonio L.; Schmidt, Ehud J.; Tempany, Clare; Cormack, Robert A.; Viswanathan, Akila N.
2017-01-01
Objective The purpose was to compare local control (LC), overall survival (OS) and dose to the organs at risk (OAR) in women with locally advanced cervical cancer treated with MR-guided versus CT-guided interstitial brachytherapy (BT). Methods 56 patients (29 MR, 27 CT) were treated with high-dose-rate (HDR) interstitial BT between 2005–2015. The MR patients had been prospectively enrolled on a Phase II clinical trial. Data were analyzed using Kaplan-Meier (K-M) and Cox proportional hazards statistical modeling in JMP® & R®. Results Median follow-up time was 19.7 months (MR group) and 18.4 months (CT group). There were no statistically significant differences in patient age at diagnosis, histology, percent with tumor size >4 cm, grade, FIGO stage or lymph node involvement between the groups. Patients in the MR group had more lymphovascular involvement compared to patients in the CT group (p<0.01). When evaluating plans generated, there were no statistically significant differences in median cumulative dose to the high-risk clinical target volume or the OAR. 2-year K-M LC rates for MR-based and CT-based treatments were 96% and 87%, respectively (log-rank p=0.65). At 2 years, OS was significantly better in the MR-guided cohort (84% vs. 56%, p=0.036). On multivariate analysis, squamous histology was associated with longer OS (HR 0.23, 95% CI 0.07–0.72) in a model with MR BT (HR 0.35, 95% CI 0.08–1.18). Conclusion In this population of locally advanced cervical-cancer patients, MR-guided HDR BT resulted in estimated 96% 2-year local control and excellent early survival rates. Squamous cell histology was the most significant predictor for survival. PMID:28318644
Soy Consumption and the Risk of Prostate Cancer: An Updated Systematic Review and Meta-Analysis
Ranard, Katherine M.; Jeon, Sookyoung; Erdman, John W.
2018-01-01
Prostate cancer (PCa) is the second most commonly diagnosed cancer in men, accounting for 15% of all cancers in men worldwide. Asian populations consume soy foods as part of a regular diet, which may contribute to the lower PCa incidence observed in these countries. This meta-analysis provides a comprehensive updated analysis that builds on previously published meta-analyses, demonstrating that soy foods and their isoflavones (genistein and daidzein) are associated with a lower risk of prostate carcinogenesis. Thirty articles were included for analysis of the potential impacts of soy food intake, isoflavone intake, and circulating isoflavone levels, on both primary and advanced PCa. Total soy food (p < 0.001), genistein (p = 0.008), daidzein (p = 0.018), and unfermented soy food (p < 0.001) intakes were significantly associated with a reduced risk of PCa. Fermented soy food intake, total isoflavone intake, and circulating isoflavones were not associated with PCa risk. Neither soy food intake nor circulating isoflavones were associated with advanced PCa risk, although very few studies currently exist to examine potential associations. Combined, this evidence from observational studies shows a statistically significant association between soy consumption and decreased PCa risk. Further studies are required to support soy consumption as a prophylactic dietary approach to reduce PCa carcinogenesis. PMID:29300347
Yamanouchi, Masayuki; Hoshino, Junichi; Ubara, Yoshifumi; Takaichi, Kenmei; Kinowaki, Keiichi; Fujii, Takeshi; Ohashi, Kenichi; Mise, Koki; Toyama, Tadashi; Hara, Akinori; Kitagawa, Kiyoki; Shimizu, Miho; Furuichi, Kengo; Wada, Takashi
2018-01-01
There have been a limited number of biopsy-based studies on diabetic nephropathy, and therefore the clinical importance of renal biopsy in patients with diabetes in late-stage chronic kidney disease (CKD) is still debated. We aimed to clarify the renal prognostic value of pathological information to clinical information in patients with diabetes and advanced CKD. We retrospectively assessed 493 type 2 diabetics with biopsy-proven diabetic nephropathy in four centers in Japan. 296 patients with stage 3-5 CKD at the time of biopsy were identified and assigned two risk prediction scores for end-stage renal disease (ESRD): the Kidney Failure Risk Equation (KFRE, a score composed of clinical parameters) and the Diabetic Nephropathy Score (D-score, a score integrated pathological parameters of the Diabetic Nephropathy Classification by the Renal Pathology Society (RPS DN Classification)). They were randomized 2:1 to development and validation cohort. Hazard Ratios (HR) of incident ESRD were reported with 95% confidence interval (CI) of the KFRE, D-score and KFRE+D-score in Cox regression model. Improvement of risk prediction with the addition of D-score to the KFRE was assessed using c-statistics, continuous net reclassification improvement (NRI), and integrated discrimination improvement (IDI). During median follow-up of 1.9 years, 194 patients developed ESRD. The cox regression analysis showed that the KFRE,D-score and KFRE+D-score were significant predictors of ESRD both in the development cohort and in the validation cohort. The c-statistics of the D-score was 0.67. The c-statistics of the KFRE was good, but its predictive value was weaker than that in the miscellaneous CKD cohort originally reported (c-statistics, 0.78 vs. 0.90) and was not significantly improved by adding the D-score (0.78 vs. 0.79, p = 0.83). Only continuous NRI was positive after adding the D-score to the KFRE (0.4%; CI: 0.0-0.8%). We found that the predict values of the KFRE and the D-score were not as good as reported, and combining the D-score with the KFRE did not significantly improve prediction of the risk of ESRD in advanced diabetic nephropathy. To improve prediction of renal prognosis for advanced diabetic nephropathy may require different approaches with combining clinical and pathological parameters that were not measured in the KFRE and the RPS DN Classification.
Perception of Curability Among Advanced Cancer Patients: An International Collaborative Study.
Yennurajalingam, Sriram; Rodrigues, Luis Fernando; Shamieh, Omar; Tricou, Colombe; Filbet, Marilène; Naing, Kyaw; Ramaswamy, Akhileshwaran; Perez-Cruz, Pedro Emilio; Bautista, Mary Jocelyn S; Bunge, Sofia; Muckaden, Mary Ann; Sewram, Vikash; Fakrooden, Sarah; Noguera-Tejedor, Antonio; Rao, Shobha S; Liu, Diane; Park, Minjeong; Williams, Janet L; Lu, Zhanni; Cantu, Hilda; Hui, David; Reddy, Suresh K; Bruera, Eduardo
2018-04-01
There are limited data on illness understanding and perception of cure among advanced cancer patients around the world. The aim of the study was to determine the frequency and factors associated with inaccurate perception of curability among advanced cancer patients receiving palliative care across the globe. Secondary analysis of a study to understand the core concepts in end-of-life care among advanced cancer patients receiving palliative care from 11 countries across the world. Advanced cancer patients were surveyed using a Patient Illness Understanding survey and Control Preference Scale. Descriptive statistics and multicovariate logistic regression analysis were performed. Fifty-five percent (763/1,390) of patients receiving palliative care inaccurately reported that their cancer is curable. The median age was 58, 55% were female, 59% were married or had a partner, 48% were Catholic, and 35% were college educated. Sixty-eight percent perceived that the goal of therapy was "to get rid of their cancer," and 47% perceived themselves as "seriously ill." Multicovariate logistic regression analysis shows that accurate perception of curability was associated with female gender (odds ratio [OR] 0.73, p = .027), higher education (OR 0.37, p < .0001), unemployment status (OR 0.69, p = .02), and being from France (OR 0.26, p < .0001) and South Africa (OR 0.52, p = .034); inaccurate perception of curability was associated with better Karnofsky performance status (OR 1.02 per point, p = .0005), and being from Philippines (OR 15.49, p < .0001), Jordan (OR 8.43, p < .0001), Brazil (OR 2.17, p = .0037), and India (OR 2.47, p = .039). Inaccurate perception of curability in advanced cancer patients is 55% and significantly differs by gender, education, performance status, employment status, and country of origin. Further studies are needed to develop strategies to reduce this misperception of curability in advanced cancer patients. The findings of this study indicate that inaccurate perception of curability among advanced cancer patients is 55%. Inaccurate perception of curability significantly differs by gender, education, performance status, employment status, and country of origin. There is great need to facilitate improved patient-physician communication so as to improve health care outcomes and patient satisfaction. © AlphaMed Press 2017.
MNE software for processing MEG and EEG data
Gramfort, A.; Luessi, M.; Larson, E.; Engemann, D.; Strohmeier, D.; Brodbeck, C.; Parkkonen, L.; Hämäläinen, M.
2013-01-01
Magnetoencephalography and electroencephalography (M/EEG) measure the weak electromagnetic signals originating from neural currents in the brain. Using these signals to characterize and locate brain activity is a challenging task, as evidenced by several decades of methodological contributions. MNE, whose name stems from its capability to compute cortically-constrained minimum-norm current estimates from M/EEG data, is a software package that provides comprehensive analysis tools and workflows including preprocessing, source estimation, time–frequency analysis, statistical analysis, and several methods to estimate functional connectivity between distributed brain regions. The present paper gives detailed information about the MNE package and describes typical use cases while also warning about potential caveats in analysis. The MNE package is a collaborative effort of multiple institutes striving to implement and share best methods and to facilitate distribution of analysis pipelines to advance reproducibility of research. Full documentation is available at http://martinos.org/mne. PMID:24161808
Immunotherapy in advanced melanoma: a network meta-analysis.
Pyo, Jung-Soo; Kang, Guhyun
2017-05-01
The aim of this study was to compare the effects of various immunotherapeutic agents and chemotherapy for unresected or metastatic melanomas. We performed a network meta-analysis using a Bayesian statistical model to compare objective response rate (ORR) of various immunotherapies from 12 randomized controlled studies. The estimated ORRs of immunotherapy and chemotherapy were 0.224 and 0.108, respectively. The ORRs of immunotherapy in untreated and pretreated patients were 0.279 and 0.176, respectively. In network meta-analysis, the odds ratios for ORR of nivolumab (1 mg/kg)/ipilmumab (3 mg/kg), pembrolizumab 10 mg/kg and nivolumab 3 mg/kg were 8.54, 5.39 and 4.35, respectively, compared with chemotherapy alone. Our data showed that various immunotherapies had higher ORRs rather than chemotherapy alone.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerr, J.; Jones, G.L.
1996-01-01
Recent advances in hardware and software have given the interpreter and engineer new ways to view 3D seismic data and well bore information. Recent papers have also highlighted the use of various statistics and seismic attributes. By combining new 3D rendering technologies with recent trends in seismic analysis, the interpreter can improve the structural and stratigraphic resolution of hydrocarbon reservoirs. This paper gives several examples using 3D visualization to better define both the structural and stratigraphic aspects of several different structural types from around the world. Statistics, 3D visualization techniques and rapid animation are used to show complex faulting andmore » detailed channel systems. These systems would be difficult to map using either 2D or 3D data with conventional interpretation techniques.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerr, J.; Jones, G.L.
1996-12-31
Recent advances in hardware and software have given the interpreter and engineer new ways to view 3D seismic data and well bore information. Recent papers have also highlighted the use of various statistics and seismic attributes. By combining new 3D rendering technologies with recent trends in seismic analysis, the interpreter can improve the structural and stratigraphic resolution of hydrocarbon reservoirs. This paper gives several examples using 3D visualization to better define both the structural and stratigraphic aspects of several different structural types from around the world. Statistics, 3D visualization techniques and rapid animation are used to show complex faulting andmore » detailed channel systems. These systems would be difficult to map using either 2D or 3D data with conventional interpretation techniques.« less
Inchauspe, Adrián Angel
2016-01-01
AIM: To present an inclusion criterion for patients who have suffered bilateral amputation in order to be treated with the supplementary resuscitation treatment which is hereby proposed by the author. METHODS: This work is based on a Retrospective Cohort model so that a certainly lethal risk to the control group is avoided. RESULTS: This paper presents a hypothesis on acupunctural PC-9 Zhong chong point, further supported by previous statistical work recorded for the K-1 Yong quan resuscitation point. CONCLUSION: Thanks to the application of the resuscitation maneuver herein proposed on the previously mentioned point, patients with bilateral amputation would have another alternative treatment available in case basic and advanced CPR should fail. PMID:27152257
Ensink, Elliot; Sinha, Jessica; Sinha, Arkadeep; Tang, Huiyuan; Calderone, Heather M; Hostetter, Galen; Winter, Jordan; Cherba, David; Brand, Randall E; Allen, Peter J; Sempere, Lorenzo F; Haab, Brian B
2015-10-06
Experiments involving the high-throughput quantification of image data require algorithms for automation. A challenge in the development of such algorithms is to properly interpret signals over a broad range of image characteristics, without the need for manual adjustment of parameters. Here we present a new approach for locating signals in image data, called Segment and Fit Thresholding (SFT). The method assesses statistical characteristics of small segments of the image and determines the best-fit trends between the statistics. Based on the relationships, SFT identifies segments belonging to background regions; analyzes the background to determine optimal thresholds; and analyzes all segments to identify signal pixels. We optimized the initial settings for locating background and signal in antibody microarray and immunofluorescence data and found that SFT performed well over multiple, diverse image characteristics without readjustment of settings. When used for the automated analysis of multicolor, tissue-microarray images, SFT correctly found the overlap of markers with known subcellular localization, and it performed better than a fixed threshold and Otsu's method for selected images. SFT promises to advance the goal of full automation in image analysis.
Ensink, Elliot; Sinha, Jessica; Sinha, Arkadeep; Tang, Huiyuan; Calderone, Heather M.; Hostetter, Galen; Winter, Jordan; Cherba, David; Brand, Randall E.; Allen, Peter J.; Sempere, Lorenzo F.; Haab, Brian B.
2016-01-01
Certain experiments involve the high-throughput quantification of image data, thus requiring algorithms for automation. A challenge in the development of such algorithms is to properly interpret signals over a broad range of image characteristics, without the need for manual adjustment of parameters. Here we present a new approach for locating signals in image data, called Segment and Fit Thresholding (SFT). The method assesses statistical characteristics of small segments of the image and determines the best-fit trends between the statistics. Based on the relationships, SFT identifies segments belonging to background regions; analyzes the background to determine optimal thresholds; and analyzes all segments to identify signal pixels. We optimized the initial settings for locating background and signal in antibody microarray and immunofluorescence data and found that SFT performed well over multiple, diverse image characteristics without readjustment of settings. When used for the automated analysis of multi-color, tissue-microarray images, SFT correctly found the overlap of markers with known subcellular localization, and it performed better than a fixed threshold and Otsu’s method for selected images. SFT promises to advance the goal of full automation in image analysis. PMID:26339978
Using structural equation modeling for network meta-analysis.
Tu, Yu-Kang; Wu, Yun-Chun
2017-07-14
Network meta-analysis overcomes the limitations of traditional pair-wise meta-analysis by incorporating all available evidence into a general statistical framework for simultaneous comparisons of several treatments. Currently, network meta-analyses are undertaken either within the Bayesian hierarchical linear models or frequentist generalized linear mixed models. Structural equation modeling (SEM) is a statistical method originally developed for modeling causal relations among observed and latent variables. As random effect is explicitly modeled as a latent variable in SEM, it is very flexible for analysts to specify complex random effect structure and to make linear and nonlinear constraints on parameters. The aim of this article is to show how to undertake a network meta-analysis within the statistical framework of SEM. We used an example dataset to demonstrate the standard fixed and random effect network meta-analysis models can be easily implemented in SEM. It contains results of 26 studies that directly compared three treatment groups A, B and C for prevention of first bleeding in patients with liver cirrhosis. We also showed that a new approach to network meta-analysis based on the technique of unrestricted weighted least squares (UWLS) method can also be undertaken using SEM. For both the fixed and random effect network meta-analysis, SEM yielded similar coefficients and confidence intervals to those reported in the previous literature. The point estimates of two UWLS models were identical to those in the fixed effect model but the confidence intervals were greater. This is consistent with results from the traditional pairwise meta-analyses. Comparing to UWLS model with common variance adjusted factor, UWLS model with unique variance adjusted factor has greater confidence intervals when the heterogeneity was larger in the pairwise comparison. The UWLS model with unique variance adjusted factor reflects the difference in heterogeneity within each comparison. SEM provides a very flexible framework for univariate and multivariate meta-analysis, and its potential as a powerful tool for advanced meta-analysis is still to be explored.
Geostatistical applications in environmental remediation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, R.N.; Purucker, S.T.; Lyon, B.F.
1995-02-01
Geostatistical analysis refers to a collection of statistical methods for addressing data that vary in space. By incorporating spatial information into the analysis, geostatistics has advantages over traditional statistical analysis for problems with a spatial context. Geostatistics has a history of success in earth science applications, and its popularity is increasing in other areas, including environmental remediation. Due to recent advances in computer technology, geostatistical algorithms can be executed at a speed comparable to many standard statistical software packages. When used responsibly, geostatistics is a systematic and defensible tool can be used in various decision frameworks, such as the Datamore » Quality Objectives (DQO) process. At every point in the site, geostatistics can estimate both the concentration level and the probability or risk of exceeding a given value. Using these probability maps can assist in identifying clean-up zones. Given any decision threshold and an acceptable level of risk, the probability maps identify those areas that are estimated to be above or below the acceptable risk. Those areas that are above the threshold are of the most concern with regard to remediation. In addition to estimating clean-up zones, geostatistics can assist in designing cost-effective secondary sampling schemes. Those areas of the probability map with high levels of estimated uncertainty are areas where more secondary sampling should occur. In addition, geostatistics has the ability to incorporate soft data directly into the analysis. These data include historical records, a highly correlated secondary contaminant, or expert judgment. The role of geostatistics in environmental remediation is a tool that in conjunction with other methods can provide a common forum for building consensus.« less
Bremer, Peer-Timo; Weber, Gunther; Tierny, Julien; Pascucci, Valerio; Day, Marcus S; Bell, John B
2011-09-01
Large-scale simulations are increasingly being used to study complex scientific and engineering phenomena. As a result, advanced visualization and data analysis are also becoming an integral part of the scientific process. Often, a key step in extracting insight from these large simulations involves the definition, extraction, and evaluation of features in the space and time coordinates of the solution. However, in many applications, these features involve a range of parameters and decisions that will affect the quality and direction of the analysis. Examples include particular level sets of a specific scalar field, or local inequalities between derived quantities. A critical step in the analysis is to understand how these arbitrary parameters/decisions impact the statistical properties of the features, since such a characterization will help to evaluate the conclusions of the analysis as a whole. We present a new topological framework that in a single-pass extracts and encodes entire families of possible features definitions as well as their statistical properties. For each time step we construct a hierarchical merge tree a highly compact, yet flexible feature representation. While this data structure is more than two orders of magnitude smaller than the raw simulation data it allows us to extract a set of features for any given parameter selection in a postprocessing step. Furthermore, we augment the trees with additional attributes making it possible to gather a large number of useful global, local, as well as conditional statistic that would otherwise be extremely difficult to compile. We also use this representation to create tracking graphs that describe the temporal evolution of the features over time. Our system provides a linked-view interface to explore the time-evolution of the graph interactively alongside the segmentation, thus making it possible to perform extensive data analysis in a very efficient manner. We demonstrate our framework by extracting and analyzing burning cells from a large-scale turbulent combustion simulation. In particular, we show how the statistical analysis enabled by our techniques provides new insight into the combustion process.
Leming, Matthew; Steiner, Rachel; Styner, Martin
2016-02-27
Tract-based spatial statistics (TBSS) 6 is a software pipeline widely employed in comparative analysis of the white matter integrity from diffusion tensor imaging (DTI) datasets. In this study, we seek to evaluate the relationship between different methods of atlas registration for use with TBSS and different measurements of DTI (fractional anisotropy, FA, axial diffusivity, AD, radial diffusivity, RD, and medial diffusivity, MD). To do so, we have developed a novel tool that builds on existing diffusion atlas building software, integrating it into an adapted version of TBSS called DAB-TBSS (DTI Atlas Builder-Tract-Based Spatial Statistics) by using the advanced registration offered in DTI Atlas Builder 7 . To compare the effectiveness of these two versions of TBSS, we also propose a framework for simulating population differences for diffusion tensor imaging data, providing a more substantive means of empirically comparing DTI group analysis programs such as TBSS. In this study, we used 33 diffusion tensor imaging datasets and simulated group-wise changes in this data by increasing, in three different simulations, the principal eigenvalue (directly altering AD), the second and third eigenvalues (RD), and all three eigenvalues (MD) in the genu, the right uncinate fasciculus, and the left IFO. Additionally, we assessed the benefits of comparing the tensors directly using a functional analysis of diffusion tensor tract statistics (FADTTS 10 ). Our results indicate comparable levels of FA-based detection between DAB-TBSS and TBSS, with standard TBSS registration reporting a higher rate of false positives in other measurements of DTI. Within the simulated changes investigated here, this study suggests that the use of DTI Atlas Builder's registration enhances TBSS group-based studies.
NASA Astrophysics Data System (ADS)
Singh, Jitendra; Sekharan, Sheeba; Karmakar, Subhankar; Ghosh, Subimal; Zope, P. E.; Eldho, T. I.
2017-04-01
Mumbai, the commercial and financial capital of India, experiences incessant annual rain episodes, mainly attributable to erratic rainfall pattern during monsoons and urban heat-island effect due to escalating urbanization, leading to increasing vulnerability to frequent flooding. After the infamous episode of 2005 Mumbai torrential rains when only two rain gauging stations existed, the governing civic body, the Municipal Corporation of Greater Mumbai (MCGM) came forward with an initiative to install 26 automatic weather stations (AWS) in June 2006 (MCGM 2007), which later increased to 60 AWS. A comprehensive statistical analysis to understand the spatio-temporal pattern of rainfall over Mumbai or any other coastal city in India has never been attempted earlier. In the current study, a thorough analysis of available rainfall data for 2006-2014 from these stations was performed; the 2013-2014 sub-hourly data from 26 AWS was found useful for further analyses due to their consistency and continuity. Correlogram cloud indicated no pattern of significant correlation when we considered the closest to the farthest gauging station from the base station; this impression was also supported by the semivariogram plots. Gini index values, a statistical measure of temporal non-uniformity, were found above 0.8 in visible majority showing an increasing trend in most gauging stations; this sufficiently led us to conclude that inconsistency in daily rainfall was gradually increasing with progress in monsoon. Interestingly, night rainfall was lesser compared to daytime rainfall. The pattern-less high spatio-temporal variation observed in Mumbai rainfall data signifies the futility of independently applying advanced statistical techniques, and thus calls for simultaneous inclusion of physics-centred models such as different meso-scale numerical weather prediction systems, particularly the Weather Research and Forecasting (WRF) model.
NASA Astrophysics Data System (ADS)
Guimarães Nobre, Gabriela; Arnbjerg-Nielsen, Karsten; Rosbjerg, Dan; Madsen, Henrik
2016-04-01
Traditionally, flood risk assessment studies have been carried out from a univariate frequency analysis perspective. However, statistical dependence between hydrological variables, such as extreme rainfall and extreme sea surge, is plausible to exist, since both variables to some extent are driven by common meteorological conditions. Aiming to overcome this limitation, multivariate statistical techniques has the potential to combine different sources of flooding in the investigation. The aim of this study was to apply a range of statistical methodologies for analyzing combined extreme hydrological variables that can lead to coastal and urban flooding. The study area is the Elwood Catchment, which is a highly urbanized catchment located in the city of Port Phillip, Melbourne, Australia. The first part of the investigation dealt with the marginal extreme value distributions. Two approaches to extract extreme value series were applied (Annual Maximum and Partial Duration Series), and different probability distribution functions were fit to the observed sample. Results obtained by using the Generalized Pareto distribution demonstrate the ability of the Pareto family to model the extreme events. Advancing into multivariate extreme value analysis, first an investigation regarding the asymptotic properties of extremal dependence was carried out. As a weak positive asymptotic dependence between the bivariate extreme pairs was found, the Conditional method proposed by Heffernan and Tawn (2004) was chosen. This approach is suitable to model bivariate extreme values, which are relatively unlikely to occur together. The results show that the probability of an extreme sea surge occurring during a one-hour intensity extreme precipitation event (or vice versa) can be twice as great as what would occur when assuming independent events. Therefore, presuming independence between these two variables would result in severe underestimation of the flooding risk in the study area.
ERIC Educational Resources Information Center
Ministerio de Educacion, Guatemala City (Guatemala). Oficina de Planeamiento Integral de la Educacion.
This booklet presents statistics concerning primary education in Guatemala. The first section covers enrollment, considering such factors as type of school and location. Other sections provide statistics on teachers, their locations, the number of schools, enrollment in terms of students repeating grades or leaving school, students advancing out…
Explorations in Statistics: Standard Deviations and Standard Errors
ERIC Educational Resources Information Center
Curran-Everett, Douglas
2008-01-01
Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This series in "Advances in Physiology Education" provides an opportunity to do just that: we will investigate basic concepts in statistics using the free software package R. Because this series uses R solely as a vehicle…
Advances in Testing the Statistical Significance of Mediation Effects
ERIC Educational Resources Information Center
Mallinckrodt, Brent; Abraham, W. Todd; Wei, Meifen; Russell, Daniel W.
2006-01-01
P. A. Frazier, A. P. Tix, and K. E. Barron (2004) highlighted a normal theory method popularized by R. M. Baron and D. A. Kenny (1986) for testing the statistical significance of indirect effects (i.e., mediator variables) in multiple regression contexts. However, simulation studies suggest that this method lacks statistical power relative to some…
Lefebvre, Alexandre; Rochefort, Gael Y.; Santos, Frédéric; Le Denmat, Dominique; Salmon, Benjamin; Pétillon, Jean-Marc
2016-01-01
Over the last decade, biomedical 3D-imaging tools have gained widespread use in the analysis of prehistoric bone artefacts. While initial attempts to characterise the major categories used in osseous industry (i.e. bone, antler, and dentine/ivory) have been successful, the taxonomic determination of prehistoric artefacts remains to be investigated. The distinction between reindeer and red deer antler can be challenging, particularly in cases of anthropic and/or taphonomic modifications. In addition to the range of destructive physicochemical identification methods available (mass spectrometry, isotopic ratio, and DNA analysis), X-ray micro-tomography (micro-CT) provides convincing non-destructive 3D images and analyses. This paper presents the experimental protocol (sample scans, image processing, and statistical analysis) we have developed in order to identify modern and archaeological antler collections (from Isturitz, France). This original method is based on bone microstructure analysis combined with advanced statistical support vector machine (SVM) classifiers. A combination of six microarchitecture biomarkers (bone volume fraction, trabecular number, trabecular separation, trabecular thickness, trabecular bone pattern factor, and structure model index) were screened using micro-CT in order to characterise internal alveolar structure. Overall, reindeer alveoli presented a tighter mesh than red deer alveoli, and statistical analysis allowed us to distinguish archaeological antler by species with an accuracy of 96%, regardless of anatomical location on the antler. In conclusion, micro-CT combined with SVM classifiers proves to be a promising additional non-destructive method for antler identification, suitable for archaeological artefacts whose degree of human modification and cultural heritage or scientific value has previously made it impossible (tools, ornaments, etc.). PMID:26901355
Advanced Gear Alloys for Ultra High Strength Applications
NASA Technical Reports Server (NTRS)
Shen, Tony; Krantz, Timothy; Sebastian, Jason
2011-01-01
Single tooth bending fatigue (STBF) test data of UHS Ferrium C61 and C64 alloys are presented in comparison with historical test data of conventional gear steels (9310 and Pyrowear 53) with comparable statistical analysis methods. Pitting and scoring tests of C61 and C64 are works in progress. Boeing statistical analysis of STBF test data for the four gear steels (C61, C64, 9310 and Pyrowear 53) indicates that the UHS grades exhibit increases in fatigue strength in the low cycle fatigue (LCF) regime. In the high cycle fatigue (HCF) regime, the UHS steels exhibit better mean fatigue strength endurance limit behavior (particularly as compared to Pyrowear 53). However, due to considerable scatter in the UHS test data, the anticipated overall benefits of the UHS grades in bending fatigue have not been fully demonstrated. Based on all the test data and on Boeing s analysis, C61 has been selected by Boeing as the gear steel for the final ERDS demonstrator test gearboxes. In terms of potential follow-up work, detailed physics-based, micromechanical analysis and modeling of the fatigue data would allow for a better understanding of the causes of the experimental scatter, and of the transition from high-stress LCF (surface-dominated) to low-stress HCF (subsurface-dominated) fatigue failure. Additional STBF test data and failure analysis work, particularly in the HCF regime and around the endurance limit stress, could allow for better statistical confidence and could reduce the observed effects of experimental test scatter. Finally, the need for further optimization of the residual compressive stress profiles of the UHS steels (resulting from carburization and peening) is noted, particularly for the case of the higher hardness C64 material.
Advances in primate stable isotope ecology-Achievements and future prospects.
Crowley, Brooke E; Reitsema, Laurie J; Oelze, Vicky M; Sponheimer, Matt
2016-10-01
Stable isotope biogeochemistry has been used to investigate foraging ecology in non-human primates for nearly 30 years. Whereas early studies focused on diet, more recently, isotopic analysis has been used to address a diversity of ecological questions ranging from niche partitioning to nutritional status to variability in life history traits. With this increasing array of applications, stable isotope analysis stands to make major contributions to our understanding of primate behavior and biology. Most notably, isotopic data provide novel insights into primate feeding behaviors that may not otherwise be detectable. This special issue brings together some of the recent advances in this relatively new field. In this introduction to the special issue, we review the state of isotopic applications in primatology and its origins and describe some developing methodological issues, including techniques for analyzing different tissue types, statistical approaches, and isotopic baselines. We then discuss the future directions we envision for the field of primate isotope ecology. Am. J. Primatol. 78:995-1003, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Dutton, Gregory
Forensic science is a collection of applied disciplines that draws from all branches of science. A key question in forensic analysis is: to what degree do a piece of evidence and a known reference sample share characteristics? Quantification of similarity, estimation of uncertainty, and determination of relevant population statistics are of current concern. A 2016 PCAST report questioned the foundational validity and the validity in practice of several forensic disciplines, including latent fingerprints, firearms comparisons and DNA mixture interpretation. One recommendation was the advancement of objective, automated comparison methods based on image analysis and machine learning. These concerns parallel the National Institute of Justice's ongoing R&D investments in applied chemistry, biology and physics. NIJ maintains a funding program spanning fundamental research with potential for forensic application to the validation of novel instruments and methods. Since 2009, NIJ has funded over 179M in external research to support the advancement of accuracy, validity and efficiency in the forensic sciences. An overview of NIJ's programs will be presented, with examples of relevant projects from fluid dynamics, 3D imaging, acoustics, and materials science.
Wu, Kana; Spiegelman, Donna; Hou, Tao; Albanes, Demetrius; Allen, Naomi E; Berndt, Sonja I; van den Brandt, Piet A; Giles, Graham G; Giovannucci, Edward; Alexandra Goldbohm, R; Goodman, Gary G; Goodman, Phyllis J; Håkansson, Niclas; Inoue, Manami; Key, Timothy J; Kolonel, Laurence N; Männistö, Satu; McCullough, Marjorie L; Neuhouser, Marian L; Park, Yikyung; Platz, Elizabeth A; Schenk, Jeannette M; Sinha, Rashmi; Stampfer, Meir J; Stevens, Victoria L; Tsugane, Shoichiro; Visvanathan, Kala; Wilkens, Lynne R; Wolk, Alicja; Ziegler, Regina G; Smith-Warner, Stephanie A
2016-05-15
Reports relating meat intake to prostate cancer risk are inconsistent. Associations between these dietary factors and prostate cancer were examined in a consortium of 15 cohort studies. During follow-up, 52,683 incident prostate cancer cases, including 4,924 advanced cases, were identified among 842,149 men. Cox proportional hazard models were used to calculate study-specific relative risks (RR) and then pooled using random effects models. Results do not support a substantial effect of total red, unprocessed red and processed meat for all prostate cancer outcomes, except for a modest positive association for tumors identified as advanced stage at diagnosis (advanced(r)). For seafood, no substantial effect was observed for prostate cancer regardless of stage or grade. Poultry intake was inversely associated with risk of advanced and fatal cancers (pooled multivariable RR [MVRR], 95% confidence interval, comparing ≥ 45 vs. <5 g/day: advanced 0.83, 0.70-0.99; trend test p value 0.29), fatal, 0.69, 0.59-0.82, trend test p value 0.16). Participants who ate ≥ 25 versus <5 g/day of eggs (1 egg ∼ 50 g) had a significant 14% increased risk of advanced and fatal cancers (advanced 1.14, 1.01-1.28, trend test p value 0.01; fatal 1.14, 1.00-1.30, trend test p value 0.01). When associations were analyzed separately by geographical region (North America vs. other continents), positive associations between unprocessed red meat and egg intake, and inverse associations between poultry intake and advanced, advanced(r) and fatal cancers were limited to North American studies. However, differences were only statistically significant for eggs. Observed differences in associations by geographical region warrant further investigation. © 2015 UICC.
A score to estimate the likelihood of detecting advanced colorectal neoplasia at colonoscopy
Kaminski, Michal F; Polkowski, Marcin; Kraszewska, Ewa; Rupinski, Maciej; Butruk, Eugeniusz; Regula, Jaroslaw
2014-01-01
Objective This study aimed to develop and validate a model to estimate the likelihood of detecting advanced colorectal neoplasia in Caucasian patients. Design We performed a cross-sectional analysis of database records for 40-year-old to 66-year-old patients who entered a national primary colonoscopy-based screening programme for colorectal cancer in 73 centres in Poland in the year 2007. We used multivariate logistic regression to investigate the associations between clinical variables and the presence of advanced neoplasia in a randomly selected test set, and confirmed the associations in a validation set. We used model coefficients to develop a risk score for detection of advanced colorectal neoplasia. Results Advanced colorectal neoplasia was detected in 2544 of the 35 918 included participants (7.1%). In the test set, a logistic-regression model showed that independent risk factors for advanced colorectal neoplasia were: age, sex, family history of colorectal cancer, cigarette smoking (p<0.001 for these four factors), and Body Mass Index (p=0.033). In the validation set, the model was well calibrated (ratio of expected to observed risk of advanced neoplasia: 1.00 (95% CI 0.95 to 1.06)) and had moderate discriminatory power (c-statistic 0.62). We developed a score that estimated the likelihood of detecting advanced neoplasia in the validation set, from 1.32% for patients scoring 0, to 19.12% for patients scoring 7–8. Conclusions Developed and internally validated score consisting of simple clinical factors successfully estimates the likelihood of detecting advanced colorectal neoplasia in asymptomatic Caucasian patients. Once externally validated, it may be useful for counselling or designing primary prevention studies. PMID:24385598
A score to estimate the likelihood of detecting advanced colorectal neoplasia at colonoscopy.
Kaminski, Michal F; Polkowski, Marcin; Kraszewska, Ewa; Rupinski, Maciej; Butruk, Eugeniusz; Regula, Jaroslaw
2014-07-01
This study aimed to develop and validate a model to estimate the likelihood of detecting advanced colorectal neoplasia in Caucasian patients. We performed a cross-sectional analysis of database records for 40-year-old to 66-year-old patients who entered a national primary colonoscopy-based screening programme for colorectal cancer in 73 centres in Poland in the year 2007. We used multivariate logistic regression to investigate the associations between clinical variables and the presence of advanced neoplasia in a randomly selected test set, and confirmed the associations in a validation set. We used model coefficients to develop a risk score for detection of advanced colorectal neoplasia. Advanced colorectal neoplasia was detected in 2544 of the 35,918 included participants (7.1%). In the test set, a logistic-regression model showed that independent risk factors for advanced colorectal neoplasia were: age, sex, family history of colorectal cancer, cigarette smoking (p<0.001 for these four factors), and Body Mass Index (p=0.033). In the validation set, the model was well calibrated (ratio of expected to observed risk of advanced neoplasia: 1.00 (95% CI 0.95 to 1.06)) and had moderate discriminatory power (c-statistic 0.62). We developed a score that estimated the likelihood of detecting advanced neoplasia in the validation set, from 1.32% for patients scoring 0, to 19.12% for patients scoring 7-8. Developed and internally validated score consisting of simple clinical factors successfully estimates the likelihood of detecting advanced colorectal neoplasia in asymptomatic Caucasian patients. Once externally validated, it may be useful for counselling or designing primary prevention studies. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Computer Managed Instruction: An Application in Teaching Introductory Statistics.
ERIC Educational Resources Information Center
Hudson, Walter W.
1985-01-01
This paper describes a computer managed instruction package for teaching introductory or advanced statistics. The instructional package is described and anecdotal information concerning its performance and student responses to its use over two semesters are given. (Author/BL)
Putting Cognitive Science behind a Statistics Teacher's Intuition
ERIC Educational Resources Information Center
Jones, Karrie A.; Jones, Jennifer L.; Vermette, Paul J.
2011-01-01
Recent advances in cognitive science have led to an enriched understanding of how people learn. Using a framework presented by Willingham, this article examines instructional best practice from the perspective of conceptual understanding and its implications on statistics education.
SECIMTools: a suite of metabolomics data analysis tools.
Kirpich, Alexander S; Ibarra, Miguel; Moskalenko, Oleksandr; Fear, Justin M; Gerken, Joseph; Mi, Xinlei; Ashrafi, Ali; Morse, Alison M; McIntyre, Lauren M
2018-04-20
Metabolomics has the promise to transform the area of personalized medicine with the rapid development of high throughput technology for untargeted analysis of metabolites. Open access, easy to use, analytic tools that are broadly accessible to the biological community need to be developed. While technology used in metabolomics varies, most metabolomics studies have a set of features identified. Galaxy is an open access platform that enables scientists at all levels to interact with big data. Galaxy promotes reproducibility by saving histories and enabling the sharing workflows among scientists. SECIMTools (SouthEast Center for Integrated Metabolomics) is a set of Python applications that are available both as standalone tools and wrapped for use in Galaxy. The suite includes a comprehensive set of quality control metrics (retention time window evaluation and various peak evaluation tools), visualization techniques (hierarchical cluster heatmap, principal component analysis, modular modularity clustering), basic statistical analysis methods (partial least squares - discriminant analysis, analysis of variance, t-test, Kruskal-Wallis non-parametric test), advanced classification methods (random forest, support vector machines), and advanced variable selection tools (least absolute shrinkage and selection operator LASSO and Elastic Net). SECIMTools leverages the Galaxy platform and enables integrated workflows for metabolomics data analysis made from building blocks designed for easy use and interpretability. Standard data formats and a set of utilities allow arbitrary linkages between tools to encourage novel workflow designs. The Galaxy framework enables future data integration for metabolomics studies with other omics data.
Casciano, Roman; Chulikavit, Maruit; Perrin, Allison; Liu, Zhimei; Wang, Xufang; Garrison, Louis P
2012-01-01
Everolimus (Afinitor) and sunitinib (Sutent) were recently approved to treat patients with advanced, progressive pancreatic neuroendocrine tumors (pNETs). (Afinitor is a registered trademark of Novartis Pharmaceuticals Corporation, East Hanover, NJ, USA; Sutent is a registered trademark of Pfizer Inc., New York, NY, USA.) This analysis examined the projected cost-effectiveness of everolimus vs sunitinib in this setting from a US payer perspective. A semi-Markov model was developed to simulate a cohort of patients with advanced, progressive pNET and to estimate the cost per life-year gained (LYG) and per quality-adjusted life-year (QALY) gained when treating with everolimus vs sunitinib. Efficacy data were based on a weight-adjusted indirect comparison of the agents using phase 3 trial data. Model health states included: stable disease with no adverse events, stable disease with adverse events, disease progression, and death. Therapy costs were based on wholesale acquisition cost. Other costs such as physician visits, tests, hospitalizations, and adverse event costs were obtained from literature and/or primary research. Utility inputs were based on primary research. Sensitivity analyses were conducted to test the model's robustness. In the base-case analysis, everolimus was associated with an incremental 0.448 LYG (0.304 QALYs) at an incremental cost of $12,673, resulting in an incremental cost-effectiveness ratio (ICER) of $28,281/LYG ($41,702/QALY gained). The ICER fell within the cost per QALY range for many widely used oncology drugs. Sensitivity analyses demonstrated that, overall, there is a trend that everolimus is cost-effective compared to sunitinib in this setting. Results of the indirect analysis were not statistically significant (p > 0.05). Assumptions that treatment patterns are the same across therapies may not represent real-world practice. While the analysis is limited by its reliance on an indirect comparison of two phase 3 studies, everolimus is expected to be cost-effective relative to sunitinib in advanced, progressive pNET.
Analysis of delay reducing and fuel saving sequencing and spacing algorithms for arrival traffic
NASA Technical Reports Server (NTRS)
Neuman, Frank; Erzberger, Heinz
1991-01-01
The air traffic control subsystem that performs sequencing and spacing is discussed. The function of the sequencing and spacing algorithms is to automatically plan the most efficient landing order and to assign optimally spaced landing times to all arrivals. Several algorithms are described and their statistical performance is examined. Sequencing brings order to an arrival sequence for aircraft. First-come-first-served sequencing (FCFS) establishes a fair order, based on estimated times of arrival, and determines proper separations. Because of the randomness of the arriving traffic, gaps will remain in the sequence of aircraft. Delays are reduced by time-advancing the leading aircraft of each group while still preserving the FCFS order. Tightly spaced groups of aircraft remain with a mix of heavy and large aircraft. Spacing requirements differ for different types of aircraft trailing each other. Traffic is reordered slightly to take advantage of this spacing criterion, thus shortening the groups and reducing average delays. For heavy traffic, delays for different traffic samples vary widely, even when the same set of statistical parameters is used to produce each sample. This report supersedes NASA TM-102795 on the same subject. It includes a new method of time-advance as well as an efficient method of sequencing and spacing for two dependent runways.
Czyzewska, Jolanta; Guzińska-Ustymowicz, Katarzyna; Pryczynicz, Anna; Kemona, Andrzej; Bandurski, Roman
2009-01-01
Fhit protein is known to play a role in the process of neoplastic transformation. It has been demonstrated that FHIT gene inactivation is manifested by a lack or very low concentration of Fhit protein in tissues collected from tumours in many organs, including head, neck, breast, lungs, stomach or large intestine. The study included a group of 80 patients with advanced gastric carcinomas. The expression of Fhit protein was assessed by means of the immunohistochemical method (avidin-biotin-streptavidin) in the sections fixed in formalin and embedded in paraffin, using rabbit polyclonal antiFhit antibody (Abcam, UK) at 1: 200. Statistical analysis did not show any correlation of the expression of Fhit protein in the main mass of tumour and in the metastasis to lymph node with gender, depth of wall invasion, histological differentiation, Lauren's classification, Bormann's classification, metastases to local lymph nodes or Helicobacter pylori infection. However, a strong statistical correlation was revealed of Fhit protein expression in the main mass of tumour with patients' age (p=0.04) and tumour location in the stomach (p=0.02). No relationship was found between Fhit expression in the main mass of tumour and survival time (p=0.26).
Retrospective analysis of dental implants placed and restored by advanced prosthodontic residents.
Barias, Pamela A; Lee, Damian J; Yuan, Judy Chia-Chun; Sukotjo, Cortino; Campbell, Stephen D; Knoernschild, Kent L
2013-02-01
The purposes of this retrospective clinical review were to: (1) describe the demographics of implant patients, types of implant treatment and implant-supported prostheses in an Advanced Education in Prosthodontic Program, (2) evaluate the survival rate of dental implants placed by prosthodontic residents from 2006 to 2008, and (3) analyze the relationship between resident year of training and implant survival rate. All patients who received dental implants placed by prosthodontic residents from January 2006 to October of 2008 in the Advanced Prosthodontic Program at the University of Illinois at Chicago College of Dentistry were selected for this study. Age, gender, implant diameter, length, implant locations, surgical and restorative detail, and year of prosthodontic residency training were collected and analyzed. Life-table and Kaplan-Meier survival analyses were performed based on implants overall, locations, year of training, and use of a computer-generated surgical guide. A Logrank statistic was performed between implant survival and year of prosthodontic residency training, location, and use of computer-generated surgical guide (α= 0.05). Three hundred and six implants were placed, and of these, seven failed. Life-table and Kaplan-Meier analyses computed a cumulative survival rate (CSR) of 97% for overall implants and implants placed with a computer-generated surgical guide. No statistical difference was found in implant survival rates as a function of year of training (P= 0.85). Dental implants placed by prosthodontic residents had a CSR comparable to previously published studies by other specialties. The year of prosthodontic residency training and implant failure rate did not have any significant relationship. © 2012 by the American College of Prosthodontists.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Y; Shirato, H; Song, J
2015-06-15
Purpose: This study aims to identify novel prognostic imaging biomarkers in locally advanced pancreatic cancer (LAPC) using quantitative, high-throughput image analysis. Methods: 86 patients with LAPC receiving chemotherapy followed by SBRT were retrospectively studied. All patients had a baseline FDG-PET scan prior to SBRT. For each patient, we extracted 435 PET imaging features of five types: statistical, morphological, textural, histogram, and wavelet. These features went through redundancy checks, robustness analysis, as well as a prescreening process based on their concordance indices with respect to the relevant outcomes. We then performed principle component analysis on the remaining features (number ranged frommore » 10 to 16), and fitted a Cox proportional hazard regression model using the first 3 principle components. Kaplan-Meier analysis was used to assess the ability to distinguish high versus low-risk patients separated by median predicted survival. To avoid overfitting, all evaluations were based on leave-one-out cross validation (LOOCV), in which each holdout patient was assigned to a risk group according to the model obtained from a separate training set. Results: For predicting overall survival (OS), the most dominant imaging features were wavelet coefficients. There was a statistically significant difference in OS between patients with predicted high and low-risk based on LOOCV (hazard ratio: 2.26, p<0.001). Similar imaging features were also strongly associated with local progression-free survival (LPFS) (hazard ratio: 1.53, p=0.026) on LOOCV. In comparison, neither SUVmax nor TLG was associated with LPFS (p=0.103, p=0.433) (Table 1). Results for progression-free survival and distant progression-free survival showed similar trends. Conclusion: Radiomic analysis identified novel imaging features that showed improved prognostic value over conventional methods. These features characterize the degree of intra-tumor heterogeneity reflected on FDG-PET images, and their biological underpinnings warrant further investigation. If validated in large, prospective cohorts, this method could be used to stratify patients based on individualized risk.« less
NASA Astrophysics Data System (ADS)
Ruggles, Adam J.
2015-11-01
This paper presents improved statistical insight regarding the self-similar scalar mixing process of atmospheric hydrogen jets and the downstream region of under-expanded hydrogen jets. Quantitative planar laser Rayleigh scattering imaging is used to probe both jets. The self-similarity of statistical moments up to the sixth order (beyond the literature established second order) is documented in both cases. This is achieved using a novel self-similar normalization method that facilitated a degree of statistical convergence that is typically limited to continuous, point-based measurements. This demonstrates that image-based measurements of a limited number of samples can be used for self-similar scalar mixing studies. Both jets exhibit the same radial trends of these moments demonstrating that advanced atmospheric self-similarity can be applied in the analysis of under-expanded jets. Self-similar histograms away from the centerline are shown to be the combination of two distributions. The first is attributed to turbulent mixing. The second, a symmetric Poisson-type distribution centered on zero mass fraction, progressively becomes the dominant and eventually sole distribution at the edge of the jet. This distribution is attributed to shot noise-affected pure air measurements, rather than a diffusive superlayer at the jet boundary. This conclusion is reached after a rigorous measurement uncertainty analysis and inspection of pure air data collected with each hydrogen data set. A threshold based upon the measurement noise analysis is used to separate the turbulent and pure air data, and thusly estimate intermittency. Beta-distributions (four parameters) are used to accurately represent the turbulent distribution moments. This combination of measured intermittency and four-parameter beta-distributions constitutes a new, simple approach to model scalar mixing. Comparisons between global moments from the data and moments calculated using the proposed model show excellent agreement. This was attributed to the high quality of the measurements which reduced the width of the correctly identified, noise-affected pure air distribution, with respect to the turbulent mixing distribution. The ignitability of the atmospheric jet is determined using the flammability factor calculated from both kernel density estimated (KDE) PDFs and PDFs generated using the newly proposed model. Agreement between contours from both approaches is excellent. Ignitability of the under-expanded jet is also calculated using KDE PDFs. Contours are compared with those calculated by applying the atmospheric model to the under-expanded jet. Once again, agreement is excellent. This work demonstrates that self-similar scalar mixing statistics and ignitability of atmospheric jets can be accurately described by the proposed model. This description can be applied with confidence to under-expanded jets, which are more realistic of leak and fuel injection scenarios.
Rocha Dos Santos, Clarissa; da Rocha Filgueiras, Richard; Furtado Malard, Patrícia; Rodrigues da Cunha Barreto-Vianna, Andre; Nogueira, Kaique; da Silva Leite, Carolina; Maurício Mendes de Lima, Eduardo
2018-06-14
The cranial cruciate ligament rupture (CCLR) is the most commonly encountered orthopedic condition in dogs. Among the various techniques to treat this condition, tibial tuberosity advancement (TTA) has been used to obtain rapid recovery of the affected knee. The objective of this study was to evaluate the viability of the use of mesenchymal stem cells (MSC) implanted in the osteotomy site obtained by TTA in nine dogs diagnosed with CCLR. The MSC were isolated from the adipose tissue of the dogs and cultured for eight days, the animals were divided into two groups. Animals from the treated group (GT) received cell transport medium containing about 1.5 millions MSC, and the animals from the control group (GC) received only the cell transport medium. The study was performed in a double-blind manner using radiographs acquired on days 15, 30, 60 and 120 after the procedure. Evaluations of the density of the trabecular bone were performed using image analysis software. The results were subjected to descriptive statistical analysis, followed by the normality test, Chi-square test, Mann-Whitney test and Tukey's multiple comparison test for p ≤ 0.05. After 30 days of the procedure, the animals of the GT presented an ossification mean 36.45% greater (p ≤ 0.033) than the GC, and there were no statistical differences for the other periods. Despite the total bone ossification within the expected period, there was no minimization of the estimated recovery time with the application of MSC, and inflammatory factors should be considered for reassessment of the therapeutic intervention time.
Analysis of AVHRR, CZCS and historical in situ data off the Oregon Coast
NASA Technical Reports Server (NTRS)
Strub, P. Ted; Chelton, Dudley B.
1990-01-01
The original scientific objectives of this grant were to: (1) characterize the seasonal cycles and interannual variability for phytoplankton concentrations and sea surface temperature (SST) in the California Current using satellite data; and (2) to explore the spatial and temporal relationship between these variables and surface wind forcing. An additional methodological objective was to develop statistical methods for forming mean fields, which minimize the effects of random data gaps and errors in the irregularly sampled CZCS (Coastal Zone Color Scanner) and AVHRR (Advanced Very High Resolution Radiometer) satellite data. A final task was to evaluate the level of uncertainty in the wind fields used for the statistical analysis. Funding in the first year included part of the cost of an image processing system to enable this and other projects to process and analyze satellite data. This report consists of summaries of the major projects carried out with all or partial support from this grant. The appendices include a list of papers and professional presentations supported by the grant, as well as reprints of the major papers and reports.
Collaborative classification of hyperspectral and visible images with convolutional neural network
NASA Astrophysics Data System (ADS)
Zhang, Mengmeng; Li, Wei; Du, Qian
2017-10-01
Recent advances in remote sensing technology have made multisensor data available for the same area, and it is well-known that remote sensing data processing and analysis often benefit from multisource data fusion. Specifically, low spatial resolution of hyperspectral imagery (HSI) degrades the quality of the subsequent classification task while using visible (VIS) images with high spatial resolution enables high-fidelity spatial analysis. A collaborative classification framework is proposed to fuse HSI and VIS images for finer classification. First, the convolutional neural network model is employed to extract deep spectral features for HSI classification. Second, effective binarized statistical image features are learned as contextual basis vectors for the high-resolution VIS image, followed by a classifier. The proposed approach employs diversified data in a decision fusion, leading to an integration of the rich spectral information, spatial information, and statistical representation information. In particular, the proposed approach eliminates the potential problems of the curse of dimensionality and excessive computation time. The experiments evaluated on two standard data sets demonstrate better classification performance offered by this framework.
Nuclear magnetic resonance (NMR)-based metabolomics for cancer research.
Ranjan, Renuka; Sinha, Neeraj
2018-05-07
Nuclear magnetic resonance (NMR) has emerged as an effective tool in various spheres of biomedical research, amongst which metabolomics is an important method for the study of various types of disease. Metabolomics has proved its stronghold in cancer research by the development of different NMR methods over time for the study of metabolites, thus identifying key players in the aetiology of cancer. A plethora of one-dimensional and two-dimensional NMR experiments (in solids, semi-solids and solution phases) are utilized to obtain metabolic profiles of biofluids, cell extracts and tissue biopsy samples, which can further be subjected to statistical analysis. Any alteration in the assigned metabolite peaks gives an indication of changes in metabolic pathways. These defined changes demonstrate the utility of NMR in the early diagnosis of cancer and provide further measures to combat malignancy and its progression. This review provides a snapshot of the trending NMR techniques and the statistical analysis involved in the metabolomics of diseases, with emphasis on advances in NMR methodology developed for cancer research. Copyright © 2018 John Wiley & Sons, Ltd.
Effect of mandibular advancement device on sleep bruxism score and sleep quality.
Solanki, Nehal; Singh, Balendra Pratap; Chand, Pooran; Siddharth, Ramashankar; Arya, Deeksha; Kumar, Lakshya; Tripathi, Suryakant; Jivanani, Hemant; Dubey, Abhishek
2017-01-01
The use of mandibular advancement devices (MADs) in the treatment of sleep bruxism is gaining widespread importance. However, the effects of MADs on sleep bruxism scores, sleep quality, and occlusal force are not clear. The purpose of this clinical study was to analyze the effect of MADs on sleep bruxism scores, sleep quality, and occlusal force. This uncontrolled before and after study enrolled 30 participants with sleep bruxism. Outcomes assessed were sleep quality, sleep bruxism scores (sleep bruxism bursts and sleep bruxism episodes/hour), and occlusal force before and after 15 and 30 days of using a MAD. Sleep bruxism scores were assessed by ambulatory polysomnography and sleep quality by using the Pittsburgh sleep quality index (PSQI). Occlusal force was recorded by using a digital gnathodynamometer in the first molar region on both sides. Statistical analysis was done by 1-factor repeated measures ANOVA (α=.05). Statistically significant reductions in sleep bruxism bursts/h, sleep bruxism episodes/h, and PSQI scores were found after 15 and 30 days of using a MAD (P<.001). Statistically significant reduction in occlusal force on both sides was found only after 15 days (P<.001) but not after 30 days of using a MAD (P=.292 on left side, and P=.575 on the right side). The study showed a short-term improvement in sleep bruxism scores, sleep quality, and reduction in occlusal force in sleep bruxism participants after using MADs. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
Factors affecting low back pain in adolescents.
Korovessis, Panagiotis; Repantis, Thomas; Baikousis, Andreas
2010-12-01
Prospective multifactorial study on low back pain (LBP) in adolescents. Most studies on LBP have focused on adults although many investigations have shown that the roots of LBP lie in adolescence. Several mechanical, physical, and behavioral factors have been associated with nonspecific LBP in adolescents. To investigate the effect of all previously reported parameters together with psychological and psychosocial factors using advanced statistics, on LBP in adolescents aged 15 to 19 years. Six hundred and eighty-eight students aged 16±1 years from 5 randomly selected high schools participated in this study and completed a questionnaire containing questions on daily activity, backpack carrying, psychological and psychosocial behavior. Anthropometric data as well as biplane spinal curvatures together with questionnaire results were included in the analysis using advanced statistics. LBP reported 41% of the participants. Generally, statistically significant correlations were found between LBP (0.002), physical activity (P<0.001), physician consultation (P=0.024), and depression (P<0.001). Sex-related differences were shown regarding LBP intensity (P=0.005) and frequency (P=0.013), stress (P<0.03), depression (P=0.005), and nervous mood (P=0.036) in favor of male students. Male adolescents had continuous energy (P=0.0258) and were calm (P=0.029) in contrast with female counterparts. LBP was sex-related and was less common in adolescents with frequent activity. Adolescent girls with stress, depressive mood, and low energy have more LBP than boys, which makes physician consultation for LBP more common in female adolescents. Systematic physical activity and control of psychological profile should decrease LBP frequency and intensity.
A score-statistic approach for determining threshold values in QTL mapping.
Kao, Chen-Hung; Ho, Hsiang-An
2012-06-01
Issues in determining the threshold values of QTL mapping are often investigated for the backcross and F2 populations with relatively simple genome structures so far. The investigations of these issues in the progeny populations after F2 (advanced populations) with relatively more complicated genomes are generally inadequate. As these advanced populations have been well implemented in QTL mapping, it is important to address these issues for them in more details. Due to an increasing number of meiosis cycle, the genomes of the advanced populations can be very different from the backcross and F2 genomes. Therefore, special devices that consider the specific genome structures present in the advanced populations are required to resolve these issues. By considering the differences in genome structure between populations, we formulate more general score test statistics and gaussian processes to evaluate their threshold values. In general, we found that, given a significance level and a genome size, threshold values for QTL detection are higher in the denser marker maps and in the more advanced populations. Simulations were performed to validate our approach.
Suurmond, Robert; van Rhee, Henk; Hak, Tony
2017-12-01
We present a new tool for meta-analysis, Meta-Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta-analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta-Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta-analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp-Hartung adjustment of the DerSimonian-Laird estimator. However, more advanced meta-analysis methods such as meta-analytical structural equation modelling and meta-regression with multiple covariates are not available. In summary, Meta-Essentials may prove a valuable resource for meta-analysts, including researchers, teachers, and students. © 2017 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.
Best Practices in Teaching Statistics and Research Methods in the Behavioral Sciences [with CD-ROM
ERIC Educational Resources Information Center
Dunn, Dana S., Ed.; Smith, Randolph A., Ed.; Beins, Barney, Ed.
2007-01-01
This book provides a showcase for "best practices" in teaching statistics and research methods in two- and four-year colleges and universities. A helpful resource for teaching introductory, intermediate, and advanced statistics and/or methods, the book features coverage of: (1) ways to integrate these courses; (2) how to promote ethical conduct;…
Drew, Mark S.
2016-01-01
Cutaneous melanoma is the most life-threatening form of skin cancer. Although advanced melanoma is often considered as incurable, if detected and excised early, the prognosis is promising. Today, clinicians use computer vision in an increasing number of applications to aid early detection of melanoma through dermatological image analysis (dermoscopy images, in particular). Colour assessment is essential for the clinical diagnosis of skin cancers. Due to this diagnostic importance, many studies have either focused on or employed colour features as a constituent part of their skin lesion analysis systems. These studies range from using low-level colour features, such as simple statistical measures of colours occurring in the lesion, to availing themselves of high-level semantic features such as the presence of blue-white veil, globules, or colour variegation in the lesion. This paper provides a retrospective survey and critical analysis of contributions in this research direction. PMID:28096807
Seo, Seongho; Kim, Su Jin; Lee, Dong Soo; Lee, Jae Sung
2014-10-01
Tracer kinetic modeling in dynamic positron emission tomography (PET) has been widely used to investigate the characteristic distribution patterns or dysfunctions of neuroreceptors in brain diseases. Its practical goal has progressed from regional data quantification to parametric mapping that produces images of kinetic-model parameters by fully exploiting the spatiotemporal information in dynamic PET data. Graphical analysis (GA) is a major parametric mapping technique that is independent on any compartmental model configuration, robust to noise, and computationally efficient. In this paper, we provide an overview of recent advances in the parametric mapping of neuroreceptor binding based on GA methods. The associated basic concepts in tracer kinetic modeling are presented, including commonly-used compartment models and major parameters of interest. Technical details of GA approaches for reversible and irreversible radioligands are described, considering both plasma input and reference tissue input models. Their statistical properties are discussed in view of parametric imaging.
NASA Astrophysics Data System (ADS)
The subjects discussed are related to LSI/VLSI based subscriber transmission and customer access for the Integrated Services Digital Network (ISDN), special applications of fiber optics, ISDN and competitive telecommunication services, technical preparations for the Geostationary-Satellite Orbit Conference, high-capacity statistical switching fabrics, networking and distributed systems software, adaptive arrays and cancelers, synchronization and tracking, speech processing, advances in communication terminals, full-color videotex, and a performance analysis of protocols. Advances in data communications are considered along with transmission network plans and progress, direct broadcast satellite systems, packet radio system aspects, radio-new and developing technologies and applications, the management of software quality, and Open Systems Interconnection (OSI) aspects of telematic services. Attention is given to personal computers and OSI, the role of software reliability measurement in information systems, and an active array antenna for the next-generation direct broadcast satellite.
Lee, Kian Mun; Hamid, Sharifah Bee Abd
2015-01-19
The performance of advance photocatalytic degradation of 4-chlorophenoxyacetic acid (4-CPA) strongly depends on photocatalyst dosage, initial concentration and initial pH. In the present study, a simple response surface methodology (RSM) was applied to investigate the interaction between these three independent factors. Thus, the photocatalytic degradation of 4-CPA in aqueous medium assisted by ultraviolet-active ZnO photocatalyst was systematically investigated. This study aims to determine the optimum processing parameters to maximize 4-CPA degradation. Based on the results obtained, it was found that a maximum of 91% of 4-CPA was successfully degraded under optimal conditions (0.02 g ZnO dosage, 20.00 mg/L of 4-CPA and pH 7.71). All the experimental data showed good agreement with the predicted results obtained from statistical analysis.
Gender differences in research grant applications for pediatric residents.
Gordon, Mary Beth; Osganian, Stavroula K; Emans, S Jean; Lovejoy, Frederick H
2009-08-01
Recent studies have reported gender differences in research grant applications and funding outcomes for medical school faculty. Our goal was to determine whether similar patterns exist at the resident level and, if so, to explore possible explanations. We conducted a retrospective review of all applications to an internal, mentored research grant fund at a large academic pediatric residency program from 2003 to 2008. We determined whether gender differences existed for application characteristics and outcomes and defined significant predictors of success. During the 5-year period, the fund supported 42 (66%) of 64 applications. Among all applicants, men were more likely than women to hold an advanced research degree. Men requested more money than women and obtained more favorable application scores. Funding success rates were not statistically different between male and female applicants. Among funded applicants, men received higher awards than women, although the percentage of requests funded was the same. In a multiple regression analysis, advanced degree was the significant independent predictor of successful funding outcome. Controlling for advanced degree attenuated the association between gender and timing of application, type of project, dollars requested, and dollars awarded; however, even after controlling for advanced degree, women had inferior grant scores compared with men. Gender differences existed in research grant applications and funding among pediatric residents that mirrored faculty patterns. Among residents, these differences were explained in part by the correlation of male gender with holding an advanced research degree.
Teleradiology Via The Naval Remote Medical Diagnosis System (RMDS)
NASA Astrophysics Data System (ADS)
Rasmussen, Will; Stevens, Ilya; Gerber, F. H.; Kuhlman, Jayne A.
1982-01-01
Testing was conducted to obtain qualitative and quantitative (statistical) data on radiology performance using the Remote Medical Diagnosis System (RMDS) Advanced Development Models (ADMs)1. Based upon data collected during testing with professional radiologists, this analysis addresses the clinical utility of radiographic images transferred through six possible RMDS transmission modes. These radiographs were also viewed under closed-circuit television (CCTV) and lightbox conditions to provide a basis for comparison. The analysis indicates that the RMDS ADM terminals (with a system video resolution of 525 x 256 x 6) would provide satisfactory radiographic images for radiology consultations in emergency cases with gross pathological disorders. However, in cases involving more subtle findings, a system video resolution of 525 x 512 x 8 would be preferable.
Large-scale quantitative analysis of painting arts.
Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong
2014-12-11
Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.
Jin, Zhaohui; Hartgers, Mindy L; Sanhueza, Cristobal T; Shubert, Christopher R; Alberts, Steven R; Truty, Mark J; Muppa, Prasuna; Nagorney, David M; Smyrk, Thomas C; Hassan, Mohamed; Mahipal, Amit
2018-05-01
Ampullary adenocarcinoma is a rare entity with limited data on prognostic factors. The aim of this study is to identify prognostic factors and assess the benefit of adjuvant therapy in patients with ampullary adenocarcinoma who underwent pancreatoduodenectomy. A cohort of 121 consecutive patients underwent pancreatoduodenectomy for ampullary adenocarcinoma from 2006 to 2016 at Mayo Clinic in Rochester, MN. All patients were confirmed by independent pathologic review to have ampullary carcinoma. Patient survival and its correlation with patient and tumor variables were evaluated by univariate and multivariate analysis. Fifty three patients (45%) received adjuvant therapy (34 patients had chemotherapy alone, while 19 patients received both chemotherapy and radiation therapy). Fifty seven percent of the patients were diagnosed with advanced stage disease (Stage IIB or higher). Nearly all patients (98.3%) had negative surgical margins. Median overall survival (OS) was 91.8 months (95% CI:52.6 months-not reached). In multivariate analysis, excellent performance status (ECOG: 0), adjuvant therapy, and advanced stage remained statistically significant. Adjuvant therapy was independently associated with improved disease free survival (Hazard ratio [HR]:0.52, P = 0.04) and overall survival (HR:0.45, P = 0.03) in patients with advanced disease. Adjuvant therapy was associated with improved survival in patients with resected ampullary cancer, especially with advanced stage disease. A multi-institutional randomized trial is needed to further assess the role of adjuvant therapy in ampullary adenocarcinoma. Copyright © 2018 Elsevier Ltd, BASO ~ The Association for Cancer Surgery, and the European Society of Surgical Oncology. All rights reserved.
Campbell, J Q; Petrella, A J
2016-09-06
Population-based modeling of the lumbar spine has the potential to be a powerful clinical tool. However, developing a fully parameterized model of the lumbar spine with accurate geometry has remained a challenge. The current study used automated methods for landmark identification to create a statistical shape model of the lumbar spine. The shape model was evaluated using compactness, generalization ability, and specificity. The primary shape modes were analyzed visually, quantitatively, and biomechanically. The biomechanical analysis was performed by using the statistical shape model with an automated method for finite element model generation to create a fully parameterized finite element model of the lumbar spine. Functional finite element models of the mean shape and the extreme shapes (±3 standard deviations) of all 17 shape modes were created demonstrating the robust nature of the methods. This study represents an advancement in finite element modeling of the lumbar spine and will allow population-based modeling in the future. Copyright © 2016 Elsevier Ltd. All rights reserved.
Application of multivariate statistical techniques in microbial ecology.
Paliy, O; Shankar, V
2016-03-01
Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large-scale ecological data sets. In particular, noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amount of data, powerful statistical techniques of multivariate analysis are well suited to analyse and interpret these data sets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular data set. In this review, we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and data set structure. © 2016 John Wiley & Sons Ltd.
Karl Pearson and eugenics: personal opinions and scientific rigor.
Delzell, Darcie A P; Poliak, Cathy D
2013-09-01
The influence of personal opinions and biases on scientific conclusions is a threat to the advancement of knowledge. Expertise and experience does not render one immune to this temptation. In this work, one of the founding fathers of statistics, Karl Pearson, is used as an illustration of how even the most talented among us can produce misleading results when inferences are made without caution or reference to potential bias and other analysis limitations. A study performed by Pearson on British Jewish schoolchildren is examined in light of ethical and professional statistical practice. The methodology used and inferences made by Pearson and his coauthor are sometimes questionable and offer insight into how Pearson's support of eugenics and his own British nationalism could have potentially influenced his often careless and far-fetched inferences. A short background into Pearson's work and beliefs is provided, along with an in-depth examination of the authors' overall experimental design and statistical practices. In addition, portions of the study regarding intelligence and tuberculosis are discussed in more detail, along with historical reactions to their work.
Statistics teaching in medical school: opinions of practising doctors.
Miles, Susan; Price, Gill M; Swift, Louise; Shepstone, Lee; Leinster, Sam J
2010-11-04
The General Medical Council expects UK medical graduates to gain some statistical knowledge during their undergraduate education; but provides no specific guidance as to amount, content or teaching method. Published work on statistics teaching for medical undergraduates has been dominated by medical statisticians, with little input from the doctors who will actually be using this knowledge and these skills after graduation. Furthermore, doctor's statistical training needs may have changed due to advances in information technology and the increasing importance of evidence-based medicine. Thus there exists a need to investigate the views of practising medical doctors as to the statistical training required for undergraduate medical students, based on their own use of these skills in daily practice. A questionnaire was designed to investigate doctors' views about undergraduate training in statistics and the need for these skills in daily practice, with a view to informing future teaching. The questionnaire was emailed to all clinicians with a link to the University of East Anglia Medical School. Open ended questions were included to elicit doctors' opinions about both their own undergraduate training in statistics and recommendations for the training of current medical students. Content analysis was performed by two of the authors to systematically categorize and describe all the responses provided by participants. 130 doctors responded, including both hospital consultants and general practitioners. The findings indicated that most had not recognised the value of their undergraduate teaching in statistics and probability at the time, but had subsequently found the skills relevant to their career. Suggestions for improving undergraduate teaching in these areas included referring to actual research and ensuring relevance to, and integration with, clinical practice. Grounding the teaching of statistics in the context of real research studies and including examples of typical clinical work may better prepare medical students for their subsequent career.
DIANA-microT web server v5.0: service integration into miRNA functional analysis workflows.
Paraskevopoulou, Maria D; Georgakilas, Georgios; Kostoulas, Nikos; Vlachos, Ioannis S; Vergoulis, Thanasis; Reczko, Martin; Filippidis, Christos; Dalamagas, Theodore; Hatzigeorgiou, A G
2013-07-01
MicroRNAs (miRNAs) are small endogenous RNA molecules that regulate gene expression through mRNA degradation and/or translation repression, affecting many biological processes. DIANA-microT web server (http://www.microrna.gr/webServer) is dedicated to miRNA target prediction/functional analysis, and it is being widely used from the scientific community, since its initial launch in 2009. DIANA-microT v5.0, the new version of the microT server, has been significantly enhanced with an improved target prediction algorithm, DIANA-microT-CDS. It has been updated to incorporate miRBase version 18 and Ensembl version 69. The in silico-predicted miRNA-gene interactions in Homo sapiens, Mus musculus, Drosophila melanogaster and Caenorhabditis elegans exceed 11 million in total. The web server was completely redesigned, to host a series of sophisticated workflows, which can be used directly from the on-line web interface, enabling users without the necessary bioinformatics infrastructure to perform advanced multi-step functional miRNA analyses. For instance, one available pipeline performs miRNA target prediction using different thresholds and meta-analysis statistics, followed by pathway enrichment analysis. DIANA-microT web server v5.0 also supports a complete integration with the Taverna Workflow Management System (WMS), using the in-house developed DIANA-Taverna Plug-in. This plug-in provides ready-to-use modules for miRNA target prediction and functional analysis, which can be used to form advanced high-throughput analysis pipelines.
DIANA-microT web server v5.0: service integration into miRNA functional analysis workflows
Paraskevopoulou, Maria D.; Georgakilas, Georgios; Kostoulas, Nikos; Vlachos, Ioannis S.; Vergoulis, Thanasis; Reczko, Martin; Filippidis, Christos; Dalamagas, Theodore; Hatzigeorgiou, A.G.
2013-01-01
MicroRNAs (miRNAs) are small endogenous RNA molecules that regulate gene expression through mRNA degradation and/or translation repression, affecting many biological processes. DIANA-microT web server (http://www.microrna.gr/webServer) is dedicated to miRNA target prediction/functional analysis, and it is being widely used from the scientific community, since its initial launch in 2009. DIANA-microT v5.0, the new version of the microT server, has been significantly enhanced with an improved target prediction algorithm, DIANA-microT-CDS. It has been updated to incorporate miRBase version 18 and Ensembl version 69. The in silico-predicted miRNA–gene interactions in Homo sapiens, Mus musculus, Drosophila melanogaster and Caenorhabditis elegans exceed 11 million in total. The web server was completely redesigned, to host a series of sophisticated workflows, which can be used directly from the on-line web interface, enabling users without the necessary bioinformatics infrastructure to perform advanced multi-step functional miRNA analyses. For instance, one available pipeline performs miRNA target prediction using different thresholds and meta-analysis statistics, followed by pathway enrichment analysis. DIANA-microT web server v5.0 also supports a complete integration with the Taverna Workflow Management System (WMS), using the in-house developed DIANA-Taverna Plug-in. This plug-in provides ready-to-use modules for miRNA target prediction and functional analysis, which can be used to form advanced high-throughput analysis pipelines. PMID:23680784
GAPIT version 2: an enhanced integrated tool for genomic association and prediction
USDA-ARS?s Scientific Manuscript database
Most human diseases and agriculturally important traits are complex. Dissecting their genetic architecture requires continued development of innovative and powerful statistical methods. Corresponding advances in computing tools are critical to efficiently use these statistical innovations and to enh...
Advances in Bayesian Modeling in Educational Research
ERIC Educational Resources Information Center
Levy, Roy
2016-01-01
In this article, I provide a conceptually oriented overview of Bayesian approaches to statistical inference and contrast them with frequentist approaches that currently dominate conventional practice in educational research. The features and advantages of Bayesian approaches are illustrated with examples spanning several statistical modeling…
Replication of long-bone length QTL in the F9-F10 LG,SM advanced intercross.
Norgard, Elizabeth A; Jarvis, Joseph P; Roseman, Charles C; Maxwell, Taylor J; Kenney-Hunt, Jane P; Samocha, Kaitlin E; Pletscher, L Susan; Wang, Bing; Fawcett, Gloria L; Leatherwood, Christopher J; Wolf, Jason B; Cheverud, James M
2009-04-01
Quantitative trait locus (QTL) mapping techniques are frequently used to identify genomic regions associated with variation in phenotypes of interest. However, the F(2) intercross and congenic strain populations usually employed have limited genetic resolution resulting in relatively large confidence intervals that greatly inhibit functional confirmation of statistical results. Here we use the increased resolution of the combined F(9) and F(10) generations (n = 1455) of the LG,SM advanced intercross to fine-map previously identified QTL associated with the lengths of the humerus, ulna, femur, and tibia. We detected 81 QTL affecting long-bone lengths. Of these, 49 were previously identified in the combined F(2)-F(3) population of this intercross, while 32 represent novel contributors to trait variance. Pleiotropy analysis suggests that most QTL affect three to four long bones or serially homologous limb segments. We also identified 72 epistatic interactions involving 38 QTL and 88 novel regions. This analysis shows that using later generations of an advanced intercross greatly facilitates fine-mapping of confidence intervals, resolving three F(2)-F(3) QTL into multiple linked loci and narrowing confidence intervals of other loci, as well as allowing identification of additional QTL. Further characterization of the biological bases of these QTL will help provide a better understanding of the genetics of small variations in long-bone length.
NASA Astrophysics Data System (ADS)
Lotfy, Hayam M.; Mohamed, Dalia; Elshahed, Mona S.
2018-01-01
In the presented work several spectrophotometric methods were performed for the quantification of canagliflozin (CGZ) and metformin hydrochloride (MTF) simultaneously in their binary mixture. Two of these methods; response correlation (RC) and advanced balance point-spectrum subtraction (ABP-SS) were developed and introduced for the first time in this work, where the latter method (ABP-SS) was performed on both the zero order and the first derivative spectra of the drugs. Besides, two recently established methods; advanced amplitude modulation (AAM) and advanced absorbance subtraction (AAS) were also accomplished. All the proposed methods were validated in accordance to the ICH guidelines, where all methods were proved to be accurate and precise. Additionally, the linearity range, limit of detection and limit of quantification were determined and the selectivity was examined through the analysis of laboratory prepared mixtures and the combined dosage form of the drugs. The proposed methods were capable of determining the two drugs in the ratio present in the pharmaceutical formulation CGZ:MTF (1:17) without the requirement of any preliminary separation, further dilution or standard spiking. The results obtained by the proposed methods were in compliance with the reported chromatographic method when compared statistically, proving the absence of any significant difference in accuracy and precision between the proposed and reported methods.
Standardized data collection to build prediction models in oncology: a prototype for rectal cancer.
Meldolesi, Elisa; van Soest, Johan; Damiani, Andrea; Dekker, Andre; Alitto, Anna Rita; Campitelli, Maura; Dinapoli, Nicola; Gatta, Roberto; Gambacorta, Maria Antonietta; Lanzotti, Vito; Lambin, Philippe; Valentini, Vincenzo
2016-01-01
The advances in diagnostic and treatment technology are responsible for a remarkable transformation in the internal medicine concept with the establishment of a new idea of personalized medicine. Inter- and intra-patient tumor heterogeneity and the clinical outcome and/or treatment's toxicity's complexity, justify the effort to develop predictive models from decision support systems. However, the number of evaluated variables coming from multiple disciplines: oncology, computer science, bioinformatics, statistics, genomics, imaging, among others could be very large thus making traditional statistical analysis difficult to exploit. Automated data-mining processes and machine learning approaches can be a solution to organize the massive amount of data, trying to unravel important interaction. The purpose of this paper is to describe the strategy to collect and analyze data properly for decision support and introduce the concept of an 'umbrella protocol' within the framework of 'rapid learning healthcare'.
Mass univariate analysis of event-related brain potentials/fields I: a critical tutorial review.
Groppe, David M; Urbach, Thomas P; Kutas, Marta
2011-12-01
Event-related potentials (ERPs) and magnetic fields (ERFs) are typically analyzed via ANOVAs on mean activity in a priori windows. Advances in computing power and statistics have produced an alternative, mass univariate analyses consisting of thousands of statistical tests and powerful corrections for multiple comparisons. Such analyses are most useful when one has little a priori knowledge of effect locations or latencies, and for delineating effect boundaries. Mass univariate analyses complement and, at times, obviate traditional analyses. Here we review this approach as applied to ERP/ERF data and four methods for multiple comparison correction: strong control of the familywise error rate (FWER) via permutation tests, weak control of FWER via cluster-based permutation tests, false discovery rate control, and control of the generalized FWER. We end with recommendations for their use and introduce free MATLAB software for their implementation. Copyright © 2011 Society for Psychophysiological Research.
Challenges of assessing critical thinking and clinical judgment in nurse practitioner students.
Gorton, Karen L; Hayes, Janice
2014-03-01
The purpose of this study was to determine whether there was a relationship between critical thinking skills and clinical judgment in nurse practitioner students. The study used a convenience, nonprobability sampling technique, engaging participants from across the United States. Correlational analysis demonstrated no statistically significant relationship between critical thinking skills and examination-style questions, critical thinking skills and scores on the evaluation and reevaluation of consequences subscale of the Clinical Decision Making in Nursing Scale, and critical thinking skills and the preceptor evaluation tool. The study found no statistically significant relationships between critical thinking skills and clinical judgment. Educators and practitioners could consider further research in these areas to gain insight into how critical thinking is and could be measured, to gain insight into the clinical decision making skills of nurse practitioner students, and to gain insight into the development and measurement of critical thinking skills in advanced practice educational programs. Copyright 2014, SLACK Incorporated.
New efficient optimizing techniques for Kalman filters and numerical weather prediction models
NASA Astrophysics Data System (ADS)
Famelis, Ioannis; Galanis, George; Liakatas, Aristotelis
2016-06-01
The need for accurate local environmental predictions and simulations beyond the classical meteorological forecasts are increasing the last years due to the great number of applications that are directly or not affected: renewable energy resource assessment, natural hazards early warning systems, global warming and questions on the climate change can be listed among them. Within this framework the utilization of numerical weather and wave prediction systems in conjunction with advanced statistical techniques that support the elimination of the model bias and the reduction of the error variability may successfully address the above issues. In the present work, new optimization methods are studied and tested in selected areas of Greece where the use of renewable energy sources is of critical. The added value of the proposed work is due to the solid mathematical background adopted making use of Information Geometry and Statistical techniques, new versions of Kalman filters and state of the art numerical analysis tools.
NASA Astrophysics Data System (ADS)
Liao, Yuqi
The utility patent, as a legal record of invention, is widely believed to be a close proxy for innovation among firms, industries, and economies as a whole. One of the critical drivers of patenting -- and ultimately, innovation -- is education. The science, technology, engineering and math (STEM) fields in education are of special importance. There is, however, little empirical research to substantiate a connection between STEM education and innovation outcomes. Seeking to fill this gap, this paper finds that, in general, there is no evidence of a meaningful relationship between STEM educational attainment and utility patent conferrals. The relationship of interest, though generally not statistically significant, is stronger for temporary US visa holders than for US citizens or permanent US residents. However, I find a large and statistically significant association between STEM educational attainment and utility patent conferrals for states that have above-average college educational attainment or above-average advanced industries workforce concentration.
NASA Astrophysics Data System (ADS)
Campbell, B. D.; Higgins, S. R.
2008-12-01
Developing a method for bridging the gap between macroscopic and microscopic measurements of reaction kinetics at the mineral-water interface has important implications in geological and chemical fields. Investigating these reactions on the nanometer scale with SPM is often limited by image analysis and data extraction due to the large quantity of data usually obtained in SPM experiments. Here we present a computer algorithm for automated analysis of mineral-water interface reactions. This algorithm automates the analysis of sequential SPM images by identifying the kinetically active surface sites (i.e., step edges), and by tracking the displacement of these sites from image to image. The step edge positions in each image are readily identified and tracked through time by a standard edge detection algorithm followed by statistical analysis on the Hough Transform of the edge-mapped image. By quantifying this displacement as a function of time, the rate of step edge displacement is determined. Furthermore, the total edge length, also determined from analysis of the Hough Transform, combined with the computed step speed, yields the surface area normalized rate of the reaction. The algorithm was applied to a study of the spiral growth of the calcite(104) surface from supersaturated solutions, yielding results almost 20 times faster than performing this analysis by hand, with results being statistically similar for both analysis methods. This advance in analysis of kinetic data from SPM images will facilitate the building of experimental databases on the microscopic kinetics of mineral-water interface reactions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, Andrew; Haass, Michael; Rintoul, Mark Daniel
GazeAppraise advances the state of the art of gaze pattern analysis using methods that simultaneously analyze spatial and temporal characteristics of gaze patterns. GazeAppraise enables novel research in visual perception and cognition; for example, using shape features as distinguishing elements to assess individual differences in visual search strategy. Given a set of point-to-point gaze sequences, hereafter referred to as scanpaths, the method constructs multiple descriptive features for each scanpath. Once the scanpath features have been calculated, they are used to form a multidimensional vector representing each scanpath and cluster analysis is performed on the set of vectors from all scanpaths.more » An additional benefit of this method is the identification of causal or correlated characteristics of the stimuli, subjects, and visual task through statistical analysis of descriptive metadata distributions within and across clusters.« less
Combined slope ratio analysis and linear-subtraction: An extension of the Pearce ratio method
NASA Astrophysics Data System (ADS)
De Waal, Sybrand A.
1996-07-01
A new technique, called combined slope ratio analysis, has been developed by extending the Pearce element ratio or conserved-denominator method (Pearce, 1968) to its logical conclusions. If two stoichiometric substances are mixed and certain chemical components are uniquely contained in either one of the two mixing substances, then by treating these unique components as conserved, the composition of the substance not containing the relevant component can be accurately calculated within the limits allowed by analytical and geological error. The calculated composition can then be subjected to rigorous statistical testing using the linear-subtraction method recently advanced by Woronow (1994). Application of combined slope ratio analysis to the rocks of the Uwekahuna Laccolith, Hawaii, USA, and the lavas of the 1959-summit eruption of Kilauea Volcano, Hawaii, USA, yields results that are consistent with field observations.
NASA Technical Reports Server (NTRS)
Landmann, A. E.; Tillema, H. F.; Macgregor, G. R.
1992-01-01
Finite element analysis (FEA), statistical energy analysis (SEA), and a power flow method (computer program PAIN) were used to assess low frequency interior noise associated with advanced propeller installations. FEA and SEA models were used to predict cabin noise and vibration and evaluate suppression concepts for structure-borne noise associated with the shaft rotational frequency and harmonics (less than 100 Hz). SEA and PAIN models were used to predict cabin noise and vibration and evaluate suppression concepts for airborne noise associated with engine radiated propeller tones. Both aft-mounted and wing-mounted propeller configurations were evaluated. Ground vibration test data from a 727 airplane modified to accept a propeller engine were used to compare with predictions for the aft-mounted propeller. Similar data from the 767 airplane was used for the wing-mounted comparisons.