The Math Problem: Advertising Students' Attitudes toward Statistics
ERIC Educational Resources Information Center
Fullerton, Jami A.; Kendrick, Alice
2013-01-01
This study used the Students' Attitudes toward Statistics Scale (STATS) to measure attitude toward statistics among a national sample of advertising students. A factor analysis revealed four underlying factors make up the attitude toward statistics construct--"Interest & Future Applicability," "Confidence," "Statistical Tools," and "Initiative."…
DOT National Transportation Integrated Search
2000-06-01
This report uses statistical analysis of community-level characteristics and qualitatively focused case studies to explore what determines the success of local transportation-related tax measures. The report contains both a statistical analysis of lo...
Mathematical background and attitudes toward statistics in a sample of Spanish college students.
Carmona, José; Martínez, Rafael J; Sánchez, Manuel
2005-08-01
To examine the relation of mathematical background and initial attitudes toward statistics of Spanish college students in social sciences the Survey of Attitudes Toward Statistics was given to 827 students. Multivariate analyses tested the effects of two indicators of mathematical background (amount of exposure and achievement in previous courses) on the four subscales. Analysis suggested grades in previous courses are more related to initial attitudes toward statistics than the number of mathematics courses taken. Mathematical background was related with students' affective responses to statistics but not with their valuing of statistics. Implications of possible research are discussed.
DOT National Transportation Integrated Search
2000-06-01
This report uses statistical analysis of community-level characteristics and qualitatively focused case studies to explore what determines the success of local transportation-related tax measures. The report contains both a statistical analysis of lo...
Ignjatović, Aleksandra; Stojanović, Miodrag; Milošević, Zoran; Anđelković Apostolović, Marija
2017-12-02
The interest in developing risk models in medicine not only is appealing, but also associated with many obstacles in different aspects of predictive model development. Initially, the association of biomarkers or the association of more markers with the specific outcome was proven by statistical significance, but novel and demanding questions required the development of new and more complex statistical techniques. Progress of statistical analysis in biomedical research can be observed the best through the history of the Framingham study and development of the Framingham score. Evaluation of predictive models comes from a combination of the facts which are results of several metrics. Using logistic regression and Cox proportional hazards regression analysis, the calibration test, and the ROC curve analysis should be mandatory and eliminatory, and the central place should be taken by some new statistical techniques. In order to obtain complete information related to the new marker in the model, recently, there is a recommendation to use the reclassification tables by calculating the net reclassification index and the integrated discrimination improvement. Decision curve analysis is a novel method for evaluating the clinical usefulness of a predictive model. It may be noted that customizing and fine-tuning of the Framingham risk score initiated the development of statistical analysis. Clinically applicable predictive model should be a trade-off between all abovementioned statistical metrics, a trade-off between calibration and discrimination, accuracy and decision-making, costs and benefits, and quality and quantity of patient's life.
Zhu, Wensheng; Yuan, Ying; Zhang, Jingwen; Zhou, Fan; Knickmeyer, Rebecca C; Zhu, Hongtu
2017-02-01
The aim of this paper is to systematically evaluate a biased sampling issue associated with genome-wide association analysis (GWAS) of imaging phenotypes for most imaging genetic studies, including the Alzheimer's Disease Neuroimaging Initiative (ADNI). Specifically, the original sampling scheme of these imaging genetic studies is primarily the retrospective case-control design, whereas most existing statistical analyses of these studies ignore such sampling scheme by directly correlating imaging phenotypes (called the secondary traits) with genotype. Although it has been well documented in genetic epidemiology that ignoring the case-control sampling scheme can produce highly biased estimates, and subsequently lead to misleading results and suspicious associations, such findings are not well documented in imaging genetics. We use extensive simulations and a large-scale imaging genetic data analysis of the Alzheimer's Disease Neuroimaging Initiative (ADNI) data to evaluate the effects of the case-control sampling scheme on GWAS results based on some standard statistical methods, such as linear regression methods, while comparing it with several advanced statistical methods that appropriately adjust for the case-control sampling scheme. Copyright © 2016 Elsevier Inc. All rights reserved.
STRengthening analytical thinking for observational studies: the STRATOS initiative.
Sauerbrei, Willi; Abrahamowicz, Michal; Altman, Douglas G; le Cessie, Saskia; Carpenter, James
2014-12-30
The validity and practical utility of observational medical research depends critically on good study design, excellent data quality, appropriate statistical methods and accurate interpretation of results. Statistical methodology has seen substantial development in recent times. Unfortunately, many of these methodological developments are ignored in practice. Consequently, design and analysis of observational studies often exhibit serious weaknesses. The lack of guidance on vital practical issues discourages many applied researchers from using more sophisticated and possibly more appropriate methods when analyzing observational studies. Furthermore, many analyses are conducted by researchers with a relatively weak statistical background and limited experience in using statistical methodology and software. Consequently, even 'standard' analyses reported in the medical literature are often flawed, casting doubt on their results and conclusions. An efficient way to help researchers to keep up with recent methodological developments is to develop guidance documents that are spread to the research community at large. These observations led to the initiation of the strengthening analytical thinking for observational studies (STRATOS) initiative, a large collaboration of experts in many different areas of biostatistical research. The objective of STRATOS is to provide accessible and accurate guidance in the design and analysis of observational studies. The guidance is intended for applied statisticians and other data analysts with varying levels of statistical education, experience and interests. In this article, we introduce the STRATOS initiative and its main aims, present the need for guidance documents and outline the planned approach and progress so far. We encourage other biostatisticians to become involved. © 2014 The Authors. Statistics in Medicine published by John Wiley & Sons, Ltd.
STRengthening Analytical Thinking for Observational Studies: the STRATOS initiative
Sauerbrei, Willi; Abrahamowicz, Michal; Altman, Douglas G; le Cessie, Saskia; Carpenter, James
2014-01-01
The validity and practical utility of observational medical research depends critically on good study design, excellent data quality, appropriate statistical methods and accurate interpretation of results. Statistical methodology has seen substantial development in recent times. Unfortunately, many of these methodological developments are ignored in practice. Consequently, design and analysis of observational studies often exhibit serious weaknesses. The lack of guidance on vital practical issues discourages many applied researchers from using more sophisticated and possibly more appropriate methods when analyzing observational studies. Furthermore, many analyses are conducted by researchers with a relatively weak statistical background and limited experience in using statistical methodology and software. Consequently, even ‘standard’ analyses reported in the medical literature are often flawed, casting doubt on their results and conclusions. An efficient way to help researchers to keep up with recent methodological developments is to develop guidance documents that are spread to the research community at large. These observations led to the initiation of the strengthening analytical thinking for observational studies (STRATOS) initiative, a large collaboration of experts in many different areas of biostatistical research. The objective of STRATOS is to provide accessible and accurate guidance in the design and analysis of observational studies. The guidance is intended for applied statisticians and other data analysts with varying levels of statistical education, experience and interests. In this article, we introduce the STRATOS initiative and its main aims, present the need for guidance documents and outline the planned approach and progress so far. We encourage other biostatisticians to become involved. PMID:25074480
Bortoletto, Carolina Carvalho; Cordeiro da Silva, Fernanda; Silva, Paula Fernanda da Costa; Leal de Godoy, Camila Haddad; Albertini, Regiane; Motta, Lara J; Mesquita-Ferrari, Raquel Agnelli; Fernandes, Kristianne Porta Santos; Romano, Renata; Bussadori, Sandra Kalil
2014-07-01
[Purpose] The aim of the present study was to evaluate the effect of a biteplate on the cranio-cervical posture of children with bruxism. [Subjects and Methods] Twelve male and female children aged six to 10 years with a diagnosis of bruxism participated in this study. The children used a biteplate during sleep for 30 days and were submitted to three postural evaluations: initial, immediately following placement of the biteplate, and at the end of treatment. Posture analysis was performed with the aid of the Alcimagem(®) 2.1 program. Data analysis (IBM SPSS Statistics 2.0) involved descriptive statistics and the Student's t-test. [Results] A statistically significant difference was found between the initial cranio-cervical angle and the angle immediately following placement of the biteplate. However, no statistically significant difference was found between the initial angle and the angle after one month of biteplate usage. [Conclusion] No significant change in the cranio-cervical posture of the children was found one month of biteplate usage. However, a reduction occurred in the cranio-cervical angle when the biteplate was in position.
Against Inferential Statistics: How and Why Current Statistics Teaching Gets It Wrong
ERIC Educational Resources Information Center
White, Patrick; Gorard, Stephen
2017-01-01
Recent concerns about a shortage of capacity for statistical and numerical analysis skills among social science students and researchers have prompted a range of initiatives aiming to improve teaching in this area. However, these projects have rarely re-evaluated the content of what is taught to students and have instead focussed primarily on…
Rojas-Rejón, Oscar A; Sánchez, Arturo
2014-07-01
This work studies the effect of initial solid load (4-32 %; w/v, DS) and particle size (0.41-50 mm) on monosaccharide yield of wheat straw subjected to dilute H(2)SO(4) (0.75 %, v/v) pretreatment and enzymatic saccharification. Response surface methodology (RSM) based on a full factorial design (FFD) was used for the statistical analysis of pretreatment and enzymatic hydrolysis. The highest xylose yield obtained during pretreatment (ca. 86 %; of theoretical) was achieved at 4 % (w/v, DS) and 25 mm. The solid fraction obtained from the first set of experiments was subjected to enzymatic hydrolysis at constant enzyme dosage (17 FPU/g); statistical analysis revealed that glucose yield was favored with solids pretreated at low initial solid loads and small particle sizes. Dynamic experiments showed that glucose yield did not increase after 48 h of enzymatic hydrolysis. Once established pretreatment conditions, experiments were carried out with several initial solid loading (4-24 %; w/v, DS) and enzyme dosages (5-50 FPU/g). Two straw sizes (0.41 and 50 mm) were used for verification purposes. The highest glucose yield (ca. 55 %; of theoretical) was achieved at 4 % (w/v, DS), 0.41 mm and 50 FPU/g. Statistical analysis of experiments showed that at low enzyme dosage, particle size had a remarkable effect over glucose yield and initial solid load was the main factor for glucose yield.
Building the Community Online Resource for Statistical Seismicity Analysis (CORSSA)
NASA Astrophysics Data System (ADS)
Michael, A. J.; Wiemer, S.; Zechar, J. D.; Hardebeck, J. L.; Naylor, M.; Zhuang, J.; Steacy, S.; Corssa Executive Committee
2010-12-01
Statistical seismology is critical to the understanding of seismicity, the testing of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology - especially to those aspects with great impact on public policy - statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA). CORSSA is a web-based educational platform that is authoritative, up-to-date, prominent, and user-friendly. We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each containing between four and eight articles. The CORSSA web page, www.corssa.org, officially unveiled on September 6, 2010, debuts with an initial set of approximately 10 to 15 articles available online for viewing and commenting with additional articles to be added over the coming months. Each article will be peer-reviewed and will present a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles will include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. A special article will compare and review available statistical seismology software packages.
An Analysis of the Crash Experience of Vehicles Equipped with Antilock Braking System
DOT National Transportation Integrated Search
1995-06-01
National Center for Statistics and Analysis has recently completed an initial : analysis of the crash experience of passenger cars (PCs) and light trucks and : vans (LTVs) equipped with antilock braking systems (ABS). Four types of crashes : were ide...
NASA Technical Reports Server (NTRS)
Baker, K. B.; Sturrock, P. A.
1975-01-01
The question of whether pulsars form a single group or whether pulsars come in two or more different groups is discussed. It is proposed that such groups might be related to several factors such as the initial creation of the neutron star, or the orientation of the magnetic field axis with the spin axis. Various statistical models are examined.
NASA Astrophysics Data System (ADS)
Yang, Haoyu; Hattori, Ken
2018-03-01
We studied the initial stage of iron deposition on an ethanol-saturated Si(111)7 × 7 surface at room temperature using scanning tunneling microscopy (STM). The statistical analysis of the Si adatom height at empty states for Si(111)-C2H5OH before and after the Fe deposition showed different types of adatoms: type B (before the deposition) and type B' (after the deposition) assigned to bare adatoms, type D and type D' to C2H5O-terminated adatoms, and type E' to adatoms with Fe. The analysis of the height distribution revealed the protection of the molecule termination for the Fe capture at the initial stage. The analysis also indicated the preferential capture of a single Fe atom to a bare center-adatom rather than a bare corner-adatom which remain after the C2H5OH saturation, but no selectivity was observed in faulted and unfaulted half unit-cells. This is the first STM-based report proving that a remaining bare adatom, but not a molecule-terminated adatom, captures a metal.
Properties of some statistics for AR-ARCH model with application to technical analysis
NASA Astrophysics Data System (ADS)
Huang, Xudong; Liu, Wei
2009-03-01
In this paper, we investigate some popular technical analysis indexes for AR-ARCH model as real stock market. Under the given conditions, we show that the corresponding statistics are asymptotically stationary and the law of large numbers hold for frequencies of the stock prices falling out normal scope of these technical analysis indexes under AR-ARCH, and give the rate of convergence in the case of nonstationary initial values, which give a mathematical rationale for these methods of technical analysis in supervising the security trends.
Statistical Methods of Latent Structure Discovery in Child-Directed Speech
ERIC Educational Resources Information Center
Panteleyeva, Natalya B.
2010-01-01
This dissertation investigates how distributional information in the speech stream can assist infants in the initial stages of acquisition of their native language phonology. An exploratory statistical analysis derives this information from the adult speech data in the corpus of conversations between adults and young children in Russian. Because…
Quantitative trait nucleotide analysis using Bayesian model selection.
Blangero, John; Goring, Harald H H; Kent, Jack W; Williams, Jeff T; Peterson, Charles P; Almasy, Laura; Dyer, Thomas D
2005-10-01
Although much attention has been given to statistical genetic methods for the initial localization and fine mapping of quantitative trait loci (QTLs), little methodological work has been done to date on the problem of statistically identifying the most likely functional polymorphisms using sequence data. In this paper we provide a general statistical genetic framework, called Bayesian quantitative trait nucleotide (BQTN) analysis, for assessing the likely functional status of genetic variants. The approach requires the initial enumeration of all genetic variants in a set of resequenced individuals. These polymorphisms are then typed in a large number of individuals (potentially in families), and marker variation is related to quantitative phenotypic variation using Bayesian model selection and averaging. For each sequence variant a posterior probability of effect is obtained and can be used to prioritize additional molecular functional experiments. An example of this quantitative nucleotide analysis is provided using the GAW12 simulated data. The results show that the BQTN method may be useful for choosing the most likely functional variants within a gene (or set of genes). We also include instructions on how to use our computer program, SOLAR, for association analysis and BQTN analysis.
Statistical representation of a spray as a point process
NASA Astrophysics Data System (ADS)
Subramaniam, S.
2000-10-01
The statistical representation of a spray as a finite point process is investigated. One objective is to develop a better understanding of how single-point statistical information contained in descriptions such as the droplet distribution function (ddf), relates to the probability density functions (pdfs) associated with the droplets themselves. Single-point statistical information contained in the droplet distribution function (ddf) is shown to be related to a sequence of single surrogate-droplet pdfs, which are in general different from the physical single-droplet pdfs. It is shown that the ddf contains less information than the fundamental single-point statistical representation of the spray, which is also described. The analysis shows which events associated with the ensemble of spray droplets can be characterized by the ddf, and which cannot. The implications of these findings for the ddf approach to spray modeling are discussed. The results of this study also have important consequences for the initialization and evolution of direct numerical simulations (DNS) of multiphase flows, which are usually initialized on the basis of single-point statistics such as the droplet number density in physical space. If multiphase DNS are initialized in this way, this implies that even the initial representation contains certain implicit assumptions concerning the complete ensemble of realizations, which are invalid for general multiphase flows. Also the evolution of a DNS initialized in this manner is shown to be valid only if an as yet unproven commutation hypothesis holds true. Therefore, it is questionable to what extent DNS that are initialized in this manner constitute a direct simulation of the physical droplets. Implications of these findings for large eddy simulations of multiphase flows are also discussed.
Analysis of reference transactions using packaged computer programs.
Calabretta, N; Ross, R
1984-01-01
Motivated by a continuing education class attended by the authors on the measurement of reference desk activities, the reference department at Scott Memorial Library initiated a project to gather data on reference desk transactions and to analyze the data by using packaged computer programs. The programs utilized for the project were SPSS (Statistical Package for the Social Sciences) and SAS (Statistical Analysis System). The planning, implementation and development of the project are described.
Small sample estimation of the reliability function for technical products
NASA Astrophysics Data System (ADS)
Lyamets, L. L.; Yakimenko, I. V.; Kanishchev, O. A.; Bliznyuk, O. A.
2017-12-01
It is demonstrated that, in the absence of big statistic samples obtained as a result of testing complex technical products for failure, statistic estimation of the reliability function of initial elements can be made by the moments method. A formal description of the moments method is given and its advantages in the analysis of small censored samples are discussed. A modified algorithm is proposed for the implementation of the moments method with the use of only the moments at which the failures of initial elements occur.
Kratochwill, Thomas R; Levin, Joel R
2014-04-01
In this commentary, we add to the spirit of the articles appearing in the special series devoted to meta- and statistical analysis of single-case intervention-design data. Following a brief discussion of historical factors leading to our initial involvement in statistical analysis of such data, we discuss: (a) the value added by including statistical-analysis recommendations in the What Works Clearinghouse Standards for single-case intervention designs; (b) the importance of visual analysis in single-case intervention research, along with the distinctive role that could be played by single-case effect-size measures; and (c) the elevated internal validity and statistical-conclusion validity afforded by the incorporation of various forms of randomization into basic single-case design structures. For the future, we envision more widespread application of quantitative analyses, as critical adjuncts to visual analysis, in both primary single-case intervention research studies and literature reviews in the behavioral, educational, and health sciences. Copyright © 2014 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Golik, V. V.; Zemenkova, M. Yu; Seroshtanov, I. V.; Begalko, Z. V.
2018-05-01
The paper presents the results of the analysis of statistical indicators of energy and resource consumption in oil and gas transportation by the example of one of the regions of Russia. The article analyzes engineering characteristics of compressor station drives. Official statistical bulletins on the fuel and energy resources of the region in the pipeline oil and gas transportation system were used as the initial data.
Paranjpe, Madhav G; Denton, Melissa D; Vidmar, Tom; Elbekai, Reem H
2014-01-01
Carcinogenicity studies have been performed in conventional 2-year rodent studies for at least 3 decades, whereas the short-term carcinogenicity studies in transgenic mice, such as Tg.rasH2, have only been performed over the last decade. In the 2-year conventional rodent studies, interlinked problems, such as increasing trends in the initial body weights, increased body weight gains, high incidence of spontaneous tumors, and low survival, that complicate the interpretation of findings have been well established. However, these end points have not been evaluated in the short-term carcinogenicity studies involving the Tg.rasH2 mice. In this article, we present retrospective analysis of data obtained from control groups in 26-week carcinogenicity studies conducted in Tg.rasH2 mice since 2004. Our analysis showed statistically significant decreasing trends in initial body weights of both sexes. Although the terminal body weights did not show any significant trends, there was a statistically significant increasing trend toward body weight gains, more so in males than in females, which correlated with increasing trends in the food consumption. There were no statistically significant alterations in mortality trends. In addition, the incidence of all common spontaneous tumors remained fairly constant with no statistically significant differences in trends. © The Author(s) 2014.
Students' Initial Knowledge State and Test Design: Towards a Valid and Reliable Test Instrument
ERIC Educational Resources Information Center
CoPo, Antonio Roland I.
2015-01-01
Designing a good test instrument involves specifications, test construction, validation, try-out, analysis and revision. The initial knowledge state of forty (40) tertiary students enrolled in Business Statistics course was determined and the same test instrument undergoes validation. The designed test instrument did not only reveal the baseline…
An improved method for determining force balance calibration accuracy
NASA Technical Reports Server (NTRS)
Ferris, Alice T.
1993-01-01
The results of an improved statistical method used at Langley Research Center for determining and stating the accuracy of a force balance calibration are presented. The application of the method for initial loads, initial load determination, auxiliary loads, primary loads, and proof loads is described. The data analysis is briefly addressed.
de Agostino Biella Passos, Vivian; de Carvalho Carrara, Cleide Felício; da Silva Dalben, Gisele; Costa, Beatriz; Gomide, Marcia Ribeiro
2014-03-01
To evaluate the prevalence of fistulas after palate repair and analyze their location and association with possible causal factors. Retrospective analysis of patient records and evaluation of preoperative initial photographs. Tertiary craniofacial center. Five hundred eighty-nine individuals with complete unilateral cleft lip and palate that underwent palate repair at the age of 12 to 36 months by the von Langenbeck technique, in a single stage, by the plastic surgery team of the hospital, from January 2003 to July 2007. The cleft width was visually classified by a single examiner as narrow, regular, or wide. The following regions of the palate were considered for the location: anterior, medium, transition (between hard and soft palate), and soft palate. Descriptive statistics and analysis of association between the occurrence of fistula and the different parameters were evaluated. Palatal fistulas were observed in 27% of the sample, with a greater proportion at the anterior region (37.11%). The chi-square statistical test revealed statistically significant association (P ≤ .05) between the fistulas and initial cleft width (P = .0003), intraoperative problems (P = .0037), and postoperative problems (P = .00002). The prevalence of palatal fistula was similar to mean values reported in the literature. Analysis of causal factors showed a positive association between palatal fistulas with wide and regular initial cleft width and intraoperative and postoperative problems. The anterior region presented the greatest occurrence of fistulas.
NASA Technical Reports Server (NTRS)
Aires, Filipe; Rossow, William B.; Chedin, Alain; Hansen, James E. (Technical Monitor)
2001-01-01
The Independent Component Analysis is a recently developed technique for component extraction. This new method requires the statistical independence of the extracted components, a stronger constraint that uses higher-order statistics, instead of the classical decorrelation, a weaker constraint that uses only second-order statistics. This technique has been used recently for the analysis of geophysical time series with the goal of investigating the causes of variability in observed data (i.e. exploratory approach). We demonstrate with a data simulation experiment that, if initialized with a Principal Component Analysis, the Independent Component Analysis performs a rotation of the classical PCA (or EOF) solution. This rotation uses no localization criterion like other Rotation Techniques (RT), only the global generalization of decorrelation by statistical independence is used. This rotation of the PCA solution seems to be able to solve the tendency of PCA to mix several physical phenomena, even when the signal is just their linear sum.
Orchestrating high-throughput genomic analysis with Bioconductor
Huber, Wolfgang; Carey, Vincent J.; Gentleman, Robert; Anders, Simon; Carlson, Marc; Carvalho, Benilton S.; Bravo, Hector Corrada; Davis, Sean; Gatto, Laurent; Girke, Thomas; Gottardo, Raphael; Hahne, Florian; Hansen, Kasper D.; Irizarry, Rafael A.; Lawrence, Michael; Love, Michael I.; MacDonald, James; Obenchain, Valerie; Oleś, Andrzej K.; Pagès, Hervé; Reyes, Alejandro; Shannon, Paul; Smyth, Gordon K.; Tenenbaum, Dan; Waldron, Levi; Morgan, Martin
2015-01-01
Bioconductor is an open-source, open-development software project for the analysis and comprehension of high-throughput data in genomics and molecular biology. The project aims to enable interdisciplinary research, collaboration and rapid development of scientific software. Based on the statistical programming language R, Bioconductor comprises 934 interoperable packages contributed by a large, diverse community of scientists. Packages cover a range of bioinformatic and statistical applications. They undergo formal initial review and continuous automated testing. We present an overview for prospective users and contributors. PMID:25633503
ERIC Educational Resources Information Center
Kleiner, Brian; Thomas, Nina; Lewis, Laurie
2007-01-01
This report presents findings from a 2006 national survey of all Title IV degree-granting 4- year postsecondary institutions on how teacher candidates within teacher education programs for initial licensure are being prepared to use educational technology once they enter the field. The "Educational Technology in Teacher Education Programs…
2014-12-01
example of maximizing or minimizing decision variables within a model. Carol Stoker and Stephen Mehay present a comparative analysis of marketing and advertising strategies...strategy development process; documenting various recruiting, marketing , and advertising initiatives in each nation; and examining efforts to
CORSSA: Community Online Resource for Statistical Seismicity Analysis
NASA Astrophysics Data System (ADS)
Zechar, J. D.; Hardebeck, J. L.; Michael, A. J.; Naylor, M.; Steacy, S.; Wiemer, S.; Zhuang, J.
2011-12-01
Statistical seismology is critical to the understanding of seismicity, the evaluation of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology-especially to those aspects with great impact on public policy-statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA, www.corssa.org). We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each will contain between four and eight articles. CORSSA now includes seven articles with an additional six in draft form along with forums for discussion, a glossary, and news about upcoming meetings, special issues, and recent papers. Each article is peer-reviewed and presents a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. We have also begun curating a collection of statistical seismology software packages.
NASA Astrophysics Data System (ADS)
Yepes-Calderon, Fernando; Brun, Caroline; Sant, Nishita; Thompson, Paul; Lepore, Natasha
2015-01-01
Tensor-Based Morphometry (TBM) is an increasingly popular method for group analysis of brain MRI data. The main steps in the analysis consist of a nonlinear registration to align each individual scan to a common space, and a subsequent statistical analysis to determine morphometric differences, or difference in fiber structure between groups. Recently, we implemented the Statistically-Assisted Fluid Registration Algorithm or SAFIRA,1 which is designed for tracking morphometric differences among populations. To this end, SAFIRA allows the inclusion of statistical priors extracted from the populations being studied as regularizers in the registration. This flexibility and degree of sophistication limit the tool to expert use, even more so considering that SAFIRA was initially implemented in command line mode. Here, we introduce a new, intuitive, easy to use, Matlab-based graphical user interface for SAFIRA's multivariate TBM. The interface also generates different choices for the TBM statistics, including both the traditional univariate statistics on the Jacobian matrix, and comparison of the full deformation tensors.2 This software will be freely disseminated to the neuroimaging research community.
Survival analysis and classification methods for forest fire size
2018-01-01
Factors affecting wildland-fire size distribution include weather, fuels, and fire suppression activities. We present a novel application of survival analysis to quantify the effects of these factors on a sample of sizes of lightning-caused fires from Alberta, Canada. Two events were observed for each fire: the size at initial assessment (by the first fire fighters to arrive at the scene) and the size at “being held” (a state when no further increase in size is expected). We developed a statistical classifier to try to predict cases where there will be a growth in fire size (i.e., the size at “being held” exceeds the size at initial assessment). Logistic regression was preferred over two alternative classifiers, with covariates consistent with similar past analyses. We conducted survival analysis on the group of fires exhibiting a size increase. A screening process selected three covariates: an index of fire weather at the day the fire started, the fuel type burning at initial assessment, and a factor for the type and capabilities of the method of initial attack. The Cox proportional hazards model performed better than three accelerated failure time alternatives. Both fire weather and fuel type were highly significant, with effects consistent with known fire behaviour. The effects of initial attack method were not statistically significant, but did suggest a reverse causality that could arise if fire management agencies were to dispatch resources based on a-priori assessment of fire growth potentials. We discuss how a more sophisticated analysis of larger data sets could produce unbiased estimates of fire suppression effect under such circumstances. PMID:29320497
Survival analysis and classification methods for forest fire size.
Tremblay, Pier-Olivier; Duchesne, Thierry; Cumming, Steven G
2018-01-01
Factors affecting wildland-fire size distribution include weather, fuels, and fire suppression activities. We present a novel application of survival analysis to quantify the effects of these factors on a sample of sizes of lightning-caused fires from Alberta, Canada. Two events were observed for each fire: the size at initial assessment (by the first fire fighters to arrive at the scene) and the size at "being held" (a state when no further increase in size is expected). We developed a statistical classifier to try to predict cases where there will be a growth in fire size (i.e., the size at "being held" exceeds the size at initial assessment). Logistic regression was preferred over two alternative classifiers, with covariates consistent with similar past analyses. We conducted survival analysis on the group of fires exhibiting a size increase. A screening process selected three covariates: an index of fire weather at the day the fire started, the fuel type burning at initial assessment, and a factor for the type and capabilities of the method of initial attack. The Cox proportional hazards model performed better than three accelerated failure time alternatives. Both fire weather and fuel type were highly significant, with effects consistent with known fire behaviour. The effects of initial attack method were not statistically significant, but did suggest a reverse causality that could arise if fire management agencies were to dispatch resources based on a-priori assessment of fire growth potentials. We discuss how a more sophisticated analysis of larger data sets could produce unbiased estimates of fire suppression effect under such circumstances.
Organization and Visualization for Initial Analysis of Forced-Choice Ipsative Data
ERIC Educational Resources Information Center
Cochran, Jill A.
2015-01-01
Forced-choice ipsative data are common in personality, philosophy and other preference-based studies. However, this type of data inherently contains dependencies that are challenging for usual statistical analysis. In order to utilize the structure of the data as a guide for analysis rather than as a challenge to manage, a visualisation tool was…
Alaska national hydrography dataset positional accuracy assessment study
Arundel, Samantha; Yamamoto, Kristina H.; Constance, Eric; Mantey, Kim; Vinyard-Houx, Jeremy
2013-01-01
Initial visual assessments Wide range in the quality of fit between features in NHD and these new image sources. No statistical analysis has been performed to actually quantify accuracy Determining absolute accuracy is cost prohibitive (must collect independent, well defined test points) Quantitative analysis of relative positional error is feasible.
NASA Technical Reports Server (NTRS)
Roth, D. J.; Swickard, S. M.; Stang, D. B.; Deguire, M. R.
1991-01-01
A review and statistical analysis of the ultrasonic velocity method for estimating the porosity fraction in polycrystalline materials is presented. Initially, a semiempirical model is developed showing the origin of the linear relationship between ultrasonic velocity and porosity fraction. Then, from a compilation of data produced by many researchers, scatter plots of velocity versus percent porosity data are shown for Al2O3, MgO, porcelain-based ceramics, PZT, SiC, Si3N4, steel, tungsten, UO2,(U0.30Pu0.70)C, and YBa2Cu3O(7-x). Linear regression analysis produces predicted slope, intercept, correlation coefficient, level of significance, and confidence interval statistics for the data. Velocity values predicted from regression analysis of fully-dense materials are in good agreement with those calculated from elastic properties.
NASA Technical Reports Server (NTRS)
Roth, D. J.; Swickard, S. M.; Stang, D. B.; Deguire, M. R.
1990-01-01
A review and statistical analysis of the ultrasonic velocity method for estimating the porosity fraction in polycrystalline materials is presented. Initially, a semi-empirical model is developed showing the origin of the linear relationship between ultrasonic velocity and porosity fraction. Then, from a compilation of data produced by many researchers, scatter plots of velocity versus percent porosity data are shown for Al2O3, MgO, porcelain-based ceramics, PZT, SiC, Si3N4, steel, tungsten, UO2,(U0.30Pu0.70)C, and YBa2Cu3O(7-x). Linear regression analysis produced predicted slope, intercept, correlation coefficient, level of significance, and confidence interval statistics for the data. Velocity values predicted from regression analysis for fully-dense materials are in good agreement with those calculated from elastic properties.
Validating an Air Traffic Management Concept of Operation Using Statistical Modeling
NASA Technical Reports Server (NTRS)
He, Yuning; Davies, Misty Dawn
2013-01-01
Validating a concept of operation for a complex, safety-critical system (like the National Airspace System) is challenging because of the high dimensionality of the controllable parameters and the infinite number of states of the system. In this paper, we use statistical modeling techniques to explore the behavior of a conflict detection and resolution algorithm designed for the terminal airspace. These techniques predict the robustness of the system simulation to both nominal and off-nominal behaviors within the overall airspace. They also can be used to evaluate the output of the simulation against recorded airspace data. Additionally, the techniques carry with them a mathematical value of the worth of each prediction-a statistical uncertainty for any robustness estimate. Uncertainty Quantification (UQ) is the process of quantitative characterization and ultimately a reduction of uncertainties in complex systems. UQ is important for understanding the influence of uncertainties on the behavior of a system and therefore is valuable for design, analysis, and verification and validation. In this paper, we apply advanced statistical modeling methodologies and techniques on an advanced air traffic management system, namely the Terminal Tactical Separation Assured Flight Environment (T-TSAFE). We show initial results for a parameter analysis and safety boundary (envelope) detection in the high-dimensional parameter space. For our boundary analysis, we developed a new sequential approach based upon the design of computer experiments, allowing us to incorporate knowledge from domain experts into our modeling and to determine the most likely boundary shapes and its parameters. We carried out the analysis on system parameters and describe an initial approach that will allow us to include time-series inputs, such as the radar track data, into the analysis
ERIC Educational Resources Information Center
Ba, Harouna; Meade, Terri; Pierson, Elizabeth; Ferguson, Camille; Roy, Amanda; Williams, Hakim
2009-01-01
Forrest County Agricultural High School (FCAHS) is located in Brooklyn, a small rural town in southern Mississippi and part of the Hattiesburg Metropolitan Statistical Area. Unlike the other schools that participated in the Cisco 21S initiative, FCAHS is not part of a larger school district. Therefore, the unit of analysis throughout this summary…
Automated Box-Cox Transformations for Improved Visual Encoding.
Maciejewski, Ross; Pattath, Avin; Ko, Sungahn; Hafen, Ryan; Cleveland, William S; Ebert, David S
2013-01-01
The concept of preconditioning data (utilizing a power transformation as an initial step) for analysis and visualization is well established within the statistical community and is employed as part of statistical modeling and analysis. Such transformations condition the data to various inherent assumptions of statistical inference procedures, as well as making the data more symmetric and easier to visualize and interpret. In this paper, we explore the use of the Box-Cox family of power transformations to semiautomatically adjust visual parameters. We focus on time-series scaling, axis transformations, and color binning for choropleth maps. We illustrate the usage of this transformation through various examples, and discuss the value and some issues in semiautomatically using these transformations for more effective data visualization.
Code of Federal Regulations, 2010 CFR
2010-07-01
... by the Administrator. (1) Statistical analysis of initial water penetration data performed to support ASTM Designation D2099-00 indicates that poor quantitative precision is associated with this testing...
An Initial Survey of Fractional Graph and Table Area in Behavioral Journals
ERIC Educational Resources Information Center
Kubina, Richard M., Jr.; Kostewicz, Douglas E.; Datchuck, Shawn M.
2008-01-01
This study examined the fractional graph area (FGA), the proportion of page space used to display statistical graphics, in 11 behavioral journals and places behavior analysis on a continuum with other natural, mathematical, and social science disciplines. The composite FGA of all 11 journals puts behavior analysis within the range of the social…
Finite Element Analysis of Reverberation Chambers
NASA Technical Reports Server (NTRS)
Bunting, Charles F.; Nguyen, Duc T.
2000-01-01
The primary motivating factor behind the initiation of this work was to provide a deterministic means of establishing the validity of the statistical methods that are recommended for the determination of fields that interact in -an avionics system. The application of finite element analysis to reverberation chambers is the initial step required to establish a reasonable course of inquiry in this particularly data-intensive study. The use of computational electromagnetics provides a high degree of control of the "experimental" parameters that can be utilized in a simulation of reverberating structures. As the work evolved there were four primary focus areas they are: 1. The eigenvalue problem for the source free problem. 2. The development of a complex efficient eigensolver. 3. The application of a source for the TE and TM fields for statistical characterization. 4. The examination of shielding effectiveness in a reverberating environment. One early purpose of this work was to establish the utility of finite element techniques in the development of an extended low frequency statistical model for reverberation phenomena. By employing finite element techniques, structures of arbitrary complexity can be analyzed due to the use of triangular shape functions in the spatial discretization. The effects of both frequency stirring and mechanical stirring are presented. It is suggested that for the low frequency operation the typical tuner size is inadequate to provide a sufficiently random field and that frequency stirring should be used. The results of the finite element analysis of the reverberation chamber illustrate io-W the potential utility of a 2D representation for enhancing the basic statistical characteristics of the chamber when operating in a low frequency regime. The basic field statistics are verified for frequency stirring over a wide range of frequencies. Mechanical stirring is shown to provide an effective frequency deviation.
Asquith, William H.; Roussel, Meghan C.
2007-01-01
Estimation of representative hydrographs from design storms, which are known as design hydrographs, provides for cost-effective, riskmitigated design of drainage structures such as bridges, culverts, roadways, and other infrastructure. During 2001?07, the U.S. Geological Survey (USGS), in cooperation with the Texas Department of Transportation, investigated runoff hydrographs, design storms, unit hydrographs,and watershed-loss models to enhance design hydrograph estimation in Texas. Design hydrographs ideally should mimic the general volume, peak, and shape of observed runoff hydrographs. Design hydrographs commonly are estimated in part by unit hydrographs. A unit hydrograph is defined as the runoff hydrograph that results from a unit pulse of excess rainfall uniformly distributed over the watershed at a constant rate for a specific duration. A time-distributed, watershed-loss model is required for modeling by unit hydrographs. This report develops a specific time-distributed, watershed-loss model known as an initial-abstraction, constant-loss model. For this watershed-loss model, a watershed is conceptualized to have the capacity to store or abstract an absolute depth of rainfall at and near the beginning of a storm. Depths of total rainfall less than this initial abstraction do not produce runoff. The watershed also is conceptualized to have the capacity to remove rainfall at a constant rate (loss) after the initial abstraction is satisfied. Additional rainfall inputs after the initial abstraction is satisfied contribute to runoff if the rainfall rate (intensity) is larger than the constant loss. The initial abstraction, constant-loss model thus is a two-parameter model. The initial-abstraction, constant-loss model is investigated through detailed computational and statistical analysis of observed rainfall and runoff data for 92 USGS streamflow-gaging stations (watersheds) in Texas with contributing drainage areas from 0.26 to 166 square miles. The analysis is limited to a previously described, watershed-specific, gamma distribution model of the unit hydrograph. In particular, the initial-abstraction, constant-loss model is tuned to the gamma distribution model of the unit hydrograph. A complex computational analysis of observed rainfall and runoff for the 92 watersheds was done to determine, by storm, optimal values of initial abstraction and constant loss. Optimal parameter values for a given storm were defined as those values that produced a modeled runoff hydrograph with volume equal to the observed runoff hydrograph and also minimized the residual sum of squares of the two hydrographs. Subsequently, the means of the optimal parameters were computed on a watershed-specific basis. These means for each watershed are considered the most representative, are tabulated, and are used in further statistical analyses. Statistical analyses of watershed-specific, initial abstraction and constant loss include documentation of the distribution of each parameter using the generalized lambda distribution. The analyses show that watershed development has substantial influence on initial abstraction and limited influence on constant loss. The means and medians of the 92 watershed-specific parameters are tabulated with respect to watershed development; although they have considerable uncertainty, these parameters can be used for parameter prediction for ungaged watersheds. The statistical analyses of watershed-specific, initial abstraction and constant loss also include development of predictive procedures for estimation of each parameter for ungaged watersheds. Both regression equations and regression trees for estimation of initial abstraction and constant loss are provided. The watershed characteristics included in the regression analyses are (1) main-channel length, (2) a binary factor representing watershed development, (3) a binary factor representing watersheds with an abundance of rocky and thin-soiled terrain, and (4) curve numb
eSACP - a new Nordic initiative towards developing statistical climate services
NASA Astrophysics Data System (ADS)
Thorarinsdottir, Thordis; Thejll, Peter; Drews, Martin; Guttorp, Peter; Venälainen, Ari; Uotila, Petteri; Benestad, Rasmus; Mesquita, Michel d. S.; Madsen, Henrik; Fox Maule, Cathrine
2015-04-01
The Nordic research council NordForsk has recently announced its support for a new 3-year research initiative on "statistical analysis of climate projections" (eSACP). eSACP will focus on developing e-science tools and services based on statistical analysis of climate projections for the purpose of helping decision-makers and planners in the face of expected future challenges in regional climate change. The motivation behind the project is the growing recognition in our society that forecasts of future climate change is associated with various sources of uncertainty, and that any long-term planning and decision-making dependent on a changing climate must account for this. At the same time there is an obvious gap between scientists from different fields and between practitioners in terms of understanding how climate information relates to different parts of the "uncertainty cascade". In eSACP we will develop generic e-science tools and statistical climate services to facilitate the use of climate projections by decision-makers and scientists from all fields for climate impact analyses and for the development of robust adaptation strategies, which properly (in a statistical sense) account for the inherent uncertainty. The new tool will be publically available and include functionality to utilize the extensive and dynamically growing repositories of data and use state-of-the-art statistical techniques to quantify the uncertainty and innovative approaches to visualize the results. Such a tool will not only be valuable for future assessments and underpin the development of dedicated climate services, but will also assist the scientific community in making more clearly its case on the consequences of our changing climate to policy makers and the general public. The eSACP project is led by Thordis Thorarinsdottir, Norwegian Computing Center, and also includes the Finnish Meteorological Institute, the Norwegian Meteorological Institute, the Technical University of Denmark and the Bjerknes Centre for Climate Research, Norway. This poster will present details of focus areas in the project and show some examples of the expected analysis tools.
1975-02-03
the anthropometrists, biologists, and psychologists of that era. Such initial contributors to modern statistics as Francis Galton and Karl Pearson...1159-78. [5] Galton , Francis (1888), "Co-relations and Their Measurements, Chiefly from Anthropometric Data," Proceedings of the...stem from that period. Galton seemed to be perpetually engaged in data analysis. He and his cousin, Darwin, and others revolved in an age of
Appraising the Corporate Sustainability Reports - Text Mining and Multi-Discriminatory Analysis
NASA Astrophysics Data System (ADS)
Modapothala, J. R.; Issac, B.; Jayamani, E.
The voluntary disclosure of the sustainability reports by the companies attracts wider stakeholder groups. Diversity in these reports poses challenge to the users of information and regulators. This study appraises the corporate sustainability reports as per GRI (Global Reporting Initiative) guidelines (the most widely accepted and used) across all industrial sectors. Text mining is adopted to carry out the initial analysis with a large sample size of 2650 reports. Statistical analyses were performed for further investigation. The results indicate that the disclosures made by the companies differ across the industrial sectors. Multivariate Discriminant Analysis (MDA) shows that the environmental variable is a greater significant contributing factor towards explanation of sustainability report.
Technical Note: The Initial Stages of Statistical Data Analysis
Tandy, Richard D.
1998-01-01
Objective: To provide an overview of several important data-related considerations in the design stage of a research project and to review the levels of measurement and their relationship to the statistical technique chosen for the data analysis. Background: When planning a study, the researcher must clearly define the research problem and narrow it down to specific, testable questions. The next steps are to identify the variables in the study, decide how to group and treat subjects, and determine how to measure, and the underlying level of measurement of, the dependent variables. Then the appropriate statistical technique can be selected for data analysis. Description: The four levels of measurement in increasing complexity are nominal, ordinal, interval, and ratio. Nominal data are categorical or “count” data, and the numbers are treated as labels. Ordinal data can be ranked in a meaningful order by magnitude. Interval data possess the characteristics of ordinal data and also have equal distances between levels. Ratio data have a natural zero point. Nominal and ordinal data are analyzed with nonparametric statistical techniques and interval and ratio data with parametric statistical techniques. Advantages: Understanding the four levels of measurement and when it is appropriate to use each is important in determining which statistical technique to use when analyzing data. PMID:16558489
Hagen, Brad; Awosoga, Oluwagbohunmi A; Kellett, Peter; Damgaard, Marie
2013-04-23
This article describes the results of a qualitative research study evaluating nursing students' experiences of a mandatory course in applied statistics, and the perceived effectiveness of teaching methods implemented during the course. Fifteen nursing students in the third year of a four-year baccalaureate program in nursing participated in focus groups before and after taking the mandatory course in statistics. The interviews were transcribed and analyzed using content analysis to reveal four major themes: (i) "one of those courses you throw out?," (ii) "numbers and terrifying equations," (iii) "first aid for statistics casualties," and (iv) "re-thinking curriculum." Overall, the data revealed that although nursing students initially enter statistics courses with considerable skepticism, fear, and anxiety, there are a number of concrete actions statistics instructors can take to reduce student fear and increase the perceived relevance of courses in statistics.
A Monte Carlo–Based Bayesian Approach for Measuring Agreement in a Qualitative Scale
Pérez Sánchez, Carlos Javier
2014-01-01
Agreement analysis has been an active research area whose techniques have been widely applied in psychology and other fields. However, statistical agreement among raters has been mainly considered from a classical statistics point of view. Bayesian methodology is a viable alternative that allows the inclusion of subjective initial information coming from expert opinions, personal judgments, or historical data. A Bayesian approach is proposed by providing a unified Monte Carlo–based framework to estimate all types of measures of agreement in a qualitative scale of response. The approach is conceptually simple and it has a low computational cost. Both informative and non-informative scenarios are considered. In case no initial information is available, the results are in line with the classical methodology, but providing more information on the measures of agreement. For the informative case, some guidelines are presented to elicitate the prior distribution. The approach has been applied to two applications related to schizophrenia diagnosis and sensory analysis. PMID:29881002
ERIC Educational Resources Information Center
Bornovalova, Marina A.; Levy, Roy; Gratz, Kim L.; Lejuez, C. W.
2010-01-01
The current study investigated the heterogeneity of borderline personality disorder (BPD) symptoms in a sample of 382 inner-city, predominantly African American male substance users through the use of latent class analysis. A 4-class model was statistically preferred, with 1 class interpreted to be a baseline class, 1 class interpreted to be a…
ERIC Educational Resources Information Center
Walsh, Kenneth; Green, Andy; Steedman, Hilary
The impact of developments in work organizations on the skilling process in the United Kingdom was studied through a macro analysis of available statistical information about the development of workplace training in the United Kingdom and case studies of three U.K. firms. The macro analysis focused on the following: initial training arrangements;…
Ronald E. McRoberts; William A. Bechtold; Paul L. Patterson; Charles T. Scott; Gregory A. Reams
2005-01-01
The Forest Inventory and Analysis (FIA) program of the USDA Forest Service has initiated a transition from regional, periodic inventories to an enhanced national FIA program featuring annual measurement of a proportion of plots in each state, greater national consistency, and integration with the ground sampling component of the Forest Health Monitoring (FHM) program...
Statistics of initial density perturbations in heavy ion collisions and their fluid dynamic response
NASA Astrophysics Data System (ADS)
Floerchinger, Stefan; Wiedemann, Urs Achim
2014-08-01
An interesting opportunity to determine thermodynamic and transport properties in more detail is to identify generic statistical properties of initial density perturbations. Here we study event-by-event fluctuations in terms of correlation functions for two models that can be solved analytically. The first assumes Gaussian fluctuations around a distribution that is fixed by the collision geometry but leads to non-Gaussian features after averaging over the reaction plane orientation at non-zero impact parameter. In this context, we derive a three-parameter extension of the commonly used Bessel-Gaussian event-by-event distribution of harmonic flow coefficients. Secondly, we study a model of N independent point sources for which connected n-point correlation functions of initial perturbations scale like 1 /N n-1. This scaling is violated for non-central collisions in a way that can be characterized by its impact parameter dependence. We discuss to what extent these are generic properties that can be expected to hold for any model of initial conditions, and how this can improve the fluid dynamical analysis of heavy ion collisions.
Nonlinear dynamics of the cellular-automaton ``game of Life''
NASA Astrophysics Data System (ADS)
Garcia, J. B. C.; Gomes, M. A. F.; Jyh, T. I.; Ren, T. I.; Sales, T. R. M.
1993-11-01
A statistical analysis of the ``game of Life'' due to Conway [Berlekamp, Conway, and Guy, Winning Ways for Your Mathematical Plays (Academic, New York, 1982), Vol. 2] is reported. The results are based on extensive computer simulations starting with uncorrelated distributions of live sites at t=0. The number n(s,t) of clusters of s live sites at time t, the mean cluster size s¯(t), and the diversity of sizes among other statistical functions are obtained. The dependence of the statistical functions with the initial density of live sites is examined. Several scaling relations as well as static and dynamic critical exponents are found.
Lewis, Kristin Nicole; Heckman, Bernadette Davantes; Himawan, Lina
2011-08-01
Growth mixture modeling (GMM) identified latent groups based on treatment outcome trajectories of headache disability measures in patients in headache subspecialty treatment clinics. Using a longitudinal design, 219 patients in headache subspecialty clinics in 4 large cities throughout Ohio provided data on their headache disability at pretreatment and 3 follow-up assessments. GMM identified 3 treatment outcome trajectory groups: (1) patients who initiated treatment with elevated disability levels and who reported statistically significant reductions in headache disability (high-disability improvers; 11%); (2) patients who initiated treatment with elevated disability but who reported no reductions in disability (high-disability nonimprovers; 34%); and (3) patients who initiated treatment with moderate disability and who reported statistically significant reductions in headache disability (moderate-disability improvers; 55%). Based on the final multinomial logistic regression model, a dichotomized treatment appointment attendance variable was a statistically significant predictor for differentiating high-disability improvers from high-disability nonimprovers. Three-fourths of patients who initiated treatment with elevated disability levels did not report reductions in disability after 5 months of treatment with new preventive pharmacotherapies. Preventive headache agents may be most efficacious for patients with moderate levels of disability and for patients with high disability levels who attend all treatment appointments. Copyright © 2011 International Association for the Study of Pain. Published by Elsevier B.V. All rights reserved.
Conditional statistics in a turbulent premixed flame derived from direct numerical simulation
NASA Technical Reports Server (NTRS)
Mantel, Thierry; Bilger, Robert W.
1994-01-01
The objective of this paper is to briefly introduce conditional moment closure (CMC) methods for premixed systems and to derive the transport equation for the conditional species mass fraction conditioned on the progress variable based on the enthalpy. Our statistical analysis will be based on the 3-D DNS database of Trouve and Poinsot available at the Center for Turbulence Research. The initial conditions and characteristics (turbulence, thermo-diffusive properties) as well as the numerical method utilized in the DNS of Trouve and Poinsot are presented, and some details concerning our statistical analysis are also given. From the analysis of DNS results, the effects of the position in the flame brush, of the Damkoehler and Lewis numbers on the conditional mean scalar dissipation, and conditional mean velocity are presented and discussed. Information concerning unconditional turbulent fluxes are also presented. The anomaly found in previous studies of counter-gradient diffusion for the turbulent flux of the progress variable is investigated.
Ramagopalan, Sreeram V; Skingsley, Andrew P; Handunnetthi, Lahiru; Magnus, Daniel; Klingel, Michelle; Pakpoor, Julia; Goldacre, Ben
2015-01-01
We and others have shown a significant proportion of interventional trials registered on ClinicalTrials.gov have their primary outcomes altered after the listed study start and completion dates. The objectives of this study were to investigate whether changes made to primary outcomes are associated with the likelihood of reporting a statistically significant primary outcome on ClinicalTrials.gov. A cross-sectional analysis of all interventional clinical trials registered on ClinicalTrials.gov as of 20 November 2014 was performed. The main outcome was any change made to the initially listed primary outcome and the time of the change in relation to the trial start and end date. 13,238 completed interventional trials were registered with ClinicalTrials.gov that also had study results posted on the website. 2555 (19.3%) had one or more statistically significant primary outcomes. Statistical analysis showed that registration year, funding source and primary outcome change after trial completion were associated with reporting a statistically significant primary outcome . Funding source and primary outcome change after trial completion are associated with a statistically significant primary outcome report on clinicaltrials.gov.
Selvarasu, Suresh; Kim, Do Yun; Karimi, Iftekhar A; Lee, Dong-Yup
2010-10-01
We present an integrated framework for characterizing fed-batch cultures of mouse hybridoma cells producing monoclonal antibody (mAb). This framework systematically combines data preprocessing, elemental balancing and statistical analysis technique. Initially, specific rates of cell growth, glucose/amino acid consumptions and mAb/metabolite productions were calculated via curve fitting using logistic equations, with subsequent elemental balancing of the preprocessed data indicating the presence of experimental measurement errors. Multivariate statistical analysis was then employed to understand physiological characteristics of the cellular system. The results from principal component analysis (PCA) revealed three major clusters of amino acids with similar trends in their consumption profiles: (i) arginine, threonine and serine, (ii) glycine, tyrosine, phenylalanine, methionine, histidine and asparagine, and (iii) lysine, valine and isoleucine. Further analysis using partial least square (PLS) regression identified key amino acids which were positively or negatively correlated with the cell growth, mAb production and the generation of lactate and ammonia. Based on these results, the optimal concentrations of key amino acids in the feed medium can be inferred, potentially leading to an increase in cell viability and productivity, as well as a decrease in toxic waste production. The study demonstrated how the current methodological framework using multivariate statistical analysis techniques can serve as a potential tool for deriving rational medium design strategies. Copyright © 2010 Elsevier B.V. All rights reserved.
Olavarría, Verónica V; Arima, Hisatomi; Anderson, Craig S; Brunser, Alejandro; Muñoz-Venturelli, Paula; Billot, Laurent; Lavados, Pablo M
2017-02-01
Background The HEADPOST Pilot is a proof-of-concept, open, prospective, multicenter, international, cluster randomized, phase IIb controlled trial, with masked outcome assessment. The trial will test if lying flat head position initiated in patients within 12 h of onset of acute ischemic stroke involving the anterior circulation increases cerebral blood flow in the middle cerebral arteries, as measured by transcranial Doppler. The study will also assess the safety and feasibility of patients lying flat for ≥24 h. The trial was conducted in centers in three countries, with ability to perform early transcranial Doppler. A feature of this trial was that patients were randomized to a certain position according to the month of admission to hospital. Objective To outline in detail the predetermined statistical analysis plan for HEADPOST Pilot study. Methods All data collected by participating researchers will be reviewed and formally assessed. Information pertaining to the baseline characteristics of patients, their process of care, and the delivery of treatments will be classified, and for each item, appropriate descriptive statistical analyses are planned with comparisons made between randomized groups. For the outcomes, statistical comparisons to be made between groups are planned and described. Results This statistical analysis plan was developed for the analysis of the results of the HEADPOST Pilot study to be transparent, available, verifiable, and predetermined before data lock. Conclusions We have developed a statistical analysis plan for the HEADPOST Pilot study which is to be followed to avoid analysis bias arising from prior knowledge of the study findings. Trial registration The study is registered under HEADPOST-Pilot, ClinicalTrials.gov Identifier NCT01706094.
Trivedi, Prinal; Edwards, Jode W; Wang, Jelai; Gadbury, Gary L; Srinivasasainagendra, Vinodh; Zakharkin, Stanislav O; Kim, Kyoungmi; Mehta, Tapan; Brand, Jacob P L; Patki, Amit; Page, Grier P; Allison, David B
2005-04-06
Many efforts in microarray data analysis are focused on providing tools and methods for the qualitative analysis of microarray data. HDBStat! (High-Dimensional Biology-Statistics) is a software package designed for analysis of high dimensional biology data such as microarray data. It was initially developed for the analysis of microarray gene expression data, but it can also be used for some applications in proteomics and other aspects of genomics. HDBStat! provides statisticians and biologists a flexible and easy-to-use interface to analyze complex microarray data using a variety of methods for data preprocessing, quality control analysis and hypothesis testing. Results generated from data preprocessing methods, quality control analysis and hypothesis testing methods are output in the form of Excel CSV tables, graphs and an Html report summarizing data analysis. HDBStat! is a platform-independent software that is freely available to academic institutions and non-profit organizations. It can be downloaded from our website http://www.soph.uab.edu/ssg_content.asp?id=1164.
Using Symbolic-Logic Matrices To Improve Confirmatory Factor Analysis Techniques.
ERIC Educational Resources Information Center
Creighton, Theodore B.; Coleman, Donald G.; Adams, R. C.
A continuing and vexing problem associated with survey instrument development is the creation of items, initially, that correlate favorably a posteriori with constructs being measured. This study tests the use of symbolic-logic matrices developed by D. G. Coleman (1979) in creating factorially "pure" statistically discrete constructs in…
ERIC Educational Resources Information Center
STERNLIEB, GEORGE
THIS STUDY IS AN IN-DEPTH ANALYSIS OF THE CONSEQUENCES OF SLUM OWNERSHIP AND THE IMPACT OF THE MARKET ON THE MAINTENANCE AND REHABILITATION OF SLUM TENEMENTS. DATA ARE DRAWN FROM LAND PARCEL STATISTICS AND LANDLORD INTERVIEWS IN NEWARK, N.J. THE STUDY DESCRIBES THE MEASURES NEEDED TO INITIATE SLUM REHABILITATION. PARTICULAR ATTENTION IS GIVEN TO…
Sensory Integration and Ego Development in a Schizophrenic Adolescent Male.
ERIC Educational Resources Information Center
Pettit, Karen A.
1987-01-01
A retrospective study compared hours spent by a schizophrenic adolescent in "time out" before and after initiation of treatment. The study evaluated the effects of sensory integrative treatment on the ability to handle anger and frustration. Results demonstrate the utility of statistical analysis versus visual comparison to validate effectiveness…
NASA Astrophysics Data System (ADS)
Pradeep, Krishna; Poiroux, Thierry; Scheer, Patrick; Juge, André; Gouget, Gilles; Ghibaudo, Gérard
2018-07-01
This work details the analysis of wafer level global process variability in 28 nm FD-SOI using split C-V measurements. The proposed approach initially evaluates the native on wafer process variability using efficient extraction methods on split C-V measurements. The on-wafer threshold voltage (VT) variability is first studied and modeled using a simple analytical model. Then, a statistical model based on the Leti-UTSOI compact model is proposed to describe the total C-V variability in different bias conditions. This statistical model is finally used to study the contribution of each process parameter to the total C-V variability.
Palosaari, Esa; Punamäki, Raija-Leena; Diab, Marwan; Qouta, Samir
2013-08-01
In a longitudinal study of war-affected children, we tested, first, whether posttraumatic cognitions (PTCs) mediated the relationship between initial and later posttraumatic stress symptoms (PTSSs). Second, we analyzed the relative strength of influences that PTCs and PTSSs have on each other in cross-lagged models of levels and latent change scores. The participants were 240 Palestinian children 10-12 years of age, reporting PTSSs and PTCs measures at 3, 5, and 11 months after a major war. Results show that PTCs did not mediate between initial and later PTSSs. The levels and changes in PTCs statistically significantly predicted later levels and changes in PTSSs, but PTSSs did not statistically significantly predict later PTCs. The results are consistent with the hypothesis that PTCs have a central role in the development and maintenance of PTSSs over time, but they do not support the hypothesis that initial PTSSs develop to chronic PTSSs through negative PTCs. PsycINFO Database Record (c) 2013 APA, all rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steed, Chad Allen
EDENx is a multivariate data visualization tool that allows interactive user driven analysis of large-scale data sets with high dimensionality. EDENx builds on our earlier system, called EDEN to enable analysis of more dimensions and larger scale data sets. EDENx provides an initial overview of summary statistics for each variable in the data set under investigation. EDENx allows the user to interact with graphical summary plots of the data to investigate subsets and their statistical associations. These plots include histograms, binned scatterplots, binned parallel coordinate plots, timeline plots, and graphical correlation indicators. From the EDENx interface, a user can selectmore » a subsample of interest and launch a more detailed data visualization via the EDEN system. EDENx is best suited for high-level, aggregate analysis tasks while EDEN is more appropriate for detail data investigations.« less
SU-E-J-261: Statistical Analysis and Chaotic Dynamics of Respiratory Signal of Patients in BodyFix
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michalski, D; Huq, M; Bednarz, G
Purpose: To quantify respiratory signal of patients in BodyFix undergoing 4DCT scan with and without immobilization cover. Methods: 20 pairs of respiratory tracks recorded with RPM system during 4DCT scan were analyzed. Descriptive statistic was applied to selected parameters of exhale-inhale decomposition. Standardized signals were used with the delay method to build orbits in embedded space. Nonlinear behavior was tested with surrogate data. Sample entropy SE, Lempel-Ziv complexity LZC and the largest Lyapunov exponents LLE were compared. Results: Statistical tests show difference between scans for inspiration time and its variability, which is bigger for scans without cover. The same ismore » for variability of the end of exhalation and inhalation. Other parameters fail to show the difference. For both scans respiratory signals show determinism and nonlinear stationarity. Statistical test on surrogate data reveals their nonlinearity. LLEs show signals chaotic nature and its correlation with breathing period and its embedding delay time. SE, LZC and LLE measure respiratory signal complexity. Nonlinear characteristics do not differ between scans. Conclusion: Contrary to expectation cover applied to patients in BodyFix appears to have limited effect on signal parameters. Analysis based on trajectories of delay vectors shows respiratory system nonlinear character and its sensitive dependence on initial conditions. Reproducibility of respiratory signal can be evaluated with measures of signal complexity and its predictability window. Longer respiratory period is conducive for signal reproducibility as shown by these gauges. Statistical independence of the exhale and inhale times is also supported by the magnitude of LLE. The nonlinear parameters seem more appropriate to gauge respiratory signal complexity since its deterministic chaotic nature. It contrasts with measures based on harmonic analysis that are blind for nonlinear features. Dynamics of breathing, so crucial for 4D-based clinical technologies, can be better controlled if nonlinear-based methodology, which reflects respiration characteristic, is applied. Funding provided by Varian Medical Systems via Investigator Initiated Research Project.« less
Back off! The effect of emotion on backward step initiation.
Bouman, Daniëlle; Stins, John F
2018-02-01
The distance regulation (DR) hypothesis states that actors are inclined to increase their distance from an unpleasant stimulus. The current study investigated the relation between emotion and its effect on the control of backward step initiation, which constitutes an avoidance-like behavior. Participants stepped backward on a force plate in response to neutral, high-arousing pleasant and high-arousing unpleasant visual emotional stimuli. Gait initiation parameters and the results of an exploratory analysis of postural sway were compared across the emotion categories using significance testing and Bayesian statistics. Evidence was found that gait initiation parameters were largely unaffected by emotional conditions. In contrast, the exploratory analysis of postural immobility showed a significant effect: highly arousing stimuli (pleasant and unpleasant) resulted in more postural sway immediately preceding gait initiation compared to neutral stimuli. This suggests that arousal, rather than valence, affects pre-step sway. These results contradict the DR hypothesis, since avoidance gait-initiation in response to unpleasant stimuli was no different compared to pleasant stimuli. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
A Framework for Assessing High School Students' Statistical Reasoning.
Chan, Shiau Wei; Ismail, Zaleha; Sumintono, Bambang
2016-01-01
Based on a synthesis of literature, earlier studies, analyses and observations on high school students, this study developed an initial framework for assessing students' statistical reasoning about descriptive statistics. Framework descriptors were established across five levels of statistical reasoning and four key constructs. The former consisted of idiosyncratic reasoning, verbal reasoning, transitional reasoning, procedural reasoning, and integrated process reasoning. The latter include describing data, organizing and reducing data, representing data, and analyzing and interpreting data. In contrast to earlier studies, this initial framework formulated a complete and coherent statistical reasoning framework. A statistical reasoning assessment tool was then constructed from this initial framework. The tool was administered to 10 tenth-grade students in a task-based interview. The initial framework was refined, and the statistical reasoning assessment tool was revised. The ten students then participated in the second task-based interview, and the data obtained were used to validate the framework. The findings showed that the students' statistical reasoning levels were consistent across the four constructs, and this result confirmed the framework's cohesion. Developed to contribute to statistics education, this newly developed statistical reasoning framework provides a guide for planning learning goals and designing instruction and assessments.
A Framework for Assessing High School Students' Statistical Reasoning
2016-01-01
Based on a synthesis of literature, earlier studies, analyses and observations on high school students, this study developed an initial framework for assessing students’ statistical reasoning about descriptive statistics. Framework descriptors were established across five levels of statistical reasoning and four key constructs. The former consisted of idiosyncratic reasoning, verbal reasoning, transitional reasoning, procedural reasoning, and integrated process reasoning. The latter include describing data, organizing and reducing data, representing data, and analyzing and interpreting data. In contrast to earlier studies, this initial framework formulated a complete and coherent statistical reasoning framework. A statistical reasoning assessment tool was then constructed from this initial framework. The tool was administered to 10 tenth-grade students in a task-based interview. The initial framework was refined, and the statistical reasoning assessment tool was revised. The ten students then participated in the second task-based interview, and the data obtained were used to validate the framework. The findings showed that the students’ statistical reasoning levels were consistent across the four constructs, and this result confirmed the framework’s cohesion. Developed to contribute to statistics education, this newly developed statistical reasoning framework provides a guide for planning learning goals and designing instruction and assessments. PMID:27812091
Population-level interventions in government jurisdictions for dietary sodium reduction.
McLaren, Lindsay; Sumar, Nureen; Barberio, Amanda M; Trieu, Kathy; Lorenzetti, Diane L; Tarasuk, Valerie; Webster, Jacqui; Campbell, Norman Rc
2016-09-16
Excess dietary sodium consumption is a risk factor for high blood pressure, stroke and cardiovascular disease. Currently, dietary sodium consumption in almost every country is too high. Excess sodium intake is associated with high blood pressure, which is common and costly and accounts for significant burden of disease. A large number of jurisdictions worldwide have implemented population-level dietary sodium reduction initiatives. No systematic review has examined the impact of these initiatives. • To assess the impact of population-level interventions for dietary sodium reduction in government jurisdictions worldwide.• To assess the differential impact of those initiatives by social and economic indicators. We searched the following electronic databases from their start date to 5 January 2015: the Cochrane Central Register of Controlled Trials (CENTRAL); Cochrane Public Health Group Specialised Register; MEDLINE; MEDLINE In Process & Other Non-Indexed Citations; EMBASE; Effective Public Health Practice Project Database; Web of Science; Trials Register of Promoting Health Interventions (TRoPHI) databases; and Latin American Caribbean Health Sciences Literature (LILACS). We also searched grey literature, other national sources and references of included studies.This review was conducted in parallel with a comprehensive review of national sodium reduction efforts under way worldwide (Trieu 2015), through which we gained additional information directly from country contacts.We imposed no restrictions on language or publication status. We included population-level initiatives (i.e. interventions that target whole populations, in this case, government jurisdictions, worldwide) for dietary sodium reduction, with at least one pre-intervention data point and at least one post-intervention data point of comparable jurisdiction. We included populations of all ages and the following types of study designs: cluster-randomised, controlled pre-post, interrupted time series and uncontrolled pre-post. We contacted study authors at different points in the review to ask for missing information. Two review authors extracted data, and two review authors assessed risk of bias for each included initiative.We analysed the impact of initiatives by using estimates of sodium consumption from dietary surveys or urine samples. All estimates were converted to a common metric: salt intake in grams per day. We analysed impact by computing the mean change in salt intake (grams per day) from pre-intervention to post-intervention. We reviewed a total of 881 full-text documents. From these, we identified 15 national initiatives, including more than 260,000 people, that met the inclusion criteria. None of the initiatives were provided in lower-middle-income or low-income countries. All initiatives except one used an uncontrolled pre-post study design.Because of high levels of study heterogeneity (I 2 > 90%), we focused on individual initiatives rather than on pooled results.Ten initiatives provided sufficient data for quantitative analysis of impact (64,798 participants). As required by the Grades of Recommendation, Assessment, Development and Evaluation (GRADE) method, we graded the evidence as very low due to the risk of bias of the included studies, as well as variation in the direction and size of effect across the studies. Five of these showed mean decreases in average daily salt intake per person from pre-intervention to post-intervention, ranging from 1.15 grams/day less (Finland) to 0.35 grams/day less (Ireland). Two initiatives showed mean increase in salt intake from pre-intervention to post-intervention: Canada (1.66) and Switzerland (0.80 grams/day more per person. The remaining initiatives did not show a statistically significant mean change.Seven of the 10 initiatives were multi-component and incorporated intervention activities of a structural nature (e.g. food product reformulation, food procurement policy in specific settings). Of those seven initiatives, four showed a statistically significant mean decrease in salt intake from pre-intervention to post-intervention, ranging from Finland to Ireland (see above), and one showed a statistically significant mean increase in salt intake from pre-intervention to post-intervention (Switzerland; see above).Nine initiatives permitted quantitative analysis of differential impact by sex (men and women separately). For women, three initiatives (China, Finland, France) showed a statistically significant mean decrease, four (Austria, Netherlands, Switzerland, United Kingdom) showed no significant change and two (Canada, United States) showed a statistically significant mean increase in salt intake from pre-intervention to post-intervention. For men, five initiatives (Austria, China, Finland, France, United Kingdom) showed a statistically significant mean decrease, three (Netherlands, Switzerland, United States) showed no significant change and one (Canada) showed a statistically significant mean increase in salt intake from pre-intervention to post-intervention.Information was insufficient to indicate whether a differential change in mean salt intake occurred from pre-intervention to post-intervention by other axes of equity included in the PROGRESS framework (e.g. education, place of residence).We identified no adverse effects of these initiatives.The number of initiatives was insufficient to permit other subgroup analyses, including stratification by intervention type, economic status of country and duration (or start year) of the initiative.Many studies had methodological strengths, including large, nationally representative samples of the population and rigorous measurement of dietary sodium intake. However, all studies were scored as having high risk of bias, reflecting the observational nature of the research and the use of an uncontrolled study design. The quality of evidence for the main outcome was low. We could perform a sensitivity analysis only for impact. Population-level interventions in government jurisdictions for dietary sodium reduction have the potential to result in population-wide reductions in salt intake from pre-intervention to post-intervention, particularly if they are multi-component (more than one intervention activity) and incorporate intervention activities of a structural nature (e.g. food product reformulation), and particularly amongst men. Heterogeneity across studies was significant, reflecting different contexts (population and setting) and initiative characteristics. Implementation of future initiatives should embed more effective means of evaluation to help us better understand the variation in the effects.
Quantifying predictability in a model with statistical features of the atmosphere
Kleeman, Richard; Majda, Andrew J.; Timofeyev, Ilya
2002-01-01
The Galerkin truncated inviscid Burgers equation has recently been shown by the authors to be a simple model with many degrees of freedom, with many statistical properties similar to those occurring in dynamical systems relevant to the atmosphere. These properties include long time-correlated, large-scale modes of low frequency variability and short time-correlated “weather modes” at smaller scales. The correlation scaling in the model extends over several decades and may be explained by a simple theory. Here a thorough analysis of the nature of predictability in the idealized system is developed by using a theoretical framework developed by R.K. This analysis is based on a relative entropy functional that has been shown elsewhere by one of the authors to measure the utility of statistical predictions precisely. The analysis is facilitated by the fact that most relevant probability distributions are approximately Gaussian if the initial conditions are assumed to be so. Rather surprisingly this holds for both the equilibrium (climatological) and nonequilibrium (prediction) distributions. We find that in most cases the absolute difference in the first moments of these two distributions (the “signal” component) is the main determinant of predictive utility variations. Contrary to conventional belief in the ensemble prediction area, the dispersion of prediction ensembles is generally of secondary importance in accounting for variations in utility associated with different initial conditions. This conclusion has potentially important implications for practical weather prediction, where traditionally most attention has focused on dispersion and its variability. PMID:12429863
Vasconcelos, Karla Anacleto de; Frota, Silvana Maria Monte Coelho; Ruffino-Netto, Antonio; Kritski, Afrânio Lineu
2018-04-01
To investigate early detection of amikacin-induced ototoxicity in a population treated for multidrug-resistant tuberculosis (MDR-TB), by means of three different tests: pure-tone audiometry (PTA); high-frequency audiometry (HFA); and distortion-product otoacoustic emission (DPOAE) testing. This was a longitudinal prospective cohort study involving patients aged 18-69 years with a diagnosis of MDR-TB who had to receive amikacin for six months as part of their antituberculosis drug regimen for the first time. Hearing was assessed before treatment initiation and at two and six months after treatment initiation. Sequential statistics were used to analyze the results. We included 61 patients, but the final population consisted of 10 patients (7 men and 3 women) because of sequential analysis. Comparison of the test results obtained at two and six months after treatment initiation with those obtained at baseline revealed that HFA at two months and PTA at six months detected hearing threshold shifts consistent with ototoxicity. However, DPOAE testing did not detect such shifts. The statistical method used in this study makes it possible to conclude that, over the six-month period, amikacin-associated hearing threshold shifts were detected by HFA and PTA, and that DPOAE testing was not efficient in detecting such shifts.
Lucas, Nathanael Cc; Hume, Carl G; Al-Chanati, Abdal; Diprose, William; Roberts, Sally; Freeman, Josh; Mogol, Vernon; Hoskins, David; Hamblin, Richard; Frampton, Chris; Bagg, Warwick; Merry, Alan F
2017-01-13
Hand hygiene is important in reducing healthcare-associated infections. The World Health Organization has defined 'five moments' when hand hygiene compliance is required. During 2013, New Zealand national data showed poor compliance with these moments by medical students. To improve medical students' compliance with the five moments. In this prospective student-led quality improvement initiative, student investigators developed, implemented and evaluated a multi-modal intervention comprising a three-month social media campaign, a competition and an entertaining educational video. Data on individual patient-medical student interactions were collected covertly by observers at baseline and at one week, six weeks and three months after initiation of the intervention. During the campaign, compliance improved in moment 2, but not significantly in moments 1, 3, 4 or 5. Statistical analysis of amalgamated data was limited by non-independent data points-a consideration apparently not always addressed in previous studies. The initiative produced improvements in compliance by medical students with one hand hygiene moment. Statistical analysis of amalgamated data for all five moments should allow for the non-independence of each occasion in which clinicians interact with a patient. More work is needed to ensure excellent hand hygiene practices of future doctors.
Statistical analysis of Turbine Engine Diagnostic (TED) field test data
NASA Astrophysics Data System (ADS)
Taylor, Malcolm S.; Monyak, John T.
1994-11-01
During the summer of 1993, a field test of turbine engine diagnostic (TED) software, developed jointly by U.S. Army Research Laboratory and the U.S. Army Ordnance Center and School, was conducted at Fort Stuart, GA. The data were collected in conformance with a cross-over design, some of whose considerations are detailed. The initial analysis of the field test data was exploratory, followed by a more formal investigation. Technical aspects of the data analysis insights that were elicited are reported.
Reaction Event Counting Statistics of Biopolymer Reaction Systems with Dynamic Heterogeneity.
Lim, Yu Rim; Park, Seong Jun; Park, Bo Jung; Cao, Jianshu; Silbey, Robert J; Sung, Jaeyoung
2012-04-10
We investigate the reaction event counting statistics (RECS) of an elementary biopolymer reaction in which the rate coefficient is dependent on states of the biopolymer and the surrounding environment and discover a universal kinetic phase transition in the RECS of the reaction system with dynamic heterogeneity. From an exact analysis for a general model of elementary biopolymer reactions, we find that the variance in the number of reaction events is dependent on the square of the mean number of the reaction events when the size of measurement time is small on the relaxation time scale of rate coefficient fluctuations, which does not conform to renewal statistics. On the other hand, when the size of the measurement time interval is much greater than the relaxation time of rate coefficient fluctuations, the variance becomes linearly proportional to the mean reaction number in accordance with renewal statistics. Gillespie's stochastic simulation method is generalized for the reaction system with a rate coefficient fluctuation. The simulation results confirm the correctness of the analytic results for the time dependent mean and variance of the reaction event number distribution. On the basis of the obtained results, we propose a method of quantitative analysis for the reaction event counting statistics of reaction systems with rate coefficient fluctuations, which enables one to extract information about the magnitude and the relaxation times of the fluctuating reaction rate coefficient, without a bias that can be introduced by assuming a particular kinetic model of conformational dynamics and the conformation dependent reactivity. An exact relationship is established between a higher moment of the reaction event number distribution and the multitime correlation of the reaction rate for the reaction system with a nonequilibrium initial state distribution as well as for the system with the equilibrium initial state distribution.
Colegrave, Nick
2017-01-01
A common approach to the analysis of experimental data across much of the biological sciences is test-qualified pooling. Here non-significant terms are dropped from a statistical model, effectively pooling the variation associated with each removed term with the error term used to test hypotheses (or estimate effect sizes). This pooling is only carried out if statistical testing on the basis of applying that data to a previous more complicated model provides motivation for this model simplification; hence the pooling is test-qualified. In pooling, the researcher increases the degrees of freedom of the error term with the aim of increasing statistical power to test their hypotheses of interest. Despite this approach being widely adopted and explicitly recommended by some of the most widely cited statistical textbooks aimed at biologists, here we argue that (except in highly specialized circumstances that we can identify) the hoped-for improvement in statistical power will be small or non-existent, and there is likely to be much reduced reliability of the statistical procedures through deviation of type I error rates from nominal levels. We thus call for greatly reduced use of test-qualified pooling across experimental biology, more careful justification of any use that continues, and a different philosophy for initial selection of statistical models in the light of this change in procedure. PMID:28330912
NASA Astrophysics Data System (ADS)
Voegel, Phillip D.; Quashnock, Kathryn A.; Heil, Katrina M.
2004-05-01
The Student-to-Student Chemistry Initiative is an outreach program started in the fall of 2001 at Midwestern State University (MSU). The oncampus program trains high school science students to perform a series of chemistry demonstrations and subsequently provides kits containing necessary supplies and reagents for the high school students to perform demonstration programs at elementary schools. The program focuses on improving student perception of science. The program's impact on high school student perception is evaluated through statistical analysis of paired preparticipation and postparticipation surveys. The surveys focus on four areas of student perception: general attitude toward science, interest in careers in science, science awareness, and interest in attending MSU for postsecondary education. Increased scores were observed in all evaluation areas including a statistically significant increase in science awareness following participation.
Intervening to promote early initiation of breastfeeding in the LDR.
Komara, Carol; Simpson, Diana; Teasdale, Carla; Whalen, Gaye; Bell, Shay; Giovanetto, Laurie
2007-01-01
To evaluate the effectiveness of an interventional protocol for the early initiation of breastfeeding that would remove barriers in the labor, delivery, recovery (LDR) unit. Descriptive design using 100 postpartum mothers who were interviewed before discharge at a large university hospital in the south-central United States. Descriptive statistics were used for analysis. The protocol was effective for initiating breastfeeding, and breastfeeding increased from 53% to 66%. When barriers to breastfeeding are reduced in the LDR setting, women will breastfeed. It is possible that reducing hospital barriers to breastfeeding in the LDR can also set the stage for sustained breastfeeding during hospitalization and for less supplementation with formula.
Initial blood storage experiment
NASA Technical Reports Server (NTRS)
Surgenor, Douglas MACN.
1988-01-01
The design of the Initial Blood Storage Experiment (IBSE) was based upon a carefully controlled comparison between identical sets of human blood cell suspensions - red cells, white cell, and platelets - one set of which was transported aboard the Columbia on a 6 day 11 hour mission, and the other held on the ground. Both sets were carried inside stainless steel dewars within specially fabricated flight hardware. Individual bags of cell suspensions were randomly assigned with respect to ground vs orbit status, dewar chamber, and specific location within the dewar. To foster optimal preservation, each cell type was held under specific optimal conditions of pH, ionic strength, solute concentration, gas tension, and temperature. An added variable in this initial experiment was provided by the use of three different polymer/plasticizer formulations for the sealed bags which held the blood cells. At termination of the experiment, aliquots of the suspensions, identified only by code, were distributed to be assayed. Assays were selected to constitute a broad survey of cellular properties and thereby maximize the chances of detection of gravitational effects. A total of 74 different outcome measurements were reported for statistical analysis. When the measurements were completed, the results were entered into the IBSE data base, at which time the data were matched with the original blood bag numbers to determine their status with respect to polymer/plasticizer type, orbit status (orbit or ground), and storage position within the experimental hardware. The data were studied by analysis of variance. Initially, type of bag and orbital status were main factors; later more detailed analyses were made on specific issues such as position in the hardware and specific plastic. If the analysis of variance indicated a statistical significance at the 5 percent level the corresponding p-value was reported.
Pitoia, Fabián; Jerkovich, Fernando; Smulever, Anabella; Brenta, Gabriela; Bueno, Fernanda; Cross, Graciela
2017-07-01
To evaluate the influence of age at diagnosis on the frequency of structural incomplete response (SIR) according to the modified risk of recurrence (RR) staging system from the American Thyroid Association guidelines. We performed a retrospective analysis of 268 patients with differentiated thyroid cancer (DTC) followed up for at least 3 years after initial treatment (total thyroidectomy and remnant ablation). The median follow-up in the whole cohort was 74.3 months (range: 36.1-317.9) and the median age at diagnosis was 45.9 years (range: 18-87). The association between age at diagnosis and the initial and final response to treatment was assessed with analysis of variance (ANOVA). Patients were also divided into several groups considering age younger and older than 40, 50, and 60 years. Age at diagnosis was not associated with either an initial or final statistically significant different SIR to treatment ( p = 0.14 and p = 0.58, respectively). Additionally, we did not find any statistically significant differences when the percentages of SIR considering the classification of RR were compared between different groups of patients by using several age cutoffs. When patients are correctly risk stratified, it seems that age at diagnosis is not involved in the frequency of having a SIR at the initial evaluation or at the final follow-up, so it should not be included as an additional variable to be considered in the RR classifications.
The validity of multiphase DNS initialized on the basis of single--point statistics
NASA Astrophysics Data System (ADS)
Subramaniam, Shankar
1999-11-01
A study of the point--process statistical representation of a spray reveals that single--point statistical information contained in the droplet distribution function (ddf) is related to a sequence of single surrogate--droplet pdf's, which are in general different from the physical single--droplet pdf's. The results of this study have important consequences for the initialization and evolution of direct numerical simulations (DNS) of multiphase flows, which are usually initialized on the basis of single--point statistics such as the average number density in physical space. If multiphase DNS are initialized in this way, this implies that even the initial representation contains certain implicit assumptions concerning the complete ensemble of realizations, which are invalid for general multiphase flows. Also the evolution of a DNS initialized in this manner is shown to be valid only if an as yet unproven commutation hypothesis holds true. Therefore, it is questionable to what extent DNS that are initialized in this manner constitute a direct simulation of the physical droplets.
Fine-Tuning Dropout Prediction through Discriminant Analysis: The Ethnic Factor.
ERIC Educational Resources Information Center
Wilkinson, L. David; Frazer, Linda H.
In the 1988-89 school year, the Austin (Texas) Independent School District's Office of Research and Evaluation undertook a new dropout research project. Part of this initiative, termed Project GRAD, attempted to develop a statistical equation by which one could predict which students were likely to drop out. If reliable predictive information…
Mediation Analysis in a Latent Growth Curve Modeling Framework
ERIC Educational Resources Information Center
von Soest, Tilmann; Hagtvet, Knut A.
2011-01-01
This article presents several longitudinal mediation models in the framework of latent growth curve modeling and provides a detailed account of how such models can be constructed. Logical and statistical challenges that might arise when such analyses are conducted are also discussed. Specifically, we discuss how the initial status (intercept) and…
Challenge in Enhancing the Teaching and Learning of Variable Measurements in Quantitative Research
ERIC Educational Resources Information Center
Kee, Chang Peng; Osman, Kamisah; Ahmad, Fauziah
2013-01-01
Statistical analysis is one component that cannot be avoided in a quantitative research. Initial observations noted that students in higher education institution faced difficulty analysing quantitative data which were attributed to the confusions of various variable measurements. This paper aims to compare the outcomes of two approaches applied in…
Debating Affirmative Action: Politics, Media, and Equal Opportunity in a "Postracial" America
ERIC Educational Resources Information Center
Paguyo, Christina H.; Moses, Michele S.
2011-01-01
This article explores how race-conscious education policy is interpreted in the political landscape of a "postracial" America. Based on a qualitative media analysis of the press coverage surrounding Amendment 46, an antiaffirmative action initiative, we examine language, statistics, and messages leveraged by advocates and critics of the…
Structure of Student Time Management Scale (STMS)
ERIC Educational Resources Information Center
Balamurugan, M.
2013-01-01
With the aim of constructing a Student Time Management Scale (STMS), the initial version was administered and data were collected from 523 standard eleventh students. (Mean age = 15.64). The data obtained were subjected to Reliability and Factor analysis using PASW Statistical software version 18. From 42 items 14 were dropped, resulting in the…
Bahrin, E K; Ibrahim, M F; Abd Razak, M N; Abd-Aziz, S; Shah, U K Md; Alitheen, N; Salleh, M Md
2012-01-01
The response surface method was applied in this study to improve cellulase production from oil palm empty fruit bunch (OPEFB) by Botryosphaeria rhodina. An experimental design based on a two-level factorial was employed to screen the significant environmental factors for cellulase production. The locally isolated fungus Botryosphaeria rhodina was cultivated on OPEFB under solid-state fermentation (SSF). From the analysis of variance (ANOVA), the initial moisture content, amount of substrate, and initial pH of nutrient supplied in the SSF system significantly influenced cellulase production. Then the optimization of the variables was done using the response surface method according to central composite design (CCD). Botryosphaeria rhodina exhibited its best performance with a high predicted value of FPase enzyme production (17.95 U/g) when the initial moisture content was at 24.32%, initial pH of nutrient was 5.96, and 3.98 g of substrate was present. The statistical optimization from actual experiment resulted in a significant increment of FPase production from 3.26 to 17.91 U/g (5.49-fold). High cellulase production at low moisture content is a very rare condition for fungi cultured in solid-state fermentation.
Mercer, Theresa G; Frostick, Lynne E; Walmsley, Anthony D
2011-10-15
This paper presents a statistical technique that can be applied to environmental chemistry data where missing values and limit of detection levels prevent the application of statistics. A working example is taken from an environmental leaching study that was set up to determine if there were significant differences in levels of leached arsenic (As), chromium (Cr) and copper (Cu) between lysimeters containing preservative treated wood waste and those containing untreated wood. Fourteen lysimeters were setup and left in natural conditions for 21 weeks. The resultant leachate was analysed by ICP-OES to determine the As, Cr and Cu concentrations. However, due to the variation inherent in each lysimeter combined with the limits of detection offered by ICP-OES, the collected quantitative data was somewhat incomplete. Initial data analysis was hampered by the number of 'missing values' in the data. To recover the dataset, the statistical tool of Statistical Multiple Imputation (SMI) was applied, and the data was re-analysed successfully. It was demonstrated that using SMI did not affect the variance in the data, but facilitated analysis of the complete dataset. Copyright © 2011 Elsevier B.V. All rights reserved.
Factors influencing initiation and duration of breast feeding in Ireland.
Leahy-Warren, Patricia; Mulcahy, Helen; Phelan, Agnes; Corcoran, Paul
2014-03-01
The aim of this research was to identify factors associated with mothers breast feeding and to identify, for those who breast fed, factors associated with breast feeding for as long as planned. breast feeding rates in Ireland are amongst the lowest in Europe. Research evidence indicates that in order for mothers to be successful at breast feeding, multiplicities of supports are necessary for both initiation and duration. The nature of these supports in tandem with other influencing factors requires analysis from an Irish perspective. cross-sectional study involving public health nurses and mothers in Ireland. This paper presents the results of the mothers' evaluation. mothers (n=1715) with children less than three years were offered a choice of completing the self-report questionnaires online or by mail. Data were analysed and reported using descriptive and inferential statistics. four in every five participants breast fed their infant and two thirds of them breast fed as long as planned. The multivariate logistic regression analysis identified that third level education, being a first time mother or previously having breast fed, participating online, having more than two public health nurse visits, and having a positive infant feeding attitude were independently and statistically significantly associated with breast feeding. Among mothers who breast fed, being aged at least 35 years, participating online, having a positive infant feeding attitude and high breast feeding self-efficacy were independently and statistically significantly associated with breast feeding for as long as planned. findings from this study reinforce health inequalities therefore there needs to be a renewed commitment to reducing health inequalities in relation to breast feeding. this study has identified factors associated with initiation and duration of breast feeding that are potentially modifiable through public health interventions. Copyright © 2013 Elsevier Ltd. All rights reserved.
Zhu, Yuerong; Zhu, Yuelin; Xu, Wei
2008-01-01
Background Though microarray experiments are very popular in life science research, managing and analyzing microarray data are still challenging tasks for many biologists. Most microarray programs require users to have sophisticated knowledge of mathematics, statistics and computer skills for usage. With accumulating microarray data deposited in public databases, easy-to-use programs to re-analyze previously published microarray data are in high demand. Results EzArray is a web-based Affymetrix expression array data management and analysis system for researchers who need to organize microarray data efficiently and get data analyzed instantly. EzArray organizes microarray data into projects that can be analyzed online with predefined or custom procedures. EzArray performs data preprocessing and detection of differentially expressed genes with statistical methods. All analysis procedures are optimized and highly automated so that even novice users with limited pre-knowledge of microarray data analysis can complete initial analysis quickly. Since all input files, analysis parameters, and executed scripts can be downloaded, EzArray provides maximum reproducibility for each analysis. In addition, EzArray integrates with Gene Expression Omnibus (GEO) and allows instantaneous re-analysis of published array data. Conclusion EzArray is a novel Affymetrix expression array data analysis and sharing system. EzArray provides easy-to-use tools for re-analyzing published microarray data and will help both novice and experienced users perform initial analysis of their microarray data from the location of data storage. We believe EzArray will be a useful system for facilities with microarray services and laboratories with multiple members involved in microarray data analysis. EzArray is freely available from . PMID:18218103
Houts, Carrie R; Edwards, Michael C; Wirth, R J; Deal, Linda S
2016-11-01
There has been a notable increase in the advocacy of using small-sample designs as an initial quantitative assessment of item and scale performance during the scale development process. This is particularly true in the development of clinical outcome assessments (COAs), where Rasch analysis has been advanced as an appropriate statistical tool for evaluating the developing COAs using a small sample. We review the benefits such methods are purported to offer from both a practical and statistical standpoint and detail several problematic areas, including both practical and statistical theory concerns, with respect to the use of quantitative methods, including Rasch-consistent methods, with small samples. The feasibility of obtaining accurate information and the potential negative impacts of misusing large-sample statistical methods with small samples during COA development are discussed.
Jayawardene, Wasantha Parakrama; YoussefAgha, Ahmed Hassan
2014-01-01
This study aimed to identify the sequential patterns of drug use initiation, which included prescription drugs misuse (PDM), among 12th-grade students in Indiana. The study also tested the suitability of the data mining method Market Basket Analysis (MBA) to detect common drug use initiation sequences in large-scale surveys. Data from 2007 to 2009 Annual Surveys of Alcohol, Tobacco, and Other Drug Use by Indiana Children and Adolescents were used for this study. A close-ended, self-administered questionnaire was used to ask adolescents about the use of 21 substance categories and the age of first use. "Support%" and "confidence%" statistics of Market Basket Analysis detected multiple and substitute addictions, respectively. The lifetime prevalence of using any addictive substance was 73.3%, and it has been decreasing during past few years. Although the lifetime prevalence of PDM was 19.2%, it has been increasing. Males and whites were more likely to use drugs and engage in multiple addictions. Market Basket Analysis identified common drug use initiation sequences that involved 11 drugs. High levels of support existed for associations among alcohol, cigarettes, and marijuana, whereas associations that included prescription drugs had medium levels of support. Market Basket Analysis is useful for the detection of common substance use initiation sequences in large-scale surveys. Before initiation of prescription drugs, physicians should consider the adolescents' risk of addiction. Prevention programs should address multiple addictions, substitute addictions, common sequences in drug use initiation, sex and racial differences in PDM, and normative beliefs of parents and adolescents in relation to PDM.
NASA Technical Reports Server (NTRS)
Jackson, L. Neal; Crenshaw, John, Sr.; Davidson, William L.; Herbert, Frank J.; Bilodeau, James W.; Stoval, J. Michael; Sutton, Terry
1989-01-01
The optimum hardware miniaturization level with the lowest cost impact for space biology hardware was determined. Space biology hardware and/or components/subassemblies/assemblies which are the most likely candidates for application of miniaturization are to be defined and relative cost impacts of such miniaturization are to be analyzed. A mathematical or statistical analysis method with the capability to support development of parametric cost analysis impacts for levels of production design miniaturization are provided.
Britz, Juliane; Pitts, Michael A
2011-11-01
We used an intermittent stimulus presentation to investigate event-related potential (ERP) components associated with perceptual reversals during binocular rivalry. The combination of spatiotemporal ERP analysis with source imaging and statistical parametric mapping of the concomitant source differences yielded differences in three time windows: reversals showed increased activity in early visual (∼120 ms) and in inferior frontal and anterior temporal areas (∼400-600 ms) and decreased activity in the ventral stream (∼250-350 ms). The combination of source imaging and statistical parametric mapping suggests that these differences were due to differences in generator strength and not generator configuration, unlike the initiation of reversals in right inferior parietal areas. These results are discussed within the context of the extensive network of brain areas that has been implicated in the initiation, implementation, and appraisal of bistable perceptual reversals. Copyright © 2011 Society for Psychophysiological Research.
NASA Astrophysics Data System (ADS)
Siokis, Fotios M.
2014-02-01
We analyze the complexity of rare economic events in troubled European economies. The economic crisis initiated at the end of 2009, forced a number of European economies to request financial assistance from world organizations. By employing the stock market index as a leading indicator of the economic activity, we test whether the financial assistance programs altered the statistical properties of the index. The effects of major financial program agreements on the economies can be best illustrated by the comparison of the multifractal spectra of the time series before and after the agreement. We reveal that the returns of the time series exhibit strong multifractal properties for all periods under investigation. In two of the three investigated economies, financial assistance along with governments’ initiatives appear to have altered the statistical properties of the stock market indexes increasing the width of the multifractal spectra and thus the complexity of the market.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, J.F.
Research in the biomedical sciences at PNL is described. Activities reported include: inhaled plutonium in dogs; national radiobiology archives; statistical analysis of data from animal studies; genotoxicity of inhaled energy effluents; molecular events during tumor initiation; biochemistry of free radical induced DNA damage; radon hazards in homes; mechanisms of radon injury; genetics of radon induced lung cancer; and in vivo/in vitro radon induced cellular damage.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-13
... Schmidt, Office of Statistical Analysis, Occupational Safety and Health Administration, U.S. Department of... 1904. These data will allow OSHA to calculate occupational injury and illness rates and to focus its... and dates of birth. Although all submissions are listed in the http://www.regulations.gov index, some...
Changing Times: A Survey of Registered Nurses in 1998. IES Report 351.
ERIC Educational Resources Information Center
Smith, G.; Seccombe, I.
A national survey of registered nurses and analysis of official statistics provided an overview of the dimensions and dynamics of the labor market for nurses in the United Kingdom. Findings indicated the following: enrollment in preregistration nurse training courses decreased by 27 percent over the 1990s; initial entries to the UK Central Council…
Comparison of simulation modeling and satellite techniques for monitoring ecological processes
NASA Technical Reports Server (NTRS)
Box, Elgene O.
1988-01-01
In 1985 improvements were made in the world climatic data base for modeling and predictive mapping; in individual process models and the overall carbon-balance models; and in the interface software for mapping the simulation results. Statistical analysis of the data base was begun. In 1986 mapping was shifted to NASA-Goddard. The initial approach involving pattern comparisons was modified to a more statistical approach. A major accomplishment was the expansion and improvement of a global data base of measurements of biomass and primary production, to complement the simulation data. The main accomplishments during 1987 included: production of a master tape with all environmental and satellite data and model results for the 1600 sites; development of a complete mapping system used for the initial color maps comparing annual and monthly patterns of Normalized Difference Vegetation Index (NDVI), actual evapotranspiration, net primary productivity, gross primary productivity, and net ecosystem production; collection of more biosphere measurements for eventual improvement of the biological models; and development of some initial monthly models for primary productivity, based on satellite data.
von Arx, Thomas; Jensen, Simon Storgård; Hänni, Stefan
2007-02-01
This clinical study prospectively evaluated the influence of various predictors on healing outcome 1 year after periapical surgery. The study cohort included 194 teeth in an equal number of patients. Three teeth were lost for the follow-up (1.5% drop-out rate). Clinical and radiographic measures were used to determine the healing outcome. For statistical analysis, results were dichotomized (healed versus nonhealed). The overall success rate was 83.8% (healed cases). The only individual predictors to prove significant for the outcome were pain at initial examination (p=0.030) and other clinical signs or symptoms at initial examination (p=0.042), meaning that such teeth had lower healing rates 1 year after periapical surgery compared with teeth without such signs or symptoms. Logistic regression revealed that pain at initial examination (odds ratio=2.59, confidence interval=1.2-5.6, p=0.04) was the only predictor reaching significance. Several predictors almost reached statistical significance: lesion size (p=0.06), retrofilling material (p=0.06), and postoperative healing course (p=0.06).
Evaluation of force degradation characteristics of orthodontic latex elastics in vitro and in vivo.
Wang, Tong; Zhou, Gang; Tan, Xianfeng; Dong, Yaojun
2007-07-01
To evaluate the characteristics of force degradation of latex elastics in clinical applications and in vitro studies. Samples of 3/16-inch latex elastics were investigated, and 12 students between the ages of 12 and 15 years were selected for the intermaxillary and intramaxillary tractions. The elastics in the control groups were set in artificial saliva and dry room conditions and were stretched 20 mm. The repeated-measure two-way analysis of variance and nonlinear regression analysis were used to identify statistical significance. Overall, there were statistically significant differences between the different methods and observation intervals. At 24- and 48-hour time intervals, the force decreased during in vivo testing and in artificial saliva (P < .001), whereas there were no significant differences in dry room conditions (P > .05). In intermaxillary traction the percentage of initial force remaining after 48 hours was 61%. In intramaxillary traction and in artificial saliva the percentage of initial force remaining was 71%, and in room conditions 86% of initial force remained. Force degradation of latex elastics was different according to their environmental conditions. There was significantly more force degradation in intermaxillary traction than in intramaxillary traction. The dry room condition caused the least force loss. There were some differences among groups in the different times to start wearing elastics in intermaxillary traction but no significant differences in intramaxillary traction.
Applications of modern statistical methods to analysis of data in physical science
NASA Astrophysics Data System (ADS)
Wicker, James Eric
Modern methods of statistical and computational analysis offer solutions to dilemmas confronting researchers in physical science. Although the ideas behind modern statistical and computational analysis methods were originally introduced in the 1970's, most scientists still rely on methods written during the early era of computing. These researchers, who analyze increasingly voluminous and multivariate data sets, need modern analysis methods to extract the best results from their studies. The first section of this work showcases applications of modern linear regression. Since the 1960's, many researchers in spectroscopy have used classical stepwise regression techniques to derive molecular constants. However, problems with thresholds of entry and exit for model variables plagues this analysis method. Other criticisms of this kind of stepwise procedure include its inefficient searching method, the order in which variables enter or leave the model and problems with overfitting data. We implement an information scoring technique that overcomes the assumptions inherent in the stepwise regression process to calculate molecular model parameters. We believe that this kind of information based model evaluation can be applied to more general analysis situations in physical science. The second section proposes new methods of multivariate cluster analysis. The K-means algorithm and the EM algorithm, introduced in the 1960's and 1970's respectively, formed the basis of multivariate cluster analysis methodology for many years. However, several shortcomings of these methods include strong dependence on initial seed values and inaccurate results when the data seriously depart from hypersphericity. We propose new cluster analysis methods based on genetic algorithms that overcomes the strong dependence on initial seed values. In addition, we propose a generalization of the Genetic K-means algorithm which can accurately identify clusters with complex hyperellipsoidal covariance structures. We then use this new algorithm in a genetic algorithm based Expectation-Maximization process that can accurately calculate parameters describing complex clusters in a mixture model routine. Using the accuracy of this GEM algorithm, we assign information scores to cluster calculations in order to best identify the number of mixture components in a multivariate data set. We will showcase how these algorithms can be used to process multivariate data from astronomical observations.
Sun, Peter; Chang, Joanne; Zhang, Jie; Kahler, Kristijan H
2012-01-01
This study examines the evolutionary impact of valsartan initiation on medical costs. A retrospective time series study design was used with a large, US national commercial claims database for the period of 2004-2008. Hypertensive patients who initiated valsartan between the ages of 18 and 63, and had continuous enrollment for 24-month pre-initiation period and 24-month post-initiation period were selected. Patients' monthly medical costs were calculated based on individual claims. A novel time series model was devised with monthly medical costs as its dependent variables, autoregressive integrated moving average (ARIMA) as its stochastic components, and four indicative variables as its decomposed interventional components. The number of post-initiation months before a cost-offset point was also assessed. Patients (n = 18,269) had mean age of 53 at the initiation date, and 53% of them were female. The most common co-morbid conditions were dyslipidemia (52%), diabetes (24%), and hypertensive complications (17%). The time series model suggests that medical costs were increasing by approximately $10 per month (p < 0.01) before the initiation, and decreasing by approximately $6 per month (p < 0.01) after the initiation. After the 4th post-initiation month, medical costs for patients with the initiation were statistically significantly lower (p < 0.01) than forecasted medical costs for the same patients without the initiation. The study has its limitations in data representativeness, ability to collect unrecorded clinical conditions, treatments, and costs, as well as its generalizability to patients with different characteristics. Commercially insured hypertensive patients experienced monthly medical cost increase before valsartan initiation. Based on our model, the evolutionary impact of the initiation on medical costs included a temporary cost surge, a gradual, consistent, and statistically significant cost decrease, and a cost-offset point around the 4th post-initiation month.
NASA Astrophysics Data System (ADS)
Alekseev, V. A.; Krylova, D. D.
1996-02-01
The analytical investigation of Bloch equations is used to describe the main features of the 1D velocity selective coherent population trapping cooling scheme. For the initial stage of cooling the fraction of cooled atoms is derived in the case of a Gaussian initial velocity distribution. At very long times of interaction the fraction of cooled atoms and the velocity distribution function are described by simple analytical formulae and do not depend on the initial distribution. These results are in good agreement with those of Bardou, Bouchaud, Emile, Aspect and Cohen-Tannoudji based on statistical analysis in terms of Levy flights and with Monte-Carlo simulations of the process.
Van Bockstaele, Femke; Janssens, Ann; Piette, Anne; Callewaert, Filip; Pede, Valerie; Offner, Fritz; Verhasselt, Bruno; Philippé, Jan
2006-07-15
ZAP-70 has been proposed as a surrogate marker for immunoglobulin heavy-chain variable region (IgV(H)) mutation status, which is known as a prognostic marker in B-cell chronic lymphocytic leukemia (CLL). The flow cytometric analysis of ZAP-70 suffers from difficulties in standardization and interpretation. We applied the Kolmogorov-Smirnov (KS) statistical test to make analysis more straightforward. We examined ZAP-70 expression by flow cytometry in 53 patients with CLL. Analysis was performed as initially described by Crespo et al. (New England J Med 2003; 348:1764-1775) and alternatively by application of the KS statistical test comparing T cells with B cells. Receiver-operating-characteristics (ROC)-curve analyses were performed to determine the optimal cut-off values for ZAP-70 measured by the two approaches. ZAP-70 protein expression was compared with ZAP-70 mRNA expression measured by a quantitative PCR (qPCR) and with the IgV(H) mutation status. Both flow cytometric analyses correlated well with the molecular technique and proved to be of equal value in predicting the IgV(H) mutation status. Applying the KS test is reproducible, simple, straightforward, and overcomes a number of difficulties encountered in the Crespo-method. The KS statistical test is an essential part of the software delivered with modern routine analytical flow cytometers and is well suited for analysis of ZAP-70 expression in CLL. (c) 2006 International Society for Analytical Cytology.
Statistical distribution of time to crack initiation and initial crack size using service data
NASA Technical Reports Server (NTRS)
Heller, R. A.; Yang, J. N.
1977-01-01
Crack growth inspection data gathered during the service life of the C-130 Hercules airplane were used in conjunction with a crack propagation rule to estimate the distribution of crack initiation times and of initial crack sizes. A Bayesian statistical approach was used to calculate the fraction of undetected initiation times as a function of the inspection time and the reliability of the inspection procedure used.
Kasapinova, K; Kamiloski, V
2016-06-01
Our purpose was to determine the correlation of initial radiographic parameters of a distal radius fracture with an injury of the triangular fibrocartilage complex. In a prospective study, 85 patients with surgically treated distal radius fractures were included. Wrist arthroscopy was used to identify and classify triangular fibrocartilage complex lesions. The initial radial length and angulation, dorsal angulation, ulnar variance and distal radioulnar distance were measured. Wrist arthroscopy identified a triangular fibrocartilage complex lesion in 45 patients. Statistical analysis did not identify a correlation with any single radiographic parameter of the distal radius fractures with the associated triangular fibrocartilage complex injuries. The initial radiograph of a distal radius fracture does not predict a triangular fibrocartilage complex injury. III. © The Author(s) 2016.
Kim, Won Hwa; Singh, Vikas; Chung, Moo K.; Hinrichs, Chris; Pachauri, Deepti; Okonkwo, Ozioma C.; Johnson, Sterling C.
2014-01-01
Statistical analysis on arbitrary surface meshes such as the cortical surface is an important approach to understanding brain diseases such as Alzheimer’s disease (AD). Surface analysis may be able to identify specific cortical patterns that relate to certain disease characteristics or exhibit differences between groups. Our goal in this paper is to make group analysis of signals on surfaces more sensitive. To do this, we derive multi-scale shape descriptors that characterize the signal around each mesh vertex, i.e., its local context, at varying levels of resolution. In order to define such a shape descriptor, we make use of recent results from harmonic analysis that extend traditional continuous wavelet theory from the Euclidean to a non-Euclidean setting (i.e., a graph, mesh or network). Using this descriptor, we conduct experiments on two different datasets, the Alzheimer’s Disease NeuroImaging Initiative (ADNI) data and images acquired at the Wisconsin Alzheimer’s Disease Research Center (W-ADRC), focusing on individuals labeled as having Alzheimer’s disease (AD), mild cognitive impairment (MCI) and healthy controls. In particular, we contrast traditional univariate methods with our multi-resolution approach which show increased sensitivity and improved statistical power to detect a group-level effects. We also provide an open source implementation. PMID:24614060
Pitoia, Fabián; Jerkovich, Fernando; Smulever, Anabella; Brenta, Gabriela; Bueno, Fernanda; Cross, Graciela
2017-01-01
Objective To evaluate the influence of age at diagnosis on the frequency of structural incomplete response (SIR) according to the modified risk of recurrence (RR) staging system from the American Thyroid Association guidelines. Patients and Methods We performed a retrospective analysis of 268 patients with differentiated thyroid cancer (DTC) followed up for at least 3 years after initial treatment (total thyroidectomy and remnant ablation). The median follow-up in the whole cohort was 74.3 months (range: 36.1-317.9) and the median age at diagnosis was 45.9 years (range: 18-87). The association between age at diagnosis and the initial and final response to treatment was assessed with analysis of variance (ANOVA). Patients were also divided into several groups considering age younger and older than 40, 50, and 60 years. Results Age at diagnosis was not associated with either an initial or final statistically significant different SIR to treatment (p = 0.14 and p = 0.58, respectively). Additionally, we did not find any statistically significant differences when the percentages of SIR considering the classification of RR were compared between different groups of patients by using several age cutoffs. Conclusions When patients are correctly risk stratified, it seems that age at diagnosis is not involved in the frequency of having a SIR at the initial evaluation or at the final follow-up, so it should not be included as an additional variable to be considered in the RR classifications. PMID:28785543
Student perception of initial transition into a nursing program: A mixed methods research study.
McDonald, Meghan; Brown, Janine; Knihnitski, Crystal
2018-05-01
Transition into undergraduate education programs is stressful and impacts students' well-being and academic achievement. Previous research indicates nursing students experience stress, depression, anxiety, and poor lifestyle habits which interfere with learning. However, nursing students' experience of transition into nursing programs has not been well studied. Incongruence exists between this lack of research and the desire to foster student success. This study analyzed students' experiences of initial transition into a nursing program. An embedded mixed method design. A single site of a direct-entry, four year baccalaureate Canadian nursing program. All first year nursing students enrolled in the fall term of 2016. This study combined the Student Adaptation to College Questionnaire (SACQ) with a subset of participants participating in qualitative focus groups. Quantitative data was analyzed using descriptive statistics to identify statistically significant differences in full-scale and subscale scores. Qualitative data was analyzed utilizing thematic analysis. Significant differences were seen between those who moved to attend university and those who did not, with those who moved scoring lower on the Academic Adjustment subscale. Focus group thematic analysis highlighted how students experienced initial transition into a baccalaureate nursing program. Identified themes included reframing supports, splitting focus/finding focus, negotiating own expectations, negotiating others' expectations, and forming identity. These findings form the Undergraduate Nursing Initial Transition (UNIT) Framework. Significance of this research includes applications in faculty development and program supports to increase student success in the first year of nursing and to provide foundational success for ongoing nursing practice. Copyright © 2018 Elsevier Ltd. All rights reserved.
Relationship between gait initiation and disability in individuals affected by multiple sclerosis.
Galli, Manuela; Coghe, Giancarlo; Sanna, Paola; Cocco, Eleonora; Marrosu, Maria Giovanna; Pau, Massimiliano
2015-11-01
This study analyzes how multiple sclerosis (MS) does affect one of the most common voluntary activities in life: the gait initiation (GI). The main aim of the work is to characterize the execution of this task by measuring and comparing relevant parameters based on center of pressure (COP) patterns and to study the relationship between these and the level of expanded disability status scale (EDSS). To this aim, 95 MS subjects with an average EDSS score of 2.4 and 35 healthy subjects were tested using a force platform during the transition from standing posture to gait. COP time-series were acquired and processed to extract a number of parameters related to the trajectory followed by the COP. The statistical analysis revealed that only a few measurements were statistically different between the two groups and only these were subsequently correlated with EDSS score. The correlation analysis underlined that a progressive alteration of the task execution can be directly related with the increase of EDSS score. These finding suggest that most of the impairment found in people with MS comes from the first part of the COP pattern, the anticipatory postural adjustments (APAs). The central nervous system performs APAs before every voluntary movement to minimize balance perturbation due to the movement itself. Gait Initiation's APAs consist in some ankle muscles contractions that induce a backward COP shift to the swing limb. The analysis here performed highlighted that MS affected patients have a reduced posterior COP shift that reveals that the anticipatory mechanism is impaired. Copyright © 2015 Elsevier B.V. All rights reserved.
A study of two statistical methods as applied to shuttle solid rocket booster expenditures
NASA Technical Reports Server (NTRS)
Perlmutter, M.; Huang, Y.; Graves, M.
1974-01-01
The state probability technique and the Monte Carlo technique are applied to finding shuttle solid rocket booster expenditure statistics. For a given attrition rate per launch, the probable number of boosters needed for a given mission of 440 launches is calculated. Several cases are considered, including the elimination of the booster after a maximum of 20 consecutive launches. Also considered is the case where the booster is composed of replaceable components with independent attrition rates. A simple cost analysis is carried out to indicate the number of boosters to build initially, depending on booster costs. Two statistical methods were applied in the analysis: (1) state probability method which consists of defining an appropriate state space for the outcome of the random trials, and (2) model simulation method or the Monte Carlo technique. It was found that the model simulation method was easier to formulate while the state probability method required less computing time and was more accurate.
An Adaptive Buddy Check for Observational Quality Control
NASA Technical Reports Server (NTRS)
Dee, Dick P.; Rukhovets, Leonid; Todling, Ricardo; DaSilva, Arlindo M.; Larson, Jay W.; Einaudi, Franco (Technical Monitor)
2000-01-01
An adaptive buddy check algorithm is presented that adjusts tolerances for outlier observations based on the variability of surrounding data. The algorithm derives from a statistical hypothesis test combined with maximum-likelihood covariance estimation. Its stability is shown to depend on the initial identification of outliers by a simple background check. The adaptive feature ensures that the final quality control decisions are not very sensitive to prescribed statistics of first-guess and observation errors, nor on other approximations introduced into the algorithm. The implementation of the algorithm in a global atmospheric data assimilation is described. Its performance is contrasted with that of a non-adaptive buddy check, for the surface analysis of an extreme storm that took place in Europe on 27 December 1999. The adaptive algorithm allowed the inclusion of many important observations that differed greatly from the first guess and that would have been excluded on the basis of prescribed statistics. The analysis of the storm development was much improved as a result of these additional observations.
A statistical study of decaying kink oscillations detected using SDO/AIA
NASA Astrophysics Data System (ADS)
Goddard, C. R.; Nisticò, G.; Nakariakov, V. M.; Zimovets, I. V.
2016-01-01
Context. Despite intensive studies of kink oscillations of coronal loops in the last decade, a large-scale statistically significant investigation of the oscillation parameters has not been made using data from the Solar Dynamics Observatory (SDO). Aims: We carry out a statistical study of kink oscillations using extreme ultraviolet imaging data from a previously compiled catalogue. Methods: We analysed 58 kink oscillation events observed by the Atmospheric Imaging Assembly (AIA) on board SDO during its first four years of operation (2010-2014). Parameters of the oscillations, including the initial apparent amplitude, period, length of the oscillating loop, and damping are studied for 120 individual loop oscillations. Results: Analysis of the initial loop displacement and oscillation amplitude leads to the conclusion that the initial loop displacement prescribes the initial amplitude of oscillation in general. The period is found to scale with the loop length, and a linear fit of the data cloud gives a kink speed of Ck = (1330 ± 50) km s-1. The main body of the data corresponds to kink speeds in the range Ck = (800-3300) km s-1. Measurements of 52 exponential damping times were made, and it was noted that at least 21 of the damping profiles may be better approximated by a combination of non-exponential and exponential profiles rather than a purely exponential damping envelope. There are nine additional cases where the profile appears to be purely non-exponential and no damping time was measured. A scaling of the exponential damping time with the period is found, following the previously established linear scaling between these two parameters.
Moffett, Bryan K; Panchabhai, Tanmay S; Nakamatsu, Raul; Arnold, Forest W; Peyrani, Paula; Wiemken, Timothy; Guardiola, Juan; Ramirez, Julio A
2016-12-01
It is unclear whether anteroposterior (AP) or posteroanterior with lateral (PA/Lat) chest radiographs are superior in the early detection of clinically relevant parapneumonic effusions (CR-PPEs). The objective of this study was to identify which technique is preferred for detection of PPEs using chest computed tomography (CCT) as a reference standard. A secondary analysis of a pneumonia database was conducted to identify patients who received a CCT within 24 hours of presentation and also received AP or PA/Lat chest radiographs within 24 hours of CCT. Sensitivity and specificity were then calculated by comparing the radiographic diagnosis of PPEs of both types of radiographs compared with CCT by using the existing attending radiologist interpretation. Clinical relevance of effusions was determined by CCT effusion measurement of >2.5 cm or presence of loculation. There was a statistically significant difference between the sensitivity of AP (67.3%) and PA/Lat (83.9%) chest radiography for the initial detection of CR-PPE. Of 16 CR-PPEs initially missed by AP radiography, 7 either required drainage initially or developed empyema within 30 days, whereas no complicated PPE or empyema was found in those missed by PA/Lat radiography. PA/Lat chest radiography should be the initial imaging of choice in pneumonia patients for detection of PPEs because it appears to be statistically superior to AP chest radiography. Published by Elsevier Inc.
WASP (Write a Scientific Paper) using Excel - 4: Histograms.
Grech, Victor
2018-02-01
Plotting data into graphs is a crucial step in data analysis as part of an initial descriptive statistics exercise since it gives the researcher an overview of the shape and nature of the data. Outlier values may also be identified, and these may be incorrect data, or true and important outliers. This paper explains how to access Microsoft Excel's Analysis Toolpak and provides some pointers for the utilisation of the histogram tool within the Toolpak. Copyright © 2018. Published by Elsevier B.V.
General aviation air traffic pattern safety analysis
NASA Technical Reports Server (NTRS)
Parker, L. C.
1973-01-01
A concept is described for evaluating the general aviation mid-air collision hazard in uncontrolled terminal airspace. Three-dimensional traffic pattern measurements were conducted at uncontrolled and controlled airports. Computer programs for data reduction, storage retrieval and statistical analysis have been developed. Initial general aviation air traffic pattern characteristics are presented. These preliminary results indicate that patterns are highly divergent from the expected standard pattern, and that pattern procedures observed can affect the ability of pilots to see and avoid each other.
Automatic initialization and quality control of large-scale cardiac MRI segmentations.
Albà, Xènia; Lekadir, Karim; Pereañez, Marco; Medrano-Gracia, Pau; Young, Alistair A; Frangi, Alejandro F
2018-01-01
Continuous advances in imaging technologies enable ever more comprehensive phenotyping of human anatomy and physiology. Concomitant reduction of imaging costs has resulted in widespread use of imaging in large clinical trials and population imaging studies. Magnetic Resonance Imaging (MRI), in particular, offers one-stop-shop multidimensional biomarkers of cardiovascular physiology and pathology. A wide range of analysis methods offer sophisticated cardiac image assessment and quantification for clinical and research studies. However, most methods have only been evaluated on relatively small databases often not accessible for open and fair benchmarking. Consequently, published performance indices are not directly comparable across studies and their translation and scalability to large clinical trials or population imaging cohorts is uncertain. Most existing techniques still rely on considerable manual intervention for the initialization and quality control of the segmentation process, becoming prohibitive when dealing with thousands of images. The contributions of this paper are three-fold. First, we propose a fully automatic method for initializing cardiac MRI segmentation, by using image features and random forests regression to predict an initial position of the heart and key anatomical landmarks in an MRI volume. In processing a full imaging database, the technique predicts the optimal corrective displacements and positions in relation to the initial rough intersections of the long and short axis images. Second, we introduce for the first time a quality control measure capable of identifying incorrect cardiac segmentations with no visual assessment. The method uses statistical, pattern and fractal descriptors in a random forest classifier to detect failures to be corrected or removed from subsequent statistical analysis. Finally, we validate these new techniques within a full pipeline for cardiac segmentation applicable to large-scale cardiac MRI databases. The results obtained based on over 1200 cases from the Cardiac Atlas Project show the promise of fully automatic initialization and quality control for population studies. Copyright © 2017 Elsevier B.V. All rights reserved.
Kulla, M; Josse, F; Stierholz, M; Hossfeld, B; Lampl, L; Helm, M
2016-05-20
As a part of the European Union Naval Force - Mediterranean Operation Sophia (EUNAVFOR Med), the Federal Republic of Germany is contributing to avoid further loss of lives at sea by supplying two naval vessels. In the study presented here we analyse the medical requirements of such rescue missions, as well as the potential benefits of various additional monitoring devices in identifying sick/injured refugees within the primary onboard medical assessment process. Retrospective analysis of the data collected between May - September 2015 from a German Naval Force frigate. Initial data collection focused on the primary medical assessment and treatment process of refugees rescued from distress at sea. Descriptive statistics, uni- and multivariate analysis were performed. The study has received a positive vote from the Ethics Commission of the University of Ulm, Germany (request no. 284/15) and has been registered in the German Register of Clinical Studies (no. DRKS00009535). A total of 2656 refugees had been rescued. 16.9 % of them were classified as "medical treatment required" within the initial onboard medical assessment process. In addition to the clinical assessment by an emergency physician, pulse rate (PR), core body temperature (CBT) and oxygen saturation (SpO2) were evaluated. Sick/injured refugees displayed a statistically significant higher PR (114/min vs. 107/min; p < .001) and CBT (37.1 °C vs. 36.7 °C; p < .001). There was no statistically significant difference in SpO2-values. The same results were found for the subgroup of patients classified as "treatment at emergency hospital required". However, a much larger difference of the mean PR and CBT (35/min resp. 1.8 °C) was found when examining the subgroups of the corresponding refugee boats. A cut-off value of clinical importance could not be found. Predominant diagnoses have been dermatological diseases (55.4), followed by internal diseases (27.7) and trauma (12.1 %). None of the refugees classified as "healthy" within the primary medical assessment process changed to "medical treatment required" during further observation. The initial medical assessment by an emergency physician has proved successful. PR, CBT and SpO2 didn't have any clinical impact to improve the identification of sick/injured refugees within the primary onboard assessment process.
Code of Federal Regulations, 2011 CFR
2011-10-01
... by ACF statistical staff from the Adoption and Foster Care Analysis and Reporting System (AFCARS) data which are transmitted by the State agency to ACF. The sampling frame will consist of cases of... State's most recent AFCARS data submission. For the initial primary review, if these data are not...
Monroe, Scott; Cai, Li
2015-01-01
This research is concerned with two topics in assessing model fit for categorical data analysis. The first topic involves the application of a limited-information overall test, introduced in the item response theory literature, to structural equation modeling (SEM) of categorical outcome variables. Most popular SEM test statistics assess how well the model reproduces estimated polychoric correlations. In contrast, limited-information test statistics assess how well the underlying categorical data are reproduced. Here, the recently introduced C2 statistic of Cai and Monroe (2014) is applied. The second topic concerns how the root mean square error of approximation (RMSEA) fit index can be affected by the number of categories in the outcome variable. This relationship creates challenges for interpreting RMSEA. While the two topics initially appear unrelated, they may conveniently be studied in tandem since RMSEA is based on an overall test statistic, such as C2. The results are illustrated with an empirical application to data from a large-scale educational survey.
MEASURE: An integrated data-analysis and model identification facility
NASA Technical Reports Server (NTRS)
Singh, Jaidip; Iyer, Ravi K.
1990-01-01
The first phase of the development of MEASURE, an integrated data analysis and model identification facility is described. The facility takes system activity data as input and produces as output representative behavioral models of the system in near real time. In addition a wide range of statistical characteristics of the measured system are also available. The usage of the system is illustrated on data collected via software instrumentation of a network of SUN workstations at the University of Illinois. Initially, statistical clustering is used to identify high density regions of resource-usage in a given environment. The identified regions form the states for building a state-transition model to evaluate system and program performance in real time. The model is then solved to obtain useful parameters such as the response-time distribution and the mean waiting time in each state. A graphical interface which displays the identified models and their characteristics (with real time updates) was also developed. The results provide an understanding of the resource-usage in the system under various workload conditions. This work is targeted for a testbed of UNIX workstations with the initial phase ported to SUN workstations on the NASA, Ames Research Center Advanced Automation Testbed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2012-01-05
SandiaMCR was developed to identify pure components and their concentrations from spectral data. This software efficiently implements the multivariate calibration regression alternating least squares (MCR-ALS), principal component analysis (PCA), and singular value decomposition (SVD). Version 3.37 also includes the PARAFAC-ALS Tucker-1 (for trilinear analysis) algorithms. The alternating least squares methods can be used to determine the composition without or with incomplete prior information on the constituents and their concentrations. It allows the specification of numerous preprocessing, initialization and data selection and compression options for the efficient processing of large data sets. The software includes numerous options including the definition ofmore » equality and non-negativety constraints to realistically restrict the solution set, various normalization or weighting options based on the statistics of the data, several initialization choices and data compression. The software has been designed to provide a practicing spectroscopist the tools required to routinely analysis data in a reasonable time and without requiring expert intervention.« less
Rigorous Statistical Bounds in Uncertainty Quantification for One-Layer Turbulent Geophysical Flows
NASA Astrophysics Data System (ADS)
Qi, Di; Majda, Andrew J.
2018-04-01
Statistical bounds controlling the total fluctuations in mean and variance about a basic steady-state solution are developed for the truncated barotropic flow over topography. Statistical ensemble prediction is an important topic in weather and climate research. Here, the evolution of an ensemble of trajectories is considered using statistical instability analysis and is compared and contrasted with the classical deterministic instability for the growth of perturbations in one pointwise trajectory. The maximum growth of the total statistics in fluctuations is derived relying on the statistical conservation principle of the pseudo-energy. The saturation bound of the statistical mean fluctuation and variance in the unstable regimes with non-positive-definite pseudo-energy is achieved by linking with a class of stable reference states and minimizing the stable statistical energy. Two cases with dependence on initial statistical uncertainty and on external forcing and dissipation are compared and unified under a consistent statistical stability framework. The flow structures and statistical stability bounds are illustrated and verified by numerical simulations among a wide range of dynamical regimes, where subtle transient statistical instability exists in general with positive short-time exponential growth in the covariance even when the pseudo-energy is positive-definite. Among the various scenarios in this paper, there exist strong forward and backward energy exchanges between different scales which are estimated by the rigorous statistical bounds.
NASA Technical Reports Server (NTRS)
Hirsch, Annette L.; Kala, Jatin; Pitman, Andy J.; Carouge, Claire; Evans, Jason P.; Haverd, Vanessa; Mocko, David
2014-01-01
The authors use a sophisticated coupled land-atmosphere modeling system for a Southern Hemisphere subdomain centered over southeastern Australia to evaluate differences in simulation skill from two different land surface initialization approaches. The first approach uses equilibrated land surface states obtained from offline simulations of the land surface model, and the second uses land surface states obtained from reanalyses. The authors find that land surface initialization using prior offline simulations contribute to relative gains in subseasonal forecast skill. In particular, relative gains in forecast skill for temperature of 10%-20% within the first 30 days of the forecast can be attributed to the land surface initialization method using offline states. For precipitation there is no distinct preference for the land surface initialization method, with limited gains in forecast skill irrespective of the lead time. The authors evaluated the asymmetry between maximum and minimum temperatures and found that maximum temperatures had the largest gains in relative forecast skill, exceeding 20% in some regions. These results were statistically significant at the 98% confidence level at up to 60 days into the forecast period. For minimum temperature, using reanalyses to initialize the land surface contributed to relative gains in forecast skill, reaching 40% in parts of the domain that were statistically significant at the 98% confidence level. The contrasting impact of the land surface initialization method between maximum and minimum temperature was associated with different soil moisture coupling mechanisms. Therefore, land surface initialization from prior offline simulations does improve predictability for temperature, particularly maximum temperature, but with less obvious improvements for precipitation and minimum temperature over southeastern Australia.
Statistical characterization of planar two-dimensional Rayleigh-Taylor mixing layers
NASA Astrophysics Data System (ADS)
Sendersky, Dmitry
2000-10-01
The statistical evolution of a planar, randomly perturbed fluid interface subject to Rayleigh-Taylor instability is explored through numerical simulation in two space dimensions. The data set, generated by the front-tracking code FronTier, is highly resolved and covers a large ensemble of initial perturbations, allowing a more refined analysis of closure issues pertinent to the stochastic modeling of chaotic fluid mixing. We closely approach a two-fold convergence of the mean two-phase flow: convergence of the numerical solution under computational mesh refinement, and statistical convergence under increasing ensemble size. Quantities that appear in the two-phase averaged Euler equations are computed directly and analyzed for numerical and statistical convergence. Bulk averages show a high degree of convergence, while interfacial averages are convergent only in the outer portions of the mixing zone, where there is a coherent array of bubble and spike tips. Comparison with the familiar bubble/spike penetration law h = alphaAgt 2 is complicated by the lack of scale invariance, inability to carry the simulations to late time, the increasing Mach numbers of the bubble/spike tips, and sensitivity to the method of data analysis. Finally, we use the simulation data to analyze some constitutive properties of the mixing process.
An Asynchronous Many-Task Implementation of In-Situ Statistical Analysis using Legion.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pebay, Philippe Pierre; Bennett, Janine Camille
2015-11-01
In this report, we propose a framework for the design and implementation of in-situ analy- ses using an asynchronous many-task (AMT) model, using the Legion programming model together with the MiniAero mini-application as a surrogate for full-scale parallel scientific computing applications. The bulk of this work consists of converting the Learn/Derive/Assess model which we had initially developed for parallel statistical analysis using MPI [PTBM11], from a SPMD to an AMT model. In this goal, we propose an original use of the concept of Legion logical regions as a replacement for the parallel communication schemes used for the only operation ofmore » the statistics engines that require explicit communication. We then evaluate this proposed scheme in a shared memory environment, using the Legion port of MiniAero as a proxy for a full-scale scientific application, as a means to provide input data sets of variable size for the in-situ statistical analyses in an AMT context. We demonstrate in particular that the approach has merit, and warrants further investigation, in collaboration with ongoing efforts to improve the overall parallel performance of the Legion system.« less
NASA Astrophysics Data System (ADS)
Gupta, Ankur; Balomajumder, Chandrajit
2017-12-01
In this study, simultaneous removal of Cr(VI) and phenol from binary solution was carried out using Fe-treated tea waste biomass. The effect of process parameters such as adsorbent dose, pH, initial concentration of Cr(VI) (mg/L), and initial concentration of phenol (mg/L) was optimized. The analysis of variance of the quadratic model demonstrates that the experimental results are in good agreement with the predicted values. Based on experimental design at an initial concentration of 55 mg/L of Cr(VI), 27.50 mg/L of phenol, pH 2.0, 15 g/L adsorbent dose, 99.99% removal of Cr(VI), and phenol was achieved.
Chua, Felicia H Z; Thien, Ady; Ng, Lee Ping; Seow, Wan Tew; Low, David C Y; Chang, Kenneth T E; Lian, Derrick W Q; Loh, Eva; Low, Sharon Y Y
2017-03-01
Posterior fossa syndrome (PFS) is a serious complication faced by neurosurgeons and their patients, especially in paediatric medulloblastoma patients. The uncertain aetiology of PFS, myriad of cited risk factors and therapeutic challenges make this phenomenon an elusive entity. The primary objective of this study was to identify associative factors related to the development of PFS in medulloblastoma patient post-tumour resection. This is a retrospective study based at a single institution. Patient data and all related information were collected from the hospital records, in accordance to a list of possible risk factors associated with PFS. These included pre-operative tumour volume, hydrocephalus, age, gender, extent of resection, metastasis, ventriculoperitoneal shunt insertion, post-operative meningitis and radiological changes in MRI. Additional variables included molecular and histological subtypes of each patient's medulloblastoma tumour. Statistical analysis was employed to determine evidence of each variable's significance in PFS permanence. A total of 19 patients with appropriately complete data was identified. Initial univariate analysis did not show any statistical significance. However, multivariate analysis for MRI-specific changes reported bilateral DWI restricted diffusion changes involving both right and left sides of the surgical cavity was of statistical significance for PFS permanence. The authors performed a clinical study that evaluated possible risk factors for permanent PFS in paediatric medulloblastoma patients. Analysis of collated results found that post-operative DWI restriction in bilateral regions within the surgical cavity demonstrated statistical significance as a predictor of PFS permanence-a novel finding in the current literature.
Shankar, Ganesh M; Clarke, Michelle J; Ailon, Tamir; Rhines, Laurence D; Patel, Shreyaskumar R; Sahgal, Arjun; Laufer, Ilya; Chou, Dean; Bilsky, Mark H; Sciubba, Daniel M; Fehlings, Michael G; Fisher, Charles G; Gokaslan, Ziya L; Shin, John H
2017-07-01
OBJECTIVE Primary osteosarcoma of the spine is a rare osseous neoplasm. While previously reported retrospective studies have demonstrated that overall patient survival is impacted mostly by en bloc resection and chemotherapy, the continued management of residual disease remains to be elucidated. This systematic review was designed to address the role of revision surgery and multimodal adjuvant therapy in cases in which en bloc excision is not initially achieved. METHODS A systematic literature search spanning the years 1966 to 2015 was performed on PubMed, Medline, EMBASE, and Web of Science to identify reports describing outcomes of patients who underwent biopsy alone, neurological decompression, or intralesional resection for osteosarcoma of the spine. Studies were reviewed qualitatively, and the clinical course of individual patients was aggregated for quantitative meta-analysis. RESULTS A total of 16 studies were identified for inclusion in the systematic review, of which 8 case reports were summarized qualitatively. These studies strongly support the role of chemotherapy for overall survival and moderately support adjuvant radiation therapy for local control. The meta-analysis revealed a statistically significant benefit in overall survival for performing revision tumor debulking (p = 0.01) and also for chemotherapy at relapse (p < 0.01). Adjuvant radiation therapy was associated with longer survival, although this did not reach statistical significance (p = 0.06). CONCLUSIONS While the initial therapeutic goal in the management of osteosarcoma of the spine is neoadjuvant chemotherapy followed by en bloc marginal resection, this objective is not always achievable given anatomical constraints and other limitations at the time of initial clinical presentation. This systematic review supports the continued aggressive use of revision surgery and multimodal adjuvant therapy when possible to improve outcomes in patients who initially undergo subtotal debulking of osteosarcoma. A limitation of this systematic review is that lesions amenable to subsequent resection or tumors inherently more sensitive to adjuvants would exaggerate a therapeutic effect of these interventions when studied in a retrospective fashion.
Phonetic diversity, statistical learning, and acquisition of phonology.
Pierrehumbert, Janet B
2003-01-01
In learning to perceive and produce speech, children master complex language-specific patterns. Daunting language-specific variation is found both in the segmental domain and in the domain of prosody and intonation. This article reviews the challenges posed by results in phonetic typology and sociolinguistics for the theory of language acquisition. It argues that categories are initiated bottom-up from statistical modes in use of the phonetic space, and sketches how exemplar theory can be used to model the updating of categories once they are initiated. It also argues that bottom-up initiation of categories is successful thanks to the perception-production loop operating in the speech community. The behavior of this loop means that the superficial statistical properties of speech available to the infant indirectly reflect the contrastiveness and discriminability of categories in the adult grammar. The article also argues that the developing system is refined using internal feedback from type statistics over the lexicon, once the lexicon is well-developed. The application of type statistics to a system initiated with surface statistics does not cause a fundamental reorganization of the system. Instead, it exploits confluences across levels of representation which characterize human language and make bootstrapping possible.
Jager, Tjalling
2013-02-05
The individuals of a species are not equal. These differences frustrate experimental biologists and ecotoxicologists who wish to study the response of a species (in general) to a treatment. In the analysis of data, differences between model predictions and observations on individual animals are usually treated as random measurement error around the true response. These deviations, however, are mainly caused by real differences between the individuals (e.g., differences in physiology and in initial conditions). Understanding these intraspecies differences, and accounting for them in the data analysis, will improve our understanding of the response to the treatment we are investigating and allow for a more powerful, less biased, statistical analysis. Here, I explore a basic scheme for statistical inference to estimate parameters governing stress that allows individuals to differ in their basic physiology. This scheme is illustrated using a simple toxicokinetic-toxicodynamic model and a data set for growth of the springtail Folsomia candida exposed to cadmium in food. This article should be seen as proof of concept; a first step in bringing more realism into the statistical inference for process-based models in ecotoxicology.
Ji, Qinqin; Salomon, Arthur R.
2015-01-01
The activation of T-lymphocytes through antigen-mediated T-cell receptor (TCR) clustering is vital in regulating the adaptive-immune response. Although T cell receptor signaling has been extensively studied, the fundamental mechanisms for signal initiation are not fully understood. Reduced temperature initiated some of the hallmarks of TCR signaling such as increased phosphorylation and activation on ERK and calcium release from the endoplasmic reticulum as well as coalesce T-cell membrane microdomains. The precise mechanism of TCR signaling initiation due to temperature change remains obscure. One critical question is whether signaling initiated by cold treatment of T cells differs from signaling initiated by crosslinking of the T cell receptor. To address this uncertainty, a wide-scale, quantitative mass spectrometry-based phosphoproteomic analysis was performed on T cells stimulated either by temperature shift or through crosslinking of the TCR. Careful statistical comparison between the two stimulations revealed a striking level of identity between the subset of 339 sites that changed significantly with both stimulations. This study demonstrates for the first time, at unprecedented detail, that T cell cold treatment was sufficient to initiate signaling patterns nearly identical to soluble antibody stimulation, shedding new light on the mechanism of activation of these critically important immune cells. PMID:25839225
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maruyama, Mitsunari, E-mail: mitunari@med-shimane.u.ac.jp; Yoshizako, Takeshi, E-mail: yosizako@med.shimane-u.ac.jp; Nakamura, Tomonori, E-mail: t-naka@med.shimane-u.ac.jp
2016-03-15
PurposeThis study was performed to evaluate the accumulation of lipiodol emulsion (LE) and adverse events during our initial experience of balloon-occluded trans-catheter arterial chemoembolization (B-TACE) for hepatocellular carcinoma (HCC) compared with conventional TACE (C-TACE).MethodsB-TACE group (50 cases) was compared with C-TACE group (50 cases). The ratio of the LE concentration in the tumor to that in the surrounding embolized liver parenchyma (LE ratio) was calculated after each treatment. Adverse events were evaluated according to the Common Terminology Criteria for Adverse Effects (CTCAE) version 4.0.ResultsThe LE ratio at the level of subsegmental showed a statistically significant difference between the groups (tmore » test: P < 0.05). Only elevation of alanine aminotransferase was more frequent in the B-TACE group, showing a statistically significant difference (Mann–Whitney test: P < 0.05). While B-TACE caused severe adverse events (liver abscess and infarction) in patients with bile duct dilatation, there was no statistically significant difference in incidence between the groups. Multivariate logistic regression analysis suggested that the significant risk factor for liver abscess/infarction was bile duct dilatation (P < 0.05).ConclusionThe LE ratio at the level of subsegmental showed a statistically significant difference between the groups (t test: P < 0.05). B-TACE caused severe adverse events (liver abscess and infarction) in patients with bile duct dilatation.« less
Characterizing chaotic melodies in automatic music composition
NASA Astrophysics Data System (ADS)
Coca, Andrés E.; Tost, Gerard O.; Zhao, Liang
2010-09-01
In this paper, we initially present an algorithm for automatic composition of melodies using chaotic dynamical systems. Afterward, we characterize chaotic music in a comprehensive way as comprising three perspectives: musical discrimination, dynamical influence on musical features, and musical perception. With respect to the first perspective, the coherence between generated chaotic melodies (continuous as well as discrete chaotic melodies) and a set of classical reference melodies is characterized by statistical descriptors and melodic measures. The significant differences among the three types of melodies are determined by discriminant analysis. Regarding the second perspective, the influence of dynamical features of chaotic attractors, e.g., Lyapunov exponent, Hurst coefficient, and correlation dimension, on melodic features is determined by canonical correlation analysis. The last perspective is related to perception of originality, complexity, and degree of melodiousness (Euler's gradus suavitatis) of chaotic and classical melodies by nonparametric statistical tests.
Mehta, H B; Vargas, G M; Adhikari, D; Dimou, F; Riall, T S
2017-06-01
The objectives were to determine trends in the use of chemotherapy as the initial treatment and to evaluate the comparative effectiveness of initial chemotherapy vs resection of the primary tumour on survival (intention-to-treat analysis) in Stage IV colorectal cancer (CRC). This cohort study used 2000-2011 data from the Surveillance, Epidemiology, and End Results (SEER)-Medicare linked database, including patients ≥ 66 years of age presenting with Stage IV CRC. Cox proportional hazards models and instrumental variable analysis were used to compare the effectiveness of chemotherapy as the initial treatment with resection of the primary tumour as the initial treatment, with 2-year survival as the end point. The use of chemotherapy as the first treatment increased over time, from 26.8% in 2001 to 46.9% in 2009 (P < 0.0001). The traditional Cox model showed that chemotherapy as the initial treatment was associated with a higher risk of mortality [hazard ratio (HR) = 1.35; 95% CI: 1.27-1.44]. When accounting for known and unknown confounders in an instrumental variable analysis, chemotherapy as the initial treatment suggested benefit on 2-year survival (HR = 0.68; 95% CI: 0.44-1.04); however, the association did not reach statistical significance. The study findings were similar in six subgroup analyses. The use of chemotherapy as the initial therapy for CRC increased substantially from 2001 to 2009. Instrumental variable analysis found that, compared with resection, chemotherapy as the initial treatment offers similar or better 2-year survival in patients with Stage IV CRC. Given the morbidity and mortality associated with colorectal resection in elderly patients, chemotherapy provides an option to patients who are not good candidates for resection. Colorectal Disease © 2017 The Association of Coloproctology of Great Britain and Ireland.
Badreldin, Hisham
2018-07-01
This study was conducted to describe the real-world hospital length of stay in patients treated with all of the U.S. Food and Drug Administration approved direct oral anticoagulants (DOACs) versus warfarin for new-onset venous thromboembolism (VTE) at a large, tertiary, academic medical center. A retrospective cohort analysis of all adult patients diagnosed with acute onset VTE was conducted. Of the 441 patients included, 261 (57%) patients received DOACs versus 180 (41%) patients received warfarin. In the DOAC group, a total of 92 (35%) patients received rivaroxaban, followed by 83 (32%) patients received apixaban, 50 (19%) patients received dabigatran, and 36 (14%) patients received edoxaban. Patients initiated on DOACs had a statistically significant shorter hospital length of stay compared to patients initiated on warfarin (median 3 days, [IQR 0-5] vs. 8 days [IQR 5-11], P < 0.05). Despite the shorter hospital length of stay in patients receiving DOACs, the overall reported differences between the DOACs group and the warfarin group in terms of recurrent VTE, major bleeding, intracranial bleeding, and gastrointestinal bleeding at 3 and 6 months were deemed to be statistically insignificant.
Entropy Growth in the Early Universe and Confirmation of Initial Big Bang Conditions
NASA Astrophysics Data System (ADS)
Beckwith, Andrew
2009-09-01
This paper shows how increased entropy values from an initially low big bang level can be measured experimentally by counting relic gravitons. Furthermore the physical mechanism of this entropy increase is explained via analogies with early-universe phase transitions. The role of Jack Ng's (2007, 2008a, 2008b) revised infinite quantum statistics in the physics of gravitational wave detection is acknowledged. Ng's infinite quantum statistics can be used to show that ΔS~ΔNgravitons is a startmg point to the increasing net universe cosmological entropy. Finally, in a nod to similarities AS ZPE analysis, it is important to note that the resulting ΔS~ΔNgravitons ≠ 1088, that in fact it is much lower, allowing for evaluating initial graviton production as an emergent field phenomena, which may be similar to how ZPE states can be used to extract energy from a vacuum if entropy is not maximized. The rapid increase in entropy so alluded to without near sudden increases to 1088 may be enough to allow successful modeling of relic graviton production for entropy in a manner similar to ZPE energy extraction from a vacuum state.
Krishnan, Ranjani; Walton, Emily B; Van Vliet, Krystyn J
2009-11-01
As computational resources increase, molecular dynamics simulations of biomolecules are becoming an increasingly informative complement to experimental studies. In particular, it has now become feasible to use multiple initial molecular configurations to generate an ensemble of replicate production-run simulations that allows for more complete characterization of rare events such as ligand-receptor unbinding. However, there are currently no explicit guidelines for selecting an ensemble of initial configurations for replicate simulations. Here, we use clustering analysis and steered molecular dynamics simulations to demonstrate that the configurational changes accessible in molecular dynamics simulations of biomolecules do not necessarily correlate with observed rare-event properties. This informs selection of a representative set of initial configurations. We also employ statistical analysis to identify the minimum number of replicate simulations required to sufficiently sample a given biomolecular property distribution. Together, these results suggest a general procedure for generating an ensemble of replicate simulations that will maximize accurate characterization of rare-event property distributions in biomolecules.
Whole vertebral bone segmentation method with a statistical intensity-shape model based approach
NASA Astrophysics Data System (ADS)
Hanaoka, Shouhei; Fritscher, Karl; Schuler, Benedikt; Masutani, Yoshitaka; Hayashi, Naoto; Ohtomo, Kuni; Schubert, Rainer
2011-03-01
An automatic segmentation algorithm for the vertebrae in human body CT images is presented. Especially we focused on constructing and utilizing 4 different statistical intensity-shape combined models for the cervical, upper / lower thoracic and lumbar vertebrae, respectively. For this purpose, two previously reported methods were combined: a deformable model-based initial segmentation method and a statistical shape-intensity model-based precise segmentation method. The former is used as a pre-processing to detect the position and orientation of each vertebra, which determines the initial condition for the latter precise segmentation method. The precise segmentation method needs prior knowledge on both the intensities and the shapes of the objects. After PCA analysis of such shape-intensity expressions obtained from training image sets, vertebrae were parametrically modeled as a linear combination of the principal component vectors. The segmentation of each target vertebra was performed as fitting of this parametric model to the target image by maximum a posteriori estimation, combined with the geodesic active contour method. In the experimental result by using 10 cases, the initial segmentation was successful in 6 cases and only partially failed in 4 cases (2 in the cervical area and 2 in the lumbo-sacral). In the precise segmentation, the mean error distances were 2.078, 1.416, 0.777, 0.939 mm for cervical, upper and lower thoracic, lumbar spines, respectively. In conclusion, our automatic segmentation algorithm for the vertebrae in human body CT images showed a fair performance for cervical, thoracic and lumbar vertebrae.
Turbulence Statistics of a Buoyant Jet in a Stratified Environment
NASA Astrophysics Data System (ADS)
McCleney, Amy Brooke
Using non-intrusive optical diagnostics, turbulence statistics for a round, incompressible, buoyant, and vertical jet discharging freely into a stably linear stratified environment is studied and compared to a reference case of a neutrally buoyant jet in a uniform environment. This is part of a validation campaign for computational fluid dynamics (CFD). Buoyancy forces are known to significantly affect the jet evolution in a stratified environment. Despite their ubiquity in numerous natural and man-made flows, available data in these jets are limited, which constrain our understanding of the underlying physical processes. In particular, there is a dearth of velocity field data, which makes it challenging to validate numerical codes, currently used for modeling these important flows. Herein, jet near- and far-field behaviors are obtained with a combination of planar laser induced fluorescence (PLIF) and multi-scale time-resolved particle image velocimetry (TR-PIV) for Reynolds number up to 20,000. Deploying non-intrusive optical diagnostics in a variable density environment is challenging in liquids. The refractive index is strongly affected by the density, which introduces optical aberrations and occlusions that prevent the resolution of the flow. One solution consists of using index matched fluids with different densities. Here a pair of water solutions - isopropanol and NaCl - are identified that satisfy these requirements. In fact, they provide a density difference up to 5%, which is the largest reported for such fluid pairs. Additionally, by design, the kinematic viscosities of the solutions are identical. This greatly simplifies the analysis and subsequent simulations of the data. The spectral and temperature dependence of the solutions are fully characterized. In the near-field, shear layer roll-up is analyzed and characterized as a function of initial velocity profile. In the far-field, turbulence statistics are reported for two different scales, one capturing the entire jet at near Taylor microscale resolution, and the other, thanks to the careful refractive index matching of the liquids, resolving the Taylor scale at near Kolmogorov scale resolution. This is accomplished using a combination of TR-PIV and long-distance micro-PIV. The turbulence statistics obtained at various downstream locations and magnifications are obtained for density differences of 0%, 1%, and 3%. To validate the experimental methodology and provide a reference case for validation, the effect of initial velocity profile on the neutrally buoyant jet in the self-preserving regime is studied at two Reynolds numbers of 10,000 and 20,000. For the neutrally buoyant jet, it is found that independent of initial conditions the jet follows a self-similar behavior in the far-field; however, the spreading rate is strongly dependent on initial velocity profile. High magnification analysis at the small turbulent length scales shows a flow field where the mean statistics compare well to the larger field of view case. Investigation of the near-field shows the jet is strongly influenced by buoyancy, where an increase in vortex ring formation frequency and number of pairings occur. The buoyant jet with a 1% density difference shows an alteration of the centerline velocity decay, but the radial distribution of the mean axial velocity collapses well at all measurement locations. Jet formation dramatically changes for a buoyant jet with a 3% density difference, where the jet reaches a terminal height and spreads out horizontally at its neutral buoyancy location. Analysis of both the mean axial velocity and strain rates show the jet is no longer self-similar; for example, the mean centerline velocity does not decay uniformly as the jet develops. The centerline strain rates at this density difference also show trends which are strongly influenced by the altered centerline velocity. The overall centerline analysis shows that turbulence suppression occurs as a result of the stratification for both the 1% and 3% density difference. Analysis on the kinetic energy budget shows that the mean convection, production, transportation, and dissipation of energy is altered from stratification. High resolution data of the jet enable flow structures to be captured in the neutrally buoyant region of the flow. Vortices of different sizes are identified. Longer data sets are necessary to perform a statistical analysis of their distribution and to compare them to homogeneous environment case. This multi-scale analysis shows potential for studying energy transfer between length scales.
Factors predicting survival in amyotrophic lateral sclerosis patients on non-invasive ventilation.
Gonzalez Calzada, Nuria; Prats Soro, Enric; Mateu Gomez, Lluis; Giro Bulta, Esther; Cordoba Izquierdo, Ana; Povedano Panades, Monica; Dorca Sargatal, Jordi; Farrero Muñoz, Eva
2016-01-01
Non invasive ventilation (NIV) improves quality of life and extends survival in amyotrophic lateral sclerosis (ALS) patients. However, few data exist about the factors related to survival. We intended to assess the predictive factors that influence survival in patients after NIV initiation. Patients who started NIV from 2000 to 2014 and were tolerant (compliance ≥ 4 hours) were included; demographic, disease related and respiratory variables at NIV initiation were analysed. Statistical analysis was performed using the Kaplan-Meier test and Cox proportional hazard models. 213 patients were included with median survival from NIV initiation of 13.5 months. In univariate analysis, the identified risk factors for mortality were severity of bulbar involvement (HR 2), Forced Vital Capacity (FVC) % (HR 0.99) and ALSFRS-R (HR 0.97). Multivariate analysis showed that bulbar involvement (HR 1.92) and ALSFRS-R (HR 0.97) were independent predictive factors of survival in patients on NIV. In our study, the two prognostic factors in ALS patients following NIV were the severity of bulbar involvement and ALSFRS-R at the time on NIV initiation. A better assessment of bulbar involvement, including evaluation of the upper airway, and a careful titration on NIV are necessary to optimize treatment efficacy.
The Impact of Recreational Marijuana Legislation in Washington, DC on Marijuana Use Cognitions.
Clarke, Paige; Dodge, Tonya; Stock, Michelle L
2018-04-13
There is little published research that tests the effect of recreational marijuana legislation on risk-related cognitions and how individuals respond immediately after legislative approval. The objective was to test whether learning about the passage of Initiative 71, a voter referendum that legalized recreational use of marijuana in the District of Columbia, would lead individuals to adopt more favorable marijuana cognitions than they had before the Initiative was passed. Undergraduate students (N = 402) completed two web-based questionnaires in 2014. The first questionnaire was completed prior to the referendum vote and the follow-up questionnaire was completed after voters approved Initiative 71. Attitudes, perceived norms, intentions, prototypes, and willingness were measured at time 1 and time 2. Study hypotheses were tested using repeated-measures analysis of covariance. Results showed that attitudes, intentions, perceived norms, and willingness to use marijuana were more favorable after Initiative 71 was passed. However, the increase in attitudes and willingness was moderated by past experience with marijuana whereby the increases were statistically significant only among those with the least experience. The increase in perceived norms was also moderated by past experience whereby increases were statistically significant among those who were moderate or heavy users. The passage of Initiative 71 had no effect on favorable prototypes. Conclusion/Importance: Legalization may have the unintended outcome of leading to more favorable intentions to use marijuana and might lead abstainers or experimental users to become more frequent users of marijuana via more positive attitudes and willingness towards marijuana use.
Genital Herpes - Initial Visits to Physicians' Offices, United States, 1966-2012
... Archive Data & Statistics Sexually Transmitted Diseases Figure 48. Genital Herpes — Initial Visits to Physicians’ Offices, United States, 1966 – ... Statistics page . NOTE : The relative standard errors for genital herpes estimates of more than 100,000 range from ...
NASA Technical Reports Server (NTRS)
Adelfang, S. I.
1977-01-01
Wind vector change with respect to time at Cape Kennedy, Florida, is examined according to the theory of multivariate normality. The joint distribution of the four variables represented by the components of the wind vector at an initial time and after a specified elapsed time is hypothesized to be quadravariate normal; the fourteen statistics of this distribution, calculated from fifteen years of twice daily Rawinsonde data are presented by monthly reference periods for each month from 0 to 27 km. The hypotheses that the wind component changes with respect to time is univariate normal, the joint distribution of wind component changes is bivariate normal, and the modulus of vector wind change is Rayleigh, has been tested by comparison with observed distributions. Statistics of the conditional bivariate normal distributions of vector wind at a future time given the vector wind at an initial time are derived. Wind changes over time periods from one to five hours, calculated from Jimsphere data, are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Fangyan; Zhang, Song; Chung Wong, Pak
Effectively visualizing large graphs and capturing the statistical properties are two challenging tasks. To aid in these two tasks, many sampling approaches for graph simplification have been proposed, falling into three categories: node sampling, edge sampling, and traversal-based sampling. It is still unknown which approach is the best. We evaluate commonly used graph sampling methods through a combined visual and statistical comparison of graphs sampled at various rates. We conduct our evaluation on three graph models: random graphs, small-world graphs, and scale-free graphs. Initial results indicate that the effectiveness of a sampling method is dependent on the graph model, themore » size of the graph, and the desired statistical property. This benchmark study can be used as a guideline in choosing the appropriate method for a particular graph sampling task, and the results presented can be incorporated into graph visualization and analysis tools.« less
A shift from significance test to hypothesis test through power analysis in medical research.
Singh, G
2006-01-01
Medical research literature until recently, exhibited substantial dominance of the Fisher's significance test approach of statistical inference concentrating more on probability of type I error over Neyman-Pearson's hypothesis test considering both probability of type I and II error. Fisher's approach dichotomises results into significant or not significant results with a P value. The Neyman-Pearson's approach talks of acceptance or rejection of null hypothesis. Based on the same theory these two approaches deal with same objective and conclude in their own way. The advancement in computing techniques and availability of statistical software have resulted in increasing application of power calculations in medical research and thereby reporting the result of significance tests in the light of power of the test also. Significance test approach, when it incorporates power analysis contains the essence of hypothesis test approach. It may be safely argued that rising application of power analysis in medical research may have initiated a shift from Fisher's significance test to Neyman-Pearson's hypothesis test procedure.
Diagnostic index of 3D osteoarthritic changes in TMJ condylar morphology
NASA Astrophysics Data System (ADS)
Gomes, Liliane R.; Gomes, Marcelo; Jung, Bryan; Paniagua, Beatriz; Ruellas, Antonio C.; Gonçalves, João. Roberto; Styner, Martin A.; Wolford, Larry; Cevidanes, Lucia
2015-03-01
The aim of this study was to investigate imaging statistical approaches for classifying 3D osteoarthritic morphological variations among 169 Temporomandibular Joint (TMJ) condyles. Cone beam Computed Tomography (CBCT) scans were acquired from 69 patients with long-term TMJ Osteoarthritis (OA) (39.1 ± 15.7 years), 15 patients at initial diagnosis of OA (44.9 ± 14.8 years) and 7 healthy controls (43 ± 12.4 years). 3D surface models of the condyles were constructed and Shape Correspondence was used to establish correspondent points on each model. The statistical framework included a multivariate analysis of covariance (MANCOVA) and Direction-Projection- Permutation (DiProPerm) for testing statistical significance of the differences between healthy control and the OA group determined by clinical and radiographic diagnoses. Unsupervised classification using hierarchical agglomerative clustering (HAC) was then conducted. Condylar morphology in OA and healthy subjects varied widely. Compared with healthy controls, OA average condyle was statistically significantly smaller in all dimensions except its anterior surface. Significant flattening of the lateral pole was noticed at initial diagnosis (p < 0.05). It was observed areas of 3.88 mm bone resorption at the superior surface and 3.10 mm bone apposition at the anterior aspect of the long-term OA average model. 1000 permutation statistics of DiProPerm supported a significant difference between the healthy control group and OA group (t = 6.7, empirical p-value = 0.001). Clinically meaningful unsupervised classification of TMJ condylar morphology determined a preliminary diagnostic index of 3D osteoarthritic changes, which may be the first step towards a more targeted diagnosis of this condition.
Improving suicide mortality statistics in Tarragona (Catalonia, Spain) between 2004-2012.
Barbería, Eneko; Gispert, Rosa; Gallo, Belén; Ribas, Gloria; Puigdefàbregas, Anna; Freitas, Adriana; Segú, Elena; Torralba, Pilar; García-Sayago, Francisco; Estarellas, Aina
2016-07-20
Monitoring and preventing suicidal behaviour requires, among other data, knowing suicide deaths precisely. They often appear under-reported or misclassified in the official mortality statistics. The aim of this study is to analyse the under-reporting found in the suicide mortality statistics of Tarragona (a province of Catalonia, Spain). The analysis takes into account all suicide deaths that occurred in the Tarragona Area of the Catalan Institute of Legal Medicine and Forensic Sciences (TA-CILMFS) between 2004 and 2012. The sources of information were the death data files of the Catalan Mortality Register, as well as the Autopsies Files of the TA-CILMFS. Suicide rates and socio-demographic profiles were statistically compared between the suicide initially reported and the final one. The mean percentage of non-reported cases in the period was 16.2%, with a minimum percentage of 2.2% in 2005 and a maximum of 26.8% in 2009. The crude mortality rate by suicide rose from 6.6 to 7.9 per 100,000 inhabitants once forensic data were incorporated. Small differences were detected between the socio-demographic profile of the suicide initially reported and the final one. Supplementary information was obtained on the suicide method, which revealed a significant increase in poisoning and suicides involving trains. An exhaustive review of suicide deaths data from forensic sources has led to an improvement in the under-reported statistical information. It also improves the knowledge of the method of suicide and personal characteristics. Copyright © 2016 SEP y SEPB. Publicado por Elsevier España, S.L.U. All rights reserved.
A comparison of wire- and Kevlar-reinforced provisional restorations.
Powell, D B; Nicholls, J I; Yuodelis, R A; Strygler, H
1994-01-01
Stainless steel wire 0.036 inch in diameter was compared with Kevlar 49 polyaramid fiber as a means of reinforcing a four-unit posterior provisional fixed restoration with 2 pontics. Three reinforcement patterns for wire and two for Kevlar 49 were evaluated and compared with the control, which was an unreinforced provisional restoration. A central tensile load was placed on the cemented provisional restoration and the variables were measured: (1) the initial stiffness; (2) the load at initial fracture; and (3) the unit toughness, or the energy stored in the beam at a point where the load had undergone a 1.0-mm deflection. Statistical analysis showed (1) the bent wire configuration had a significantly higher initial stiffness (P < or = .05), (2) there was no difference between designs for load at initial fracture, and (3) the bent wire had a significantly higher unit toughness value (P < or = .05).
Homogeneous buoyancy-generated turbulence
NASA Technical Reports Server (NTRS)
Batchelor, G. K.; Canuto, V. M.; Chasnov, J. R.
1992-01-01
Using a theoretical analysis of fundamental equations and a numerical simulation of the flow field, the statistically homogeneous motion that is generated by buoyancy forces after the creation of homogeneous random fluctuations in the density of infinite fluid at an initial instant is examined. It is shown that analytical results together with numerical results provide a comprehensive description of the 'birth, life, and death' of buoyancy-generated turbulence. Results of numerical simulations yielded the mean-square density mean-square velocity fluctuations and the associated spectra as functions of time for various initial conditions, and the time required for the mean-square density fluctuation to fall to a specified small value was estimated.
MORAL HAZARD IN HEALTH INSURANCE: DO DYNAMIC INCENTIVES MATTER?
Aron-Dine, Aviva; Einav, Liran; Finkelstein, Amy; Cullen, Mark
2016-01-01
Using data from employer-provided health insurance and Medicare Part D, we investigate whether healthcare utilization responds to the dynamic incentives created by the nonlinear nature of health insurance contracts. We exploit the fact that, because annual coverage usually resets every January, individuals who join a plan later in the year face the same initial (“spot”) price of healthcare but a higher expected end-of-year (“future”) price. We find a statistically significant response of initial utilization to the future price, rejecting the null that individuals respond only to the spot price. We discuss implications for analysis of moral hazard in health insurance. PMID:26769985
Acquisition and Initial Analysis of H+- and H--Beam Centroid Jitter at LANSCE
NASA Astrophysics Data System (ADS)
Gilpatrick, J. D.; Bitteker, L.; Gulley, M. S.; Kerstiens, D.; Oothoudt, M.; Pillai, C.; Power, J.; Shelley, F.
2006-11-01
During the 2005 Los Alamos Neutron Science Center (LANSCE) beam runs, beam current and centroid-jitter data were observed, acquired, analyzed, and documented for both the LANSCE H+ and H- beams. These data were acquired using three beam position monitors (BPMs) from the 100-MeV Isotope Production Facility (IPF) beam line and three BPMs from the Switchyard transport line at the end of the LANSCE 800-MeV linac. The two types of data acquired, intermacropulse and intramacropulse, were analyzed for statistical and frequency characteristics as well as various other correlations including comparing their phase-space like characteristics in a coordinate system of transverse angle versus transverse position. This paper will briefly describe the measurements required to acquire these data, the initial analysis of these jitter data, and some interesting dilemmas these data presented.
Meta-analysis of randomized clinical trials in the era of individual patient data sharing.
Kawahara, Takuya; Fukuda, Musashi; Oba, Koji; Sakamoto, Junichi; Buyse, Marc
2018-06-01
Individual patient data (IPD) meta-analysis is considered to be a gold standard when the results of several randomized trials are combined. Recent initiatives on sharing IPD from clinical trials offer unprecedented opportunities for using such data in IPD meta-analyses. First, we discuss the evidence generated and the benefits obtained by a long-established prospective IPD meta-analysis in early breast cancer. Next, we discuss a data-sharing system that has been adopted by several pharmaceutical sponsors. We review a number of retrospective IPD meta-analyses that have already been proposed using this data-sharing system. Finally, we discuss the role of data sharing in IPD meta-analysis in the future. Treatment effects can be more reliably estimated in both types of IPD meta-analyses than with summary statistics extracted from published papers. Specifically, with rich covariate information available on each patient, prognostic and predictive factors can be identified or confirmed. Also, when several endpoints are available, surrogate endpoints can be assessed statistically. Although there are difficulties in conducting, analyzing, and interpreting retrospective IPD meta-analysis utilizing the currently available data-sharing systems, data sharing will play an important role in IPD meta-analysis in the future.
Graft survival of diabetic versus nondiabetic donor tissue after initial keratoplasty.
Vislisel, Jesse M; Liaboe, Chase A; Wagoner, Michael D; Goins, Kenneth M; Sutphin, John E; Schmidt, Gregory A; Zimmerman, M Bridget; Greiner, Mark A
2015-04-01
To compare corneal graft survival using tissue from diabetic and nondiabetic donors in patients undergoing initial Descemet stripping automated endothelial keratoplasty (DSAEK) or penetrating keratoplasty (PKP). A retrospective chart review of pseudophakic eyes that underwent DSAEK or PKP was performed. The primary outcome measure was graft failure. Cox proportional hazard regression and Kaplan-Meier survival analyses were used to compare diabetic versus nondiabetic donor tissue for all keratoplasty cases. A total of 183 eyes (136 DSAEK, 47 PKP) were included in the statistical analysis. Among 24 procedures performed using diabetic donor tissue, there were 4 cases (16.7%) of graft failure (3 DSAEK, 1 PKP), and among 159 procedures performed using nondiabetic donor tissue, there were 18 cases (11.3%) of graft failure (12 DSAEK, 6 PKP). Cox proportional hazard ratio of graft failure for all cases comparing diabetic with nondiabetic donor tissue was 1.69, but this difference was not statistically significant (95% confidence interval, 0.56-5.06; P = 0.348). There were no significant differences in Kaplan-Meier curves comparing diabetic with nondiabetic donor tissue for all cases (P = 0.380). Statistical analysis of graft failure by donor diabetes status within each procedure type was not possible because of the small number of graft failure events involving diabetic tissue. We found similar rates of graft failure in all keratoplasty cases when comparing tissue from diabetic and nondiabetic donors, but further investigation is needed to determine whether diabetic donor tissue results in different graft failure rates after DSAEK compared with PKP.
1995-05-01
based upon the variables ’ service quality ’ and ’customer satisfaction.’ Service quality was operationally defined as a gap score by subtracting...regression analysis, a statistically significant relationship was shown to exist: (1) between customer satisfaction and service quality , t(387)=13.566... service quality , customer satisfaction and future choice behavior may assist in preparation for the TRICARE initiative.
Neural Correlates of Morphology Acquisition through a Statistical Learning Paradigm.
Sandoval, Michelle; Patterson, Dianne; Dai, Huanping; Vance, Christopher J; Plante, Elena
2017-01-01
The neural basis of statistical learning as it occurs over time was explored with stimuli drawn from a natural language (Russian nouns). The input reflected the "rules" for marking categories of gendered nouns, without making participants explicitly aware of the nature of what they were to learn. Participants were scanned while listening to a series of gender-marked nouns during four sequential scans, and were tested for their learning immediately after each scan. Although participants were not told the nature of the learning task, they exhibited learning after their initial exposure to the stimuli. Independent component analysis of the brain data revealed five task-related sub-networks. Unlike prior statistical learning studies of word segmentation, this morphological learning task robustly activated the inferior frontal gyrus during the learning period. This region was represented in multiple independent components, suggesting it functions as a network hub for this type of learning. Moreover, the results suggest that subnetworks activated by statistical learning are driven by the nature of the input, rather than reflecting a general statistical learning system.
Neural Correlates of Morphology Acquisition through a Statistical Learning Paradigm
Sandoval, Michelle; Patterson, Dianne; Dai, Huanping; Vance, Christopher J.; Plante, Elena
2017-01-01
The neural basis of statistical learning as it occurs over time was explored with stimuli drawn from a natural language (Russian nouns). The input reflected the “rules” for marking categories of gendered nouns, without making participants explicitly aware of the nature of what they were to learn. Participants were scanned while listening to a series of gender-marked nouns during four sequential scans, and were tested for their learning immediately after each scan. Although participants were not told the nature of the learning task, they exhibited learning after their initial exposure to the stimuli. Independent component analysis of the brain data revealed five task-related sub-networks. Unlike prior statistical learning studies of word segmentation, this morphological learning task robustly activated the inferior frontal gyrus during the learning period. This region was represented in multiple independent components, suggesting it functions as a network hub for this type of learning. Moreover, the results suggest that subnetworks activated by statistical learning are driven by the nature of the input, rather than reflecting a general statistical learning system. PMID:28798703
Najafpoor, Ali Asghar; Jonidi Jafari, Ahmad; Hosseinzadeh, Ahmad; Khani Jazani, Reza; Bargozin, Hasan
2018-01-01
Treatment with a non-thermal plasma (NTP) is a new and effective technology applied recently for conversion of gases for air pollution control. This research was initiated to optimize the efficient application of the NTP process in benzene, toluene, ethyl-benzene, and xylene (BTEX) removal. The effects of four variables including temperature, initial BTEX concentration, voltage, and flow rate on the BTEX elimination efficiency were investigated using response surface methodology (RSM). The constructed model was evaluated by analysis of variance (ANOVA). The model goodness-of-fit and statistical significance was assessed using determination coefficients (R 2 and R 2 adj ) and the F-test. The results revealed that the R 2 proportion was greater than 0.96 for BTEX removal efficiency. The statistical analysis demonstrated that the BTEX removal efficiency was significantly correlated with the temperature, BTEX concentration, voltage, and flow rate. Voltage was the most influential variable affecting the dependent variable as it exerted a significant effect (p < 0.0001) on the response variable. According to the achieved results, NTP can be applied as a progressive, cost-effective, and practical process for treatment of airstreams polluted with BTEX in conditions of low residence time and high concentrations of pollutants.
Park, Rachel; O'Brien, Thomas F; Huang, Susan S; Baker, Meghan A; Yokoe, Deborah S; Kulldorff, Martin; Barrett, Craig; Swift, Jamie; Stelling, John
2016-11-01
While antimicrobial resistance threatens the prevention, treatment, and control of infectious diseases, systematic analysis of routine microbiology laboratory test results worldwide can alert new threats and promote timely response. This study explores statistical algorithms for recognizing geographic clustering of multi-resistant microbes within a healthcare network and monitoring the dissemination of new strains over time. Escherichia coli antimicrobial susceptibility data from a three-year period stored in WHONET were analyzed across ten facilities in a healthcare network utilizing SaTScan's spatial multinomial model with two models for defining geographic proximity. We explored geographic clustering of multi-resistance phenotypes within the network and changes in clustering over time. Geographic clustering identified from both latitude/longitude and non-parametric facility groupings geographic models were similar, while the latter was offers greater flexibility and generalizability. Iterative application of the clustering algorithms suggested the possible recognition of the initial appearance of invasive E. coli ST131 in the clinical database of a single hospital and subsequent dissemination to others. Systematic analysis of routine antimicrobial resistance susceptibility test results supports the recognition of geographic clustering of microbial phenotypic subpopulations with WHONET and SaTScan, and iterative application of these algorithms can detect the initial appearance in and dissemination across a region prompting early investigation, response, and containment measures.
[Basic concepts for network meta-analysis].
Catalá-López, Ferrán; Tobías, Aurelio; Roqué, Marta
2014-12-01
Systematic reviews and meta-analyses have long been fundamental tools for evidence-based clinical practice. Initially, meta-analyses were proposed as a technique that could improve the accuracy and the statistical power of previous research from individual studies with small sample size. However, one of its main limitations has been the fact of being able to compare no more than two treatments in an analysis, even when the clinical research question necessitates that we compare multiple interventions. Network meta-analysis (NMA) uses novel statistical methods that incorporate information from both direct and indirect treatment comparisons in a network of studies examining the effects of various competing treatments, estimating comparisons between many treatments in a single analysis. Despite its potential limitations, NMA applications in clinical epidemiology can be of great value in situations where there are several treatments that have been compared against a common comparator. Also, NMA can be relevant to a research or clinical question when many treatments must be considered or when there is a mix of both direct and indirect information in the body of evidence. Copyright © 2013 Elsevier España, S.L.U. All rights reserved.
Environmental Health Practice: Statistically Based Performance Measurement
Enander, Richard T.; Gagnon, Ronald N.; Hanumara, R. Choudary; Park, Eugene; Armstrong, Thomas; Gute, David M.
2007-01-01
Objectives. State environmental and health protection agencies have traditionally relied on a facility-by-facility inspection-enforcement paradigm to achieve compliance with government regulations. We evaluated the effectiveness of a new approach that uses a self-certification random sampling design. Methods. Comprehensive environmental and occupational health data from a 3-year statewide industry self-certification initiative were collected from representative automotive refinishing facilities located in Rhode Island. Statistical comparisons between baseline and postintervention data facilitated a quantitative evaluation of statewide performance. Results. The analysis of field data collected from 82 randomly selected automotive refinishing facilities showed statistically significant improvements (P<.05, Fisher exact test) in 4 major performance categories: occupational health and safety, air pollution control, hazardous waste management, and wastewater discharge. Statistical significance was also shown when a modified Bonferroni adjustment for multiple comparisons was performed. Conclusions. Our findings suggest that the new self-certification approach to environmental and worker protection is effective and can be used as an adjunct to further enhance state and federal enforcement programs. PMID:17267709
NASA Technical Reports Server (NTRS)
Batthauer, Byron E.
1987-01-01
This paper analyzes a NASA Convair 990 (CV-990) accident with emphasis on rejected-takeoff (RTO) decision making, training, procedures, and accident statistics. The NASA Aircraft Accident Investigation Board was somewhat perplexed that an aircraft could be destroyed as a result of blown tires during the takeoff roll. To provide a better understanding of tire failure RTO's, The Board obtained accident reports, Federal Aviation Administration (FAA) studies, and other pertinent information related to the elements of this accident. This material enhanced the analysis process and convinced the Accident Board that high-speed RTO's in transport aircraft should be given more emphasis during pilot training. Pilots should be made aware of various RTO situations and statistics with emphasis on failed-tire RTO's. This background information could enhance the split-second decision-making process that is required prior to initiating an RTO.
Flares, ejections, proton events
NASA Astrophysics Data System (ADS)
Belov, A. V.
2017-11-01
Statistical analysis is performed for the relationship of coronal mass ejections (CMEs) and X-ray flares with the fluxes of solar protons with energies >10 and >100 MeV observed near the Earth. The basis for this analysis was the events that took place in 1976-2015, for which there are reliable observations of X-ray flares on GOES satellites and CME observations with SOHO/LASCO coronagraphs. A fairly good correlation has been revealed between the magnitude of proton enhancements and the power and duration of flares, as well as the initial CME speed. The statistics do not give a clear advantage either to CMEs or the flares concerning their relation with proton events, but the characteristics of the flares and ejections complement each other well and are reasonable to use together in the forecast models. Numerical dependences are obtained that allow estimation of the proton fluxes to the Earth expected from solar observations; possibilities for improving the model are discussed.
NASA Astrophysics Data System (ADS)
Provo, Judy; Lamar, Carlton; Newby, Timothy
2002-01-01
A cross section was used to enhance three-dimensional knowledge of anatomy of the canine head. All veterinary students in two successive classes (n = 124) dissected the head; experimental groups also identified structures on a cross section of the head. A test assessing spatial knowledge of the head generated 10 dependent variables from two administrations. The test had content validity and statistically significant interrater and test-retest reliability. A live-dog examination generated one additional dependent variable. Analysis of covariance controlling for performance on course examinations and quizzes revealed no treatment effect. Including spatial skill as a third covariate revealed a statistically significant effect of spatial skill on three dependent variables. Men initially had greater spatial skill than women, but spatial skills were equal after 8 months. A qualitative analysis showed the positive impact of this experience on participants. Suggestions for improvement and future research are discussed.
NASA Astrophysics Data System (ADS)
Vigan, A.; Chauvin, G.; Bonavita, M.; Desidera, S.; Bonnefoy, M.; Mesa, D.; Beuzit, J.-L.; Augereau, J.-C.; Biller, B.; Boccaletti, A.; Brugaletta, E.; Buenzli, E.; Carson, J.; Covino, E.; Delorme, P.; Eggenberger, A.; Feldt, M.; Hagelberg, J.; Henning, T.; Lagrange, A.-M.; Lanzafame, A.; Ménard, F.; Messina, S.; Meyer, M.; Montagnier, G.; Mordasini, C.; Mouillet, D.; Moutou, C.; Mugnier, L.; Quanz, S. P.; Reggiani, M.; Ségransan, D.; Thalmann, C.; Waters, R.; Zurlo, A.
2014-01-01
Over the past decade, a growing number of deep imaging surveys have started to provide meaningful constraints on the population of extrasolar giant planets at large orbital separation. Primary targets for these surveys have been carefully selected based on their age, distance and spectral type, and often on their membership to young nearby associations where all stars share common kinematics, photometric and spectroscopic properties. The next step is a wider statistical analysis of the frequency and properties of low mass companions as a function of stellar mass and orbital separation. In late 2009, we initiated a coordinated European Large Program using angular differential imaging in the H band (1.66 μm) with NaCo at the VLT. Our aim is to provide a comprehensive and statistically significant study of the occurrence of extrasolar giant planets and brown dwarfs at large (5-500 AU) orbital separation around ~150 young, nearby stars, a large fraction of which have never been observed at very deep contrast. The survey has now been completed and we present the data analysis and detection limits for the observed sample, for which we reach the planetary-mass domain at separations of >~50 AU on average. We also present the results of the statistical analysis that has been performed over the 75 targets newly observed at high-contrast. We discuss the details of the statistical analysis and the physical constraints that our survey provides for the frequency and formation scenario of planetary mass companions at large separation.
NASA Technical Reports Server (NTRS)
Chameides, William L.
1988-01-01
Spring 1084 GTE CITE-1 flight data from the field exercise was obtained from a GTE Data Archive Tape. Chemical and supporting meteorological data taken over the Pacific Ocean was statistically and diagnostically analyzed to identify the key processes affecting the concentrations of ozone and its chemical precursors in the region. The analysis was completed. The analysis of the GTE CITE-2 data is being performed in collaboration with Dr. D.D. Davis and other GTE scientists. Initial results of the analysis were presented and work begun on the paper describing the results.
Space biology initiative program definition review. Trade study 4: Design modularity and commonality
NASA Technical Reports Server (NTRS)
Jackson, L. Neal; Crenshaw, John, Sr.; Davidson, William L.; Herbert, Frank J.; Bilodeau, James W.; Stoval, J. Michael; Sutton, Terry
1989-01-01
The relative cost impacts (up or down) of developing Space Biology hardware using design modularity and commonality is studied. Recommendations for how the hardware development should be accomplished to meet optimum design modularity requirements for Life Science investigation hardware will be provided. In addition, the relative cost impacts of implementing commonality of hardware for all Space Biology hardware are defined. Cost analysis and supporting recommendations for levels of modularity and commonality are presented. A mathematical or statistical cost analysis method with the capability to support development of production design modularity and commonality impacts to parametric cost analysis is provided.
Community Rates of Breastfeeding Initiation.
Grubesic, Tony H; Durbin, Kelly M
2016-11-01
Breastfeeding initiation rates vary considerably across racial and ethnic groups, maternal age, and education level, yet there are limited data concerning the influence of geography on community rates of breastfeeding initiation. This study aimed to describe how community rates of breastfeeding initiation vary in geographic space, highlighting "hot spots" and "cool spots" of initiation and exploring the potential connections between race, socioeconomic status, and urbanization levels on these patterns. Birth certificate data from the Kentucky Department of Health for 2004-2010 were combined with county-level geographic base files, Census 2010 demographic and socioeconomic data, and Rural-Urban Continuum Codes to conduct a spatial statistical analysis of community rates of breastfeeding initiation. Between 2004 and 2010, the average rate of breastfeeding initiation for Kentucky increased from 43.84% to 49.22%. Simultaneously, the number of counties identified as breastfeeding initiation hot spots also increased, displaying a systematic geographic pattern in doing so. Cool spots of breastfeeding initiation persisted in rural, Appalachian Kentucky. Spatial regression results suggested that unemployment, income, race, education, location, and the availability of International Board Certified Lactation Consultants are connected to breastfeeding initiation. Not only do spatial analytics facilitate the identification of breastfeeding initiation hot spots and cool spots, but they can be used to better understand the landscape of breastfeeding initiation and help target breastfeeding education and/or support efforts.
Statistical Methods for Rapid Aerothermal Analysis and Design Technology: Validation
NASA Technical Reports Server (NTRS)
DePriest, Douglas; Morgan, Carolyn
2003-01-01
The cost and safety goals for NASA s next generation of reusable launch vehicle (RLV) will require that rapid high-fidelity aerothermodynamic design tools be used early in the design cycle. To meet these requirements, it is desirable to identify adequate statistical models that quantify and improve the accuracy, extend the applicability, and enable combined analyses using existing prediction tools. The initial research work focused on establishing suitable candidate models for these purposes. The second phase is focused on assessing the performance of these models to accurately predict the heat rate for a given candidate data set. This validation work compared models and methods that may be useful in predicting the heat rate.
Zaki, Rafdzah; Bulgiba, Awang; Nordin, Noorhaire; Azina Ismail, Noor
2013-06-01
Reliability measures precision or the extent to which test results can be replicated. This is the first ever systematic review to identify statistical methods used to measure reliability of equipment measuring continuous variables. This studyalso aims to highlight the inappropriate statistical method used in the reliability analysis and its implication in the medical practice. In 2010, five electronic databases were searched between 2007 and 2009 to look for reliability studies. A total of 5,795 titles were initially identified. Only 282 titles were potentially related, and finally 42 fitted the inclusion criteria. The Intra-class Correlation Coefficient (ICC) is the most popular method with 25 (60%) studies having used this method followed by the comparing means (8 or 19%). Out of 25 studies using the ICC, only 7 (28%) reported the confidence intervals and types of ICC used. Most studies (71%) also tested the agreement of instruments. This study finds that the Intra-class Correlation Coefficient is the most popular method used to assess the reliability of medical instruments measuring continuous outcomes. There are also inappropriate applications and interpretations of statistical methods in some studies. It is important for medical researchers to be aware of this issue, and be able to correctly perform analysis in reliability studies.
Supaporn, Pansuwan; Yeom, Sung Ho
2018-04-30
This study investigated the biological conversion of crude glycerol generated from a commercial biodiesel production plant as a by-product to 1,3-propanediol (1,3-PD). Statistical analysis was employed to derive a statistical model for the individual and interactive effects of glycerol, (NH 4 ) 2 SO 4 , trace elements, pH, and cultivation time on the four objectives: 1,3-PD concentration, yield, selectivity, and productivity. Optimum conditions for each objective with its maximum value were predicted by statistical optimization, and experiments under the optimum conditions verified the predictions. In addition, by systematic analysis of the values of four objectives, optimum conditions for 1,3-PD concentration (49.8 g/L initial glycerol, 4.0 g/L of (NH 4 ) 2 SO 4 , 2.0 mL/L of trace element, pH 7.5, and 11.2 h of cultivation time) were determined to be the global optimum culture conditions for 1,3-PD production. Under these conditions, we could achieve high 1,3-PD yield (47.4%), 1,3-PD selectivity (88.8%), and 1,3-PD productivity (2.1/g/L/h) as well as high 1,3-PD concentration (23.6 g/L).
A hierarchical fuzzy rule-based approach to aphasia diagnosis.
Akbarzadeh-T, Mohammad-R; Moshtagh-Khorasani, Majid
2007-10-01
Aphasia diagnosis is a particularly challenging medical diagnostic task due to the linguistic uncertainty and vagueness, inconsistencies in the definition of aphasic syndromes, large number of measurements with imprecision, natural diversity and subjectivity in test objects as well as in opinions of experts who diagnose the disease. To efficiently address this diagnostic process, a hierarchical fuzzy rule-based structure is proposed here that considers the effect of different features of aphasia by statistical analysis in its construction. This approach can be efficient for diagnosis of aphasia and possibly other medical diagnostic applications due to its fuzzy and hierarchical reasoning construction. Initially, the symptoms of the disease which each consists of different features are analyzed statistically. The measured statistical parameters from the training set are then used to define membership functions and the fuzzy rules. The resulting two-layered fuzzy rule-based system is then compared with a back propagating feed-forward neural network for diagnosis of four Aphasia types: Anomic, Broca, Global and Wernicke. In order to reduce the number of required inputs, the technique is applied and compared on both comprehensive and spontaneous speech tests. Statistical t-test analysis confirms that the proposed approach uses fewer Aphasia features while also presenting a significant improvement in terms of accuracy.
Clinical significance of quantitative analysis of facial nerve enhancement on MRI in Bell's palsy.
Song, Mee Hyun; Kim, Jinna; Jeon, Ju Hyun; Cho, Chang Il; Yoo, Eun Hye; Lee, Won-Sang; Lee, Ho-Ki
2008-11-01
Quantitative analysis of the facial nerve on the lesion side as well as the normal side, which allowed for more accurate measurement of facial nerve enhancement in patients with facial palsy, showed statistically significant correlation with the initial severity of facial nerve inflammation, although little prognostic significance was shown. This study investigated the clinical significance of quantitative measurement of facial nerve enhancement in patients with Bell's palsy by analyzing the enhancement pattern and correlating MRI findings with initial severity of facial palsy and clinical outcome. Facial nerve enhancement was measured quantitatively by using the region of interest on pre- and postcontrast T1-weighted images in 44 patients diagnosed with Bell's palsy. The signal intensity increase on the lesion side was first compared with that of the contralateral side and then correlated with the initial degree of facial palsy and prognosis. The lesion side showed significantly higher signal intensity increase compared with the normal side in all of the segments except for the mastoid segment. Signal intensity increase at the internal auditory canal and labyrinthine segments showed correlation with the initial degree of facial palsy but no significant difference was found between different prognostic groups.
Three-dimensional images contribute to the diagnosis of mucous retention cyst in maxillary sinus.
Donizeth-Rodrigues, Cleomar; Fonseca-Da Silveira, Márcia; Gonçalves-De Alencar, Ana-Helena; Garcia-Santos-Silva, Maria-Alves; Francisco-De-Mendonça, Elismauro; Estrela, Carlos
2013-01-01
To evaluate the detection of mucous retention cyst of maxillary sinus (MRCMS) using panoramic radiography and cone beam computed tomography (CBCT). A digital database with 6,000 panoramic radiographs was reviewed for MRCMS. Suggestive images of MRCMS were detected on 185 radiographs, and patients were located and invited to return for follow-up. Thirty patients returned, and control panoramic radiographs were obtained 6 to 46 months after the initial radiograph. When MRCMS was found on control radiographs, CBCT scans were obtained. Cysts were measured and compared on radiographs and scans. The Wilcoxon, Spearman and Kolmorogov-Smirnov tests were used for statistical analysis. The level of significance was set at 5%. There were statistically significant differences between the two methods (p<0.05): 23 MRCMS detected on panoramic radiographs were confirmed by CBCT, but 5 MRCMS detected on CBCT images had not been identified by panoramic radiography. Eight MRCMS detected on control radiographs were not confirmed by CBCT. MRCMS size differences from initial to control panoramic radiographs and CBCT scans were not statistically significant (p= 0.617 and p= 0.626). The correlation between time and MRCMS size differences was not significant (r = -0.16, p = 0.381). CBCT scanning detect MRCMS more accurately than panoramic radiography.
Stilp, Christian E.; Kluender, Keith R.
2012-01-01
To the extent that sensorineural systems are efficient, redundancy should be extracted to optimize transmission of information, but perceptual evidence for this has been limited. Stilp and colleagues recently reported efficient coding of robust correlation (r = .97) among complex acoustic attributes (attack/decay, spectral shape) in novel sounds. Discrimination of sounds orthogonal to the correlation was initially inferior but later comparable to that of sounds obeying the correlation. These effects were attenuated for less-correlated stimuli (r = .54) for reasons that are unclear. Here, statistical properties of correlation among acoustic attributes essential for perceptual organization are investigated. Overall, simple strength of the principal correlation is inadequate to predict listener performance. Initial superiority of discrimination for statistically consistent sound pairs was relatively insensitive to decreased physical acoustic/psychoacoustic range of evidence supporting the correlation, and to more frequent presentations of the same orthogonal test pairs. However, increased range supporting an orthogonal dimension has substantial effects upon perceptual organization. Connectionist simulations and Eigenvalues from closed-form calculations of principal components analysis (PCA) reveal that perceptual organization is near-optimally weighted to shared versus unshared covariance in experienced sound distributions. Implications of reduced perceptual dimensionality for speech perception and plausible neural substrates are discussed. PMID:22292057
Pace, M.N.; Rosentreter, J.J.; Bartholomay, R.C.
2001-01-01
Idaho State University and the US Geological Survey, in cooperation with the US Department of Energy, conducted a study to determine and evaluate strontium distribution coefficients (Kds) of subsurface materials at the Idaho National Engineering and Environmental Laboratory (INEEL). The Kds were determined to aid in assessing the variability of strontium Kds and their effects on chemical transport of strontium-90 in the Snake River Plain aquifer system. Data from batch experiments done to determine strontium Kds of five sediment-infill samples and six standard reference material samples were analyzed by using multiple linear regression analysis and the stepwise variable-selection method in the statistical program, Statistical Product and Service Solutions, to derive an equation of variables that can be used to predict strontium Kds of sediment-infill samples. The sediment-infill samples were from basalt vesicles and fractures from a selected core at the INEEL; strontium Kds ranged from ???201 to 356 ml g-1. The standard material samples consisted of clay minerals and calcite. The statistical analyses of the batch-experiment results showed that the amount of strontium in the initial solution, the amount of manganese oxide in the sample material, and the amount of potassium in the initial solution are the most important variables in predicting strontium Kds of sediment-infill samples.
NASA Astrophysics Data System (ADS)
Zhang, Mi; Guan, Zhidong; Wang, Xiaodong; Du, Shanyi
2017-10-01
Kink band is a typical phenomenon for composites under longitudinal compression. In this paper, theoretical analysis and finite element simulation were conducted to analyze kink angle as well as compressive strength of composites. Kink angle was considered to be an important character throughout longitudinal compression process. Three factors including plastic matrix, initial fiber misalignment and rotation due to loading were considered for theoretical analysis. Besides, the relationship between kink angle and fiber volume fraction was improved and optimized by theoretical derivation. In addition, finite element models considering fiber stochastic strength and Drucker-Prager constitutive model for matrix were conducted in ABAQUS to analyze kink band formation process, which corresponded with the experimental results. Through simulation, the loading and failure procedure can be evidently divided into three stages: elastic stage, softening stage, and fiber break stage. It also shows that kink band is a result of fiber misalignment and plastic matrix. Different values of initial fiber misalignment angle, wavelength and fiber volume fraction were considered to explore the effects on compressive strength and kink angle. Results show that compressive strength increases with the decreasing of initial fiber misalignment angle, the decreasing of initial fiber misalignment wavelength and the increasing of fiber volume fraction, while kink angle decreases in these situations. Orthogonal array in statistics was also built to distinguish the effect degree of these factors. It indicates that initial fiber misalignment angle has the largest impact on compressive strength and kink angle.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Labby, Z.
Physicists are often expected to have a solid grounding in experimental design and statistical analysis, sometimes filling in when biostatisticians or other experts are not available for consultation. Unfortunately, graduate education on these topics is seldom emphasized and few opportunities for continuing education exist. Clinical physicists incorporate new technology and methods into their practice based on published literature. A poor understanding of experimental design and analysis could Result in inappropriate use of new techniques. Clinical physicists also improve current practice through quality initiatives that require sound experimental design and analysis. Academic physicists with a poor understanding of design and analysismore » may produce ambiguous (or misleading) results. This can Result in unnecessary rewrites, publication rejection, and experimental redesign (wasting time, money, and effort). This symposium will provide a practical review of error and uncertainty, common study designs, and statistical tests. Instruction will primarily focus on practical implementation through examples and answer questions such as: where would you typically apply the test/design and where is the test/design typically misapplied (i.e., common pitfalls)? An analysis of error and uncertainty will also be explored using biological studies and associated modeling as a specific use case. Learning Objectives: Understand common experimental testing and clinical trial designs, what questions they can answer, and how to interpret the results Determine where specific statistical tests are appropriate and identify common pitfalls Understand the how uncertainty and error are addressed in biological testing and associated biological modeling.« less
NASA Technical Reports Server (NTRS)
Bauman, William H., III
2010-01-01
The AMU conducted an objective analysis of the MesoNAM forecasts compared to observed values from sensors at specified KSC/CCAFS wind towers by calculating the following statistics to verify the performance of the model: 1) Bias (mean difference), 2) Standard deviation of Bias, 3) Root Mean Square Error (RMSE), and 4) Hypothesis test for Bias = O. The 45 WS LWOs use the MesoNAM to support launch weather operations. However, the actual performance of the model at KSC and CCAFS had not been measured objectively. The analysis compared the MesoNAM forecast winds, temperature and dew point to the observed values from the sensors on wind towers. The data were stratified by tower sensor, month and onshore/offshore wind direction based on the orientation of the coastline to each tower's location. The model's performance statistics were then calculated for each wind tower based on sensor height and model initialization time. The period of record for the data used in this task was based on the operational start of the current MesoNAM in mid-August 2006 and so the task began with the first full month of data, September 2006, through May 2010. The analysis of model performance indicated: a) The accuracy decreased as the forecast valid time from the model initialization increased, b) There was a diurnal signal in T with a cool bias during the late night and a warm bias during the afternoon, c) There was a diurnal signal in Td with a low bias during the afternoon and a high bias during the late night, and d) The model parameters at each vertical level most closely matched the observed parameters at heights closest to those vertical levels. The AMU developed a GUI that consists of a multi-level drop-down menu written in JavaScript embedded within the HTML code. This tool allows the LWO to easily and efficiently navigate among the charts and spreadsheet files containing the model performance statistics. The objective statistics give the LWOs knowledge of the model's strengths and weaknesses and the GUI allows quick access to the data which will result in improved forecasts for operations.
Los Alamos National Laboratory W76 Pit Tube Lifetime Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abeln, Terri G.
2012-04-25
A metallurgical study was requested as part of the Los Alamos National Laboratory (LANL) W76-1 life-extension program (LEP) involving a lifetime analysis of type 304 stainless steel pit tubes subject to repeat bending loads during assembly and disassembly operations at BWXT/Pantex. This initial test phase was completed during the calendar years of 2004-2006 and the report not issued until additional recommended tests could be performed. These tests have not been funded to this date and therefore this report is considered final. Tubes were reportedly fabricated according to Rocky Flats specification P14548 - Seamless Type 304 VIM/VAR Stainless Steel Tubing. Tubemore » diameter was specified as 0.125 inches and wall thickness as 0.028 inches. A heat treat condition is not specified and the hardness range specification can be characteristic of both 1/8 and 1/4 hard conditions. Properties of all tubes tested were within specification. Metallographic analysis could not conclusively determine a specified limit to number of bends allowable. A statistical analysis suggests a range of 5-7 bends with a 99.95% confidence limit. See the 'Statistical Analysis' section of this report. The initial phase of this study involved two separate sets of test specimens. The first group was part of an investigation originating in the ESA-GTS [now Gas Transfer Systems (W-7) Group]. After the bend cycle test parameters were chosen (all three required bends subjected to the same amount of bend cycles) and the tubes bent, the investigation was transferred to Terri Abeln (Metallurgical Science and Engineering) for analysis. Subsequently, another limited quantity of tubes became available for testing and were cycled with the same bending fixture, but with different test parameters determined by T. Abeln.« less
Ximenes, Marcos; Triches, Thaisa C; Beltrame, Ana Paula C A; Hilgert, Leandro A; Cardoso, Mariane
2013-01-01
This study evaluated the efficacy of 2 final irrigation solutions for removal of the smear layer (SL) from root canals of primary teeth, using scanning electron microscope (SEM) analysis. Thirty primary molars were selected and a single operator instrumented the canals. The initial irrigation was done with a 1% sodium hypochlorite (NaOCl) solution. After the preparation, the roots were randomly divided into 3 groups for final irrigation: Group 1, 1% NaOCl (n = 10); Group 2, 17% EDTA + 1% NaOCl (n = 10); and Group 3, 17% EDTA + saline solution (n = 10). The roots were prepared for SEM analysis (magnification 1000X). The photomicrographs were independently analyzed by 2 investigators with SEM experience, attributing scores to each root third in terms of SL removal. Kruskal-Wallis and Mann-Whitney tests revealed that there was no statistical difference between the groups (P = 0.489). However, a statistical difference was found (P < 0.05) in a comparison of root thirds, with the apical third having the worst results. Comparing the thirds within the same group, all canals showed statistical differences between the cervical and apical thirds (P < 0.05). The authors determined that no substance or association of substances were able to completely remove SL.
Network approach towards understanding the crazing in glassy amorphous polymers
NASA Astrophysics Data System (ADS)
Venkatesan, Sudarkodi; Vivek-Ananth, R. P.; Sreejith, R. P.; Mangalapandi, Pattulingam; Hassanali, Ali A.; Samal, Areejit
2018-04-01
We have used molecular dynamics to simulate an amorphous glassy polymer with long chains to study the deformation mechanism of crazing and associated void statistics. The Van der Waals interactions and the entanglements between chains constituting the polymer play a crucial role in crazing. Thus, we have reconstructed two underlying weighted networks, namely, the Van der Waals network and the entanglement network from polymer configurations extracted from the molecular dynamics simulation. Subsequently, we have performed graph-theoretic analysis of the two reconstructed networks to reveal the role played by them in the crazing of polymers. Our analysis captured various stages of crazing through specific trends in the network measures for Van der Waals networks and entanglement networks. To further corroborate the effectiveness of network analysis in unraveling the underlying physics of crazing in polymers, we have contrasted the trends in network measures for Van der Waals networks and entanglement networks in the light of stress-strain behaviour and voids statistics during deformation. We find that the Van der Waals network plays a crucial role in craze initiation and growth. Although, the entanglement network was found to maintain its structure during craze initiation stage, it was found to progressively weaken and undergo dynamic changes during the hardening and failure stages of crazing phenomena. Our work demonstrates the utility of network theory in quantifying the underlying physics of polymer crazing and widens the scope of applications of network science to characterization of deformation mechanisms in diverse polymers.
Statistical analysis and isotherm study of uranium biosorption by Padina sp. algae biomass.
Khani, Mohammad Hassan
2011-06-01
The application of response surface methodology is presented for optimizing the removal of U ions from aqueous solutions using Padina sp., a brown marine algal biomass. Box-Wilson central composite design was employed to assess individual and interactive effects of the four main parameters (pH and initial uranium concentration in solutions, contact time and temperature) on uranium uptake. Response surface analysis showed that the data were adequately fitted to second-order polynomial model. Analysis of variance showed a high coefficient of determination value (R (2)=0.9746) and satisfactory second-order regression model was derived. The optimum pH and initial uranium concentration in solutions, contact time and temperature were found to be 4.07, 778.48 mg/l, 74.31 min, and 37.47°C, respectively. Maximized uranium uptake was predicted and experimentally validated. The equilibrium data for biosorption of U onto the Padina sp. were well represented by the Langmuir isotherm, giving maximum monolayer adsorption capacity as high as 376.73 mg/g.
Effects of Negative Emotions and Life Events on Women's Missed Miscarriage.
Xing, Huilin; Luo, Yaping; Wang, Shouying
2018-02-01
To investigate the effects of negative emotions and life events on women's missed miscarriage. Overall, 214 women diagnosed with a missed miscarriage by prenatal examination from 2016-2017 in Xiamen Maternal and Child Health Care Hospital, Xiamen, China were selected as the observation group compared to 214 women as control group. The general data of the patients were investigated by self-programmed questionnaires. Zung Self-Rating Anxiety Scale, Center Epidemiological Studies Depression Scale; Life Events Scale for Pregnant Women were used conduct the study. General data, anxiety, depression and life events were compared between the two groups of patients, and statistically different factors were included in the multivariate Logistic regression analysis. There were statistically significant differences in the educational level, pre-pregnancy health status, planned pregnancy, pre-pregnancy or gestational gynecological inflammation and the initiative to obtain knowledge of prenatal and postnatal care between the two groups of pregnant women ( P <0.01); there were also statistically significant differences in score of life events, score of anxiety and score of depression between them ( P <0.01). The high educational level, good health status before pregnancy and the initiative to obtain the knowledge of prenatal and postnatal care were taken as the independent protective factors for the missed miscarriage in pregnant women, while life events, anxiety and depression were independent risk factors for it. Negative emotions and life events increase the risk of women's missed miscarriage, and the high educational level, good health status before pregnancy and the initiative to obtain the knowledge of prenatal and postnatal care reduce the risk of women's missed miscarriage.
Sources of Error and the Statistical Formulation of M S: m b Seismic Event Screening Analysis
NASA Astrophysics Data System (ADS)
Anderson, D. N.; Patton, H. J.; Taylor, S. R.; Bonner, J. L.; Selby, N. D.
2014-03-01
The Comprehensive Nuclear-Test-Ban Treaty (CTBT), a global ban on nuclear explosions, is currently in a ratification phase. Under the CTBT, an International Monitoring System (IMS) of seismic, hydroacoustic, infrasonic and radionuclide sensors is operational, and the data from the IMS is analysed by the International Data Centre (IDC). The IDC provides CTBT signatories basic seismic event parameters and a screening analysis indicating whether an event exhibits explosion characteristics (for example, shallow depth). An important component of the screening analysis is a statistical test of the null hypothesis H 0: explosion characteristics using empirical measurements of seismic energy (magnitudes). The established magnitude used for event size is the body-wave magnitude (denoted m b) computed from the initial segment of a seismic waveform. IDC screening analysis is applied to events with m b greater than 3.5. The Rayleigh wave magnitude (denoted M S) is a measure of later arriving surface wave energy. Magnitudes are measurements of seismic energy that include adjustments (physical correction model) for path and distance effects between event and station. Relative to m b, earthquakes generally have a larger M S magnitude than explosions. This article proposes a hypothesis test (screening analysis) using M S and m b that expressly accounts for physical correction model inadequacy in the standard error of the test statistic. With this hypothesis test formulation, the 2009 Democratic Peoples Republic of Korea announced nuclear weapon test fails to reject the null hypothesis H 0: explosion characteristics.
Reyes-Garcia, Victoria; Ruiz-Mallen, Isabel; Porter-Bolland, Luciana; Garcia-Frapolli, Eduardo; Ellis, Edward A; Mendez, Maria-Elena; Pritchard, Diana J; Sanchez-Gonzalez, María-Consuelo
2013-08-01
Since the 1990s national and international programs have aimed to legitimize local conservation initiatives that might provide an alternative to the formal systems of state-managed or otherwise externally driven protected areas. We used discourse analysis (130 semistructured interviews with key informants) and descriptive statistics (679 surveys) to compare local perceptions of and experiences with state-driven versus community-driven conservation initiatives. We conducted our research in 6 communities in southeastern Mexico. Formalization of local conservation initiatives did not seem to be based on local knowledge and practices. Although interviewees thought community-based initiatives generated less conflict than state-managed conservation initiatives, the community-based initiatives conformed to the biodiversity conservation paradigm that emphasizes restricted use of and access to resources. This restrictive approach to community-based conservation in Mexico, promoted through state and international conservation organizations, increased the area of protected land and had local support but was not built on locally relevant and multifunctional landscapes, a model that community-based conservation is assumed to advance. © 2013 Society for Conservation Biology.
Batch Statistical Process Monitoring Approach to a Cocrystallization Process.
Sarraguça, Mafalda C; Ribeiro, Paulo R S; Dos Santos, Adenilson O; Lopes, João A
2015-12-01
Cocrystals are defined as crystalline structures composed of two or more compounds that are solid at room temperature held together by noncovalent bonds. Their main advantages are the increase of solubility, bioavailability, permeability, stability, and at the same time retaining active pharmaceutical ingredient bioactivity. The cocrystallization between furosemide and nicotinamide by solvent evaporation was monitored on-line using near-infrared spectroscopy (NIRS) as a process analytical technology tool. The near-infrared spectra were analyzed using principal component analysis. Batch statistical process monitoring was used to create control charts to perceive the process trajectory and define control limits. Normal and non-normal operating condition batches were performed and monitored with NIRS. The use of NIRS associated with batch statistical process models allowed the detection of abnormal variations in critical process parameters, like the amount of solvent or amount of initial components present in the cocrystallization. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.
Generalising the logistic map through the q-product
NASA Astrophysics Data System (ADS)
Pessoa, R. W. S.; Borges, E. P.
2011-03-01
We investigate a generalisation of the logistic map as xn+1 = 1 - axn otimesqmap xn (-1 <= xn <= 1, 0 < a <= 2) where otimesq stands for a generalisation of the ordinary product, known as q-product [Borges, E.P. Physica A 340, 95 (2004)]. The usual product, and consequently the usual logistic map, is recovered in the limit q → 1, The tent map is also a particular case for qmap → ∞. The generalisation of this (and others) algebraic operator has been widely used within nonextensive statistical mechanics context (see C. Tsallis, Introduction to Nonextensive Statistical Mechanics, Springer, NY, 2009). We focus the analysis for qmap > 1 at the edge of chaos, particularly at the first critical point ac, that depends on the value of qmap. Bifurcation diagrams, sensitivity to initial conditions, fractal dimension and rate of entropy growth are evaluated at ac(qmap), and connections with nonextensive statistical mechanics are explored.
A statistical framework for evaluating neural networks to predict recurrent events in breast cancer
NASA Astrophysics Data System (ADS)
Gorunescu, Florin; Gorunescu, Marina; El-Darzi, Elia; Gorunescu, Smaranda
2010-07-01
Breast cancer is the second leading cause of cancer deaths in women today. Sometimes, breast cancer can return after primary treatment. A medical diagnosis of recurrent cancer is often a more challenging task than the initial one. In this paper, we investigate the potential contribution of neural networks (NNs) to support health professionals in diagnosing such events. The NN algorithms are tested and applied to two different datasets. An extensive statistical analysis has been performed to verify our experiments. The results show that a simple network structure for both the multi-layer perceptron and radial basis function can produce equally good results, not all attributes are needed to train these algorithms and, finally, the classification performances of all algorithms are statistically robust. Moreover, we have shown that the best performing algorithm will strongly depend on the features of the datasets, and hence, there is not necessarily a single best classifier.
Asymptotic modal analysis and statistical energy analysis
NASA Technical Reports Server (NTRS)
Dowell, Earl H.
1988-01-01
Statistical Energy Analysis (SEA) is defined by considering the asymptotic limit of Classical Modal Analysis, an approach called Asymptotic Modal Analysis (AMA). The general approach is described for both structural and acoustical systems. The theoretical foundation is presented for structural systems, and experimental verification is presented for a structural plate responding to a random force. Work accomplished subsequent to the grant initiation focusses on the acoustic response of an interior cavity (i.e., an aircraft or spacecraft fuselage) with a portion of the wall vibrating in a large number of structural modes. First results were presented at the ASME Winter Annual Meeting in December, 1987, and accepted for publication in the Journal of Vibration, Acoustics, Stress and Reliability in Design. It is shown that asymptotically as the number of acoustic modes excited becomes large, the pressure level in the cavity becomes uniform except at the cavity boundaries. However, the mean square pressure at the cavity corner, edge and wall is, respectively, 8, 4, and 2 times the value in the cavity interior. Also it is shown that when the portion of the wall which is vibrating is near a cavity corner or edge, the response is significantly higher.
NASA Astrophysics Data System (ADS)
Snyder, Morgan E.; Waldron, John W. F.
2018-03-01
The deformation history of the Upper Paleozoic Maritimes Basin, Atlantic Canada, can be partially unraveled by examining fractures (joints, veins, and faults) that are well exposed on the shorelines of the macrotidal Bay of Fundy, in subsurface core, and on image logs. Data were collected from coastal outcrops and well core across the Windsor-Kennetcook subbasin, a subbasin in the Maritimes Basin, using the circular scan-line and vertical scan-line methods in outcrop, and FMI Image log analysis of core. We use cross-cutting and abutting relationships between fractures to understand relative timing of fracturing, followed by a statistical test (Markov chain analysis) to separate groups of fractures. This analysis, previously used in sedimentology, was modified to statistically test the randomness of fracture timing relationships. The results of the Markov chain analysis suggest that fracture initiation can be attributed to movement along the Minas Fault Zone, an E-W fault system that bounds the Windsor-Kennetcook subbasin to the north. Four sets of fractures are related to dextral strike slip along the Minas Fault Zone in the late Paleozoic, and four sets are related to sinistral reactivation of the same boundary in the Mesozoic.
ERIC Educational Resources Information Center
Eugene, Michael; Carlson, Robert; Hrowal, Heidi; Fahey, John; Ronnei, Jean; Young, Steve; Gomez, Joseph; Thomas, Michael
2007-01-01
This report describes 50 initial statistical indicators developed by the Council of the Great City Schools and its member districts to measure big-city school performance on a range of operational and business functions, and presents data city-by-city on those indicators. The analysis marks the first time that such indicators have been developed…
Almalik, Osama; Nijhuis, Michiel B; van den Heuvel, Edwin R
2014-01-01
Shelf-life estimation usually requires that at least three registration batches are tested for stability at multiple storage conditions. The shelf-life estimates are often obtained by linear regression analysis per storage condition, an approach implicitly suggested by ICH guideline Q1E. A linear regression analysis combining all data from multiple storage conditions was recently proposed in the literature when variances are homogeneous across storage conditions. The combined analysis is expected to perform better than the separate analysis per storage condition, since pooling data would lead to an improved estimate of the variation and higher numbers of degrees of freedom, but this is not evident for shelf-life estimation. Indeed, the two approaches treat the observed initial batch results, the intercepts in the model, and poolability of batches differently, which may eliminate or reduce the expected advantage of the combined approach with respect to the separate approach. Therefore, a simulation study was performed to compare the distribution of simulated shelf-life estimates on several characteristics between the two approaches and to quantify the difference in shelf-life estimates. In general, the combined statistical analysis does estimate the true shelf life more consistently and precisely than the analysis per storage condition, but it did not outperform the separate analysis in all circumstances.
Super-stable Poissonian structures
NASA Astrophysics Data System (ADS)
Eliazar, Iddo
2012-10-01
In this paper we characterize classes of Poisson processes whose statistical structures are super-stable. We consider a flow generated by a one-dimensional ordinary differential equation, and an ensemble of particles ‘surfing’ the flow. The particles start from random initial positions, and are propagated along the flow by stochastic ‘wave processes’ with general statistics and general cross correlations. Setting the initial positions to be Poisson processes, we characterize the classes of Poisson processes that render the particles’ positions—at all times, and invariantly with respect to the wave processes—statistically identical to their initial positions. These Poisson processes are termed ‘super-stable’ and facilitate the generalization of the notion of stationary distributions far beyond the realm of Markov dynamics.
Six Degree-of-Freedom Entry Dispersion Analysis for the METEOR Recovery Module
NASA Technical Reports Server (NTRS)
Desai, Prasun N.; Braun, Robert D.; Powell, Richard W.; Engelund, Walter C.; Tartabini, Paul V.
1996-01-01
The present study performs a six degree-of-freedom entry dispersion analysis for the Multiple Experiment Transporter to Earth Orbit and Return (METEOR) mission. METEOR offered the capability of flying a recoverable science package in a microgravity environment. However, since the Recovery Module has no active control system, an accurate determination of the splashdown position is difficult because no opportunity exists to remove any errors. Hence, uncertainties in the initial conditions prior to deorbit burn initiation, during deorbit burn and exo-atmospheric coast phases, and during atmospheric flight impact the splashdown location. This investigation was undertaken to quantify the impact of the various exo-atmospheric and atmospheric uncertainties. Additionally, a Monte-Carlo analysis was performed to statistically assess the splashdown dispersion footprint caused by the multiple mission uncertainties. The Monte-Carlo analysis showed that a 3-sigma splashdown dispersion footprint with axes of 43.3 nm (long), -33.5 nm (short), and 10.0 nm (crossrange) can be constructed. A 58% probability exists that the Recovery Module will overshoot the nominal splashdown site.
Demographic and health attributes of the Nahua, initial contact population of the Peruvian Amazon.
Culqui, Dante R; Ayuso-Alvarez, Ana; Munayco, Cesar V; Quispe-Huaman, Carlos; Mayta-Tristán, Percy; Campos, Juan de Mata Donado
2016-01-01
We present the case of the Nahua population of Santa Rosa de Serjali, Peruvian Amazon's population, considered of initial contact. This population consists of human groups that for a long time decided to live in isolation, but lately have begun living a more sedentary lifestyle and in contact with Western populations. There are two fully identified initial contact groups in Peru: the Nahua and the Nanti. The health statistics of the Nahua are scarce. This study offers an interpretation of demographic and epidemiological indicators of the Nahua people, trying to identify if a certain degree of health vulnerability exists. We performed a cross sectional study, and after analyzing their health indicators, as well as the supplemental qualitative analysis of the population, brought us to conclude that in 2006, the Nahua, remained in a state of health vulnerability.
Evaluating a policing strategy intended to disrupt an illicit street-level drug market.
Corsaro, Nicholas; Brunson, Rod K; McGarrell, Edmund F
2010-12-01
The authors examined a strategic policing initiative that was implemented in a high crime Nashville, Tennessee neighborhood by utilizing a mixed-methodological evaluation approach in order to provide (a) a descriptive process assessment of program fidelity; (b) an interrupted time-series analysis relying upon generalized linear models; (c) in-depth resident interviews. Results revealed that the initiative corresponded with a statistically significant reduction in drug and narcotics incidents as well as perceived changes in neighborhood disorder within the target community. There was less-clear evidence, however, of a significant impact on other outcomes examined. The implications that an intensive crime prevention strategy corresponded with a reduction in specific forms of neighborhood crime illustrates the complex considerations that law enforcement officials face when deciding to implement this type of crime prevention initiative.
Stawarczyk, Bogna; Ozcan, Mutlu; Hämmerle, Christoph H F; Roos, Malgorzata
2012-05-01
The aim of this study was to compare the fracture load of veneered anterior zirconia crowns using normal and Weibull distribution of complete and censored data. Standardized zirconia frameworks for maxillary canines were milled using a CAD/CAM system and randomly divided into 3 groups (N=90, n=30 per group). They were veneered with three veneering ceramics, namely GC Initial ZR, Vita VM9, IPS e.max Ceram using layering technique. The crowns were cemented with glass ionomer cement on metal abutments. The specimens were then loaded to fracture (1 mm/min) in a Universal Testing Machine. The data were analyzed using classical method (normal data distribution (μ, σ); Levene test and one-way ANOVA) and according to the Weibull statistics (s, m). In addition, fracture load results were analyzed depending on complete and censored failure types (only chipping vs. total fracture together with chipping). When computed with complete data, significantly higher mean fracture loads (N) were observed for GC Initial ZR (μ=978, σ=157; s=1043, m=7.2) and VITA VM9 (μ=1074, σ=179; s=1139; m=7.8) than that of IPS e.max Ceram (μ=798, σ=174; s=859, m=5.8) (p<0.05) by classical and Weibull statistics, respectively. When the data were censored for only total fracture, IPS e.max Ceram presented the lowest fracture load for chipping with both classical distribution (μ=790, σ=160) and Weibull statistics (s=836, m=6.5). When total fracture with chipping (classical distribution) was considered as failure, IPS e.max Ceram did not show significant fracture load for total fracture (μ=1054, σ=110) compared to other groups (GC Initial ZR: μ=1039, σ=152, VITA VM9: μ=1170, σ=166). According to Weibull distributed data, VITA VM9 showed significantly higher fracture load (s=1228, m=9.4) than those of other groups. Both classical distribution and Weibull statistics for complete data yielded similar outcomes. Censored data analysis of all ceramic systems based on failure types is essential and brings additional information regarding the susceptibility to chipping or total fracture. Copyright © 2011 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
Computational knowledge integration in biopharmaceutical research.
Ficenec, David; Osborne, Mark; Pradines, Joel; Richards, Dan; Felciano, Ramon; Cho, Raymond J; Chen, Richard O; Liefeld, Ted; Owen, James; Ruttenberg, Alan; Reich, Christian; Horvath, Joseph; Clark, Tim
2003-09-01
An initiative to increase biopharmaceutical research productivity by capturing, sharing and computationally integrating proprietary scientific discoveries with public knowledge is described. This initiative involves both organisational process change and multiple interoperating software systems. The software components rely on mutually supporting integration techniques. These include a richly structured ontology, statistical analysis of experimental data against stored conclusions, natural language processing of public literature, secure document repositories with lightweight metadata, web services integration, enterprise web portals and relational databases. This approach has already begun to increase scientific productivity in our enterprise by creating an organisational memory (OM) of internal research findings, accessible on the web. Through bringing together these components it has also been possible to construct a very large and expanding repository of biological pathway information linked to this repository of findings which is extremely useful in analysis of DNA microarray data. This repository, in turn, enables our research paradigm to be shifted towards more comprehensive systems-based understandings of drug action.
Low-cost digital image processing at the University of Oklahoma
NASA Technical Reports Server (NTRS)
Harrington, J. A., Jr.
1981-01-01
Computer assisted instruction in remote sensing at the University of Oklahoma involves two separate approaches and is dependent upon initial preprocessing of a LANDSAT computer compatible tape using software developed for an IBM 370/158 computer. In-house generated preprocessing algorithms permits students or researchers to select a subset of a LANDSAT scene for subsequent analysis using either general purpose statistical packages or color graphic image processing software developed for Apple II microcomputers. Procedures for preprocessing the data and image analysis using either of the two approaches for low-cost LANDSAT data processing are described.
Park, Ji Eun; Han, Kyunghwa; Sung, Yu Sub; Chung, Mi Sun; Koo, Hyun Jung; Yoon, Hee Mang; Choi, Young Jun; Lee, Seung Soo; Kim, Kyung Won; Shin, Youngbin; An, Suah; Cho, Hyo-Min
2017-01-01
Objective To evaluate the frequency and adequacy of statistical analyses in a general radiology journal when reporting a reliability analysis for a diagnostic test. Materials and Methods Sixty-three studies of diagnostic test accuracy (DTA) and 36 studies reporting reliability analyses published in the Korean Journal of Radiology between 2012 and 2016 were analyzed. Studies were judged using the methodological guidelines of the Radiological Society of North America-Quantitative Imaging Biomarkers Alliance (RSNA-QIBA), and COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) initiative. DTA studies were evaluated by nine editorial board members of the journal. Reliability studies were evaluated by study reviewers experienced with reliability analysis. Results Thirty-one (49.2%) of the 63 DTA studies did not include a reliability analysis when deemed necessary. Among the 36 reliability studies, proper statistical methods were used in all (5/5) studies dealing with dichotomous/nominal data, 46.7% (7/15) of studies dealing with ordinal data, and 95.2% (20/21) of studies dealing with continuous data. Statistical methods were described in sufficient detail regarding weighted kappa in 28.6% (2/7) of studies and regarding the model and assumptions of intraclass correlation coefficient in 35.3% (6/17) and 29.4% (5/17) of studies, respectively. Reliability parameters were used as if they were agreement parameters in 23.1% (3/13) of studies. Reproducibility and repeatability were used incorrectly in 20% (3/15) of studies. Conclusion Greater attention to the importance of reporting reliability, thorough description of the related statistical methods, efforts not to neglect agreement parameters, and better use of relevant terminology is necessary. PMID:29089821
Gender differences in the initiation of injection drug use among young adults.
Doherty, M C; Garfein, R S; Monterroso, E; Latkin, C; Vlahov, D
2000-09-01
To characterize the circumstances surrounding initiation of injecting drug use, data were collected from 229 young, recently initiated injection drug users enrolled through community-based recruitment in Baltimore, Maryland. Gender differences in the pattern of initiation, the number of persons present at initiation, risky injection, and sexual behaviors at initiation, as well as behaviors after initiation, were examined. Overall, men and women were similar statistically with respect to age at initiation (19.5 years) and risk behaviors at initiation. While men were initiated by men (77%), women were more often initiated by women (65%), most of whom were friends (75%) or relatives (23%). The percentage of women infected with human immunodeficiency virus (HIV) was slightly greater than that of men, 17% versus 11% (P < .2), whether initiated by a man or a woman. Persons who self-initiated had a lower HIV prevalence and fewer HIV-related risk behaviors. Analysis of variance assessed differences in the HIV risk profiles of female and male IDUs who were initiated by someone of the same sex, of the opposite sex, or who self-initiated. These results indicated that (1) young women and men had similar patterns of injection initiation; (2) most women were initiated by female friends, running counter to earlier literature claims that women were initiated to injection drug use by male sex partners; and (3) women initiated by men had a marginally greater mean score on the HIV risk profile.
Pevnick, Joshua M.; Fuller, Garth; Duncan, Ray; Spiegel, Brennan M. R.
2016-01-01
Background Personal fitness trackers (PFT) have substantial potential to improve healthcare. Objective To quantify and characterize early adopters who shared their PFT data with providers. Methods We used bivariate statistics and logistic regression to compare patients who shared any PFT data vs. patients who did not. Results A patient portal was used to invite 79,953 registered portal users to share their data. Of 66,105 users included in our analysis, 499 (0.8%) uploaded data during an initial 37-day study period. Bivariate and regression analysis showed that early adopters were more likely than non-adopters to be younger, male, white, health system employees, and to have higher BMIs. Neither comorbidities nor utilization predicted adoption. Conclusion Our results demonstrate that patients had little intrinsic desire to share PFT data with their providers, and suggest that patients most at risk for poor health outcomes are least likely to share PFT data. Marketing, incentives, and/or cultural change may be needed to induce such data-sharing. PMID:27846287
Brown, James G; Joyce, Kerry E; Stacey, Dawn; Thomson, Richard G
2015-05-01
Efficacy of patient decision aids (PtDAs) may be influenced by trial participants' identity either as patients seeking to benefit personally from involvement or as volunteers supporting the research effort. To determine if study characteristics indicative of participants' trial identity might influence PtDA efficacy. We undertook exploratory subgroup meta-analysis of the 2011 Cochrane review of PtDAs, including trials that compared PtDA with usual care for treatment decisions. We extracted data on whether participants initiated the care pathway, setting, practitioner interactions, and 6 outcome variables (knowledge, risk perception, decisional conflict, feeling informed, feeling clear about values, and participation). The main subgroup analysis categorized trials as "volunteerism" or "patienthood" on the basis of whether participants initiated the care pathway. A supplementary subgroup analysis categorized trials on the basis of whether any volunteerism factors were present (participants had not initiated the care pathway, had attended a research setting, or had a face-to-face interaction with a researcher). Twenty-nine trials were included. Compared with volunteerism trials, pooled effect sizes were higher in patienthood trials (where participants initiated the care pathway) for knowledge, decisional conflict, feeling informed, feeling clear, and participation. The subgroup difference was statistically significant for knowledge only (P = 0.03). When trials were compared on the basis of whether volunteerism factors were present, knowledge was significantly greater in patienthood trials (P < 0.001), but there was otherwise no consistent pattern of differences in effects across outcomes. There is a tendency toward greater PtDA efficacy in trials in which participants initiate the pathway of care. Knowledge acquisition appears to be greater in trials where participants are predominantly patients rather than volunteers. © The Author(s) 2015.
Padula, William V; Mishra, Manish K; Weaver, Christopher D; Yilmaz, Taygan; Splaine, Mark E
2012-06-01
To demonstrate complementary results of regression and statistical process control (SPC) chart analyses for hospital-acquired pressure ulcers (HAPUs), and identify possible links between changes and opportunities for improvement between hospital microsystems and macrosystems. Ordinary least squares and panel data regression of retrospective hospital billing data, and SPC charts of prospective patient records for a US tertiary-care facility (2004-2007). A prospective cohort of hospital inpatients at risk for HAPUs was the study population. There were 337 HAPU incidences hospital wide among 43 844 inpatients. A probit regression model predicted the correlation of age, gender and length of stay on HAPU incidence (pseudo R(2)=0.096). Panel data analysis determined that for each additional day in the hospital, there was a 0.28% increase in the likelihood of HAPU incidence. A p-chart of HAPU incidence showed a mean incidence rate of 1.17% remaining in statistical control. A t-chart showed the average time between events for the last 25 HAPUs was 13.25 days. There was one 57-day period between two incidences during the observation period. A p-chart addressing Braden scale assessments showed that 40.5% of all patients were risk stratified for HAPUs upon admission. SPC charts complement standard regression analysis. SPC amplifies patient outcomes at the microsystem level and is useful for guiding quality improvement. Macrosystems should monitor effective quality improvement initiatives in microsystems and aid the spread of successful initiatives to other microsystems, followed by system-wide analysis with regression. Although HAPU incidence in this study is below the national mean, there is still room to improve HAPU incidence in this hospital setting since 0% incidence is theoretically achievable. Further assessment of pressure ulcer incidence could illustrate improvement in the quality of care and prevent HAPUs.
Jung, Kyung-Won; Ahn, Kyu-Hong
2016-01-01
The present study is focused on the application of recovered coagulant (RC) by acidification from drinking water treatment residuals for both adjusting the initial pH and aiding coagulant in electrocoagulation. To do this, real cotton textile wastewater was used as a target pollutant, and decolorization and chemical oxygen demand (COD) removal efficiency were monitored. A preliminary test indicated that a stainless steel electrode combined with RC significantly accelerated decolorization and COD removal efficiencies, by about 52% and 56%, respectively, even at an operating time of 5 min. A single electrocoagulation system meanwhile requires at least 40 min to attain the similar removal performances. Subsequently, the interactive effect of three independent variables (applied voltage, initial pH, and reaction time) on the response variables (decolorization and COD removal) was evaluated, and these parameters were statistically optimized using the response surface methodology. Analysis of variance showed a high coefficient of determination values (decolorization, R(2) = 0.9925 and COD removal, R(2) = 0.9973) and satisfactory prediction second-order polynomial quadratic regression models. Average decolorization and COD removal of 89.52% and 94.14%, respectively, were achieved, corresponding to 97.8% and 98.1% of the predicted values under statistically optimized conditions. The results suggest that the RC effectively played a dual role of both adjusting the initial pH and aiding coagulant in the electrocoagulation process.
NASA Astrophysics Data System (ADS)
Ridgley, James Alexander, Jr.
This dissertation is an exploratory quantitative analysis of various independent variables to determine their effect on the professional longevity (years of service) of high school science teachers in the state of Florida for the academic years 2011-2012 to 2013-2014. Data are collected from the Florida Department of Education, National Center for Education Statistics, and the National Assessment of Educational Progress databases. The following research hypotheses are examined: H1 - There are statistically significant differences in Level 1 (teacher variables) that influence the professional longevity of a high school science teacher in Florida. H2 - There are statistically significant differences in Level 2 (school variables) that influence the professional longevity of a high school science teacher in Florida. H3 - There are statistically significant differences in Level 3 (district variables) that influence the professional longevity of a high school science teacher in Florida. H4 - When tested in a hierarchical multiple regression, there are statistically significant differences in Level 1, Level 2, or Level 3 that influence the professional longevity of a high school science teacher in Florida. The professional longevity of a Floridian high school science teacher is the dependent variable. The independent variables are: (Level 1) a teacher's sex, age, ethnicity, earned degree, salary, number of schools taught in, migration count, and various years of service in different areas of education; (Level 2) a school's geographic location, residential population density, average class size, charter status, and SES; and (Level 3) a school district's average SES and average spending per pupil. Statistical analyses of exploratory MLRs and a HMR are used to support the research hypotheses. The final results of the HMR analysis show a teacher's age, salary, earned degree (unknown, associate, and doctorate), and ethnicity (Hispanic and Native Hawaiian/Pacific Islander); a school's charter status; and a school district's average SES are all significant predictors of a Florida high school science teacher's professional longevity. Although statistically significant in the initial exploratory MLR analyses, a teacher's ethnicity (Asian and Black), a school's geographic location (city and rural), and a school's SES are not statistically significant in the final HMR model.
Tom, Jennifer A; Sinsheimer, Janet S; Suchard, Marc A
Massive datasets in the gigabyte and terabyte range combined with the availability of increasingly sophisticated statistical tools yield analyses at the boundary of what is computationally feasible. Compromising in the face of this computational burden by partitioning the dataset into more tractable sizes results in stratified analyses, removed from the context that justified the initial data collection. In a Bayesian framework, these stratified analyses generate intermediate realizations, often compared using point estimates that fail to account for the variability within and correlation between the distributions these realizations approximate. However, although the initial concession to stratify generally precludes the more sensible analysis using a single joint hierarchical model, we can circumvent this outcome and capitalize on the intermediate realizations by extending the dynamic iterative reweighting MCMC algorithm. In doing so, we reuse the available realizations by reweighting them with importance weights, recycling them into a now tractable joint hierarchical model. We apply this technique to intermediate realizations generated from stratified analyses of 687 influenza A genomes spanning 13 years allowing us to revisit hypotheses regarding the evolutionary history of influenza within a hierarchical statistical framework.
Tom, Jennifer A.; Sinsheimer, Janet S.; Suchard, Marc A.
2015-01-01
Massive datasets in the gigabyte and terabyte range combined with the availability of increasingly sophisticated statistical tools yield analyses at the boundary of what is computationally feasible. Compromising in the face of this computational burden by partitioning the dataset into more tractable sizes results in stratified analyses, removed from the context that justified the initial data collection. In a Bayesian framework, these stratified analyses generate intermediate realizations, often compared using point estimates that fail to account for the variability within and correlation between the distributions these realizations approximate. However, although the initial concession to stratify generally precludes the more sensible analysis using a single joint hierarchical model, we can circumvent this outcome and capitalize on the intermediate realizations by extending the dynamic iterative reweighting MCMC algorithm. In doing so, we reuse the available realizations by reweighting them with importance weights, recycling them into a now tractable joint hierarchical model. We apply this technique to intermediate realizations generated from stratified analyses of 687 influenza A genomes spanning 13 years allowing us to revisit hypotheses regarding the evolutionary history of influenza within a hierarchical statistical framework. PMID:26681992
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, James W.; Liu, Da-Jiang
We develop statistical mechanical models amenable to analytic treatment for the dissociative adsorption of O2 at hollow sites on fcc(100) metal surfaces. The models incorporate exclusion of nearest-neighbor pairs of adsorbed O. However, corresponding simple site-blocking models, where adsorption requires a large ensemble of available sites, exhibit an anomalously fast initial decrease in sticking. Thus, in addition to blocking, our models also incorporate more facile adsorption via orientational steering and funneling dynamics (features supported by ab initio Molecular Dynamics studies). Behavior for equilibrated adlayers is distinct from those with finite adspecies mobility. We focus on the low-temperature limited-mobility regime wheremore » analysis of the associated master equations readily produces exact results for both short- and long-time behavior. Kinetic Monte Carlo simulation is also utilized to provide a more complete picture of behavior. These models capture both the initial decrease and the saturation of the experimentally observed sticking versus coverage, as well as features of non-equilibrium adlayer ordering as assessed by surface-sensitive diffraction.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, James W.; Department of Physics and Astronomy, Iowa State University, Ames, Iowa 50011; Liu, Da-Jiang
We develop statistical mechanical models amenable to analytic treatment for the dissociative adsorption of O{sub 2} at hollow sites on fcc(100) metal surfaces. The models incorporate exclusion of nearest-neighbor pairs of adsorbed O. However, corresponding simple site-blocking models, where adsorption requires a large ensemble of available sites, exhibit an anomalously fast initial decrease in sticking. Thus, in addition to blocking, our models also incorporate more facile adsorption via orientational steering and funneling dynamics (features supported by ab initio Molecular Dynamics studies). Behavior for equilibrated adlayers is distinct from those with finite adspecies mobility. We focus on the low-temperature limited-mobility regimemore » where analysis of the associated master equations readily produces exact results for both short- and long-time behavior. Kinetic Monte Carlo simulation is also utilized to provide a more complete picture of behavior. These models capture both the initial decrease and the saturation of the experimentally observed sticking versus coverage, as well as features of non-equilibrium adlayer ordering as assessed by surface-sensitive diffraction.« less
NASA Astrophysics Data System (ADS)
Yoshida, Yuki; Karakida, Ryo; Okada, Masato; Amari, Shun-ichi
2017-04-01
Weight normalization, a newly proposed optimization method for neural networks by Salimans and Kingma (2016), decomposes the weight vector of a neural network into a radial length and a direction vector, and the decomposed parameters follow their steepest descent update. They reported that learning with the weight normalization achieves better converging speed in several tasks including image recognition and reinforcement learning than learning with the conventional parameterization. However, it remains theoretically uncovered how the weight normalization improves the converging speed. In this study, we applied a statistical mechanical technique to analyze on-line learning in single layer linear and nonlinear perceptrons with weight normalization. By deriving order parameters of the learning dynamics, we confirmed quantitatively that weight normalization realizes fast converging speed by automatically tuning the effective learning rate, regardless of the nonlinearity of the neural network. This property is realized when the initial value of the radial length is near the global minimum; therefore, our theory suggests that it is important to choose the initial value of the radial length appropriately when using weight normalization.
NASA Astrophysics Data System (ADS)
Saez, Núria; Ruiz, Xavier; Pallarés, Jordi; Shevtsova, Valentina
2013-04-01
An accelerometric record from the IVIDIL experiment (ESA Columbus module) has exhaustively been studied. The analysis involved the determination of basic statistical properties as, for instance, the auto-correlation and the power spectrum (second-order statistical analyses). Also, and taking into account the shape of the associated histograms, we address another important question, the non-Gaussian nature of the time series using the bispectrum and the bicoherence of the signals. Extrapolating the above-mentioned results, a computational model of a high-temperature shear cell has been performed. A scalar indicator has been used to quantify the accuracy of the diffusion coefficient measurements in the case of binary mixtures involving photovoltaic silicon or liquid Al-Cu binary alloys. Three different initial arrangements have been considered, the so-called interdiffusion, centred thick layer and the lateral thick layer. Results allow us to conclude that, under the conditions of the present work, the diffusion coefficient is insensitive to the environmental conditions, that is to say, accelerometric disturbances and initial shear cell arrangement.
Alsaggaf, Rotana; O'Hara, Lyndsay M; Stafford, Kristen A; Leekha, Surbhi; Harris, Anthony D
2018-02-01
OBJECTIVE A systematic review of quasi-experimental studies in the field of infectious diseases was published in 2005. The aim of this study was to assess improvements in the design and reporting of quasi-experiments 10 years after the initial review. We also aimed to report the statistical methods used to analyze quasi-experimental data. DESIGN Systematic review of articles published from January 1, 2013, to December 31, 2014, in 4 major infectious disease journals. METHODS Quasi-experimental studies focused on infection control and antibiotic resistance were identified and classified based on 4 criteria: (1) type of quasi-experimental design used, (2) justification of the use of the design, (3) use of correct nomenclature to describe the design, and (4) statistical methods used. RESULTS Of 2,600 articles, 173 (7%) featured a quasi-experimental design, compared to 73 of 2,320 articles (3%) in the previous review (P<.01). Moreover, 21 articles (12%) utilized a study design with a control group; 6 (3.5%) justified the use of a quasi-experimental design; and 68 (39%) identified their design using the correct nomenclature. In addition, 2-group statistical tests were used in 75 studies (43%); 58 studies (34%) used standard regression analysis; 18 (10%) used segmented regression analysis; 7 (4%) used standard time-series analysis; 5 (3%) used segmented time-series analysis; and 10 (6%) did not utilize statistical methods for comparisons. CONCLUSIONS While some progress occurred over the decade, it is crucial to continue improving the design and reporting of quasi-experimental studies in the fields of infection control and antibiotic resistance to better evaluate the effectiveness of important interventions. Infect Control Hosp Epidemiol 2018;39:170-176.
NASA Astrophysics Data System (ADS)
Moguilnaya, T.; Suminov, Y.; Botikov, A.; Ignatov, S.; Kononenko, A.; Agibalov, A.
2017-01-01
We developed the new automatic method that combines the method of forced luminescence and stimulated Brillouin scattering. This method is used for monitoring pathogens, genetically modified products and nanostructured materials in colloidal solution. We carried out the statistical spectral analysis of pathogens, genetically modified soy and nano-particles of silver in water from different regions in order to determine the statistical errors of the method. We studied spectral characteristics of these objects in water to perform the initial identification with 95% probability. These results were used for creation of the model of the device for monitor of pathogenic organisms and working model of the device to determine the genetically modified soy in meat.
Arambasić, M B; Jatić-Slavković, D
2004-05-01
This paper presents the application of the regression analysis program and the program for comparing linear regressions (modified method for one-way, analysis of variance), writtens in BASIC program language, for instance, determination of content of Diclofenac-Sodium (active ingredient in DIKLOFEN injections, ampules á 75 mg/3 ml). Stability testing of Diclofenac-Sodium was done by isothermic method of accelerated aging at 4 different temperatures (30 degrees, 40 degrees, 50 degrees and 60 degrees C) as a function of time (4 different duration of treatment: (0-155, 0-145, 0-74 and 0-44 days). The decrease in stability (decrease in the mean value of the content of Diclofenac-Sodium (in %), at different temperatures as a function of time, is possible to describe by, linear dependance. According to the value for regression equation values, the times are assessed in which the content of Diclofenac-Sodium (in %) will decrease by 10%, of the initial value. The times are follows at 30 degrees C 761.02 days, at 40 degrees C 397.26 days, at 50 degrees C 201.96 days and at 60 degrees C 58.85 days. The estimated times (in days) in which the mean value for Diclofenac-Sodium content (in %) will by 10% of the initial values, as a junction of time, are most suitably described by 3rd order parabola. Based on the parameter values which describe the 3rd order parabola, the time was estimated in which Diclofenac-Sodium content mean value (in %) will fall by 10% of the initial one at average ambient temperatures of 20 degrees C and 25 degrees C. The times are: 1409.47 days (20 degrees C) and 1042.39 days (25 degrees C). Based on the value for Fischer's coefficien (F), the comparison of trenf of Diclofenac-Sodium content (in %) shows that, under the influence of different temperatures as a function of time, among them, depending on temperature value, there is: statistically very significant difference (P < .05) at 50 degrees C and lower toward 60 degrees C, i.e. statistically probably significant difference (P > 0.01) at 40 degrees C and lower towards 50 degrees C and there is no statistically significance difference (P > 0.05) at 30 degrees C towards 40 degrees C.
Insights into Corona Formation through Statistical Analyses
NASA Technical Reports Server (NTRS)
Glaze, L. S.; Stofan, E. R.; Smrekar, S. E.; Baloga, S. M.
2002-01-01
Statistical analysis of an expanded database of coronae on Venus indicates that the populations of Type 1 (with fracture annuli) and 2 (without fracture annuli) corona diameters are statistically indistinguishable, and therefore we have no basis for assuming different formation mechanisms. Analysis of the topography and diameters of coronae shows that coronae that are depressions, rimmed depressions, and domes tend to be significantly smaller than those that are plateaus, rimmed plateaus, or domes with surrounding rims. This is consistent with the model of Smrekar and Stofan and inconsistent with predictions of the spreading drop model of Koch and Manga. The diameter range for domes, the initial stage of corona formation, provides a broad constraint on the buoyancy of corona-forming plumes. Coronae are only slightly more likely to be topographically raised than depressions, with Type 1 coronae most frequently occurring as rimmed depressions and Type 2 coronae most frequently occuring with flat interiors and raised rims. Most Type 1 coronae are located along chasmata systems or fracture belts, while Type 2 coronas are found predominantly as isolated features in the plains. Coronae at hotspot rises tend to be significantly larger than coronae in other settings, consistent with a hotter upper mantle at hotspot rises and their active state.
Gait patterns for crime fighting: statistical evaluation
NASA Astrophysics Data System (ADS)
Sulovská, Kateřina; Bělašková, Silvie; Adámek, Milan
2013-10-01
The criminality is omnipresent during the human history. Modern technology brings novel opportunities for identification of a perpetrator. One of these opportunities is an analysis of video recordings, which may be taken during the crime itself or before/after the crime. The video analysis can be classed as identification analyses, respectively identification of a person via externals. The bipedal locomotion focuses on human movement on the basis of their anatomical-physiological features. Nowadays, the human gait is tested by many laboratories to learn whether the identification via bipedal locomotion is possible or not. The aim of our study is to use 2D components out of 3D data from the VICON Mocap system for deep statistical analyses. This paper introduces recent results of a fundamental study focused on various gait patterns during different conditions. The study contains data from 12 participants. Curves obtained from these measurements were sorted, averaged and statistically tested to estimate the stability and distinctiveness of this biometrics. Results show satisfactory distinctness of some chosen points, while some do not embody significant difference. However, results presented in this paper are of initial phase of further deeper and more exacting analyses of gait patterns under different conditions.
2015-12-01
WAIVERS ..............................................................................................49 APPENDIX C. DESCRIPTIVE STATISTICS ... Statistics of Dependent Variables. .............................................23 Table 6. Summary Statistics of Academics Variables...24 Table 7. Summary Statistics of Application Variables ............................................25 Table 8
Ten Eyck, Raymond P; Tews, Matthew; Ballester, John M; Hamilton, Glenn C
2010-06-01
To determine the impact of simulation-based instruction on student performance in the role of emergency department resuscitation team leader. A randomized, single-blinded, controlled study using an intention to treat analysis. Eighty-three fourth-year medical students enrolled in an emergency medicine clerkship were randomly allocated to two groups differing only by instructional format. Each student individually completed an initial simulation case, followed by a standardized curriculum of eight cases in either group simulation or case-based group discussion format before a second individual simulation case. A remote coinvestigator measured eight objective performance end points using digital recordings of all individual simulation cases. McNemar chi2, Pearson correlation, repeated measures multivariate analysis of variance, and follow-up analysis of variance were used for statistical evaluation. Sixty-eight students (82%) completed both initial and follow-up individual simulations. Eight students were lost from the simulation group and seven from the discussion group. The mean postintervention case performance was significantly better for the students allocated to simulation instruction compared with the group discussion students for four outcomes including a decrease in mean time to (1) order an intravenous line; (2) initiate cardiac monitoring; (3) order initial laboratory tests; and (4) initiate blood pressure monitoring. Paired comparisons of each student's initial and follow-up simulations demonstrated significant improvement in the same four areas, in mean time to order an abdominal radiograph and in obtaining an allergy history. A single simulation-based teaching session significantly improved student performance as a team leader. Additional simulation sessions provided further improvement compared with instruction provided in case-based group discussion format.
Chambers, Georgina M; Randall, Sean; Hoang, Van Phuong; Sullivan, Elizabeth A; Highet, Nicole; Croft, Maxine; Mihalopoulos, Cathrine; Morgan, Vera A; Reilly, Nicole; Austin, Marie-Paule
2016-03-01
To evaluate the impact of the National Perinatal Depression Initiative on access to Medicare services for women at risk of perinatal mental illness. Retrospective cohort study using difference-in-difference analytical methods to quantify the impact of the National Perinatal Depression Initiative policies on Medicare Benefits Schedule mental health usage by Australian women giving birth between 2006 and 2010. A random sample of women of reproductive age enrolled in Medicare who had not given birth where used as controls. The main outcome measures were the proportions of women giving birth each month who accessed a Medicare Benefits Schedule mental health items during the perinatal period (pregnancy through to the end of the first postnatal year) before and after the introduction of the National Perinatal Depression Initiative. The proportion of women giving birth who accessed at least one mental health item during the perinatal period increased from 88 to 141 per 1000 between 2007 and 2010. The difference-in-difference analysis showed that while there was an overall increase in Medicare Benefits Schedule mental health item access as a result of the National Perinatal Depression Initiative, this did not reach statistical significance. However, the National Perinatal Depression Initiative was found to significantly increase access in subpopulations of women, particularly those aged under 25 and over 34 years living in major cities. In the 2 years following its introduction, the National Perinatal Depression Initiative was found to have increased access to Medicare funded mental health services in particular groups of women. However, an overall increase across all groups did not reach statistical significance. Further studies are needed to assess the impact of the National Perinatal Depression Initiative on women during childbearing years, including access to tertiary care, the cost-effectiveness of the initiative, and mental health outcomes. It is recommended that new mental health policy initiatives incorporate a planned strategic approach to evaluation, which includes sufficient follow-up to assess the impact of public health strategies. © The Royal Australian and New Zealand College of Psychiatrists 2015.
An evaluation of various methods of treatment for Legg-Calvé-Perthes disease.
Wang, L; Bowen, J R; Puniak, M A; Guille, J T; Glutting, J
1995-05-01
An analysis of 5 methods of treatment for Legg-Calvé-Perthes disease was done on 124 patients with 141 affected hips. Before treatment, all groups were statistically similar concerning initial Mose measurement, age at onset of the disease, gender, and Catterall class. Treatments included the Scottish Rite orthosis (41 hips), nonweight bearing and exercises (41 hips), Petrie cast (29 hips), femoral varus osteotomy (15 hips), or Salter osteotomy (15 hips). Hips treated by the Scottish Rite orthosis had a significantly worse Mose measurement across time interaction (repeated measures analysis of variance, post hoc analyses, p < 0.05). For the other 4 treatment methods, there was no statistically different change. At followup, the Mose measurements for hips treated with the Scottish Rite orthosis were significantly worse than those for hips treated by nonweight bearing and exercises, Petrie cast, varus osteotomy, or Salter osteotomy (repeated measures analysis of variance, post hoc analyses, p < 0.05). There was, however, no significant difference in the distribution of hips according to the Stulberg et al classification at the last followup.
NASA Technical Reports Server (NTRS)
Duffy, S. F.; Hu, J.; Hopkins, D. A.
1995-01-01
The article begins by examining the fundamentals of traditional deterministic design philosophy. The initial section outlines the concepts of failure criteria and limit state functions two traditional notions that are embedded in deterministic design philosophy. This is followed by a discussion regarding safety factors (a possible limit state function) and the common utilization of statistical concepts in deterministic engineering design approaches. Next the fundamental aspects of a probabilistic failure analysis are explored and it is shown that deterministic design concepts mentioned in the initial portion of the article are embedded in probabilistic design methods. For components fabricated from ceramic materials (and other similarly brittle materials) the probabilistic design approach yields the widely used Weibull analysis after suitable assumptions are incorporated. The authors point out that Weibull analysis provides the rare instance where closed form solutions are available for a probabilistic failure analysis. Since numerical methods are usually required to evaluate component reliabilities, a section on Monte Carlo methods is included to introduce the concept. The article concludes with a presentation of the technical aspects that support the numerical method known as fast probability integration (FPI). This includes a discussion of the Hasofer-Lind and Rackwitz-Fiessler approximations.
Understanding Evaluation of Learning Support in Mathematics and Statistics
ERIC Educational Resources Information Center
MacGillivray, Helen; Croft, Tony
2011-01-01
With rapid and continuing growth of learning support initiatives in mathematics and statistics found in many parts of the world, and with the likelihood that this trend will continue, there is a need to ensure that robust and coherent measures are in place to evaluate the effectiveness of these initiatives. The nature of learning support brings…
Doing Research That Matters: A Success Story from Statistics Education
ERIC Educational Resources Information Center
Hipkins, Rosemary
2014-01-01
This is the first report from a new initiative called TLRI Project Plus. It aims to add value to the Teaching and Learning Research Initiative (TLRI), which NZCER manages on behalf of the government, by synthesising findings across multiple projects. This report focuses on two projects in statistics education and explores the factors that…
Torok, Michelle; Konings, Paul; Batterham, Philip J; Christensen, Helen
2017-10-06
Rates of suicide appear to be increasing, indicating a critical need for more effective prevention initiatives. To increase the efficacy of future prevention initiatives, we examined the spatial distribution of suicide deaths and suicide attempts in New South Wales (NSW), Australia, to identify where high incidence 'suicide clusters' were occurring. Such clusters represent candidate regions where intervention is critically needed, and likely to have the greatest impact, thus providing an evidence-base for the targeted prioritisation of resources. Analysis is based on official suicide mortality statistics for NSW, provided by the Australian Bureau of Statistics, and hospital separations for non-fatal intentional self-harm, provided through the NSW Health Admitted Patient Data Collection at a Statistical Area 2 (SA2) geography. Geographical Information System (GIS) techniques were applied to detect suicide clusters occurring between 2005 and 2013 (aggregated), for persons aged over 5 years. The final dataset contained 5466 mortality and 86,017 non-fatal intentional self-harm cases. In total, 25 Local Government Areas were identified as primary or secondary likely candidate regions for intervention. Together, these regions contained approximately 200 SA2 level suicide clusters, which represented 46% (n = 39,869) of hospital separations and 43% (n = 2330) of suicide deaths between 2005 and 2013. These clusters primarily converged on the Eastern coastal fringe of NSW. Crude rates of suicide deaths and intentional self-harm differed at the Local Government Areas (LGA) level in NSW. There was a tendency for primary suicide clusters to occur within metropolitan and coastal regions, rather than rural areas. The findings demonstrate the importance of taking geographical variation of suicidal behaviour into account, prior to development and implementation of prevention initiatives, so that such initiatives can target key problem areas where they are likely to have maximal impact.
Uncertainty analysis for the steady-state flows in a dual throat nozzle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Q.-Y.; Gottlieb, David; Hesthaven, Jan S.
2005-03-20
It is well known that the steady state of an isentropic flow in a dual-throat nozzle with equal throat areas is not unique. In particular there is a possibility that the flow contains a shock wave, whose location is determined solely by the initial condition. In this paper, we consider cases with uncertainty in this initial condition and use generalized polynomial chaos methods to study the steady-state solutions for stochastic initial conditions. Special interest is given to the statistics of the shock location. The polynomial chaos (PC) expansion modes are shown to be smooth functions of the spatial variable x,more » although each solution realization is discontinuous in the spatial variable x. When the variance of the initial condition is small, the probability density function of the shock location is computed with high accuracy. Otherwise, many terms are needed in the PC expansion to produce reasonable results due to the slow convergence of the PC expansion, caused by non-smoothness in random space.« less
Sensitivity analysis of static resistance of slender beam under bending
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valeš, Jan
2016-06-08
The paper deals with statical and sensitivity analyses of resistance of simply supported I-beams under bending. The resistance was solved by geometrically nonlinear finite element method in the programme Ansys. The beams are modelled with initial geometrical imperfections following the first eigenmode of buckling. Imperfections were, together with geometrical characteristics of cross section, and material characteristics of steel, considered as random quantities. The method Latin Hypercube Sampling was applied to evaluate statistical and sensitivity resistance analyses.
Statistical Analysis of Warfare: Identification of Winning Factors with a Focus on Irregular Warfare
2015-09-01
Defense HERO Historical Evaluation and Research Organization IDPs Internally Displaced Persons IFR Initial Force Ratio INITA Relative Imitative...armies started combating non-state, widely dispersed groups. Although this change appears to be quite simple, it has had a deep impact on military...1789–1961: A Study of the Impact of the French, Industrial, and Russian Revolutions on War and its Conduct (Boston, MA: Da Capo Press, 1992). 10 Ibid
Park, Rachel; O'Brien, Thomas F.; Huang, Susan S.; Baker, Meghan A.; Yokoe, Deborah S.; Kulldorff, Martin; Barrett, Craig; Swift, Jamie; Stelling, John
2016-01-01
Objectives While antimicrobial resistance threatens the prevention, treatment, and control of infectious diseases, systematic analysis of routine microbiology laboratory test results worldwide can alert new threats and promote timely response. This study explores statistical algorithms for recognizing geographic clustering of multi-resistant microbes within a healthcare network and monitoring the dissemination of new strains over time. Methods Escherichia coli antimicrobial susceptibility data from a three-year period stored in WHONET were analyzed across ten facilities in a healthcare network utilizing SaTScan's spatial multinomial model with two models for defining geographic proximity. We explored geographic clustering of multi-resistance phenotypes within the network and changes in clustering over time. Results Geographic clustering identified from both latitude/longitude and non-parametric facility groupings geographic models were similar, while the latter was offers greater flexibility and generalizability. Iterative application of the clustering algorithms suggested the possible recognition of the initial appearance of invasive E. coli ST131 in the clinical database of a single hospital and subsequent dissemination to others. Conclusion Systematic analysis of routine antimicrobial resistance susceptibility test results supports the recognition of geographic clustering of microbial phenotypic subpopulations with WHONET and SaTScan, and iterative application of these algorithms can detect the initial appearance in and dissemination across a region prompting early investigation, response, and containment measures. PMID:27530311
NASA Astrophysics Data System (ADS)
Shemer, L.; Sergeeva, A.
2009-12-01
The statistics of random water wave field determines the probability of appearance of extremely high (freak) waves. This probability is strongly related to the spectral wave field characteristics. Laboratory investigation of the spatial variation of the random wave-field statistics for various initial conditions is thus of substantial practical importance. Unidirectional nonlinear random wave groups are investigated experimentally in the 300 m long Large Wave Channel (GWK) in Hannover, Germany, which is the biggest facility of its kind in Europe. Numerous realizations of a wave field with the prescribed frequency power spectrum, yet randomly-distributed initial phases of each harmonic, were generated by a computer-controlled piston-type wavemaker. Several initial spectral shapes with identical dominant wave length but different width were considered. For each spectral shape, the total duration of sampling in all realizations was long enough to yield sufficient sample size for reliable statistics. Through all experiments, an effort had been made to retain the characteristic wave height value and thus the degree of nonlinearity of the wave field. Spatial evolution of numerous statistical wave field parameters (skewness, kurtosis and probability distributions) is studied using about 25 wave gauges distributed along the tank. It is found that, depending on the initial spectral shape, the frequency spectrum of the wave field may undergo significant modification in the course of its evolution along the tank; the values of all statistical wave parameters are strongly related to the local spectral width. A sample of the measured wave height probability functions (scaled by the variance of surface elevation) is plotted in Fig. 1 for the initially narrow rectangular spectrum. The results in Fig. 1 resemble findings obtained in [1] for the initial Gaussian spectral shape. The probability of large waves notably surpasses that predicted by the Rayleigh distribution and is the highest at the distance of about 100 m. Acknowledgement This study is carried out in the framework of the EC supported project "Transnational access to large-scale tests in the Large Wave Channel (GWK) of Forschungszentrum Küste (Contract HYDRALAB III - No. 022441). [1] L. Shemer and A. Sergeeva, J. Geophys. Res. Oceans 114, C01015 (2009). Figure 1. Variation along the tank of the measured wave height distribution for rectangular initial spectral shape, the carrier wave period T0=1.5 s.
Statistical modeling of an integrated boiler for coal fired thermal power plant.
Chandrasekharan, Sreepradha; Panda, Rames Chandra; Swaminathan, Bhuvaneswari Natrajan
2017-06-01
The coal fired thermal power plants plays major role in the power production in the world as they are available in abundance. Many of the existing power plants are based on the subcritical technology which can produce power with the efficiency of around 33%. But the newer plants are built on either supercritical or ultra-supercritical technology whose efficiency can be up to 50%. Main objective of the work is to enhance the efficiency of the existing subcritical power plants to compensate for the increasing demand. For achieving the objective, the statistical modeling of the boiler units such as economizer, drum and the superheater are initially carried out. The effectiveness of the developed models is tested using analysis methods like R 2 analysis and ANOVA (Analysis of Variance). The dependability of the process variable (temperature) on different manipulated variables is analyzed in the paper. Validations of the model are provided with their error analysis. Response surface methodology (RSM) supported by DOE (design of experiments) are implemented to optimize the operating parameters. Individual models along with the integrated model are used to study and design the predictive control of the coal-fired thermal power plant.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ingram, Jani Cheri; Lehman, Richard Michael; Bauer, William Francis
We report the use of a surface analysis approach, static secondary ion mass spectrometry (SIMS) equipped with a molecular (ReO4-) ion primary beam, to analyze the surface of intact microbial cells. SIMS spectra of 28 microorganisms were compared to fatty acid profiles determined by gas chromatographic analysis of transesterfied fatty acids extracted from the same organisms. The results indicate that surface bombardment using the molecular primary beam cleaved the ester linkage characteristic of bacteria at the glycerophosphate backbone of the phospholipid components of the cell membrane. This cleavage enables direct detection of the fatty acid conjugate base of intact microorganismsmore » by static SIMS. The limit of detection for this approach is approximately 107 bacterial cells/cm2. Multivariate statistical methods were applied in a graded approach to the SIMS microbial data. The results showed that the full data set could initially be statistically grouped based upon major differences in biochemical composition of the cell wall. The gram-positive bacteria were further statistically analyzed, followed by final analysis of a specific bacterial genus that was successfully grouped by species. Additionally, the use of SIMS to detect microbes on mineral surfaces is demonstrated by an analysis of Shewanella oneidensis on crushed hematite. The results of this study provide evidence for the potential of static SIMS to rapidly detect bacterial species based on ion fragments originating from cell membrane lipids directly from sample surfaces.« less
Cerebral network deficits in post-acute catatonic schizophrenic patients measured by fMRI.
Scheuerecker, J; Ufer, S; Käpernick, M; Wiesmann, M; Brückmann, H; Kraft, E; Seifert, D; Koutsouleris, N; Möller, H J; Meisenzahl, E M
2009-03-01
Twelve patients with catatonic schizophrenia and 12 matched healthy controls were examined with functional MRI while performing a motor task. The aim of our study was to identify the intracerebral pathophysiological correlates of motor symptoms in catatonic patients. The motor task included three conditions: a self-initiated (SI), an externally triggered (ET) and a rest condition. Statistical analysis was performed with SPM5. During the self-initiated movements patients showed significantly less activation than healthy controls in the supplementary motor area (SMA), the prefrontal and parietal cortex. Our results suggest a dysfunction of the "medial motor system" in catatonic patients. Self-initiated and externally triggered movements are mediated by different motor loops. The "medial loop" includes the SMA, thalamus and basal ganglia, and is necessary for self-initiated movements. The "lateral loop" includes parts of the cerebellum, lateral premotor cortex, thalamus and parietal association areas. It is involved in the execution of externally triggered movements. Our findings are in agreement with earlier behavioral data, which show deficits in self-initiated movements in catatonic patients but no impairment of externally triggered movements.
SDGs and Geospatial Frameworks: Data Integration in the United States
NASA Astrophysics Data System (ADS)
Trainor, T.
2016-12-01
Responding to the need to monitor a nation's progress towards meeting the Sustainable Development Goals (SDG) outlined in the 2030 U.N. Agenda requires the integration of earth observations with statistical information. The urban agenda proposed in SDG 11 challenges the global community to find a geospatial approach to monitor and measure inclusive, safe, resilient, and sustainable cities and communities. Target 11.7 identifies public safety, accessibility to green and public spaces, and the most vulnerable populations (i.e., women and children, older persons, and persons with disabilities) as the most important priorities of this goal. A challenge for both national statistical organizations and earth observation agencies in addressing SDG 11 is the requirement for detailed statistics at a sufficient spatial resolution to provide the basis for meaningful analysis of the urban population and city environments. Using an example for the city of Pittsburgh, this presentation proposes data and methods to illustrate how earth science and statistical data can be integrated to respond to Target 11.7. Finally, a preliminary series of data initiatives are proposed for extending this method to other global cities.
Analysis of regional deformation and strain accumulation data adjacent to the San Andreas fault
NASA Technical Reports Server (NTRS)
Turcotte, Donald L.
1991-01-01
A new approach to the understanding of crustal deformation was developed under this grant. This approach combined aspects of fractals, chaos, and self-organized criticality to provide a comprehensive theory for deformation on distributed faults. It is hypothesized that crustal deformation is an example of comminution: Deformation takes place on a fractal distribution of faults resulting in a fractal distribution of seismicity. Our primary effort under this grant was devoted to developing an understanding of distributed deformation in the continental crust. An initial effort was carried out on the fractal clustering of earthquakes in time. It was shown that earthquakes do not obey random Poisson statistics, but can be approximated in many cases by coupled, scale-invariant fractal statistics. We applied our approach to the statistics of earthquakes in the New Hebrides region of the southwest Pacific because of the very high level of seismicity there. This work was written up and published in the Bulletin of the Seismological Society of America. This approach was also applied to the statistics of the seismicity on the San Andreas fault system.
Statistical Reviewers Improve Reporting in Biomedical Articles: A Randomized Trial
Cobo, Erik; Selva-O'Callagham, Albert; Ribera, Josep-Maria; Cardellach, Francesc; Dominguez, Ruth; Vilardell, Miquel
2007-01-01
Background Although peer review is widely considered to be the most credible way of selecting manuscripts and improving the quality of accepted papers in scientific journals, there is little evidence to support its use. Our aim was to estimate the effects on manuscript quality of either adding a statistical peer reviewer or suggesting the use of checklists such as CONSORT or STARD to clinical reviewers or both. Methodology and Principal Findings Interventions were defined as 1) the addition of a statistical reviewer to the clinical peer review process, and 2) suggesting reporting guidelines to reviewers; with “no statistical expert” and “no checklist” as controls. The two interventions were crossed in a 2×2 balanced factorial design including original research articles consecutively selected, between May 2004 and March 2005, by the Medicina Clinica (Barc) editorial committee. We randomized manuscripts to minimize differences in terms of baseline quality and type of study (intervention, longitudinal, cross-sectional, others). Sample-size calculations indicated that 100 papers provide an 80% power to test a 55% standardized difference. We specified the main outcome as the increment in quality of papers as measured on the Goodman Scale. Two blinded evaluators rated the quality of manuscripts at initial submission and final post peer review version. Of the 327 manuscripts submitted to the journal, 131 were accepted for further review, and 129 were randomized. Of those, 14 that were lost to follow-up showed no differences in initial quality to the followed-up papers. Hence, 115 were included in the main analysis, with 16 rejected for publication after peer review. 21 (18.3%) of the 115 included papers were interventions, 46 (40.0%) were longitudinal designs, 28 (24.3%) cross-sectional and 20 (17.4%) others. The 16 (13.9%) rejected papers had a significantly lower initial score on the overall Goodman scale than accepted papers (difference 15.0, 95% CI: 4.6–24.4). The effect of suggesting a guideline to the reviewers had no effect on change in overall quality as measured by the Goodman scale (0.9, 95% CI: −0.3–+2.1). The estimated effect of adding a statistical reviewer was 5.5 (95% CI: 4.3–6.7), showing a significant improvement in quality. Conclusions and Significance This prospective randomized study shows the positive effect of adding a statistical reviewer to the field-expert peers in improving manuscript quality. We did not find a statistically significant positive effect by suggesting reviewers use reporting guidelines. PMID:17389922
Three-dimensional images contribute to the diagnosis of mucous retention cyst in maxillary sinus
Donizeth-Rodrigues, Cleomar; Fonseca-Da Silveira, Márcia; Gonçalves-De Alencar, Ana H.; Garcia-Santos-Silva, Maria A.; Francisco-De-Mendonça, Elismauro
2013-01-01
Objective: To evaluate the detection of mucous retention cyst of maxillary sinus (MRCMS) using panoramic radiography and cone beam computed tomography (CBCT). Study Design: A digital database with 6,000 panoramic radiographs was reviewed for MRCMS. Suggestive images of MRCMS were detected on 185 radiographs, and patients were located and invited to return for follow-up. Thirty patients returned, and control panoramic radiographs were obtained 6 to 46 months after the initial radiograph. When MRCMS was found on control radiographs, CBCT scans were obtained. Cysts were measured and compared on radiographs and scans. The Wilcoxon, Spearman and Kolmorogov-Smirnov tests were used for statistical analysis. The level of significance was set at 5%. Results: There were statistically significant differences between the two methods (p<0.05): 23 MRCMS detected on panoramic radiographs were confirmed by CBCT, but 5 MRCMS detected on CBCT images had not been identified by panoramic radiography. Eight MRCMS detected on control radiographs were not confirmed by CBCT. MRCMS size differences from initial to control panoramic radiographs and CBCT scans were not statistically significant (p= 0.617 and p= 0.626). The correlation between time and MRCMS size differences was not significant (r = -0.16, p = 0.381). Conclusion: CBCT scanning detect MRCMS more accurately than panoramic radiography. Key words:Mucous cyst, maxillary sinus, panoramic radiograph, cone beam computed tomography. PMID:23229251
Blunt-Vinti, Heather D; Wheldon, Christopher; McFarlane, Mary; Brogan, Natalie; Walsh-Buhi, Eric R
2016-01-01
Using the Internet to meet new people is becoming more common; however, such behavior is often considered risky, particularly for adolescents. Nevertheless, adolescents are meeting people through online venues and some are forming romantic/sexual relationships. The purpose of this study was to examine the relationship and sexual satisfaction reported by teens in online- and offline-initiated relationships. Data were collected from 273 13-19 year olds visiting a publicly funded clinic through 2010 and 2011. Questions included where respondents met the partner (online vs. offline), time between meeting and first sex, how well they knew the partner, and relationship and sexual (R&S) satisfaction. Analyses consisted of descriptive statistics, t tests, and path analysis, exploring R&S satisfaction in online- and offline-initiated relationships. R&S satisfaction scores were moderate for adolescents who reported meeting partners online and in person but were statistically higher in offline-initiated relationships. There was an inverse relationship between having an online partner and both relationship and sexual satisfaction. Additionally, knowing partners for a longer period of time and feeling more knowledgeable about partners before having sex were statistically significantly related to higher R&S satisfaction. Teens in this study reported more satisfying relationships with partners met offline compared with online. Results suggest that encouraging teens to wait longer and to get to know their partner(s) better before engaging in sex may improve satisfaction with, and quality of, those relationships. These findings provide an important contribution to sexual health promotion among young people, with whom technology use is ubiquitous. Copyright © 2016 Society for Adolescent Health and Medicine. All rights reserved.
Ricker, Martin; Peña Ramírez, Víctor M.; von Rosen, Dietrich
2014-01-01
Growth curves are monotonically increasing functions that measure repeatedly the same subjects over time. The classical growth curve model in the statistical literature is the Generalized Multivariate Analysis of Variance (GMANOVA) model. In order to model the tree trunk radius (r) over time (t) of trees on different sites, GMANOVA is combined here with the adapted PL regression model Q = A·T+E, where for and for , A = initial relative growth to be estimated, , and E is an error term for each tree and time point. Furthermore, Ei[–b·r] = , , with TPR being the turning point radius in a sigmoid curve, and at is an estimated calibrating time-radius point. Advantages of the approach are that growth rates can be compared among growth curves with different turning point radiuses and different starting points, hidden outliers are easily detectable, the method is statistically robust, and heteroscedasticity of the residuals among time points is allowed. The model was implemented with dendrochronological data of 235 Pinus montezumae trees on ten Mexican volcano sites to calculate comparison intervals for the estimated initial relative growth . One site (at the Popocatépetl volcano) stood out, with being 3.9 times the value of the site with the slowest-growing trees. Calculating variance components for the initial relative growth, 34% of the growth variation was found among sites, 31% among trees, and 35% over time. Without the Popocatépetl site, the numbers changed to 7%, 42%, and 51%. Further explanation of differences in growth would need to focus on factors that vary within sites and over time. PMID:25402427
Expression Profiling of Nonpolar Lipids in Meibum From Patients With Dry Eye: A Pilot Study
Chen, Jianzhong; Keirsey, Jeremy K.; Green, Kari B.; Nichols, Kelly K.
2017-01-01
Purpose The purpose of this investigation was to characterize differentially expressed lipids in meibum samples from patients with dry eye disease (DED) in order to better understand the underlying pathologic mechanisms. Methods Meibum samples were collected from postmenopausal women with DED (PW-DED; n = 5) and a control group of postmenopausal women without DED (n = 4). Lipid profiles were analyzed by direct infusion full-scan electrospray ionization mass spectrometry (ESI-MS). An initial analysis of 145 representative peaks from four classes of lipids in PW-DED samples revealed that additional manual corrections for peak overlap and isotopes only slightly affected the statistical analysis. Therefore, analysis of uncorrected data, which can be applied to a greater number of peaks, was used to compare more than 500 lipid peaks common to PW-DED and control samples. Statistical analysis of peak intensities identified several lipid species that differed significantly between the two groups. Data from contact lens wearers with DED (CL-DED; n = 5) were also analyzed. Results Many species of the two types of diesters (DE) and very long chain wax esters (WE) were decreased by ∼20% in PW-DED, whereas levels of triacylglycerols were increased by an average of 39% ± 3% in meibum from PW-DED compared to that in the control group. Approximately the same reduction (20%) of similar DE and WE was observed for CL-DED. Conclusions Statistical analysis of peak intensities from direct infusion ESI-MS results identified differentially expressed lipids in meibum from dry eye patients. Further studies are warranted to support these findings. PMID:28426869
Reischauer, Carolin; Patzwahl, René; Koh, Dow-Mu; Froehlich, Johannes M; Gutzeit, Andreas
2018-04-01
To evaluate whole-lesion volumetric texture analysis of apparent diffusion coefficient (ADC) maps for assessing treatment response in prostate cancer bone metastases. Texture analysis is performed in 12 treatment-naïve patients with 34 metastases before treatment and at one, two, and three months after the initiation of androgen deprivation therapy. Four first-order and 19 second-order statistical texture features are computed on the ADC maps in each lesion at every time point. Repeatability, inter-patient variability, and changes in the feature values under therapy are investigated. Spearman rank's correlation coefficients are calculated across time to demonstrate the relationship between the texture features and the serum prostate specific antigen (PSA) levels. With few exceptions, the texture features exhibited moderate to high precision. At the same time, Friedman's tests revealed that all first-order and second-order statistical texture features changed significantly in response to therapy. Thereby, the majority of texture features showed significant changes in their values at all post-treatment time points relative to baseline. Bivariate analysis detected significant correlations between the great majority of texture features and the serum PSA levels. Thereby, three first-order and six second-order statistical features showed strong correlations with the serum PSA levels across time. The findings in the present work indicate that whole-tumor volumetric texture analysis may be utilized for response assessment in prostate cancer bone metastases. The approach may be used as a complementary measure for treatment monitoring in conjunction with averaged ADC values. Copyright © 2018 Elsevier B.V. All rights reserved.
Claumann, Carlos Alberto; Wüst Zibetti, André; Bolzan, Ariovaldo; Machado, Ricardo A F; Pinto, Leonel Teixeira
2015-12-18
For this work, an analysis of parameter estimation for the retention factor in GC model was performed, considering two different criteria: sum of square error, and maximum error in absolute value; relevant statistics are described for each case. The main contribution of this work is the implementation of an initialization scheme (specialized) for the estimated parameters, which features fast convergence (low computational time) and is based on knowledge of the surface of the error criterion. In an application to a series of alkanes, specialized initialization resulted in significant reduction to the number of evaluations of the objective function (reducing computational time) in the parameter estimation. The obtained reduction happened between one and two orders of magnitude, compared with the simple random initialization. Copyright © 2015 Elsevier B.V. All rights reserved.
Scheibel, Paula Cabrini; Ramos, Adilson Luiz; Iwaki, Lilian Cristina Vessoni; Micheletti, Kelly Regina
2014-01-01
OBJECTIVE: The aim of the present study was to investigate the correlation between initial alveolar bone density of upper central incisors (ABD-UI) and external apical root resorption (EARR) after 12 months of orthodontic movement in cases without extraction. METHODS: A total of 47 orthodontic patients 11 years old or older were submitted to periapical radiography of upper incisors prior to treatment (T1) and after 12 months of treatment (T2). ABD-UI and EARR were measured by means of densitometry. RESULTS: No statistically significant correlation was found between initial ABD-UI and EARR at T2 (r = 0.149; p = 0.157). CONCLUSION: Based on the present findings, alveolar density assessed through periapical radiography is not predictive of root resorption after 12 months of orthodontic treatment in cases without extraction. PMID:25715722
The statistical reason why some researchers say some silvicultural treatments "wash-out" over time
David B. South; Curtis L. VanderSchaaf
2006-01-01
The initial effects of a silvicultural treatment on height or volume growth sometimes decline over time, and the early gains eventually disappear with very long rotations. However, in some reports initial gains are maintained until harvest but due to statistical analyses, a researcher might conclude the treatment effect has "washed-out" by ages 10 to 18 years...
Reis, Matthias; Kromer, Justus A; Klipp, Edda
2018-01-20
Multimodality is a phenomenon which complicates the analysis of statistical data based exclusively on mean and variance. Here, we present criteria for multimodality in hierarchic first-order reaction networks, consisting of catalytic and splitting reactions. Those networks are characterized by independent and dependent subnetworks. First, we prove the general solvability of the Chemical Master Equation (CME) for this type of reaction network and thereby extend the class of solvable CME's. Our general solution is analytical in the sense that it allows for a detailed analysis of its statistical properties. Given Poisson/deterministic initial conditions, we then prove the independent species to be Poisson/binomially distributed, while the dependent species exhibit generalized Poisson/Khatri Type B distributions. Generalized Poisson/Khatri Type B distributions are multimodal for an appropriate choice of parameters. We illustrate our criteria for multimodality by several basic models, as well as the well-known two-stage transcription-translation network and Bateman's model from nuclear physics. For both examples, multimodality was previously not reported.
Statistical analysis of the pulse-coupled synchronization strategy for wireless sensor networks
Wang, Yongqiang; Núñez, Felipe; Doyle, Francis J.
2013-01-01
Pulse-coupled synchronization is attracting increased attention in the sensor network community. Yet its properties have not been fully investigated. Using statistical analysis, we prove analytically that by controlling the number of connections at each node, synchronization can be guaranteed for generally pulse-coupled oscillators even in the presence of a refractory period. The approach does not require the initial phases to reside in half an oscillation cycle, which improves existing results. We also find that a refractory period can be strategically included to reduce idle listening at nearly no sacrifice to the synchronization probability. Given that reduced idle listening leads to higher energy efficiency in the synchronization process, the strategically added refractory period makes the synchronization scheme appealing to cheap sensor nodes, where energy is a precious system resource. We also analyzed the pulse-coupled synchronization in the presence of unreliable communication links and obtained similar results. QualNet experimental results are given to confirm the effectiveness of the theoretical predictions. PMID:24324322
Chau, Destiny F; Vasilopoulos, Terrie; Schoepf, Miriam; Zhang, Christina; Fahy, Brenda G
2016-09-01
Complex surgical and critically ill pediatric patients rely on syringe infusion pumps for precise delivery of IV medications. Low flow rates and in-line IV filter use may affect drug delivery. To determine the effects of an in-line filter to remove air and/or contaminants on syringe pump performance at low flow rates, we compared the measured rates with the programmed flow rates with and without in-line IV filters. Standardized IV infusion assemblies with and without IV filters (filter and control groups) attached to a 10-mL syringe were primed and then loaded onto a syringe pump and connected to a 16-gauge, 16-cm single-lumen catheter. The catheter was suspended in a normal saline fluid column to simulate the back pressure from central venous circulation. The delivered infusate was measured by gravimetric methods at predetermined time intervals, and flow rate was calculated. Experimental trials for initial programmed rates of 1.0, 0.8, 0.6, and 0.4 mL/h were performed in control and filter groups. For each trial, the flow rate was changed to double the initial flow rate and was then returned to the initial flow rate to analyze pump performance for titration of rates often required during medication administration. These conditions (initial rate, doubling of initial rate, and return to initial rate) were analyzed separately for steady-state flow rate and time to steady state, whereas their average was used for percent deviation analysis. Differences between control and filter groups were assessed using Student t tests with adjustment for multiplicity (using n = 3 replications per group). Mean time from 0 to initial flow (startup delay) was <1 minute in both groups with no statistical difference between groups (P = 1.0). The average time to reach steady-state flow after infusion startup or rate changes was not statistically different between the groups (range, 0.8-5.5 minutes), for any flow rate or part of the trial (initial rate, doubling of initial rate, and return to initial rate), although the study was underpowered to detect small time differences. Overall, the mean steady-state flow rate for each trial was below the programmed flow rate with negative mean percent deviations for each trial. In the 1.0-mL/h initial rate trial, the steady-state flow rate attained was lower in the filter than the control group for the initial rate (P = 0.04) and doubling of initial rate (P = 0.04) with a trend during the return to initial rate (P = 0.06), although this same effect was not observed when doubling the initial rate trials of 0.8 or 0.6 mL/h or any other rate trials compared with the control group. With low flow rates used in complex surgical and pediatric critically ill patients, the addition of IV filters did not confer statistically significant changes in startup delay, flow variability, or time to reach steady-state flow of medications administered by syringe infusion pumps. The overall flow rate was lower than programmed flow rate with or without a filter.
Improving adherence to the Epic Beacon ambulatory workflow.
Chackunkal, Ellen; Dhanapal Vogel, Vishnuprabha; Grycki, Meredith; Kostoff, Diana
2017-06-01
Computerized physician order entry has been shown to significantly improve chemotherapy safety by reducing the number of prescribing errors. Epic's Beacon Oncology Information System of computerized physician order entry and electronic medication administration was implemented in Henry Ford Health System's ambulatory oncology infusion centers on 9 November 2013. Since that time, compliance to the infusion workflow had not been assessed. The objective of this study was to optimize the current workflow and improve the compliance to this workflow in the ambulatory oncology setting. This study was a retrospective, quasi-experimental study which analyzed the composite workflow compliance rate of patient encounters from 9 to 23 November 2014. Based on this analysis, an intervention was identified and implemented in February 2015 to improve workflow compliance. The primary endpoint was to compare the composite compliance rate to the Beacon workflow before and after a pharmacy-initiated intervention. The intervention, which was education of infusion center staff, was initiated by ambulatory-based, oncology pharmacists and implemented by a multi-disciplinary team of pharmacists and nurses. The composite compliance rate was then reassessed for patient encounters from 2 to 13 March 2015 in order to analyze the effects of the determined intervention on compliance. The initial analysis in November 2014 revealed a composite compliance rate of 38%, and data analysis after the intervention revealed a statistically significant increase in the composite compliance rate to 83% ( p < 0.001). This study supports a pharmacist-initiated educational intervention can improve compliance to an ambulatory, oncology infusion workflow.
Woldeamanuel, Gashaw Garedew; Wondimu, Diresibachew Haile
2018-01-01
Hematological abnormalities are common in HIV positive patients. Of these, thrombocytopenia is a known complication which has been associated with a variety of bleeding disorders. However, its magnitude and related factors have not been well-characterized in the era of highly active antiretroviral therapy (HAART) in Ethiopia. Therefore, this study aimed to assess the prevalence of thrombocytopenia before and after initiation of HAART among HIV positive patients attending Black Lion Specialized Hospital, Addis Ababa, Ethiopia. A cross sectional study was conducted from February to April 2017 in Black Lion Specialized Hospital, Addis Ababa, Ethiopia. A total of 176 patients on HAART were selected using simple random sampling techniques. Socio-demographic and clinical characteristics of the study patients were collected using structured questionnaire. Measurements of platelet counts and CD4 + T cell counts were made using Sysmex XT 2000i hematology analyzer and BD FACS Count CD4 analyzer, respectively. Statistical analysis of the data (Paired T- test and binary logistic regression) was done using SPSS version 20. P -value < 0.05 was considered as statistically significant. A total of 176 patients (Age > 18 years old) were enrolled in this study and had a mean age of 40.08 ± 9.38 years. There was significant increase in the mean values of platelet counts (218.44 ± 106.6 × 10 3 /μl vs 273.65 ± 83.8 × 10 3 /μl, p < 0.001) after six months of HAART initiation compared to the baseline. Prevalence of thrombocytopenia before and after HAART initiation was 25 and 5.7% respectively. HIV patients whose CD4 counts < 200 Cells/μl were more likely to have thrombocytopenia than HIV patients whose CD4 count ≥350 Cells/μl. However, it was not statistically associated with prevalence of thrombocytopenia. This study has shown that the prevalence of thrombocytopenia after HAART initiation was decreased significantly. Based on our results, a number of study participants still had thrombocytopenia after initiation of HAART. Therefore, continuous screening for thrombocytopenia among HIV infected patients should be performed to decrease the risk of morbidity and mortality.
[Normative prenatal evaluation at a philanthropic maternity hospital in São Paulo].
Corrêa, Claudia Regina Hostim; Bonadio, Isabel Cristina; Tsunechiro, Maria Alice
2011-12-01
This cross-sectional study counted with the participation of 301 pregnant women seen in 2009 at a philanthropic maternity hospital in the city of São Paulo (a prenatal support program named Pré-Natal do Amparo Maternal - PN-AM). The objectives of this study were to evaluate the prenatal care according to the initial gestational age, the number of appointments that were held, the continuity of the assistance, and relate the appropriateness with the socio-demographic, obstetric and local variables of the initial prenatal care. The analysis criteria used was initiating prenatal care before 120 days of gestation and attending at least six appointments. The relationship between the variables was analyzed using the Chi-Square Test. Results showed that 41.5% of the pregnant women initiated prenatal care at another health care service and transferred spontaneously to the PN-AM; 74.1% initiated the prenatal care early and 80.4% attended at least six appointments; 63.1% met both criteria simultaneously. Appropriate prenatal care showed a statistically significant difference for mother's age, steady partner, employment, place of residence, having a companion during the appointment and place where prenatal care was initiated.
40 CFR Appendix IV to Part 265 - Tests for Significance
Code of Federal Regulations, 2010 CFR
2010-07-01
... introductory statistics texts. ... student's t-test involves calculation of the value of a t-statistic for each comparison of the mean... parameter with its initial background concentration or value. The calculated value of the t-statistic must...
Baltzer, Pascal Andreas Thomas; Renz, Diane M; Kullnig, Petra E; Gajda, Mieczyslaw; Camara, Oumar; Kaiser, Werner A
2009-04-01
The identification of the most suspect enhancing part of a lesion is regarded as a major diagnostic criterion in dynamic magnetic resonance mammography. Computer-aided diagnosis (CAD) software allows the semi-automatic analysis of the kinetic characteristics of complete enhancing lesions, providing additional information about lesion vasculature. The diagnostic value of this information has not yet been quantified. Consecutive patients from routine diagnostic studies (1.5 T, 0.1 mmol gadopentetate dimeglumine, dynamic gradient-echo sequences at 1-minute intervals) were analyzed prospectively using CAD. Dynamic sequences were processed and reduced to a parametric map. Curve types were classified by initial signal increase (not significant, intermediate, and strong) and the delayed time course of signal intensity (continuous, plateau, and washout). Lesion enhancement was measured using CAD. The most suspect curve, the curve-type distribution percentage, and combined dynamic data were compared. Statistical analysis included logistic regression analysis and receiver-operating characteristic analysis. Fifty-one patients with 46 malignant and 44 benign lesions were enrolled. On receiver-operating characteristic analysis, the most suspect curve showed diagnostic accuracy of 76.7 +/- 5%. In comparison, the curve-type distribution percentage demonstrated accuracy of 80.2 +/- 4.9%. Combined dynamic data had the highest diagnostic accuracy (84.3 +/- 4.2%). These differences did not achieve statistical significance. With appropriate cutoff values, sensitivity and specificity, respectively, were found to be 80.4% and 72.7% for the most suspect curve, 76.1% and 83.6% for the curve-type distribution percentage, and 78.3% and 84.5% for both parameters. The integration of whole-lesion dynamic data tends to improve specificity. However, no statistical significance backs up this finding.
A Mokken scale analysis of the peer physical examination questionnaire.
Vaughan, Brett; Grace, Sandra
2018-01-01
Peer physical examination (PPE) is a teaching and learning strategy utilised in most health profession education programs. Perceptions of participating in PPE have been described in the literature, focusing on areas of the body students are willing, or unwilling, to examine. A small number of questionnaires exist to evaluate these perceptions, however none have described the measurement properties that may allow them to be used longitudinally. The present study undertook a Mokken scale analysis of the Peer Physical Examination Questionnaire (PPEQ) to evaluate its dimensionality and structure when used with Australian osteopathy students. Students enrolled in Year 1 of the osteopathy programs at Victoria University (Melbourne, Australia) and Southern Cross University (Lismore, Australia) were invited to complete the PPEQ prior to their first practical skills examination class. R, an open-source statistics program, was used to generate the descriptive statistics and perform a Mokken scale analysis. Mokken scale analysis is a non-parametric item response theory approach that is used to cluster items measuring a latent construct. Initial analysis suggested the PPEQ did not form a single scale. Further analysis identified three subscales: 'comfort', 'concern', and 'professionalism and education'. The properties of each subscale suggested they were unidimensional with variable internal structures. The 'comfort' subscale was the strongest of the three identified. All subscales demonstrated acceptable reliability estimation statistics (McDonald's omega > 0.75) supporting the calculation of a sum score for each subscale. The subscales identified are consistent with the literature. The 'comfort' subscale may be useful to longitudinally evaluate student perceptions of PPE. Further research is required to evaluate changes with PPE and the utility of the questionnaire with other health profession education programs.
Decaestecker, C; Lopes, B S; Gordower, L; Camby, I; Cras, P; Martin, J J; Kiss, R; VandenBerg, S R; Salmon, I
1997-04-01
The oligoastrocytoma, as a mixed glioma, represents a nosologic dilemma with respect to precisely defining the oligodendroglial and astroglial phenotypes that constitute the neoplastic cell lineages of these tumors. In this study, cell image analysis with Feulgen-stained nuclei was used to distinguish between oligodendroglial and astrocytic phenotypes in oligodendrogliomas and astrocytomas and then applied to mixed oligoastrocytomas. Quantitative features with respect to chromatin pattern (30 variables) and DNA ploidy (8 variables) were evaluated on Feulgen-stained nuclei in a series of 71 gliomas using computer-assisted microscopy. These included 32 oligodendrogliomas (OLG group: 24 grade II and 8 grade III tumors according to the WHO classification), 32 astrocytomas (AST group: 13 grade II and 19 grade III tumors), and 7 oligoastrocytomas (OLGAST group). Initially, image analysis with multivariate statistical analyses (Discriminant Analysis) could identify each glial tumor group. Highly significant statistical differences were obtained distinguishing the morphonuclear features of oligodendrogliomas from those of astrocytomas, regardless of their histological grade. When compared with the 7 mixed oligoastrocytomas under study, 5 exhibited DNA ploidy and chromatin pattern characteristics similar to grade II oligodendrogliomas, I to grade III oligodendrogliomas, and I to grade II astrocytomas. Using multifactorial statistical analyses (Discriminant Analysis combined with Principal Component Analysis). It was possible to quantify the proportion of "typical" glial cell phenotypes that compose grade II and III oligodendrogliomas and grade II and III astrocytomas in each mixed glioma. Cytometric image analysis may be an important adjunct to routine histopathology for the reproducible identification of neoplasms containing a mixture of oligodendroglial and astrocytic phenotypes.
Statistical analysis of measured free-space laser signal intensity over a 2.33 km optical path.
Tunick, Arnold
2007-10-17
Experimental research is conducted to determine the characteristic behavior of high frequency laser signal intensity data collected over a 2.33 km optical path. Results focus mainly on calculated power spectra and frequency distributions. In addition, a model is developed to calculate optical turbulence intensity (C(n)/2) as a function of receiving and transmitting aperture diameter, log-amplitude variance, and path length. Initial comparisons of calculated to measured C(n)/2 data are favorable. It is anticipated that this kind of signal data analysis will benefit laser communication systems development and testing at the U.S. Army Research Laboratory (ARL) and elsewhere.
Kiger, Mary; Brown, Catherine S; Watkins, Lynn
2006-10-01
This study compares the outcomes using VitalStim therapy to outcomes using traditional swallowing therapy for deglutition disorders. Twenty-two patients had an initial and a followup videofluoroscopic swallowing study or fiberoptic endoscopic evaluation of swallowing and were divided into an experimental group that received VitalStim treatments and a control group that received traditional swallowing therapy. Outcomes were analyzed for changes in oral and pharyngeal phase dysphagia severity, dietary consistency restrictions, and progression from nonoral to oral intake. Results of chi(2) analysis showed no statistically significant difference in outcomes between the experimental and control groups.
[Relational database for urinary stone ambulatory consultation. Assessment of initial outcomes].
Sáenz Medina, J; Páez Borda, A; Crespo Martinez, L; Gómez Dos Santos, V; Barrado, C; Durán Poveda, M
2010-05-01
To create a relational database for monitoring lithiasic patients. We describe the architectural details and the initial results of the statistical analysis. Microsoft Access 2002 was used as template. Four different tables were constructed to gather demographic data (table 1), clinical and laboratory findings (table 2), stone features (table 3) and therapeutic approach (table 4). For a reliability analysis of the database the number of correctly stored data was gathered. To evaluate the performance of the database, a prospective analysis was conducted, from May 2004 to August 2009, on 171 stone free patients after treatment (EWSL, surgery or medical) from a total of 511 patients stored in the database. Lithiasic status (stone free or stone relapse) was used as primary end point, while demographic factors (age, gender), lithiasic history, upper urinary tract alterations and characteristics of the stone (side, location, composition and size) were considered as predictive factors. An univariate analysis was conducted initially by chi square test and supplemented by Kaplan Meier estimates for time to stone recurrence. A multiple Cox proportional hazards regression model was generated to jointly assess the prognostic value of the demographic factors and the predictive value of stones characteristics. For the reliability analysis 22,084 data were available corresponding to 702 consultations on 511 patients. Analysis of data showed a recurrence rate of 85.4% (146/171, median time to recurrence 608 days, range 70-1758). In the univariate and multivariate analysis, none of the factors under consideration had a significant effect on recurrence rate (p=ns). The relational database is useful for monitoring patients with urolithiasis. It allows easy control and update, as well as data storage for later use. The analysis conducted for its evaluation showed no influence of demographic factors and stone features on stone recurrence.
Macfarlane, Sarah B.
2005-01-01
Efforts to strengthen health information systems in low- and middle-income countries should include forging links with systems in other social and economic sectors. Governments are seeking comprehensive socioeconomic data on the basis of which to implement strategies for poverty reduction and to monitor achievement of the Millennium Development Goals. The health sector is looking to take action on the social factors that determine health outcomes. But there are duplications and inconsistencies between sectors in the collection, reporting, storage and analysis of socioeconomic data. National offices of statistics give higher priority to collection and analysis of economic than to social statistics. The Report of the Commission for Africa has estimated that an additional US$ 60 million a year is needed to improve systems to collect and analyse statistics in Africa. Some donors recognize that such systems have been weakened by numerous international demands for indicators, and have pledged support for national initiatives to strengthen statistical systems, as well as sectoral information systems such as those in health and education. Many governments are working to coordinate information systems to monitor and evaluate poverty reduction strategies. There is therefore an opportunity for the health sector to collaborate with other sectors to lever international resources to rationalize definition and measurement of indicators common to several sectors; streamline the content, frequency and timing of household surveys; and harmonize national and subnational databases that store socioeconomic data. Without long-term commitment to improve training and build career structures for statisticians and information technicians working in the health and other sectors, improvements in information and statistical systems cannot be sustained. PMID:16184278
The GEOS-iODAS: Description and Evaluation
NASA Technical Reports Server (NTRS)
Vernieres, Guillaume; Rienecker, Michele M.; Kovach, Robin; Keppenne, Christian L.
2012-01-01
This report documents the GMAO's Goddard Earth Observing System sea ice and ocean data assimilation systems (GEOS iODAS) and their evolution from the first reanalysis test, through the implementation that was used to initialize the GMAO decadal forecasts, and to the current system that is used to initialize the GMAO seasonal forecasts. The iODAS assimilates a wide range of observations into the ocean and sea ice components: in-situ temperature and salinity profiles, sea level anomalies from satellite altimetry, analyzed SST, and sea-ice concentration. The climatological sea surface salinity is used to constrain the surface salinity prior to the Argo years. Climatological temperature and salinity gridded data sets from the 2009 version of the World Ocean Atlas (WOA09) are used to help constrain the analysis in data sparse areas. The latest analysis, GEOS ODAS5.2, is diagnosed through detailed studies of the statistics of the innovations and analysis departures, comparisons with independent data, and integrated values such as volume transport. Finally, the climatologies of temperature and salinity fields from the Argo era, 2002-2011, are presented and compared with the WOA09.
Time-series analysis of the transcriptome and proteome of Escherichia coli upon glucose repression.
Borirak, Orawan; Rolfe, Matthew D; de Koning, Leo J; Hoefsloot, Huub C J; Bekker, Martijn; Dekker, Henk L; Roseboom, Winfried; Green, Jeffrey; de Koster, Chris G; Hellingwerf, Klaas J
2015-10-01
Time-series transcript- and protein-profiles were measured upon initiation of carbon catabolite repression in Escherichia coli, in order to investigate the extent of post-transcriptional control in this prototypical response. A glucose-limited chemostat culture was used as the CCR-free reference condition. Stopping the pump and simultaneously adding a pulse of glucose, that saturated the cells for at least 1h, was used to initiate the glucose response. Samples were collected and subjected to quantitative time-series analysis of both the transcriptome (using microarray analysis) and the proteome (through a combination of 15N-metabolic labeling and mass spectrometry). Changes in the transcriptome and corresponding proteome were analyzed using statistical procedures designed specifically for time-series data. By comparison of the two sets of data, a total of 96 genes were identified that are post-transcriptionally regulated. This gene list provides candidates for future in-depth investigation of the molecular mechanisms involved in post-transcriptional regulation during carbon catabolite repression in E. coli, like the involvement of small RNAs. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
No association of SORL1 SNPs with Alzheimer's disease.
Minster, Ryan L; DeKosky, Steven T; Kamboh, M Ilyas
2008-08-01
SORL1 is an element of the amyloid precursor protein processing pathway and is therefore a good candidate for affecting Alzheimer's disease (AD) risk. Indeed, there have been reports of associations between variation in SORL1 and AD risk. We examined six statistically significant single-nucleotide polymorphisms from the initial observation in a large Caucasian American case-controls cohort (1000 late-onset AD [LOAD] cases and 1000 older controls). Analysis of allele, genotype and haplotype frequencies revealed no association with LOAD risk in our cohort.
Phospholipid Fatty Acid Analysis: Past, Present and Future
NASA Astrophysics Data System (ADS)
Findlay, R. H.
2008-12-01
With their 1980 publication, Bobbie and White initiated the use of phospholipid fatty acids for the study of microbial communities. This method, integrated with a previously published biomass assay based on the colorimetric detection of orthophosphate liberated from phospholipids, provided the first quantitative method for determining microbial community structure. The method is based on a quantitative extraction of lipids from the sample matrix, isolation of the phospholipids, conversion of the phospholipid fatty acids to their corresponding fatty acid methyl esters (known by the acronym FAME) and the separation, identification and quantification of the FAME by gas chromatography. Early laboratory and field samples focused on correlating individual fatty acids to particular groups of microorganisms. Subsequent improvements to the methodology include reduced solvent volumes for extractions, improved sensitivity in the detection of orthophosphate and the use of solid phase extraction technology. Improvements in the field of gas chromatography also increased accessibility of the technique and it has been widely applied to water, sediment, soil and aerosol samples. Whole cell fatty acid analysis, a related but not equal technique, is currently used for phenotypic characterization in bacterial species descriptions and is the basis for a commercial, rapid bacterial identification system. In the early 1990ês application of multivariate statistical analysis, first cluster analysis and then principal component analysis, further improved the usefulness of the technique and allowed the development of a functional group approach to interpretation of phospholipid fatty acid profiles. Statistical techniques currently applied to the analysis of phospholipid fatty acid profiles include constrained ordinations and neutral networks. Using redundancy analysis, a form of constrained ordination, we have recently shown that both cation concentration and dissolved organic matter (DOM) quality are determinates of microbial community structure in forested headwater streams. One of the most exciting recent developments in phospholipid fatty acid analysis is the application of compound specific stable isotope analysis. We are currently applying this technique to stream sediments to help determine which microorganisms are involved in the initial processing of DOM and the technique promises to be a useful tool for assigning ecological function to microbial populations.
Development of a Research Methods and Statistics Concept Inventory
ERIC Educational Resources Information Center
Veilleux, Jennifer C.; Chapman, Kate M.
2017-01-01
Research methods and statistics are core courses in the undergraduate psychology major. To assess learning outcomes, it would be useful to have a measure that assesses research methods and statistical literacy beyond course grades. In two studies, we developed and provided initial validation results for a research methods and statistical knowledge…
NASA Technical Reports Server (NTRS)
Bauman, William H., III
2010-01-01
The 12-km resolution North American Mesoscale (NAM) model (MesoNAM) is used by the 45th Weather Squadron (45 WS) Launch Weather Officers at Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS) to support space launch weather operations. The 45 WS tasked the Applied Meteorology Unit to conduct an objective statistics-based analysis of MesoNAM output compared to wind tower mesonet observations and then develop a an operational tool to display the results. The National Centers for Environmental Prediction began running the current version of the MesoNAM in mid-August 2006. The period of record for the dataset was 1 September 2006 - 31 January 2010. The AMU evaluated MesoNAM hourly forecasts from 0 to 84 hours based on model initialization times of 00, 06, 12 and 18 UTC. The MesoNAM forecast winds, temperature and dew point were compared to the observed values of these parameters from the sensors in the KSC/CCAFS wind tower network. The data sets were stratified by model initialization time, month and onshore/offshore flow for each wind tower. Statistics computed included bias (mean difference), standard deviation of the bias, root mean square error (RMSE) and a hypothesis test for bias = O. Twelve wind towers located in close proximity to key launch complexes were used for the statistical analysis with the sensors on the towers positioned at varying heights to include 6 ft, 30 ft, 54 ft, 60 ft, 90 ft, 162 ft, 204 ft and 230 ft depending on the launch vehicle and associated weather launch commit criteria being evaluated. These twelve wind towers support activities for the Space Shuttle (launch and landing), Delta IV, Atlas V and Falcon 9 launch vehicles. For all twelve towers, the results indicate a diurnal signal in the bias of temperature (T) and weaker but discernable diurnal signal in the bias of dewpoint temperature (T(sub d)) in the MesoNAM forecasts. Also, the standard deviation of the bias and RMSE of T, T(sub d), wind speed and wind direction indicated the model error increased with the forecast period all four parameters. The hypothesis testing uses statistics to determine the probability that a given hypothesis is true. The goal of using the hypothesis test was to determine if the model bias of any of the parameters assessed throughout the model forecast period was statistically zero. For th is dataset, if this test produced a value >= -1 .96 or <= 1.96 for a data point, then the bias at that point was effectively zero and the model forecast for that point was considered to have no error. A graphical user interface (GUI) was developed so the 45 WS would have an operational tool at their disposal that would be easy to navigate among the multiple stratifications of information to include tower locations, month, model initialization times, sensor heights and onshore/offshore flow. The AMU developed the GUI using HyperText Markup Language (HTML) so the tool could be used in most popular web browsers with computers running different operating systems such as Microsoft Windows and Linux.
ERIC Educational Resources Information Center
Selmer, Sarah J.; Rye, James A.; Malone, Elizabeth; Fernandez, Danielle; Trebino, Kathryn
2014-01-01
Statistical literacy is essential to scientific literacy, and the quest for such is best initiated in the elementary grades. The "Next Generation Science Standards and the Common Core State Standards for Mathematics" set forth practices (e.g., asking questions, using tools strategically to analyze and interpret data) and content (e.g.,…
NASA Astrophysics Data System (ADS)
Srinivasan, Samuelraj; Prabhu, Vijendra; Chandra, Subhash; Koshy, Shalini; Acharya, Shashidhar; Mahato, Krishna K.
2014-02-01
The present era of minimal invasive dentistry emphasizes the early detection and remineralization of initial enamel caries. Ozone has been shown to reverse the initial demineralization before the integrity of the enamel surface is lost. Nano-hydroxyapatite is a proven remineralizing agent for early enamel caries. In the present study, the effect of ozone in enhancing the remineralizing potential of nano-hydroxyapatite on artificially demineralized enamel was investigated using laser induced fluorescence. Thirty five sound human premolars were collected from healthy subjects undergoing orthodontic treatment. Fluorescence was recorded by exciting the mesial surfaces using 325 nm He-Cd laser with 2 mW power. Tooth specimens were subjected to demineralization to create initial enamel caries. Following which the specimens were divided into three groups, i.e ozone (ozonated water for 2 min), without ozone and artificial saliva. Remineralization regimen was followed for 3 weeks. The fluorescence spectra of the specimens were recorded from all the three experimental groups at baseline, after demineralization and remineralization. The average spectrum for each experimental group was used for statistical analysis. Fluorescence intensities of Ozone treated specimens following remineralization were higher than that of artificial saliva, and this difference was found to be statistically significant (P<0.0001). In a nutshell, ozone enhanced the remineralizing potential of nanohydroxyapatite, and laser induced fluorescence was found to be effective in assessing the surface mineral changes in enamel. Ozone can be considered an effective agent in reversing the initial enamel caries there by preventing the tooth from entering into the repetitive restorative cycle.
Wisniowski, Brendan; Barnes, Mary; Jenkins, Jason; Boyne, Nicholas; Kruger, Allan; Walker, Philip J
2011-09-01
Endovascular abdominal aortic aneurysm (AAA) repair (EVAR) has been associated with lower operative mortality and morbidity than open surgery but comparable long-term mortality and higher delayed complication and reintervention rates. Attention has therefore been directed to identifying preoperative and operative variables that influence outcomes after EVAR. Risk-prediction models, such as the EVAR Risk Assessment (ERA) model, have also been developed to help surgeons plan EVAR procedures. The aims of this study were (1) to describe outcomes of elective EVAR at the Royal Brisbane and Women's Hospital (RBWH), (2) to identify preoperative and operative variables predictive of outcomes after EVAR, and (3) to externally validate the ERA model. All elective EVAR procedures at the RBWH before July 1, 2009, were reviewed. Descriptive analyses were performed to determine the outcomes. Univariate and multivariate analyses were performed to identify preoperative and operative variables predictive of outcomes after EVAR. Binomial logistic regression analyses were used to externally validate the ERA model. Before July 1, 2009, 197 patients (172 men), who were a mean age of 72.8 years, underwent elective EVAR at the RBWH. Operative mortality was 1.0%. Survival was 81.1% at 3 years and 63.2% at 5 years. Multivariate analysis showed predictors of survival were age (P = .0126), American Society of Anesthesiologists (ASA) score (P = .0180), and chronic obstructive pulmonary disease (P = .0348) at 3 years and age (P = .0103), ASA score (P = .0006), renal failure (P = .0048), and serum creatinine (P = .0022) at 5 years. Aortic branch vessel score was predictive of initial (30-day) type II endoleak (P = .0015). AAA tortuosity was predictive of midterm type I endoleak (P = .0251). Female sex was associated with lower rates of initial clinical success (P = .0406). The ERA model fitted RBWH data well for early death (C statistic = .906), 3-year survival (C statistic = .735), 5-year survival (C statistic = .800), and initial type I endoleak (C statistic = .850). The outcomes of elective EVAR at the RBWH are broadly consistent with those of a nationwide Australian audit and recent randomized trials. Age and ASA score are independent predictors of midterm survival after elective EVAR. The ERA model predicts mortality-related outcomes and initial type I endoleak well for RBWH elective EVAR patients. Copyright © 2011 Society for Vascular Surgery. All rights reserved.
A new similarity index for nonlinear signal analysis based on local extrema patterns
NASA Astrophysics Data System (ADS)
Niknazar, Hamid; Motie Nasrabadi, Ali; Shamsollahi, Mohammad Bagher
2018-02-01
Common similarity measures of time domain signals such as cross-correlation and Symbolic Aggregate approximation (SAX) are not appropriate for nonlinear signal analysis. This is because of the high sensitivity of nonlinear systems to initial points. Therefore, a similarity measure for nonlinear signal analysis must be invariant to initial points and quantify the similarity by considering the main dynamics of signals. The statistical behavior of local extrema (SBLE) method was previously proposed to address this problem. The SBLE similarity index uses quantized amplitudes of local extrema to quantify the dynamical similarity of signals by considering patterns of sequential local extrema. By adding time information of local extrema as well as fuzzifying quantized values, this work proposes a new similarity index for nonlinear and long-term signal analysis, which extends the SBLE method. These new features provide more information about signals and reduce noise sensitivity by fuzzifying them. A number of practical tests were performed to demonstrate the ability of the method in nonlinear signal clustering and classification on synthetic data. In addition, epileptic seizure detection based on electroencephalography (EEG) signal processing was done by the proposed similarity to feature the potentials of the method as a real-world application tool.
Obuchowski, Nancy A; Buckler, Andrew; Kinahan, Paul; Chen-Mayer, Heather; Petrick, Nicholas; Barboriak, Daniel P; Bullen, Jennifer; Barnhart, Huiman; Sullivan, Daniel C
2016-04-01
A major initiative of the Quantitative Imaging Biomarker Alliance is to develop standards-based documents called "Profiles," which describe one or more technical performance claims for a given imaging modality. The term "actor" denotes any entity (device, software, or person) whose performance must meet certain specifications for the claim to be met. The objective of this paper is to present the statistical issues in testing actors' conformance with the specifications. In particular, we present the general rationale and interpretation of the claims, the minimum requirements for testing whether an actor achieves the performance requirements, the study designs used for testing conformity, and the statistical analysis plan. We use three examples to illustrate the process: apparent diffusion coefficient in solid tumors measured by MRI, change in Perc 15 as a biomarker for the progression of emphysema, and percent change in solid tumor volume by computed tomography as a biomarker for lung cancer progression. Copyright © 2016 The Association of University Radiologists. All rights reserved.
Stec, Małgorzata; Grzebyk, Mariola
2018-01-01
The European Union (EU), striving to create economic dominance on the global market, has prepared a comprehensive development programme, which initially was the Lisbon Strategy and then the Strategy Europe 2020. The attainment of the strategic goals included in the prospective development programmes shall transform the EU into the most competitive economy in the world based on knowledge. This paper presents a statistical evaluation of progress being made by EU member states in meeting Europe 2020. For the basis of the assessment, the authors proposed a general synthetic measure in dynamic terms, which allows to objectively compare EU member states by 10 major statistical indicators. The results indicate that most of EU countries show average progress in realisation of Europe's development programme which may suggest that the goals may not be achieved in the prescribed time. It is particularly important to monitor the implementation of Europe 2020 to arrive at the right decisions which will guarantee the accomplishment of the EU's development strategy.
Nakata, Norio; Ohta, Tomoyuki; Nishioka, Makiko; Takeyama, Hiroshi; Toriumi, Yasuo; Kato, Kumiko; Nogi, Hiroko; Kamio, Makiko; Fukuda, Kunihiko
2015-11-01
This study was performed to evaluate the diagnostic utility of quantitative analysis of benign and malignant breast lesions using contrast-enhanced sonography. Contrast-enhanced sonography using the perflubutane-based contrast agent Sonazoid (Daiichi Sankyo, Tokyo, Japan) was performed in 94 pathologically proven palpable breast mass lesions, which could be depicted with B-mode sonography. Quantitative analyses using the time-intensity curve on contrast-enhanced sonography were performed in 5 region of interest (ROI) types (manually traced ROI and circular ROIs of 5, 10, 15, and 20 mm in diameter). The peak signal intensity, initial slope, time to peak, positive enhancement integral, and wash-out ratio were investigated in each ROI. There were significant differences between benign and malignant lesions in the time to peak (P < .05), initial slope (P < .001), and positive enhancement integral (P < .05) for the manual ROI. Significant differences were found between benign and malignant lesions in the time to peak (P < .05) for the 5-mm ROI; the time to peak (P < .05) and initial slope (P< .05) for the 10-mm ROI; absolute values of the peak signal intensity (P< .05), time to peak (P< .01), and initial slope (P< .005) for the 15-mm ROI; and the time to peak (P < .05) and initial slope (P < .05) for the 20-mm ROI. There were no statistically significant differences in any wash-out ratio values for the 5 ROI types. Kinetic analysis using contrast-enhanced sonography is useful for differentiation between benign and malignant breast lesions. © 2015 by the American Institute of Ultrasound in Medicine.
COBRA ATD minefield detection model initial performance analysis
NASA Astrophysics Data System (ADS)
Holmes, V. Todd; Kenton, Arthur C.; Hilton, Russell J.; Witherspoon, Ned H.; Holloway, John H., Jr.
2000-08-01
A statistical performance analysis of the USMC Coastal Battlefield Reconnaissance and Analysis (COBRA) Minefield Detection (MFD) Model has been performed in support of the COBRA ATD Program under execution by the Naval Surface Warfare Center/Dahlgren Division/Coastal Systems Station . This analysis uses the Veridian ERIM International MFD model from the COBRA Sensor Performance Evaluation and Computational Tools for Research Analysis modeling toolbox and a collection of multispectral mine detection algorithm response distributions for mines and minelike clutter objects. These mine detection response distributions were generated form actual COBRA ATD test missions over littoral zone minefields. This analysis serves to validate both the utility and effectiveness of the COBRA MFD Model as a predictive MFD performance too. COBRA ATD minefield detection model algorithm performance results based on a simulate baseline minefield detection scenario are presented, as well as result of a MFD model algorithm parametric sensitivity study.
Gay, Hawkins C; Baldridge, Abigail S; Huffman, Mark D
2017-12-01
Data sharing is as an expanding initiative for enhancing trust in the clinical research enterprise. To evaluate the feasibility, process, and outcomes of a reproduction analysis of the THERMOCOOL SMARTTOUCH Catheter for the Treatment of Symptomatic Paroxysmal Atrial Fibrillation (SMART-AF) trial using shared clinical trial data. A reproduction analysis of the SMART-AF trial was performed using the data sets, data dictionary, case report file, and statistical analysis plan from the original trial accessed through the Yale Open Data Access Project using the SAS Clinical Trials Data Transparency platform. SMART-AF was a multicenter, single-arm trial evaluating the effectiveness and safety of an irrigated, contact force-sensing catheter for ablation of drug refractory, symptomatic paroxysmal atrial fibrillation in 172 participants recruited from 21 sites between June 2011 and December 2011. Analysis of the data was conducted between December 2016 and April 2017. Effectiveness outcomes included freedom from atrial arrhythmias after ablation and proportion of participants without any arrhythmia recurrence over the 12 months of follow-up after a 3-month blanking period. Safety outcomes included major adverse device- or procedure-related events. The SMART AF trial participants' mean age was 58.7 (10.8) years, and 72% were men. The time from initial proposal submission to final analysis was 11 months. Freedom from atrial arrhythmias at 12 months postprocedure was similar compared with the primary study report (74.0%; 95% CI, 66.0-82.0 vs 76.4%; 95% CI, 68.7-84.1). The reproduction analysis success rate was higher than the primary study report (65.8%; 95% CI 56.5-74.2 vs 75.6%; 95% CI, 67.2-82.5). Adverse events were minimal and similar between the 2 analyses, but contact force range or regression models could not be reproduced. The feasibility of a reproduction analysis of the SMART-AF trial was demonstrated through an academic data-sharing platform. Data sharing can be facilitated through incentivizing collaboration, sharing statistical code, and creating more decentralized data sharing platforms with fewer restrictions to data access.
Resource utilization with insulin pump therapy for type 2 diabetes mellitus.
Lynch, Peter M; Riedel, Aylin Altan; Samant, Navendu; Fan, Ying; Peoples, Tim; Levinson, Jennifer; Lee, Scott W
2010-01-01
To evaluate the effects of switching from multiple daily injection (MDI) therapy to insulin pump therapy, also called continuous subcutaneous insulin infusion (CSII), on antidiabetic drug and healthcare resource utilization. This study was a retrospective analysis of administrative claims data from a large geographically diverse health plan in the United States from January 1, 2005, through April 30, 2008. Changes in antidiabetic drug use, antidiabetic drug switching and augmentation, and healthcare utilization during the baseline period and after CSII initiation were assessed using paired t test. There were 3649 possible subjects, of whom 943 met the criteria for analysis. The mean number of antidiabetic drugs used decreased by 46% after CSII initiation, and the mean reduction in antidiabetic drug utilization was 0.67; both were statistically significant. More than one-third of subjects who were taking antidiabetic drugs before CSII initiation discontinued oral therapy after CSII initiation. The number of subjects using multiple antidiabetic drugs significantly decreased after CSII initiation by 58%, and rates of switching or augmenting significantly decreased from 42% at baseline to 25% after CSII initiation.The rates of emergency department visits and inpatient admissions significantly decreased, and the rate of ambulatory visits significantly increased. CSII was associated with significant decreases in antidiabetic drug and healthcare resource utilization, contributing to stability of care. The evidence from this study indicates that CSII should be considered as an option for patients with type 2 diabetes mellitus who are using MDI and are experiencing a high degree of antidiabetic drug and healthcare resource utilization.
Code of Federal Regulations, 2011 CFR
2011-04-01
... as a nationally recognized statistical rating organization. 240.17g-1 Section 240.17g-1 Commodity and... Statistical Rating Organizations § 240.17g-1 Application for registration as a nationally recognized statistical rating organization. (a) Initial application. A credit rating agency applying to the Commission to...
Code of Federal Regulations, 2010 CFR
2010-04-01
... as a nationally recognized statistical rating organization. 240.17g-1 Section 240.17g-1 Commodity and... Statistical Rating Organizations § 240.17g-1 Application for registration as a nationally recognized statistical rating organization. (a) Initial application. A credit rating agency applying to the Commission to...
Watanabe, Melina Mayumi; Rodrigues, José Augusto; Marchi, Giselle Maria; Ambrosano, Gláucia Maria Bovi
2005-06-01
The aim of this study was to evaluate, in vitro, the cariostatic effect of whitening toothpastes. Ninety-five dental fragments were obtained from nonerupted third molars. The fragments were embedded in polystyrene resin and sequentially polished with abrasive papers (400-, 600-, and 1,000-grit) and diamond pastes of 6, 3, and 1 microm. The fragments were assigned in five groups according to toothpaste treatment: G1 = Rembrandt Plus with Peroxide; G2 = Crest Dual Action Whitening; G3 = Aquafresh Whitening Triple Protection; and the control groups: G4 = Sensodyne Original (without fluoride); G5 = Sensodyne Sodium Bicarbonated (with fluoride). The initial enamel microhardness evaluations were done. For 2 weeks the fragments were submitted daily to a de-remineralization cycle followed by a 10-minute toothpaste slurry. After that, the final microhardness tests were done. The percentage of mineral loss of enamel was determined for statistical analysis. Analysis of variance and the Tukey test were applied. The results did not show statistically significant differences in mineral loss among groups G1, G2, G3, and G5, which statistically differ from G4 (toothpaste without fluoride). G4 showed the highest mineral loss (P < or = .05). The whitening toothpastes evaluated showed a cariostatic effect similar to regular, nonwhitening toothpaste.
Multivariate Analysis and Prediction of Dioxin-Furan ...
Peer Review Draft of Regional Methods Initiative Final Report Dioxins, which are bioaccumulative and environmentally persistent, pose an ongoing risk to human and ecosystem health. Fish constitute a significant source of dioxin exposure for humans and fish-eating wildlife. Current dioxin analytical methods are costly, time-consuming, and produce hazardous by-products. A Danish team developed a novel, multivariate statistical methodology based on the covariance of dioxin-furan congener Toxic Equivalences (TEQs) and fatty acid methyl esters (FAMEs) and applied it to North Atlantic Ocean fishmeal samples. The goal of the current study was to attempt to extend this Danish methodology to 77 whole and composite fish samples from three trophic groups: predator (whole largemouth bass), benthic (whole flathead and channel catfish) and forage fish (composite bluegill, pumpkinseed and green sunfish) from two dioxin contaminated rivers (Pocatalico R. and Kanawha R.) in West Virginia, USA. Multivariate statistical analyses, including, Principal Components Analysis (PCA), Hierarchical Clustering, and Partial Least Squares Regression (PLS), were used to assess the relationship between the FAMEs and TEQs in these dioxin contaminated freshwater fish from the Kanawha and Pocatalico Rivers. These three multivariate statistical methods all confirm that the pattern of Fatty Acid Methyl Esters (FAMEs) in these freshwater fish covaries with and is predictive of the WHO TE
Forecasts of non-Gaussian parameter spaces using Box-Cox transformations
NASA Astrophysics Data System (ADS)
Joachimi, B.; Taylor, A. N.
2011-09-01
Forecasts of statistical constraints on model parameters using the Fisher matrix abound in many fields of astrophysics. The Fisher matrix formalism involves the assumption of Gaussianity in parameter space and hence fails to predict complex features of posterior probability distributions. Combining the standard Fisher matrix with Box-Cox transformations, we propose a novel method that accurately predicts arbitrary posterior shapes. The Box-Cox transformations are applied to parameter space to render it approximately multivariate Gaussian, performing the Fisher matrix calculation on the transformed parameters. We demonstrate that, after the Box-Cox parameters have been determined from an initial likelihood evaluation, the method correctly predicts changes in the posterior when varying various parameters of the experimental setup and the data analysis, with marginally higher computational cost than a standard Fisher matrix calculation. We apply the Box-Cox-Fisher formalism to forecast cosmological parameter constraints by future weak gravitational lensing surveys. The characteristic non-linear degeneracy between matter density parameter and normalization of matter density fluctuations is reproduced for several cases, and the capabilities of breaking this degeneracy by weak-lensing three-point statistics is investigated. Possible applications of Box-Cox transformations of posterior distributions are discussed, including the prospects for performing statistical data analysis steps in the transformed Gaussianized parameter space.
Probabilistic models for reactive behaviour in heterogeneous condensed phase media
NASA Astrophysics Data System (ADS)
Baer, M. R.; Gartling, D. K.; DesJardin, P. E.
2012-02-01
This work presents statistically-based models to describe reactive behaviour in heterogeneous energetic materials. Mesoscale effects are incorporated in continuum-level reactive flow descriptions using probability density functions (pdfs) that are associated with thermodynamic and mechanical states. A generalised approach is presented that includes multimaterial behaviour by treating the volume fraction as a random kinematic variable. Model simplifications are then sought to reduce the complexity of the description without compromising the statistical approach. Reactive behaviour is first considered for non-deformable media having a random temperature field as an initial state. A pdf transport relationship is derived and an approximate moment approach is incorporated in finite element analysis to model an example application whereby a heated fragment impacts a reactive heterogeneous material which leads to a delayed cook-off event. Modelling is then extended to include deformation effects associated with shock loading of a heterogeneous medium whereby random variables of strain, strain-rate and temperature are considered. A demonstrative mesoscale simulation of a non-ideal explosive is discussed that illustrates the joint statistical nature of the strain and temperature fields during shock loading to motivate the probabilistic approach. This modelling is derived in a Lagrangian framework that can be incorporated in continuum-level shock physics analysis. Future work will consider particle-based methods for a numerical implementation of this modelling approach.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.
The influence of control group reproduction on the statistical ...
Because of various Congressional mandates to protect the environment from endocrine disrupting chemicals (EDCs), the United States Environmental Protection Agency (USEPA) initiated the Endocrine Disruptor Screening Program. In the context of this framework, the Office of Research and Development within the USEPA developed the Medaka Extended One Generation Reproduction Test (MEOGRT) to characterize the endocrine action of a suspected EDC. One important endpoint of the MEOGRT is fecundity of breeding pairs of medaka. Power analyses were conducted to determine the number of replicates needed in proposed test designs and to determine the effects that varying reproductive parameters (e.g. mean fecundity, variance, and days with no egg production) will have on the statistical power of the test. A software tool, the MEOGRT Reproduction Power Analysis Tool, was developed to expedite these power analyses by both calculating estimates of the needed reproductive parameters (e.g. population mean and variance) and performing the power analysis under user specified scenarios. The manuscript illustrates how the reproductive performance of the control medaka that are used in a MEOGRT influence statistical power, and therefore the successful implementation of the protocol. Example scenarios, based upon medaka reproduction data collected at MED, are discussed that bolster the recommendation that facilities planning to implement the MEOGRT should have a culture of medaka with hi
Lepesqueur, Laura Soares; de Figueiredo, Viviane Maria Gonçalves; Ferreira, Leandro Lameirão; Sobrinho, Argemiro Soares da Silva; Massi, Marcos; Bottino, Marco Antônio; Nogueira Junior, Lafayette
2015-01-01
To determine the effect of maintaining torque after mechanical cycling of abutment screws that are coated with diamondlike carbon and coated with diamondlike carbon doped with diamond nanoparticles, with external and internal hex connections. Sixty implants were divided into six groups according to the type of connection (external or internal hex) and the type of abutment screw (uncoated, coated with diamondlike carbon, and coated with diamondlike carbon doped with diamond nanoparticles). The implants were inserted into polyurethane resin and crowns of nickel chrome were cemented on the implants. The crowns had a hole for access to the screw. The initial torque and the torque after mechanical cycling were measured. The torque values maintained (in percentages) were evaluated. Statistical analysis was performed using one-way analysis of variance and the Tukey test, with a significance level of 5%. The largest torque value was maintained in uncoated screws with external hex connections, a finding that was statistically significant (P = .0001). No statistically significant differences were seen between the groups with and without coating in maintaining torque for screws with internal hex connections (P = .5476). After mechanical cycling, the diamondlike carbon with and without diamond doping on the abutment screws showed no improvement in maintaining torque in external and internal hex connections.
Hauben, Manfred; Hung, Eric Y.
2016-01-01
Introduction: There is an interest in methodologies to expeditiously detect credible signals of drug-induced pancreatitis. An example is the reported signal of pancreatitis with rasburicase emerging from a study [the ‘index publication’ (IP)] combining quantitative signal detection findings from a spontaneous reporting system (SRS) and electronic health records (EHRs). The signal was reportedly supported by a clinical review with a case series manuscript in progress. The reported signal is noteworthy, being initially classified as a false-positive finding for the chosen reference standard, but reclassified as a ‘clinically supported’ signal. Objective: This paper has dual objectives: to revisit the signal of rasburicase and acute pancreatitis and extend the original analysis via reexamination of its findings, in light of more contemporary data; and to motivate discussions on key issues in signal detection and evaluation, including recent findings from a major international pharmacovigilance research initiative. Methodology: We used the same methodology as the IP, including the same disproportionality analysis software/dataset for calculating observed to expected reporting frequencies (O/Es), Medical Dictionary for Regulatory Activities Preferred Term, and O/E metric/threshold combination defining a signal of disproportionate reporting. Baseline analysis results prompted supplementary analyses using alternative analytical choices. We performed a comprehensive literature search to identify additional published case reports of rasburicase and pancreatitis. Results: We could not replicate positive findings (e.g. a signal or statistic of disproportionate reporting) from the SRS data using the same algorithm, software, dataset and vendor specified in the IP. The reporting association was statistically highlighted in default and supplemental analysis when more sensitive forms of disproportionality analysis were used. Two of three reports in the FAERS database were assessed as likely duplicate reports. We did not identify any additional reports in the FAERS corresponding to the three cases identified in the IP using EHRs. We did not identify additional published reports of pancreatitis associated with rasburicase. Discussion: Our exercise stimulated interesting discussions of key points in signal detection and evaluation, including causality assessment, signal detection algorithm performance, pharmacovigilance terminology, duplicate reporting, mechanisms for communicating signals, the structure of the FAERs database, and recent results from a major international pharmacovigilance research initiative. PMID:27298720
NASA Astrophysics Data System (ADS)
Singh, Jitendra; Sekharan, Sheeba; Karmakar, Subhankar; Ghosh, Subimal; Zope, P. E.; Eldho, T. I.
2017-04-01
Mumbai, the commercial and financial capital of India, experiences incessant annual rain episodes, mainly attributable to erratic rainfall pattern during monsoons and urban heat-island effect due to escalating urbanization, leading to increasing vulnerability to frequent flooding. After the infamous episode of 2005 Mumbai torrential rains when only two rain gauging stations existed, the governing civic body, the Municipal Corporation of Greater Mumbai (MCGM) came forward with an initiative to install 26 automatic weather stations (AWS) in June 2006 (MCGM 2007), which later increased to 60 AWS. A comprehensive statistical analysis to understand the spatio-temporal pattern of rainfall over Mumbai or any other coastal city in India has never been attempted earlier. In the current study, a thorough analysis of available rainfall data for 2006-2014 from these stations was performed; the 2013-2014 sub-hourly data from 26 AWS was found useful for further analyses due to their consistency and continuity. Correlogram cloud indicated no pattern of significant correlation when we considered the closest to the farthest gauging station from the base station; this impression was also supported by the semivariogram plots. Gini index values, a statistical measure of temporal non-uniformity, were found above 0.8 in visible majority showing an increasing trend in most gauging stations; this sufficiently led us to conclude that inconsistency in daily rainfall was gradually increasing with progress in monsoon. Interestingly, night rainfall was lesser compared to daytime rainfall. The pattern-less high spatio-temporal variation observed in Mumbai rainfall data signifies the futility of independently applying advanced statistical techniques, and thus calls for simultaneous inclusion of physics-centred models such as different meso-scale numerical weather prediction systems, particularly the Weather Research and Forecasting (WRF) model.
Chaikh, Abdulhamid; Balosso, Jacques
2016-12-01
This study proposes a statistical process to compare different treatment plans issued from different irradiation techniques or different treatment phases. This approach aims to provide arguments for discussion about the impact on clinical results of any condition able to significantly alter dosimetric or ballistic related data. The principles of the statistical investigation are presented in the framework of a clinical example based on 40 fields of radiotherapy for lung cancers. Two treatment plans were generated for each patient making a change of dose distribution due to variation of lung density correction. The data from 2D gamma index (γ) including the pixels having γ≤1 were used to determine the capability index (Cp) and the acceptability index (Cpk) of the process. To measure the strength of the relationship between the γ passing rates and the Cp and Cpk indices, the Spearman's rank non-parametric test was used to calculate P values. The comparison between reference and tested plans showed that 95% of pixels have γ≤1 with criteria (6%, 6 mm). The values of the Cp and Cpk indices were lower than one showing a significant dose difference. The data showed a strong correlation between γ passing rates and the indices with P>0.8. The statistical analysis using Cp and Cpk, show the significance of dose differences resulting from two plans in radiotherapy. These indices can be used for adaptive radiotherapy to measure the difference between initial plan and daily delivered plan. The significant changes of dose distribution could raise the question about the continuity to treat the patient with the initial plan or the need for adjustments.
Using HIV&AIDS statistics in pre-service Mathematics Education to integrate HIV&AIDS education.
van Laren, Linda
2012-12-01
In South Africa, the HIV&AIDS education policy documents indicate opportunities for integration across disciplines/subjects. There are different interpretations of integration/inclusion and mainstreaming HIV&AIDS education, and numerous levels of integration. Integration ensures that learners experience the disciplines/subjects as being linked and related, and integration is required to support and expand the learners' opportunities to attain skills, acquire knowledge and develop attitudes and values across the curriculum. This study makes use of self-study methodology where I, a teacher educator, aim to improve my practice through including HIV&AIDS statistics in Mathematics Education. This article focuses on how I used HIV&AIDS statistics to facilitate pre-service teacher reflection and introduce them to integration of HIV&AIDS education across the curriculum. After pre-service teachers were provided with HIV statistics, they drew a pie chart which graphically illustrated the situation and reflected on issues relating to HIV&AIDS. Three themes emerged from the analysis of their reflections. The themes relate to the need for further HIV&AIDS education, the changing pastoral role of teachers and the changing context of teaching. This information indicates that the use of statistics is an appropriate means of initiating the integration of HIV&AIDS education into the academic curriculum.
Functional brain networks for learning predictive statistics.
Giorgio, Joseph; Karlaftis, Vasilis M; Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew; Kourtzi, Zoe
2017-08-18
Making predictions about future events relies on interpreting streams of information that may initially appear incomprehensible. This skill relies on extracting regular patterns in space and time by mere exposure to the environment (i.e., without explicit feedback). Yet, we know little about the functional brain networks that mediate this type of statistical learning. Here, we test whether changes in the processing and connectivity of functional brain networks due to training relate to our ability to learn temporal regularities. By combining behavioral training and functional brain connectivity analysis, we demonstrate that individuals adapt to the environment's statistics as they change over time from simple repetition to probabilistic combinations. Further, we show that individual learning of temporal structures relates to decision strategy. Our fMRI results demonstrate that learning-dependent changes in fMRI activation within and functional connectivity between brain networks relate to individual variability in strategy. In particular, extracting the exact sequence statistics (i.e., matching) relates to changes in brain networks known to be involved in memory and stimulus-response associations, while selecting the most probable outcomes in a given context (i.e., maximizing) relates to changes in frontal and striatal networks. Thus, our findings provide evidence that dissociable brain networks mediate individual ability in learning behaviorally-relevant statistics. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Harris, Patricia RE; Stein, Phyllis K; Fung, Gordon L; Drew, Barbara J
2013-01-01
Background We sought to examine the prognostic value of heart rate turbulence derived from electrocardiographic recordings initiated in the emergency department for patients with non-ST elevation myocardial infarction (NSTEMI) or unstable angina. Methods Twenty-four-hour Holter recordings were started in patients with cardiac symptoms approximately 45 minutes after arrival in the emergency department. Patients subsequently diagnosed with NSTEMI or unstable angina who had recordings with ≥18 hours of sinus rhythm and sufficient data to compute Thrombolysis In Myocardial Infarction (TIMI) risk scores were chosen for analysis (n = 166). Endpoints were emergent re-entry to the cardiac emergency department and/or death at 30 days and one year. Results In Cox regression models, heart rate turbulence and TIMI risk scores together were significant predictors of 30-day (model chi square 13.200, P = 0.001, C-statistic 0.725) and one-year (model chi square 31.160, P < 0.001, C-statistic 0.695) endpoints, outperforming either measure alone. Conclusion Measurement of heart rate turbulence, initiated upon arrival at the emergency department, may provide additional incremental value in the risk assessment for patients with NSTEMI or unstable angina. PMID:23976860
Many-body localization of bosons in optical lattices
NASA Astrophysics Data System (ADS)
Sierant, Piotr; Zakrzewski, Jakub
2018-04-01
Many-body localization for a system of bosons trapped in a one-dimensional lattice is discussed. Two models that may be realized for cold atoms in optical lattices are considered. The model with a random on-site potential is compared with previously introduced random interactions model. While the origin and character of the disorder in both systems is different they show interesting similar properties. In particular, many-body localization appears for a sufficiently large disorder as verified by a time evolution of initial density wave states as well as using statistical properties of energy levels for small system sizes. Starting with different initial states, we observe that the localization properties are energy-dependent which reveals an inverted many-body localization edge in both systems (that finding is also verified by statistical analysis of energy spectrum). Moreover, we consider computationally challenging regime of transition between many body localized and extended phases where we observe a characteristic algebraic decay of density correlations which may be attributed to subdiffusion (and Griffiths-like regions) in the studied systems. Ergodicity breaking in the disordered Bose–Hubbard models is compared with the slowing-down of the time evolution of the clean system at large interactions.
Statistical Downscaling of WRF-Chem Model: An Air Quality Analysis over Bogota, Colombia
NASA Astrophysics Data System (ADS)
Kumar, Anikender; Rojas, Nestor
2015-04-01
Statistical downscaling is a technique that is used to extract high-resolution information from regional scale variables produced by coarse resolution models such as Chemical Transport Models (CTMs). The fully coupled WRF-Chem (Weather Research and Forecasting with Chemistry) model is used to simulate air quality over Bogota. Bogota is a tropical Andean megacity located over a high-altitude plateau in the middle of very complex terrain. The WRF-Chem model was adopted for simulating the hourly ozone concentrations. The computational domains were chosen of 120x120x32, 121x121x32 and 121x121x32 grid points with horizontal resolutions of 27, 9 and 3 km respectively. The model was initialized with real boundary conditions using NCAR-NCEP's Final Analysis (FNL) and a 1ox1o (~111 km x 111 km) resolution. Boundary conditions were updated every 6 hours using reanalysis data. The emission rates were obtained from global inventories, namely the REanalysis of the TROpospheric (RETRO) chemical composition and the Emission Database for Global Atmospheric Research (EDGAR). Multiple linear regression and artificial neural network techniques are used to downscale the model output at each monitoring stations. The results confirm that the statistically downscaled outputs reduce simulated errors by up to 25%. This study provides a general overview of statistical downscaling of chemical transport models and can constitute a reference for future air quality modeling exercises over Bogota and other Colombian cities.
Lefebvre, Alexandre; Rochefort, Gael Y.; Santos, Frédéric; Le Denmat, Dominique; Salmon, Benjamin; Pétillon, Jean-Marc
2016-01-01
Over the last decade, biomedical 3D-imaging tools have gained widespread use in the analysis of prehistoric bone artefacts. While initial attempts to characterise the major categories used in osseous industry (i.e. bone, antler, and dentine/ivory) have been successful, the taxonomic determination of prehistoric artefacts remains to be investigated. The distinction between reindeer and red deer antler can be challenging, particularly in cases of anthropic and/or taphonomic modifications. In addition to the range of destructive physicochemical identification methods available (mass spectrometry, isotopic ratio, and DNA analysis), X-ray micro-tomography (micro-CT) provides convincing non-destructive 3D images and analyses. This paper presents the experimental protocol (sample scans, image processing, and statistical analysis) we have developed in order to identify modern and archaeological antler collections (from Isturitz, France). This original method is based on bone microstructure analysis combined with advanced statistical support vector machine (SVM) classifiers. A combination of six microarchitecture biomarkers (bone volume fraction, trabecular number, trabecular separation, trabecular thickness, trabecular bone pattern factor, and structure model index) were screened using micro-CT in order to characterise internal alveolar structure. Overall, reindeer alveoli presented a tighter mesh than red deer alveoli, and statistical analysis allowed us to distinguish archaeological antler by species with an accuracy of 96%, regardless of anatomical location on the antler. In conclusion, micro-CT combined with SVM classifiers proves to be a promising additional non-destructive method for antler identification, suitable for archaeological artefacts whose degree of human modification and cultural heritage or scientific value has previously made it impossible (tools, ornaments, etc.). PMID:26901355
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitchell, W.G.; Spaletto, M.I.; Lewis, K.
The method of plutonium (Pu) determination at the Brunswick Laboratory (NBL) consists of a combination of ion-exchange purification followed by controlled-potential coulometric analysis (IE/CPC). The present report's purpose is to quantify any detectable Pu loss occurring in the ion-exchange (IE) purification step which would cause a negative bias in the NBL method for Pu analysis. The magnitude of any such loss would be contained within the reproducibility (0.05%) of the IE/CPC method which utilizes a state-of-the-art autocoulometer developed at NBL. When the NBL IE/CPC method is used for Pu analysis, any loss in ion-exchange purification (<0.05%) is confounded with themore » repeatability of the ion-exchange and the precision of the CPC analysis technique (<0.05%). Consequently, to detect a bias in the IE/CPC method due to the IE alone using the IE/CPC method itself requires that many randomized analyses on a single material be performed over time and that statistical analysis of the data be performed. The initial approach described in this report to quantify any IE loss was an independent method, Isotope Dilution Mass Spectrometry; however, the number of analyses performed was insufficient to assign a statistically significant value to the IE loss (<0.02% of 10 mg samples of Pu). The second method used for quantifying any IE loss of Pu was multiple ion exchanges of the same Pu aliquant; the small number of analyses possible per individual IE together with the column-to-column variability over multiple ion exchanges prevented statistical detection of any loss of <0.05%. 12 refs.« less
Energetic Particles of keV–MeV Energies Observed near Reconnecting Current Sheets at 1 au
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khabarova, Olga V.; Zank, Gary P.
2017-07-01
We provide evidence for particle acceleration up to ∼5 MeV at reconnecting current sheets in the solar wind based on both case studies and a statistical analysis of the energetic ion and electron flux data from the five Advanced Composition Explorer Electron, Proton, and Alpha Monitor (EPAM) detectors. The case study of a typical reconnection exhaust event reveals (i) a small-scale peak of the energetic ion flux observed in the vicinity of the reconnection exhaust and (ii) a long-timescale atypical energetic particle event (AEPE) encompassing the reconnection exhaust. AEPEs associated with reconnecting strong current sheets last for many hours, evenmore » days, as confirmed by statistical studies. The case study shows that time-intensity profiles of the ion flux may vary significantly from one EPAM detector to another partially because of the local topology of magnetic fields, but mainly because of the impact of upstream magnetospheric events; therefore, the occurrence of particle acceleration can be hidden. The finding of significant particle energization within a time interval of ±30 hr around reconnection exhausts is supported by a superposed epoch analysis of 126 reconnection exhaust events. We suggest that energetic particles initially accelerated via prolonged magnetic reconnection are trapped and reaccelerated in small- or medium-scale magnetic islands surrounding the reconnecting current sheet, as predicted by the transport theory of Zank et al. Other mechanisms of initial particle acceleration can contribute also.« less
[Impact of pharmaceutical intervention in preventing relapses in depression in Primary Care].
Rubio-Valera, María; Peñarrubia-María, M Teresa; Fernández-Vergel, Rita; Carvajal Tejadillo, Andrea Cecilia; Fernández Sánchez, Ana; Aznar-Lou, Ignacio; March-Pujol, Marian; Serrano-Blanco, Antoni
2016-05-01
To evaluate the long-term impact of a brief pharmacist intervention (PI) compared with usual care (UC) on prevention of depression relapse. randomised controlled clinical trial Primary Care Of the 179 depressed patients initiating antidepressants, the 113 whose clinical symptoms had remitted (main definition) at 6 months assessment were selected for this secondary study (PI=58; UC=55). PI was an interview to promote medication adherence when patients get antidepressants from pharmacy. Baseline, 3 months, and six-months follow-up assessments were made. The severity of depressive symptoms was evaluated with PHQ9. Patients presenting a remission of symptoms were selected. The patient medical records were reviewed to identify a relapse in the following 12 months by using 4 indicators. There was a lower proportion of patients that relapsed in the PI group than in the UC group 18 months after initiation of treatment, but the difference was not statistically significant either in the intent-to-treat analysis (OR=0.734 [95%CI; 0.273-1.975]) or the per-protocol analysis (OR=0.615 [95%CI; 0.183 -2.060]). All the sensitivity analyses showed consistent results. The sample size and adherence to the protocol in the intervention group were low. PI group showed a non-statistically significant tendency towards presenting fewer relapses. This could be related to the improvement in adherence among patients that received the intervention. Copyright © 2015 Elsevier España, S.L.U. All rights reserved.
Analysis of vector wind change with respect to time for Cape Kennedy, Florida
NASA Technical Reports Server (NTRS)
Adelfang, S. I.
1978-01-01
Multivariate analysis was used to determine the joint distribution of the four variables represented by the components of the wind vector at an initial time and after a specified elapsed time is hypothesized to be quadravariate normal; the fourteen statistics of this distribution, calculated from 15 years of twice-daily rawinsonde data are presented by monthly reference periods for each month from 0 to 27 km. The hypotheses that the wind component changes with respect to time is univariate normal, that the joint distribution of wind component change with respect to time is univariate normal, that the joint distribution of wind component changes is bivariate normal, and that the modulus of vector wind change is Rayleigh are tested by comparison with observed distributions. Statistics of the conditional bivariate normal distributions of vector wind at a future time given the vector wind at an initial time are derived. Wind changes over time periods from 1 to 5 hours, calculated from Jimsphere data, are presented. Extension of the theoretical prediction (based on rawinsonde data) of wind component change standard deviation to time periods of 1 to 5 hours falls (with a few exceptions) within the 95 percentile confidence band of the population estimate obtained from the Jimsphere sample data. The joint distributions of wind change components, conditional wind components, and 1 km vector wind shear change components are illustrated by probability ellipses at the 95 percentile level.
Ensink, Elliot; Sinha, Jessica; Sinha, Arkadeep; Tang, Huiyuan; Calderone, Heather M; Hostetter, Galen; Winter, Jordan; Cherba, David; Brand, Randall E; Allen, Peter J; Sempere, Lorenzo F; Haab, Brian B
2015-10-06
Experiments involving the high-throughput quantification of image data require algorithms for automation. A challenge in the development of such algorithms is to properly interpret signals over a broad range of image characteristics, without the need for manual adjustment of parameters. Here we present a new approach for locating signals in image data, called Segment and Fit Thresholding (SFT). The method assesses statistical characteristics of small segments of the image and determines the best-fit trends between the statistics. Based on the relationships, SFT identifies segments belonging to background regions; analyzes the background to determine optimal thresholds; and analyzes all segments to identify signal pixels. We optimized the initial settings for locating background and signal in antibody microarray and immunofluorescence data and found that SFT performed well over multiple, diverse image characteristics without readjustment of settings. When used for the automated analysis of multicolor, tissue-microarray images, SFT correctly found the overlap of markers with known subcellular localization, and it performed better than a fixed threshold and Otsu's method for selected images. SFT promises to advance the goal of full automation in image analysis.
Ensink, Elliot; Sinha, Jessica; Sinha, Arkadeep; Tang, Huiyuan; Calderone, Heather M.; Hostetter, Galen; Winter, Jordan; Cherba, David; Brand, Randall E.; Allen, Peter J.; Sempere, Lorenzo F.; Haab, Brian B.
2016-01-01
Certain experiments involve the high-throughput quantification of image data, thus requiring algorithms for automation. A challenge in the development of such algorithms is to properly interpret signals over a broad range of image characteristics, without the need for manual adjustment of parameters. Here we present a new approach for locating signals in image data, called Segment and Fit Thresholding (SFT). The method assesses statistical characteristics of small segments of the image and determines the best-fit trends between the statistics. Based on the relationships, SFT identifies segments belonging to background regions; analyzes the background to determine optimal thresholds; and analyzes all segments to identify signal pixels. We optimized the initial settings for locating background and signal in antibody microarray and immunofluorescence data and found that SFT performed well over multiple, diverse image characteristics without readjustment of settings. When used for the automated analysis of multi-color, tissue-microarray images, SFT correctly found the overlap of markers with known subcellular localization, and it performed better than a fixed threshold and Otsu’s method for selected images. SFT promises to advance the goal of full automation in image analysis. PMID:26339978
Pavlidis, Paul; Qin, Jie; Arango, Victoria; Mann, John J; Sibille, Etienne
2004-06-01
One of the challenges in the analysis of gene expression data is placing the results in the context of other data available about genes and their relationships to each other. Here, we approach this problem in the study of gene expression changes associated with age in two areas of the human prefrontal cortex, comparing two computational methods. The first method, "overrepresentation analysis" (ORA), is based on statistically evaluating the fraction of genes in a particular gene ontology class found among the set of genes showing age-related changes in expression. The second method, "functional class scoring" (FCS), examines the statistical distribution of individual gene scores among all genes in the gene ontology class and does not involve an initial gene selection step. We find that FCS yields more consistent results than ORA, and the results of ORA depended strongly on the gene selection threshold. Our findings highlight the utility of functional class scoring for the analysis of complex expression data sets and emphasize the advantage of considering all available genomic information rather than sets of genes that pass a predetermined "threshold of significance."
Gordon, John David; DiMattina, Michael; Reh, Andrea; Botes, Awie; Celia, Gerard; Payson, Mark
2013-08-01
To examine the utilization and outcomes of natural cycle (unstimulated) IVF as reported to the Society of Assisted Reproductive Technology (SART) in 2006 and 2007. Retrospective analysis. Dataset analysis from the SART Clinical Outcome Reporting System national database. All patients undergoing IVF as reported to SART in 2006 and 2007. None. Utilization of unstimulated IVF; description of patient demographics; and comparison of implantation and pregnancy rates between unstimulated and stimulated IVF cycles. During 2006 and 2007 a total of 795 unstimulated IVF cycles were initiated. Success rates were age dependent, with patients <35 years of age demonstrating clinical pregnancy rates per cycle start, retrieval, and transfer of 19.2%, 26.8%, and 35.9%, respectively. Implantation rates were statistically higher for unstimulated compared with stimulated IVF in patients who were 35 to 42 years old. Unstimulated IVF represents <1% of the total IVF cycles initiated in the United States. The pregnancy and live birth rates per initiated cycle were 19.2% and 15.2%, respectively, in patients <35 years old. The implantation rates in unstimulated IVF cycles compared favorably to stimulated IVF. Natural cycle IVF may be considered in a wide range of patients as an alternative therapy for the infertile couple. Copyright © 2013 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.
Science Initiatives of the US Virtual Astronomical Observatory
NASA Astrophysics Data System (ADS)
Hanisch, R. J.
2012-09-01
The United States Virtual Astronomical Observatory program is the operational facility successor to the National Virtual Observatory development project. The primary goal of the US VAO is to build on the standards, protocols, and associated infrastructure developed by NVO and the International Virtual Observatory Alliance partners and to bring to fruition a suite of applications and web-based tools that greatly enhance the research productivity of professional astronomers. To this end, and guided by the advice of our Science Council (Fabbiano et al. 2011), we have focused on five science initiatives in the first two years of VAO operations: 1) scalable cross-comparisons between astronomical source catalogs, 2) dynamic spectral energy distribution construction, visualization, and model fitting, 3) integration and periodogram analysis of time series data from the Harvard Time Series Center and NASA Star and Exoplanet Database, 4) integration of VO data discovery and access tools into the IRAF data analysis environment, and 5) a web-based portal to VO data discovery, access, and display tools. We are also developing tools for data linking and semantic discovery, and have a plan for providing data mining and advanced statistical analysis resources for VAO users. Initial versions of these applications and web-based services are being released over the course of the summer and fall of 2011, with further updates and enhancements planned for throughout 2012 and beyond.
Understanding evaluation of learning support in mathematics and statistics
NASA Astrophysics Data System (ADS)
MacGillivray, Helen; Croft, Tony
2011-03-01
With rapid and continuing growth of learning support initiatives in mathematics and statistics found in many parts of the world, and with the likelihood that this trend will continue, there is a need to ensure that robust and coherent measures are in place to evaluate the effectiveness of these initiatives. The nature of learning support brings challenges for measurement and analysis of its effects. After briefly reviewing the purpose, rationale for, and extent of current provision, this article provides a framework for those working in learning support to think about how their efforts can be evaluated. It provides references and specific examples of how workers in this field are collecting, analysing and reporting their findings. The framework is used to structure evaluation in terms of usage of facilities, resources and services provided, and also in terms of improvements in performance of the students and staff who engage with them. Very recent developments have started to address the effects of learning support on the development of deeper approaches to learning, the affective domain and the development of communities of practice of both learners and teachers. This article intends to be a stimulus to those who work in mathematics and statistics support to gather even richer, more valuable, forms of data. It provides a 'toolkit' for those interested in evaluation of learning support and closes by referring to an on-line resource being developed to archive the growing body of evidence.
NASA Astrophysics Data System (ADS)
Sharma, Arpita; Saikia, Ananya; Khare, Puja; Dutta, D. K.; Baruah, B. P.
2014-08-01
In Part 1 of the present investigation, 37 representative Eocene coal samples of Meghalaya, India were analyzed and their physico-chemical characteristics and the major oxides and minerals present in ash samples were studied for assessing the genesis of these coals. Various statistical tools were also applied to study their genesis. The datasets from Part 1 used in this investigation (Part 2) show the contribution of major oxides towards ash fusion temperatures (AFTs). The regression analysis of high temperature ash (HTA) composition and initial deformation temperature (IDT) show a definite increasing or decreasing trend, which has been used to determine the predictive indices for slagging, fouling, and abrasion propensities during combustion practices. The increase or decrease of IDT is influenced by the increase of Fe2O3, Al2O3, SiO2, and CaO, respectively. Detrital-authigenic index (DAI) calculated from the ash composition and its relation with AFT indicates Sialoferric nature of these coals. The correlation analysis, Principal Component Analysis (PCA), and Hierarchical Cluster Analysis (HCA) were used to study the possible fouling, slagging, and abrasion potentials in boilers during the coal combustion processes. A positive relationship between slagging and heating values of the coal has been found in this study.
FGWAS: Functional genome wide association analysis.
Huang, Chao; Thompson, Paul; Wang, Yalin; Yu, Yang; Zhang, Jingwen; Kong, Dehan; Colen, Rivka R; Knickmeyer, Rebecca C; Zhu, Hongtu
2017-10-01
Functional phenotypes (e.g., subcortical surface representation), which commonly arise in imaging genetic studies, have been used to detect putative genes for complexly inherited neuropsychiatric and neurodegenerative disorders. However, existing statistical methods largely ignore the functional features (e.g., functional smoothness and correlation). The aim of this paper is to develop a functional genome-wide association analysis (FGWAS) framework to efficiently carry out whole-genome analyses of functional phenotypes. FGWAS consists of three components: a multivariate varying coefficient model, a global sure independence screening procedure, and a test procedure. Compared with the standard multivariate regression model, the multivariate varying coefficient model explicitly models the functional features of functional phenotypes through the integration of smooth coefficient functions and functional principal component analysis. Statistically, compared with existing methods for genome-wide association studies (GWAS), FGWAS can substantially boost the detection power for discovering important genetic variants influencing brain structure and function. Simulation studies show that FGWAS outperforms existing GWAS methods for searching sparse signals in an extremely large search space, while controlling for the family-wise error rate. We have successfully applied FGWAS to large-scale analysis of data from the Alzheimer's Disease Neuroimaging Initiative for 708 subjects, 30,000 vertices on the left and right hippocampal surfaces, and 501,584 SNPs. Copyright © 2017 Elsevier Inc. All rights reserved.
Ha, Young Eun; Song, Jae-Hoon; Kang, Won Ki; Peck, Kyong Ran; Chung, Doo Ryeon; Kang, Cheol-In; Joung, Mi-Kyong; Joo, Eun-Jeong; Shon, Kyung Mok
2011-11-01
Bacteremia is an important clinical condition in febrile neutropenia that can cause clinical failure of antimicrobial therapy. The purpose of this study was to investigate the clinical factors predictive of bacteremia in low-risk febrile neutropenia at initial patient evaluation. We performed a retrospective cohort study in a university hospital in Seoul, Korea, between May 1995 and May 2007. Patients who met the criteria of low-risk febrile neutropenia at the time of visit to emergency department after anti-cancer chemotherapy were included in the analysis. During the study period, 102 episodes of bacteremia were documented among the 993 episodes of low-risk febrile neutropenia. Single gram-negative bacteremia was most frequent. In multivariate regression analysis, initial body temperature ≥39°C, initial hypotension, presence of clinical sites of infection, presence of central venous catheter, initial absolute neutrophil count <50/mm(3), and the CRP ≥10 mg/dL were statistically significant predictors for bacteremia. A scoring system using these variables was derived and the likelihood of bacteremia was well correlated with the score points with AUC under ROC curve of 0.785. Patients with low score points had low rate of bacteremia, thus, would be candidates for outpatient-based or oral antibiotic therapy. We identified major clinical factors that can predict bacteremia in low-risk febrile neutropenia.
Descriptive Statistical Techniques for Librarians. 2nd Edition.
ERIC Educational Resources Information Center
Hafner, Arthur W.
A thorough understanding of the uses and applications of statistical techniques is integral in gaining support for library funding or new initiatives. This resource is designed to help practitioners develop and manipulate descriptive statistical information in evaluating library services, tracking and controlling limited resources, and analyzing…
Rockfalls in the Duratón canyon, central Spain: Inventory and statistical analysis
NASA Astrophysics Data System (ADS)
Tanarro, Luis M.; Muñoz, Julio
2012-10-01
This paper presents an initial analysis of the rockfall processes affecting the walls of the canyon of the River Duratón. This 34 km long meandering canyon in the basin of the River Duero in central Spain (41°18' N, 3°45' W) has evolved in a large-scale outcrop of Late Cretaceous calcareous rocks (dolomite and limestone) deformed into a series of asymmetrical folds. Its vertical scarps range from 80 to 100 m; its width varies from 150 to 300 m; and its floor is between 30 and 50 m wide. The research consisted of drawing up an inventory of rockfalls from a field survey and mapping the fallen blocks deposited on the basal talus or on the canyon floor, which in turn allowed the original location of each block on the scarps to be identified and located on the orthophotos available. A Digital Elevation Model (DEM) was produced using a Geographic Information System (GIS) and maps made of the aspects and slopes. The aspect of each rockfall data point was determined, and this initial database was completed with other significant parameters (location on the valley side, relationship with the tectonic structure and relative age). An approximate delimitation was also produced of the potential rockfall source area, by reclassifying the slopes according to morphometric criteria. The result is a geomorphic rockfall inventory map, showing the distribution of the rockfalls and a basic statistical analysis to allow a preliminary evaluation of the rockfall characteristics in relation to both their topoclimatic location (aspect) and their structural location (with or counter to the dip of the strata) and to the current geomorphic dynamic through a study of recent scars on the scarps. Recent rockfalls have also been related to the meteorological conditions in which they occurred.
Drew, Benjamin T.; Bowes, Michael A.; Redmond, Anthony C.; Dube, Bright; Kingsbury, Sarah R.; Conaghan, Philip G.
2017-01-01
Abstract Objectives Current structural associations of patellofemoral pain (PFP) are based on 2D imaging methodology with inherent measurement uncertainty due to positioning and rotation. This study employed novel technology to create 3D measures of commonly described patellofemoral joint imaging features and compared these features in people with and without PFP in a large cohort. Methods We compared two groups from the Osteoarthritis Initiative: one with localized PFP and pain on stairs, and a control group with no knee pain; both groups had no radiographic OA. MRI bone surfaces were automatically segmented and aligned using active appearance models. We applied t-tests, logistic regression and linear discriminant analysis to compare 13 imaging features (including patella position, trochlear morphology, facet area and tilt) converted into 3D equivalents, and a measure of overall 3D shape. Results One hundred and fifteen knees with PFP (mean age 59.7, BMI 27.5 kg/m2, female 58.2%) and 438 without PFP (mean age 63.6, BMI 26.9 kg/m2, female 52.9%) were included. After correction for multiple testing, no statistically significant differences were found between groups for any of the 3D imaging features or their combinations. A statistically significant discrimination was noted for overall 3D shape between genders, confirming the validity of the 3D measures. Conclusion Challenging current perceptions, no differences in patellofemoral morphology were found between older people with and without PFP using 3D quantitative imaging analysis. Further work is needed to see if these findings are replicated in a younger PFP population. PMID:28968747
Drew, Benjamin T; Bowes, Michael A; Redmond, Anthony C; Dube, Bright; Kingsbury, Sarah R; Conaghan, Philip G
2017-12-01
Current structural associations of patellofemoral pain (PFP) are based on 2D imaging methodology with inherent measurement uncertainty due to positioning and rotation. This study employed novel technology to create 3D measures of commonly described patellofemoral joint imaging features and compared these features in people with and without PFP in a large cohort. We compared two groups from the Osteoarthritis Initiative: one with localized PFP and pain on stairs, and a control group with no knee pain; both groups had no radiographic OA. MRI bone surfaces were automatically segmented and aligned using active appearance models. We applied t-tests, logistic regression and linear discriminant analysis to compare 13 imaging features (including patella position, trochlear morphology, facet area and tilt) converted into 3D equivalents, and a measure of overall 3D shape. One hundred and fifteen knees with PFP (mean age 59.7, BMI 27.5 kg/m2, female 58.2%) and 438 without PFP (mean age 63.6, BMI 26.9 kg/m2, female 52.9%) were included. After correction for multiple testing, no statistically significant differences were found between groups for any of the 3D imaging features or their combinations. A statistically significant discrimination was noted for overall 3D shape between genders, confirming the validity of the 3D measures. Challenging current perceptions, no differences in patellofemoral morphology were found between older people with and without PFP using 3D quantitative imaging analysis. Further work is needed to see if these findings are replicated in a younger PFP population. © The Author 2017. Published by Oxford University Press on behalf of the British Society for Rheumatology.
Teacher Professional Development to Foster Authentic Student Research Experiences
NASA Astrophysics Data System (ADS)
Conn, K.; Iyengar, E.
2004-12-01
This presentation reports on a new teacher workshop design that encourages teachers to initiate and support long-term student-directed research projects in the classroom setting. Teachers were recruited and engaged in an intensive marine ecology learning experience at Shoals Marine Laboratory, Appledore Island, Maine. Part of the weeklong summer workshop was spent in field work, part in laboratory work, and part in learning experimental design and basic statistical analysis of experimental results. Teachers were presented with strategies to adapt their workshop learnings to formulate plans for initiating and managing authentic student research projects in their classrooms. The authors will report on the different considerations and constraints facing the teachers in their home school settings and teachers' progress in implementing their plans. Suggestions for replicating the workshop will be offered.
Grinich, E; Schmitt, J; Küster, D; Spuls, P I; Williams, H C; Chalmers, J R; Thomas, K S; Apfelbacher, C; Prinsen, C A C; Furue, M; Stuart, B; Carter, B; Simpson, E
2018-05-10
Several organizations from multiple fields of medicine are setting standards for clinical research including protocol development, 1 harmonization of outcome reporting, 2 statistical analysis, 3 quality assessment 4 and reporting of findings. 1 Clinical research standardization facilitates the interpretation and synthesis of data, increases the usability of trial results for guideline groups and shared decision-making, and reduces selective outcome reporting bias. The mission of the Harmonising Outcome Measures for Eczema (HOME) initiative is to establish an agreed-upon core set of outcomes to be measured and reported in all clinical trials of atopic dermatitis (AD). This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
2012-01-01
This paper utilizes a statistical approach, the response surface optimization methodology, to determine the optimum conditions for the Acid Black 172 dye removal efficiency from aqueous solution by electrocoagulation. The experimental parameters investigated were initial pH: 4–10; initial dye concentration: 0–600 mg/L; applied current: 0.5-3.5 A and reaction time: 3–15 min. These parameters were changed at five levels according to the central composite design to evaluate their effects on decolorization through analysis of variance. High R2 value of 94.48% shows a high correlation between the experimental and predicted values and expresses that the second-order regression model is acceptable for Acid Black 172 dye removal efficiency. It was also found that some interactions and squares influenced the electrocoagulation performance as well as the selected parameters. Optimum dye removal efficiency of 90.4% was observed experimentally at initial pH of 7, initial dye concentration of 300 mg/L, applied current of 2 A and reaction time of 9.16 min, which is close to model predicted (90%) result. PMID:23369574
Gestational Age at First Antenatal Care Visit in Malawi.
Mkandawire, Paul
2015-11-01
This paper examines the gestational age at first antenatal care (ANC) visit and factors associated with timely initiation of ANC in Malawi in a context where maternal and child health services are generally provided for free. Lognormal survival models are applied to Demographic and Health Survey data from a nationally representative sample of women (n = 13,588) of child-bearing age. The findings of this study show that less than 30 % of pregnant women initiate ANC within the World Health Organization recommended gestational timeframe of 16 weeks or earlier. The hazard analysis shows a gradient in the initiation of ANC by maternal education level, with least educated mothers most likely to delay their first ANC visit. However, after adjusting for variables capturing intimate partner violence in the multivariate models, the effect of maternal education attenuated and lost statistical significance. Other significant predictors of gestational age at first ANC include media exposure, perceived distance from health facility, age, and birth order. The findings of the study link domestic violence directly with the gestational age at which mothers initiate ANC, suggesting that gender-based violence may operate through delayed initiation of ANC to undermine maternal and child health outcomes.
Kathman, Steven J; Potts, Ryan J; Ayres, Paul H; Harp, Paul R; Wilson, Cody L; Garner, Charles D
2010-10-01
The mouse dermal assay has long been used to assess the dermal tumorigenicity of cigarette smoke condensate (CSC). This mouse skin model has been developed for use in carcinogenicity testing utilizing the SENCAR mouse as the standard strain. Though the model has limitations, it remains as the most relevant method available to study the dermal tumor promoting potential of mainstream cigarette smoke. In the typical SENCAR mouse CSC bioassay, CSC is applied for 29 weeks following the application of a tumor initiator such as 7,12-dimethylbenz[a]anthracene (DMBA). Several endpoints are considered for analysis including: the percentage of animals with at least one mass, latency, and number of masses per animal. In this paper, a relatively straightforward analytic model and procedure is presented for analyzing the time course of the incidence of masses. The procedure considered here takes advantage of Bayesian statistical techniques, which provide powerful methods for model fitting and simulation. Two datasets are analyzed to illustrate how the model fits the data, how well the model may perform in predicting data from such trials, and how the model may be used as a decision tool when comparing the dermal tumorigenicity of cigarette smoke condensate from multiple cigarette types. The analysis presented here was developed as a statistical decision tool for differentiating between two or more prototype products based on the dermal tumorigenicity. Copyright (c) 2010 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Irvine, John M.; Ghadar, Nastaran; Duncan, Steve; Floyd, David; O'Dowd, David; Lin, Kristie; Chang, Tom
2017-03-01
Quantitative biomarkers for assessing the presence, severity, and progression of age-related macular degeneration (AMD) would benefit research, diagnosis, and treatment. This paper explores development of quantitative biomarkers derived from OCT imagery of the retina. OCT images for approximately 75 patients with Wet AMD, Dry AMD, and no AMD (healthy eyes) were analyzed to identify image features indicative of the patients' conditions. OCT image features provide a statistical characterization of the retina. Healthy eyes exhibit a layered structure, whereas chaotic patterns indicate the deterioration associated with AMD. Our approach uses wavelet and Frangi filtering, combined with statistical features that do not rely on image segmentation, to assess patient conditions. Classification analysis indicates clear separability of Wet AMD from other conditions, including Dry AMD and healthy retinas. The probability of correct classification of was 95.7%, as determined from cross validation. Similar classification analysis predicts the response of Wet AMD patients to treatment, as measured by the Best Corrected Visual Acuity (BCVA). A statistical model predicts BCVA from the imagery features with R2 = 0.846. Initial analysis of OCT imagery indicates that imagery-derived features can provide useful biomarkers for characterization and quantification of AMD: Accurate assessment of Wet AMD compared to other conditions; image-based prediction of outcome for Wet AMD treatment; and features derived from the OCT imagery accurately predict BCVA; unlike many methods in the literature, our techniques do not rely on segmentation of the OCT image. Next steps include larger scale testing and validation.
Garrido-Acosta, Osvaldo; Meza-Toledo, Sergio Enrique; Anguiano-Robledo, Liliana; Valencia-Hernández, Ignacio; Chamorro-Cevallos, Germán
2014-01-01
We determined the median effective dose (ED50) values for the anticonvulsants phenobarbital and sodium valproate using a modification of Lorke's method. This modification allowed appropriate statistical analysis and the use of a smaller number of mice per compound tested. The anticonvulsant activities of phenobarbital and sodium valproate were evaluated in male CD1 mice by maximal electroshock (MES) and intraperitoneal administration of pentylenetetrazole (PTZ). The anticonvulsant ED50 values were obtained through modifications of Lorke's method that involved changes in the selection of the three first doses in the initial test and the fourth dose in the second test. Furthermore, a test was added to evaluate the ED50 calculated by the modified Lorke's method, allowing statistical analysis of the data and determination of the confidence limits for ED50. The ED50 for phenobarbital against MES- and PTZ-induced seizures was 16.3mg/kg and 12.7mg/kg, respectively. The sodium valproate values were 261.2mg/kg and 159.7mg/kg, respectively. These results are similar to those found using the traditional methods of finding ED50, suggesting that the modifications made to Lorke's method generate equal results using fewer mice while increasing confidence in the statistical analysis. This adaptation of Lorke's method can be used to determine median letal dose (LD50) or ED50 for compounds with other pharmacological activities. Copyright © 2014 Elsevier Inc. All rights reserved.
Bolin, Jocelyn Holden; Finch, W Holmes
2014-01-01
Statistical classification of phenomena into observed groups is very common in the social and behavioral sciences. Statistical classification methods, however, are affected by the characteristics of the data under study. Statistical classification can be further complicated by initial misclassification of the observed groups. The purpose of this study is to investigate the impact of initial training data misclassification on several statistical classification and data mining techniques. Misclassification conditions in the three group case will be simulated and results will be presented in terms of overall as well as subgroup classification accuracy. Results show decreased classification accuracy as sample size, group separation and group size ratio decrease and as misclassification percentage increases with random forests demonstrating the highest accuracy across conditions.
Sensitivity of wildlife habitat models to uncertainties in GIS data
NASA Technical Reports Server (NTRS)
Stoms, David M.; Davis, Frank W.; Cogan, Christopher B.
1992-01-01
Decision makers need to know the reliability of output products from GIS analysis. For many GIS applications, it is not possible to compare these products to an independent measure of 'truth'. Sensitivity analysis offers an alternative means of estimating reliability. In this paper, we present a CIS-based statistical procedure for estimating the sensitivity of wildlife habitat models to uncertainties in input data and model assumptions. The approach is demonstrated in an analysis of habitat associations derived from a GIS database for the endangered California condor. Alternative data sets were generated to compare results over a reasonable range of assumptions about several sources of uncertainty. Sensitivity analysis indicated that condor habitat associations are relatively robust, and the results have increased our confidence in our initial findings. Uncertainties and methods described in the paper have general relevance for many GIS applications.
Hilarion, Pilar; Groene, Oliver; Colom, Joan; Lopez, Rosa M; Suñol, Rosa
2010-10-23
The Health Department of the Regional Government of Catalonia, Spain, issued a quality plan for substance abuse centers. The objective of this paper is to evaluate the impact of a multidimensional quality improvement initiative in the field of substance abuse care and to discuss potentials and limitations for further quality improvement. The study uses an uncontrolled, sector-wide pre-post design. All centers providing services for persons with substance abuse issues in the Autonomous Community of Catalonia participated in this assessment. Measures of compliance were developed based on indicators reported in the literature and by broad stakeholder involvement. We compared pre-post differences in dimension-specific and overall compliance-scores using one-way ANOVA for repeated measures and the Friedman statistic. We described the spread of the data using the inter-quartile range and the Fligner-Killen statistic. Finally, we adjusted compliance scores for location and size using linear and logistic regression models. We performed a baseline and follow up assessment in 22 centers for substance abuse care and observed substantial and statistically significant improvements for overall compliance (pre: 60.9%; post: 79.1%) and for compliance in the dimensions 'care pathway' (pre: 66.5%; post: 83.5%) and 'organization and management' (pre: 50.5%; post: 77.2%). We observed improvements in the dimension 'environment and infrastructure' (pre: 81.8%; post: 95.5%) and in the dimension 'relations and user rights' (pre: 66.5%; post: 72.5%); however, these were not statistically significant. The regression analysis suggests that improvements in compliance are positively influenced by being located in the Barcelona region in case of the dimension 'relations and user rights'. The positive results of this quality improvement initiative are possibly associated with the successful involvement of stakeholders, the consciously constructed feedback reports on individual and sector-wide performance and the support of evidence-based guidance wherever possible. Further research should address how contextual issues shape the uptake and effectiveness of quality improvement actions and how such quality improvements can be sustained.
Reynolds number dependence of relative dispersion statistics in isotropic turbulence
NASA Astrophysics Data System (ADS)
Sawford, Brian L.; Yeung, P. K.; Hackl, Jason F.
2008-06-01
Direct numerical simulation results for a range of relative dispersion statistics over Taylor-scale Reynolds numbers up to 650 are presented in an attempt to observe and quantify inertial subrange scaling and, in particular, Richardson's t3 law. The analysis includes the mean-square separation and a range of important but less-studied differential statistics for which the motion is defined relative to that at time t =0. It seeks to unambiguously identify and quantify the Richardson scaling by demonstrating convergence with both the Reynolds number and initial separation. According to these criteria, the standard compensated plots for these statistics in inertial subrange scaling show clear evidence of a Richardson range but with an imprecise estimate for the Richardson constant. A modified version of the cube-root plots introduced by Ott and Mann [J. Fluid Mech. 422, 207 (2000)] confirms such convergence. It has been used to yield more precise estimates for Richardson's constant g which decrease with Taylor-scale Reynolds numbers over the range of 140-650. Extrapolation to the large Reynolds number limit gives an asymptotic value for Richardson's constant in the range g =0.55-0.57, depending on the functional form used to make the extrapolation.
Feder, Paul I; Ma, Zhenxu J; Bull, Richard J; Teuschler, Linda K; Rice, Glenn
2009-01-01
In chemical mixtures risk assessment, the use of dose-response data developed for one mixture to estimate risk posed by a second mixture depends on whether the two mixtures are sufficiently similar. While evaluations of similarity may be made using qualitative judgments, this article uses nonparametric statistical methods based on the "bootstrap" resampling technique to address the question of similarity among mixtures of chemical disinfectant by-products (DBP) in drinking water. The bootstrap resampling technique is a general-purpose, computer-intensive approach to statistical inference that substitutes empirical sampling for theoretically based parametric mathematical modeling. Nonparametric, bootstrap-based inference involves fewer assumptions than parametric normal theory based inference. The bootstrap procedure is appropriate, at least in an asymptotic sense, whether or not the parametric, distributional assumptions hold, even approximately. The statistical analysis procedures in this article are initially illustrated with data from 5 water treatment plants (Schenck et al., 2009), and then extended using data developed from a study of 35 drinking-water utilities (U.S. EPA/AMWA, 1989), which permits inclusion of a greater number of water constituents and increased structure in the statistical models.
Elkhoudary, Mahmoud M; Abdel Salam, Randa A; Hadad, Ghada M
2016-11-01
A new simple, sensitive, rapid and accurate gradient reversed-phase high-performance liquid chromatography with photodiode array detector (RP-HPLC-DAD) was developed and validated for simultaneous analysis of Metronidazole (MNZ), Spiramycin (SPY), Diloxanidefuroate (DIX) and Cliquinol (CLQ) using statistical experimental design. Initially, a resolution V fractional factorial design was used in order to screen five independent factors: the column temperature (°C), pH, phosphate buffer concentration (mM), flow rate (ml/min) and the initial fraction of mobile phase B (%). pH, flow rate and initial fraction of mobile phase B were identified as significant, using analysis of variance. The optimum conditions of separation determined with the aid of central composite design were: (1) initial mobile phase concentration: phosphate buffer/methanol (50/50, v/v), (2) phosphate buffer concentration (50 mM), (3) pH (4.72), (4) column temperature 30°C and (5) mobile phase flow rate (0.8 ml min -1 ). Excellent linearity was observed for all of the standard calibration curves, and the correlation coefficients were above 0.9999. Limits of detection for all of the analyzed compounds ranged between 0.02 and 0.11 μg ml -1 ; limits of quantitation ranged between 0.06 and 0.33 μg ml -1 The proposed method showed good prediction ability. The optimized method was validated according to ICH guidelines. Three commercially available tablets were analyzed showing good % recovery and %RSD. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Indigenous Mortality (Revealed): The Invisible Illuminated
Ring, Ian; Arambula Solomon, Teshia G.; Gachupin, Francine C.; Smylie, Janet; Cutler, Tessa Louise; Waldon, John A.
2015-01-01
Inaccuracies in the identification of Indigenous status and the collection of and access to vital statistics data impede the strategic implementation of evidence-based public health initiatives to reduce avoidable deaths. The impact of colonization and subsequent government initiatives has been commonly observed among the Indigenous peoples of Australia, Canada, New Zealand, and the United States. The quality of Indigenous data that informs mortality statistics are similarly connected to these distal processes, which began with colonization. We discuss the methodological and technical challenges in measuring mortality for Indigenous populations within a historical and political context, and identify strategies for the accurate ascertainment and inclusion of Indigenous people in mortality statistics. PMID:25211754
Zhang, Z; Zheng, Y; Bian, X
2016-06-01
The results of recent published studies focusing on the effect of azithromycin as an adjunct to scaling and root planing (SRP) in the treatment of chronic periodontitis are inconsistent. We conducted a meta-analysis of randomized controlled clinical trials to examine the effect of azithromycin combined with SRP on periodontal clinical parameters as compared to SRP alone. An electronic search was carried out on Pubmed, Embase and the Cochrane Central Register of Controlled Trials from their earliest records through December 28, 2014 to identify studies that met pre-stated inclusion criteria. Reference lists of retrieved articles were also reviewed. Data were extracted independently by two authors. Either a fixed- or random-effects model was used to calculate the overall effect sizes of azithromycin on probing depth, attachment level (AL) and bleeding on probing (BOP). Heterogeneity was evaluated using the Q test and I(2) statistic. Publication bias was evaluated by Begg's test and Egger's test. A total of 14 trials were included in the meta-analysis. Compared with SRP alone, locally delivered azithromycin plus SRP statistically significantly reduced probing depth by 0.99 mm (95% CI 0.42-1.57) and increased AL by 1.12 mm (95% CI 0.31-1.92). In addition, systemically administered azithromycin plus SRP statistically significantly reduced probing depth by 0.21 mm (95% CI 0.12-0.29), BOP by 4.50% (95% CI 1.45-7.56) and increased AL by 0.23 mm (95% CI 0.07-0.39). Sensitivity analysis yielded similar results. No evidence of publication bias was observed. The additional benefit of systemic azithromycin was shown at the initially deep probing depth sites, but not at shallow or moderate sites. The overall effect sizes of systemic azithromycin showed a tendency to decrease with time, and meta-regression analysis suggested a negative relation between the length of follow-up and net change in probing depth (r = -0.05, p = 0.02). This meta-analysis provides further evidence that azithromycin used as an adjunct to SRP significantly improves the efficacy of non-surgical periodontal therapy on reducing probing depth, BOP and improving AL, particularly at the initially deep probing depth sites. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Percolation Analysis as a Tool to Describe the Topology of the Large Scale Structure of the Universe
NASA Astrophysics Data System (ADS)
Yess, Capp D.
1997-09-01
Percolation analysis is the study of the properties of clusters. In cosmology, it is the statistics of the size and number of clusters. This thesis presents a refinement of percolation analysis and its application to astronomical data. An overview of the standard model of the universe and the development of large scale structure is presented in order to place the study in historical and scientific context. Then using percolation statistics we, for the first time, demonstrate the universal character of a network pattern in the real space, mass distributions resulting from nonlinear gravitational instability of initial Gaussian fluctuations. We also find that the maximum of the number of clusters statistic in the evolved, nonlinear distributions is determined by the effective slope of the power spectrum. Next, we present percolation analyses of Wiener Reconstructions of the IRAS 1.2 Jy Redshift Survey. There are ten reconstructions of galaxy density fields in real space spanning the range β = 0.1 to 1.0, where β=Ω0.6/b,/ Ω is the present dimensionless density and b is the linear bias factor. Our method uses the growth of the largest cluster statistic to characterize the topology of a density field, where Gaussian randomized versions of the reconstructions are used as standards for analysis. For the reconstruction volume of radius, R≈100h-1 Mpc, percolation analysis reveals a slight 'meatball' topology for the real space, galaxy distribution of the IRAS survey. Finally, we employ a percolation technique developed for pointwise distributions to analyze two-dimensional projections of the three northern and three southern slices in the Las Campanas Redshift Survey and then give consideration to further study of the methodology, errors and application of percolation. We track the growth of the largest cluster as a topological indicator to a depth of 400 h-1 Mpc, and report an unambiguous signal, with high signal-to-noise ratio, indicating a network topology which in two dimensions is indicative of a filamentary distribution. It is hoped that one day percolation analysis can characterize the structure of the universe to a degree that will aid theorists in confidently describing the nature of our world.
[Generalization of the results of clinical studies through the analysis of subgroups].
Costa, João; Fareleira, Filipa; Ascensão, Raquel; Vaz Carneiro, António
2012-01-01
Subgroup analysis in clinical trials are usually performed to define the potential heterogeneity of treatment effect in relation with the baseline risk, physiopathology, practical application of therapy or the under-utilization in clinical practice of effective interventions due to uncertainties of its benefit/risk ratio. When appropriately planned, subgroup analysis are a valid methodology the define benefits in subgroups of patients, thus providing good quality evidence to support clinical decision making. However, in order to be correct, subgroup analysis should be defined a priori, done in small numbers, should be fully reported and, most important, must endure statistical tests for interaction. In this paper we present an example of the treatment of post-menopausal osteoporosis, in which the benefits of an intervention (the higher the fracture risk is, the better the benefit is) with a specific agent (bazedoxifene) was only disclosed after a post-hoc analysis of the initial global trial sample.
Incorporating covariates into fisheries stock assessment models with application to Pacific herring.
Deriso, Richard B; Maunder, Mark N; Pearson, Walter H
2008-07-01
We present a framework for evaluating the cause of fishery declines by integrating covariates into a fisheries stock assessment model. This allows the evaluation of fisheries' effects vs. natural and other human impacts. The analyses presented are based on integrating ecological science and statistics and form the basis for environmental decision-making advice. Hypothesis tests are described to rank hypotheses and determine the size of a multiple covariate model. We extend recent developments in integrated analysis and use novel methods to produce effect size estimates that are relevant to policy makers and include estimates of uncertainty. Results can be directly applied to evaluate trade-offs among alternative management decisions. The methods and results are also broadly applicable outside fisheries stock assessment. We show that multiple factors influence populations and that analysis of factors in isolation can be misleading. We illustrate the framework by applying it to Pacific herring of Prince William Sound, Alaska (USA). The Pacific herring stock that spawns in Prince William Sound is a stock that has collapsed, but there are several competing or alternative hypotheses to account for the initial collapse and subsequent lack of recovery. Factors failing the initial screening tests for statistical significance included indicators of the 1989 Exxon Valdez oil spill, coho salmon predation, sea lion predation, Pacific Decadal Oscillation, Northern Oscillation Index, and effects of containment in the herring egg-on-kelp pound fishery. The overall results indicate that the most statistically significant factors related to the lack of recovery of the herring stock involve competition or predation by juvenile hatchery pink salmon on herring juveniles. Secondary factors identified in the analysis were poor nutrition in the winter, ocean (Gulf of Alaska) temperature in the winter, the viral hemorrhagic septicemia virus, and the pathogen Ichthyophonus hoferi. The implication of this result to fisheries management in Prince William Sound is that it may well be difficult to simultaneously increase the production of pink salmon and maintain a viable Pacific herring fishery. The impact can be extended to other commercially important fisheries, and a whole ecosystem approach may be needed to evaluate the costs and benefits of salmon hatcheries.
Velocity bias in the distribution of dark matter halos
NASA Astrophysics Data System (ADS)
Baldauf, Tobias; Desjacques, Vincent; Seljak, Uroš
2015-12-01
The standard formalism for the coevolution of halos and dark matter predicts that any initial halo velocity bias rapidly decays to zero. We argue that, when the purpose is to compute statistics like power spectra etc., the coupling in the momentum conservation equation for the biased tracers must be modified. Our new formulation predicts the constancy in time of any statistical halo velocity bias present in the initial conditions, in agreement with peak theory. We test this prediction by studying the evolution of a conserved halo population in N -body simulations. We establish that the initial simulated halo density and velocity statistics show distinct features of the peak model and, thus, deviate from the simple local Lagrangian bias. We demonstrate, for the first time, that the time evolution of their velocity is in tension with the rapid decay expected in the standard approach.
Observation of non-classical correlations in sequential measurements of photon polarization
NASA Astrophysics Data System (ADS)
Suzuki, Yutaro; Iinuma, Masataka; Hofmann, Holger F.
2016-10-01
A sequential measurement of two non-commuting quantum observables results in a joint probability distribution for all output combinations that can be explained in terms of an initial joint quasi-probability of the non-commuting observables, modified by the resolution errors and back-action of the initial measurement. Here, we show that the error statistics of a sequential measurement of photon polarization performed at different measurement strengths can be described consistently by an imaginary correlation between the statistics of resolution and back-action. The experimental setup was designed to realize variable strength measurements with well-controlled imaginary correlation between the statistical errors caused by the initial measurement of diagonal polarizations, followed by a precise measurement of the horizontal/vertical polarization. We perform the experimental characterization of an elliptically polarized input state and show that the same complex joint probability distribution is obtained at any measurement strength.
The composite sequential clustering technique for analysis of multispectral scanner data
NASA Technical Reports Server (NTRS)
Su, M. Y.
1972-01-01
The clustering technique consists of two parts: (1) a sequential statistical clustering which is essentially a sequential variance analysis, and (2) a generalized K-means clustering. In this composite clustering technique, the output of (1) is a set of initial clusters which are input to (2) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by traditional supervised maximum likelihood classification techniques. The mathematical algorithms for the composite sequential clustering program and a detailed computer program description with job setup are given.
NASA Astrophysics Data System (ADS)
Hendikawati, P.; Arifudin, R.; Zahid, M. Z.
2018-03-01
This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.
Digest of Education Statistics, 2005. NCES 2006-030
ERIC Educational Resources Information Center
Snyder, Thomas D.; Tan, Alexandra G.; Hoffman, Charlene M.
2006-01-01
The 2005 edition of the "Digest of Education Statistics" is the 41st in a series of publications initiated in 1962. Its primary purpose is to provide a compilation of statistical information covering the broad field of American education from prekindergarten through graduate school. The "Digest" includes a selection of data…
Statistical Association Criteria in Forensic Psychiatry–A criminological evaluation of casuistry
Gheorghiu, V; Buda, O; Popescu, I; Trandafir, MS
2011-01-01
Purpose. Identification of potential shared primary psychoprophylaxis and crime prevention is measured by analyzing the rate of commitments for patients–subjects to forensic examination. Material and method. The statistic trial is a retrospective, document–based study. The statistical lot consists of 770 initial examination reports performed and completed during the whole year 2007, primarily analyzed in order to summarize the data within the National Institute of Forensic Medicine, Bucharest, Romania (INML), with one of the group variables being ‘particularities of the psychiatric patient history’, containing the items ‘forensic onset’, ‘commitments within the last year prior to the examination’ and ‘absence of commitments within the last year prior to the examination’. The method used was the Kendall bivariate correlation. For this study, the authors separately analyze only the two items regarding commitments by other correlation alternatives and by modern, elaborate statistical analyses, i.e. recording of the standard case study variables, Kendall bivariate correlation, cross tabulation, factor analysis and hierarchical cluster analysis. Results. The results are varied, from theoretically presumed clinical nosography (such as schizophrenia or manic depression), to non–presumed (conduct disorders) or unexpected behavioral acts, and therefore difficult to interpret. Conclusions. One took into consideration the features of the batch as well as the results of the previous standard correlation of the whole statistical lot. The authors emphasize the role of medical security measures that are actually applied in the therapeutic management in general and in risk and second offence management in particular, as well as the role of forensic psychiatric examinations in the detection of certain aspects related to the monitoring of mental patients. PMID:21505571
Sugita, Mitsuro; Weatherbee, Andrew; Bizheva, Kostadinka; Popov, Ivan; Vitkin, Alex
2016-07-01
The probability density function (PDF) of light scattering intensity can be used to characterize the scattering medium. We have recently shown that in optical coherence tomography (OCT), a PDF formalism can be sensitive to the number of scatterers in the probed scattering volume and can be represented by the K-distribution, a functional descriptor for non-Gaussian scattering statistics. Expanding on this initial finding, here we examine polystyrene microsphere phantoms with different sphere sizes and concentrations, and also human skin and fingernail in vivo. It is demonstrated that the K-distribution offers an accurate representation for the measured OCT PDFs. The behavior of the shape parameter of K-distribution that best fits the OCT scattering results is investigated in detail, and the applicability of this methodology for biological tissue characterization is demonstrated and discussed.
A bootstrap based Neyman-Pearson test for identifying variable importance.
Ditzler, Gregory; Polikar, Robi; Rosen, Gail
2015-04-01
Selection of most informative features that leads to a small loss on future data are arguably one of the most important steps in classification, data analysis and model selection. Several feature selection (FS) algorithms are available; however, due to noise present in any data set, FS algorithms are typically accompanied by an appropriate cross-validation scheme. In this brief, we propose a statistical hypothesis test derived from the Neyman-Pearson lemma for determining if a feature is statistically relevant. The proposed approach can be applied as a wrapper to any FS algorithm, regardless of the FS criteria used by that algorithm, to determine whether a feature belongs in the relevant set. Perhaps more importantly, this procedure efficiently determines the number of relevant features given an initial starting point. We provide freely available software implementations of the proposed methodology.
NASA Astrophysics Data System (ADS)
Ceppi, C.; Mancini, F.; Ritrovato, G.
2009-04-01
This study aim at the landslide susceptibility mapping within an area of the Daunia (Apulian Apennines, Italy) by a multivariate statistical method and data manipulation in a Geographical Information System (GIS) environment. Among the variety of existing statistical data analysis techniques, the logistic regression was chosen to produce a susceptibility map all over an area where small settlements are historically threatened by landslide phenomena. By logistic regression a best fitting between the presence or absence of landslide (dependent variable) and the set of independent variables is performed on the basis of a maximum likelihood criterion, bringing to the estimation of regression coefficients. The reliability of such analysis is therefore due to the ability to quantify the proneness to landslide occurrences by the probability level produced by the analysis. The inventory of dependent and independent variables were managed in a GIS, where geometric properties and attributes have been translated into raster cells in order to proceed with the logistic regression by means of SPSS (Statistical Package for the Social Sciences) package. A landslide inventory was used to produce the bivariate dependent variable whereas the independent set of variable concerned with slope, aspect, elevation, curvature, drained area, lithology and land use after their reductions to dummy variables. The effect of independent parameters on landslide occurrence was assessed by the corresponding coefficient in the logistic regression function, highlighting a major role played by the land use variable in determining occurrence and distribution of phenomena. Once the outcomes of the logistic regression are determined, data are re-introduced in the GIS to produce a map reporting the proneness to landslide as predicted level of probability. As validation of results and regression model a cell-by-cell comparison between the susceptibility map and the initial inventory of landslide events was performed and an agreement at 75% level achieved.
Missing Data and Multiple Imputation: An Unbiased Approach
NASA Technical Reports Server (NTRS)
Foy, M.; VanBaalen, M.; Wear, M.; Mendez, C.; Mason, S.; Meyers, V.; Alexander, D.; Law, J.
2014-01-01
The default method of dealing with missing data in statistical analyses is to only use the complete observations (complete case analysis), which can lead to unexpected bias when data do not meet the assumption of missing completely at random (MCAR). For the assumption of MCAR to be met, missingness cannot be related to either the observed or unobserved variables. A less stringent assumption, missing at random (MAR), requires that missingness not be associated with the value of the missing variable itself, but can be associated with the other observed variables. When data are truly MAR as opposed to MCAR, the default complete case analysis method can lead to biased results. There are statistical options available to adjust for data that are MAR, including multiple imputation (MI) which is consistent and efficient at estimating effects. Multiple imputation uses informing variables to determine statistical distributions for each piece of missing data. Then multiple datasets are created by randomly drawing on the distributions for each piece of missing data. Since MI is efficient, only a limited number, usually less than 20, of imputed datasets are required to get stable estimates. Each imputed dataset is analyzed using standard statistical techniques, and then results are combined to get overall estimates of effect. A simulation study will be demonstrated to show the results of using the default complete case analysis, and MI in a linear regression of MCAR and MAR simulated data. Further, MI was successfully applied to the association study of CO2 levels and headaches when initial analysis showed there may be an underlying association between missing CO2 levels and reported headaches. Through MI, we were able to show that there is a strong association between average CO2 levels and the risk of headaches. Each unit increase in CO2 (mmHg) resulted in a doubling in the odds of reported headaches.
Smith, Lerissa; Zhang, Shun; Fairchild, Amanda J.; Heiman, Harry J.; Rust, George
2014-01-01
Objectives. We examined whether the timely initiation of antiretroviral therapy (ART) differed by race and comorbidity among older (≥ 50 years) people living with HIV/AIDS (PLWHA). Methods. We conducted frequency and descriptive statistics analysis to characterize our sample, which we drew from 2005–2007 Medicaid claims data from 14 states. We employed univariate and multivariable Cox regression analyses to evaluate the relationship between race, comorbidity, and timely ART initiation (≤ 90 days post-HIV/AIDS diagnosis). Results. Approximately half of the participants did not commence ART promptly. After we adjusted for covariates, we found that older PLWHA who reported a comorbidity were 40% (95% confidence interval = 0.26, 0.61) as likely to commence ART promptly. We found no racial differences in the timely initiation of ART among older PLWHA. Conclusions. Comorbidities affect timely ART initiation in older PLWHA. Older PLWHA may benefit from integrating and coordinating HIV care with care for other comorbidities and the development of ART treatment guidelines specific to older PLWHA. Consistent Medicaid coverage helps ensure consistent access to HIV treatment and care and may eliminate racial disparities in timely ART initiation among older PLWHA. PMID:25211735
Synthetic Training Data Generation for Activity Monitoring and Behavior Analysis
NASA Astrophysics Data System (ADS)
Monekosso, Dorothy; Remagnino, Paolo
This paper describes a data generator that produces synthetic data to simulate observations from an array of environment monitoring sensors. The overall goal of our work is to monitor the well-being of one occupant in a home. Sensors are embedded in a smart home to unobtrusively record environmental parameters. Based on the sensor observations, behavior analysis and modeling are performed. However behavior analysis and modeling require large data sets to be collected over long periods of time to achieve the level of accuracy expected. A data generator - was developed based on initial data i.e. data collected over periods lasting weeks to facilitate concurrent data collection and development of algorithms. The data generator is based on statistical inference techniques. Variation is introduced into the data using perturbation models.
Experiments with a three-dimensional statistical objective analysis scheme using FGGE data
NASA Technical Reports Server (NTRS)
Baker, Wayman E.; Bloom, Stephen C.; Woollen, John S.; Nestler, Mark S.; Brin, Eugenia
1987-01-01
A three-dimensional (3D), multivariate, statistical objective analysis scheme (referred to as optimum interpolation or OI) has been developed for use in numerical weather prediction studies with the FGGE data. Some novel aspects of the present scheme include: (1) a multivariate surface analysis over the oceans, which employs an Ekman balance instead of the usual geostrophic relationship, to model the pressure-wind error cross correlations, and (2) the capability to use an error correlation function which is geographically dependent. A series of 4-day data assimilation experiments are conducted to examine the importance of some of the key features of the OI in terms of their effects on forecast skill, as well as to compare the forecast skill using the OI with that utilizing a successive correction method (SCM) of analysis developed earlier. For the three cases examined, the forecast skill is found to be rather insensitive to varying the error correlation function geographically. However, significant differences are noted between forecasts from a two-dimensional (2D) version of the OI and those from the 3D OI, with the 3D OI forecasts exhibiting better forecast skill. The 3D OI forecasts are also more accurate than those from the SCM initial conditions. The 3D OI with the multivariate oceanic surface analysis was found to produce forecasts which were slightly more accurate, on the average, than a univariate version.
Quench dynamics in superconducting nanojunctions: Metastability and dynamical Yang-Lee zeros
NASA Astrophysics Data System (ADS)
Souto, R. Seoane; Martín-Rodero, A.; Yeyati, A. Levy
2017-10-01
We study the charge transfer dynamics following the formation of a phase or voltage biased superconducting nanojunction using a full counting statistics analysis. We demonstrate that the evolution of the zeros of the generating function allows one to identify the population of different many body states much in the same way as the accumulation of Yang-Lee zeros of the partition function in equilibrium statistical mechanics is connected to phase transitions. We give an exact expression connecting the dynamical zeros to the charge transfer cumulants and discuss when an approximation based on "dominant" zeros is valid. We show that, for generic values of the parameters, the system gets trapped into a metastable state characterized by a nonequilibrium population of the many body states which is dependent on the initial conditions. We study in particular the effect of the switching rates in the dynamics showing that, in contrast to intuition, the deviation from thermal equilibrium increases for the slower rates. In the voltage biased case the steady state is reached independent of the initial conditions. Our method allows us to obtain accurate results for the steady state current and noise in quantitative agreement with steady state methods developed to describe the multiple Andreev reflections regime. Finally, we discuss the system dynamics after a sudden voltage drop showing the possibility of tuning the many body states population by an appropriate choice of the initial voltage, providing a feasible experimental way to access the quench dynamics and control the state of the system.
Empirical analysis on the runners' velocity distribution in city marathons
NASA Astrophysics Data System (ADS)
Lin, Zhenquan; Meng, Fan
2018-01-01
In recent decades, much researches have been performed on human temporal activity and mobility patterns, while few investigations have been made to examine the features of the velocity distributions of human mobility patterns. In this paper, we investigated empirically the velocity distributions of finishers in New York City marathon, American Chicago marathon, Berlin marathon and London marathon. By statistical analyses on the datasets of the finish time records, we captured some statistical features of human behaviors in marathons: (1) The velocity distributions of all finishers and of partial finishers in the fastest age group both follow log-normal distribution; (2) In the New York City marathon, the velocity distribution of all male runners in eight 5-kilometer internal timing courses undergoes two transitions: from log-normal distribution at the initial stage (several initial courses) to the Gaussian distribution at the middle stage (several middle courses), and to log-normal distribution at the last stage (several last courses); (3) The intensity of the competition, which is described by the root-mean-square value of the rank changes of all runners, goes weaker from initial stage to the middle stage corresponding to the transition of the velocity distribution from log-normal distribution to Gaussian distribution, and when the competition gets stronger in the last course of the middle stage, there will come a transition from Gaussian distribution to log-normal one at last stage. This study may enrich the researches on human mobility patterns and attract attentions on the velocity features of human mobility.
Applied statistical training to strengthen analysis and health research capacity in Rwanda.
Thomson, Dana R; Semakula, Muhammed; Hirschhorn, Lisa R; Murray, Megan; Ndahindwa, Vedaste; Manzi, Anatole; Mukabutera, Assumpta; Karema, Corine; Condo, Jeanine; Hedt-Gauthier, Bethany
2016-09-29
To guide efficient investment of limited health resources in sub-Saharan Africa, local researchers need to be involved in, and guide, health system and policy research. While extensive survey and census data are available to health researchers and program officers in resource-limited countries, local involvement and leadership in research is limited due to inadequate experience, lack of dedicated research time and weak interagency connections, among other challenges. Many research-strengthening initiatives host prolonged fellowships out-of-country, yet their approaches have not been evaluated for effectiveness in involvement and development of local leadership in research. We developed, implemented and evaluated a multi-month, deliverable-driven, survey analysis training based in Rwanda to strengthen skills of five local research leaders, 15 statisticians, and a PhD candidate. Research leaders applied with a specific research question relevant to country challenges and committed to leading an analysis to publication. Statisticians with prerequisite statistical training and experience with a statistical software applied to participate in class-based trainings and complete an assigned analysis. Both statisticians and research leaders were provided ongoing in-country mentoring for analysis and manuscript writing. Participants reported a high level of skill, knowledge and collaborator development from class-based trainings and out-of-class mentorship that were sustained 1 year later. Five of six manuscripts were authored by multi-institution teams and submitted to international peer-reviewed scientific journals, and three-quarters of the participants mentored others in survey data analysis or conducted an additional survey analysis in the year following the training. Our model was effective in utilizing existing survey data and strengthening skills among full-time working professionals without disrupting ongoing work commitments and using few resources. Critical to our success were a transparent, robust application process and time limited training supplemented by ongoing, in-country mentoring toward manuscript deliverables that were led by Rwanda's health research leaders.
ViPAR: a software platform for the Virtual Pooling and Analysis of Research Data.
Carter, Kim W; Francis, Richard W; Carter, K W; Francis, R W; Bresnahan, M; Gissler, M; Grønborg, T K; Gross, R; Gunnes, N; Hammond, G; Hornig, M; Hultman, C M; Huttunen, J; Langridge, A; Leonard, H; Newman, S; Parner, E T; Petersson, G; Reichenberg, A; Sandin, S; Schendel, D E; Schalkwyk, L; Sourander, A; Steadman, C; Stoltenberg, C; Suominen, A; Surén, P; Susser, E; Sylvester Vethanayagam, A; Yusof, Z
2016-04-01
Research studies exploring the determinants of disease require sufficient statistical power to detect meaningful effects. Sample size is often increased through centralized pooling of disparately located datasets, though ethical, privacy and data ownership issues can often hamper this process. Methods that facilitate the sharing of research data that are sympathetic with these issues and which allow flexible and detailed statistical analyses are therefore in critical need. We have created a software platform for the Virtual Pooling and Analysis of Research data (ViPAR), which employs free and open source methods to provide researchers with a web-based platform to analyse datasets housed in disparate locations. Database federation permits controlled access to remotely located datasets from a central location. The Secure Shell protocol allows data to be securely exchanged between devices over an insecure network. ViPAR combines these free technologies into a solution that facilitates 'virtual pooling' where data can be temporarily pooled into computer memory and made available for analysis without the need for permanent central storage. Within the ViPAR infrastructure, remote sites manage their own harmonized research dataset in a database hosted at their site, while a central server hosts the data federation component and a secure analysis portal. When an analysis is initiated, requested data are retrieved from each remote site and virtually pooled at the central site. The data are then analysed by statistical software and, on completion, results of the analysis are returned to the user and the virtually pooled data are removed from memory. ViPAR is a secure, flexible and powerful analysis platform built on open source technology that is currently in use by large international consortia, and is made publicly available at [http://bioinformatics.childhealthresearch.org.au/software/vipar/]. © The Author 2015. Published by Oxford University Press on behalf of the International Epidemiological Association.
Radwan, Ahmed Bassiuony; El-Debeiky, Mohammed Soliman; Abdel-Hay, Sameh
2015-08-01
Overflow retentive stool incontinence (ORSI) is secondary to constipation and fecal loading. In our study, the dose and duration of senna-based laxatives (SBL) treatment to achieve full defecatory control will be examined for possible correlation with new parameters measured from the initial contrast enema. Initially, an observational study was conducted prospectively on a group of patient with ORSI to define the optimum dose of SBL to achieve full defecatory control with measurement of six parameters in the initial contrast enema (level of colonic dilatation, recto-anal angle, ratio of maximal diameter of dilated colon to last lumbar spine, ratio of maximum diameter of dilated colon to normal descending colon, immediate and after 24-h post-evacuation residual contrast). The result was analyzed statistically to reach a correlation between the radiological data and prescribed dose. Over 2 and half years, 72 patients were included in the study; their mean age was 6.3 ± 3.33 years. The mean effective starting dose of SBL was 57 ± 18.13 mg/day and the mean effective ending dose was 75 ± 31.68 mg/day. Time lapsed till full defecatory control ranged from 1 to 16 weeks. Statistical correlation revealed that mean effective ending dose of SBL treatment significantly increased with higher levels of colonic dilatation. A weak positive correlation was found for both the mean effective starting and ending doses with the ratio of maximum colonic diameter to last lumbar spine and descending colonic diameters ratio. Senna-based laxatives are effective treatment for overflow retentive stool incontinence and their doses can be adjusted initially depending on the analysis of the radiological data.
Considerations for initial dosing of botulinum toxin in treatment of adductor spasmodic dysphonia.
Rosow, David E; Parikh, Punam; Vivero, Richard J; Casiano, Roy R; Lundy, Donna S
2013-06-01
To assess the effect on voice improvement and duration of breathiness based on initial dose of onabotulinum toxin A (BTX-A) in the management of adductor spasmodic dysphonia (SD) and to compare voice outcomes for initial bilaterally injected doses of 1.25 units (group A) vs 2.5 units (group B) of BTX-A. Case series with chart review of patients with adductor SD treated at a tertiary care facility from 1990 to 2011. Academic subspecialty laryngology practice. Demographic data (age and sex), voice rating, duration of voice improvement, and breathiness were evaluated and compared between groups A and B using the Student t test and χ(2) analysis. Of 478 patients identified, 305 (223 in group A, 82 in group B) patients met inclusion criteria. The average age was 56.2 years in group A and 57.4 years in group B (P = .5). The female to male ratio was 2.91 for group A vs 3.56 for group B (P = .61). Good voice outcomes (grade 3 or 4) were reported by 91% of group A patients vs 94% of group B (P = .75). The average duration of voice improvement was 99.7 days for group A and 108.3 days for group B (P = .54). The average duration of breathiness was 10.88 days for group A vs 15.42 days for group B (P = .02). Patients injected with 1.25 units bilaterally had a statistically significant shorter duration of breathiness without a statistically significant difference in clinical effectiveness or voice outcome. It is therefore recommended that a relatively low initial BTX-A dose be used with subsequent titration to achieve improved voice outcomes.
Common, Jessica L; Mariathas, Hensley H; Parsons, Kaylah; Greenland, Jonathan D; Harris, Scott; Bhatia, Rick; Byrne, SuzanneC
2018-06-04
A multidisciplinary, centralized referral program was established at our institution in 2014 to reduce delays in lung cancer diagnosis and treatment following diagnostic imaging observed with the traditional, primary care provider-led referral process. The main objectives of this retrospective cohort study were to determine if referral to a Thoracic Triage Panel (TTP): 1) expedites lung cancer diagnosis and treatment initiation; and 2) leads to more appropriate specialist consultation. Patients with a diagnosis of lung cancer and initial diagnostic imaging between March 1, 2015, and February 29, 2016, at a Memorial University-affiliated tertiary care centre in St John's, Newfoundland, were identified and grouped according to whether they were referred to the TTP or managed through a traditional referral process. Wait times (in days) from first abnormal imaging to biopsy and treatment initiation were recorded. Statistical analysis was performed using the Wilcoxon rank-sum test. A total of 133 patients who met inclusion criteria were identified. Seventy-nine patients were referred to the TTP and 54 were managed by traditional means. There was a statistically significant reduction in median wait times for patients referred to the TTP. Wait time from first abnormal imaging to biopsy decreased from 61.5 to 36.0 days (P < .0001). Wait time from first abnormal imaging to treatment initiation decreased from 118.0 to 80.0 days (P < .001). The percentage of specialist consultations that led to treatment was also greater for patients referred to the TTP. A collaborative, centralized intake and referral program helps to reduce wait time for diagnosis and treatment of lung cancer. Copyright © 2018 Canadian Association of Radiologists. Published by Elsevier Inc. All rights reserved.
Educational debt: does it have an influence on initial job location and specialty choice?
Snyder, Jennifer; Nehrenz, Guy; Danielsen, Randy; Pedersen, Donald
2014-01-01
This study applied a quantitative design and analyzed the impact of educational debt on initial specialty and location choices for physician assistant (PA) graduates in Indiana. PAs who graduated between January 1, 2000, and December 31, 2010, and actively practice in Indiana were surveyed. Descriptive statistics and chi-square analyses were performed to determine whether any significant relationships existed among practice specialty, location, and gender. 157 participants (33%) responded to the survey and were considered in the final analysis. Males were more likely than females to be influenced by debt in choosing their specialty and the location of their initial job. A majority of PAs would have reconsidered rural practice if they had received federal and or state loan forgiveness for educational debt. This study provides evidence that debt may influence practice specialty and location choice. Further studies are needed to determine how gender might account for decisions to practice in certain specialties and location.
Large deformation image classification using generalized locality-constrained linear coding.
Zhang, Pei; Wee, Chong-Yaw; Niethammer, Marc; Shen, Dinggang; Yap, Pew-Thian
2013-01-01
Magnetic resonance (MR) imaging has been demonstrated to be very useful for clinical diagnosis of Alzheimer's disease (AD). A common approach to using MR images for AD detection is to spatially normalize the images by non-rigid image registration, and then perform statistical analysis on the resulting deformation fields. Due to the high nonlinearity of the deformation field, recent studies suggest to use initial momentum instead as it lies in a linear space and fully encodes the deformation field. In this paper we explore the use of initial momentum for image classification by focusing on the problem of AD detection. Experiments on the public ADNI dataset show that the initial momentum, together with a simple sparse coding technique-locality-constrained linear coding (LLC)--can achieve a classification accuracy that is comparable to or even better than the state of the art. We also show that the performance of LLC can be greatly improved by introducing proper weights to the codebook.
No association of SORL1 SNPs with Alzheimer’s disease
Minster, Ryan L.; DeKosky, Steven T.; Kamboh, M. Ilyas
2008-01-01
SORL1 is an element of the amyloid precursor protein processing pathway and is therefore a good candidate for affecting Alzheimer’s disease (AD) risk. Indeed, there have been reports of associations between variation in SORL1 and AD risk. We examined six statistically significant single-nucleotide polymorphisms from the initial observation in a large Caucasian American case–controls cohort (1000 late-onset AD [LOAD] cases and 1000 older controls). Analysis of allele, genotype and haplotype frequencies revealed no association with LOAD risk in our cohort. PMID:18562096
Substorm injection boundaries. [magnetospheric electric field model
NASA Technical Reports Server (NTRS)
Mcilwain, C. E.
1974-01-01
An improved magnetospheric electric field model is used to compute the initial locations of particles injected by several substorms. Trajectories are traced from the time of their encounter with the ATS-5 satellite backwards to the onset time given by ground-based magnetometers. A spiral shaped inner boundary of injection is found which is quite similar to that found by a statistical analysis. This injection boundary is shown to move in an energy dependent fashion which can explain the soft energy spectra observed at the inner edge of the electrons plasma sheet.
An unsupervised classification technique for multispectral remote sensing data.
NASA Technical Reports Server (NTRS)
Su, M. Y.; Cummings, R. E.
1973-01-01
Description of a two-part clustering technique consisting of (a) a sequential statistical clustering, which is essentially a sequential variance analysis, and (b) a generalized K-means clustering. In this composite clustering technique, the output of (a) is a set of initial clusters which are input to (b) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by traditional supervised maximum-likelihood classification techniques.
Unsupervised classification of earth resources data.
NASA Technical Reports Server (NTRS)
Su, M. Y.; Jayroe, R. R., Jr.; Cummings, R. E.
1972-01-01
A new clustering technique is presented. It consists of two parts: (a) a sequential statistical clustering which is essentially a sequential variance analysis and (b) a generalized K-means clustering. In this composite clustering technique, the output of (a) is a set of initial clusters which are input to (b) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by existing supervised maximum liklihood classification technique.
Initial statistics from the Perth Automated Supernova Search
NASA Astrophysics Data System (ADS)
Williams, A. J.
1997-08-01
The Perth Automated Supernova Search uses the 61-cm PLAT (Perth Lowell Automated Telescope) at Perth Observatory, Western Australia. Since 1993 January 1, five confirmed supernovae have been found by the search. The analysis of the first three years of data is discussed, and preliminary results presented. We find a Type Ib/c rate of 0.43 +/- 0.43 SNu, and a Type IIP rate of 0.86 +/- 0.49 SNu, where SNu are 'supernova units'. These values are for a Hubble constant of 75 km per sec per Mpc.
Reliability analysis of structural ceramics subjected to biaxial flexure
NASA Technical Reports Server (NTRS)
Chao, Luen-Yuan; Shetty, Dinesh K.
1991-01-01
The reliability of alumina disks subjected to biaxial flexure is predicted on the basis of statistical fracture theory using a critical strain energy release rate fracture criterion. Results on a sintered silicon nitride are consistent with reliability predictions based on pore-initiated penny-shaped cracks with preferred orientation normal to the maximum principal stress. Assumptions with regard to flaw types and their orientations in each ceramic can be justified by fractography. It is shown that there are no universal guidelines for selecting fracture criteria or assuming flaw orientations in reliability analyses.
2012-09-01
Originally, Resio and Vincent (1977) used theoretical results derived from Cardone (1969) to develop curves relating overland to overlake wind speeds...Later, Schwab (1978) proposed the following equation as an approximation to the Cardone curves: ERDC/CHL TR-12-19 13 /Δ. Δ . ΔW L L TTU U U T...1992), the CEM (USACE 2002), and in Smith (1991); Schwab and Morton (1984); Donelan (1980); Schwab (1978); Resio and Vincent (1977); and Cardone
Metrology in health: a pilot study
NASA Astrophysics Data System (ADS)
Ferreira, M.; Matos, A.
2015-02-01
The purpose of this paper is to identify and analyze some relevant issues which arise when the concept of metrological traceability is applied to health care facilities. Discussion is structured around the results that were obtained through a characterization and comparative description of the practices applied in 45 different Portuguese health entities. Following a qualitative exploratory approach, the information collected was the support for the initial research hypotheses and the development of the questionnaire survey. It was also applied a quantitative methodology that included a descriptive and inferential statistical analysis of the experimental data set.
A new criterion for predicting rolling-element fatigue lives of through-hardened steels
NASA Technical Reports Server (NTRS)
Chevalier, J. L.; Zaretsky, E. V.; Parker, R. J.
1972-01-01
A carbide factor was derived based upon a statistical analysis which related rolling-element fatigue life to the total number of residual carbide particles per unit area, median residual carbide size, and percent residual carbide area. An equation was experimentally determined which predicts material hardness as a function of temperature. The limiting temperatures of all of the materials studied were dependent on initial room temperature hardness and tempering temperature. An equation was derived combining the effects of material hardness, carbide factor, and bearing temperature to predict rolling-element bearing life.
Askarian, Mehrdad; Kouchak, Farideh; Youssef, Moussa; Romito, Laura M
2013-10-01
To compare the level of knowledge, the attitudes, and practices with regards to tobacco use between Iranian students at a public (PBU) and Islamic Azad (IAU) university. A cross-sectional design was used in this study. As the number of students at the IAU were three times greater than that of the PBU, we selected 150 students from the PBU and 450 students from the IAU using simple random sampling. A 57-item survey instrument was utilized for this study. The collected data were recorded by SPSS version 15 software and then it underwent statistical analysis using descriptive statistics and ANOVA to compare the difference between means of knowledge, attitude and practice scores. Logistic regression analysis was conducted to identify variables that have an independent association with students smoking and to describe possible variations in these relationships. The P value level for statistical significance was set at 0.05. From participants, 46.8% were females, 10% of 327 students reported being daily smokers; of these, 84% were from the IAU. Totally, among the 107 smokers, 61 (57%) and 29 (27.1%) were water pipe and cigarettes smokers, respectively. Ninety-three IAU students (21.7%) and 30 PBU students (20.7%) reported smoking during the past 30 days. The mean of the knowledge items between the students of IAU was lower than PBU students. Female gender, smoking in the home, and allowing visitors to smoke in the home were significant predictors of smoking in the past 30 days in PBU, respectively. In IAU, female gender, smoking by friends, and health status were predictors for smoking in the past 30 days. Future studies should assess the factors affecting smoking initiation, as well as effective techniques for the prevention of smoking initiation and substance abuse in Iranian adolescents and young adults.
AG Channel Measurement and Modeling Results for Over-Water and Hilly Terrain Conditions
NASA Technical Reports Server (NTRS)
Matolak, David W.; Sun, Ruoyu
2015-01-01
This report describes work completed over the past year on our project, entitled "Unmanned Aircraft Systems (UAS) Research: The AG Channel, Robust Waveforms, and Aeronautical Network Simulations." This project is funded under the NASA project "Unmanned Aircraft Systems (UAS) in the National Airspace System (NAS)." In this report we provide the following: an update on project progress; a description of the over-freshwater and hilly terrain initial results on path loss, delay spread, small-scale fading, and correlations; complete path loss models for the over-water AG channels; analysis for obtaining parameter statistics required for development of accurate wideband AG channel models; and analysis of an atypical AG channel in which the aircraft flies out of the ground site antenna main beam. We have modeled the small-scale fading of these channels with Ricean statistics, and have quantified the behavior of the Ricean K-factor. We also provide some results for correlations of signal components, both intra-band and inter-band. An updated literature review, and a summary that also describes future work, are also included.
NASA Technical Reports Server (NTRS)
Bauman, William H., III
2010-01-01
The 45th Weather Squadron (45 WS) Launch Weather Officers use the 12-km resolution North American Mesoscale (NAM) model (MesoNAM) text and graphical product forecasts extensively to support launch weather operations. However, the actual performance of the model at Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS) has not been measured objectively. In order to have tangible evidence of model performance, the 45 WS tasked the Applied Meteorology Unit to conduct a detailed statistical analysis of model output compared to observed values. The model products are provided to the 45 WS by ACTA, Inc. and include hourly forecasts from 0 to 84 hours based on model initialization times of 00, 06, 12 and 18 UTC. The objective analysis compared the MesoNAM forecast winds, temperature and dew point, as well as the changes in these parameters over time, to the observed values from the sensors in the KSC/CCAFS wind tower network. Objective statistics will give the forecasters knowledge of the model's strength and weaknesses, which will result in improved forecasts for operations.
The accurate assessment of small-angle X-ray scattering data
Grant, Thomas D.; Luft, Joseph R.; Carter, Lester G.; ...
2015-01-23
Small-angle X-ray scattering (SAXS) has grown in popularity in recent times with the advent of bright synchrotron X-ray sources, powerful computational resources and algorithms enabling the calculation of increasingly complex models. However, the lack of standardized data-quality metrics presents difficulties for the growing user community in accurately assessing the quality of experimental SAXS data. Here, a series of metrics to quantitatively describe SAXS data in an objective manner using statistical evaluations are defined. These metrics are applied to identify the effects of radiation damage, concentration dependence and interparticle interactions on SAXS data from a set of 27 previously described targetsmore » for which high-resolution structures have been determined via X-ray crystallography or nuclear magnetic resonance (NMR) spectroscopy. Studies show that these metrics are sufficient to characterize SAXS data quality on a small sample set with statistical rigor and sensitivity similar to or better than manual analysis. The development of data-quality analysis strategies such as these initial efforts is needed to enable the accurate and unbiased assessment of SAXS data quality.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, R O; Essington, E H; Brady, D N
Statistical design and analysis activities for the Nevada Applied Ecology Group (NAEG) during 1976 are briefly outlined. This is followed by a description of soil data collected thus far at nuclear study sites. Radionuclide concentrations in surface soil collected along a transect from ground zero (GZ) along the main fallout pattern are given for Nuclear Site (NS) 201. Concentrations in soil collected at 315 locations on a grid system at 200 foot spacings are also given for this site. The /sup 241/Am to /sup 137/Cs ratios change over NS 201 depending on location relative to GZ. They range from lessmore » than one where /sup 241/Am is at low levels, to more than fifty where /sup 241/Am levels are high (near GZ). The estimated median /sup 239/ /sup 240/Pu to /sup 241/Am ratio is 11 and appears to be relatively constant over the area (the 95 percent lower and upper limits on the true median ratio are about 8 and 14).« less
Thompson, Cheryl Bagley
2009-01-01
This 13th article of the Basics of Research series is first in a short series on statistical analysis. These articles will discuss creating your statistical analysis plan, levels of measurement, descriptive statistics, probability theory, inferential statistics, and general considerations for interpretation of the results of a statistical analysis.
García-Sempere, Aníbal; Bejarano-Quisoboni, Daniel; Librero, Julián; Rodríguez-Bernal, Clara L; Peiró, Salvador; Sanfélix-Gimeno, Gabriel
2017-01-01
Introduction: Beyond clinical trials, clinical practice guidelines, and administrative regulation, treatment decision-making can be influenced by individual and contextual factors. Our goal was to describe variations in the patterns of initiation of anticoagulation therapy in patients with atrial fibrillation by Health Areas (HA) in the region of Valencia in Spain and to quantify the influence of the HAs on variations in treatment choice. Methods: We conducted a population-based retrospective cohort study of all atrial fibrillation patients who started treatment with oral anticoagulants between November 2011 and February 2014 in each of the region's 24 HAs. We described patient and utilization characteristics per HA and initiation patterns over time, and we identified contextual and individual factors associated with differences in initiation patterns. Results: 21,879 patients initiated treatment with an oral anticoagulant in the 24 HAs. Initiation with direct oral anticoagulants (DOAC) in the first year was 14.6%. In November 2013 the ratio was 25.4%, with HA ratios ranging from 3.8 to 57.1%. DOAC-initiating patients had less comorbidity but were more likely to present episodes of previous ischemic stroke, hemorrhagic stroke, or TIA when compared with patients initiating with VKA treatment. Variability among HAs was statistically significant, with the majority of HAs ranking above or below the regional initiation average (ICC ≈ 8%). Conclusion: There was high variability in the percentage of DOAC initiation and in the choice of DOAC among HAs. Interventions aimed to improve DOAC initiation decision-making and to reduce variations should take into account the Health Area component.
García-Sempere, Aníbal; Bejarano-Quisoboni, Daniel; Librero, Julián; Rodríguez-Bernal, Clara L.; Peiró, Salvador; Sanfélix-Gimeno, Gabriel
2017-01-01
Introduction: Beyond clinical trials, clinical practice guidelines, and administrative regulation, treatment decision-making can be influenced by individual and contextual factors. Our goal was to describe variations in the patterns of initiation of anticoagulation therapy in patients with atrial fibrillation by Health Areas (HA) in the region of Valencia in Spain and to quantify the influence of the HAs on variations in treatment choice. Methods: We conducted a population-based retrospective cohort study of all atrial fibrillation patients who started treatment with oral anticoagulants between November 2011 and February 2014 in each of the region's 24 HAs. We described patient and utilization characteristics per HA and initiation patterns over time, and we identified contextual and individual factors associated with differences in initiation patterns. Results: 21,879 patients initiated treatment with an oral anticoagulant in the 24 HAs. Initiation with direct oral anticoagulants (DOAC) in the first year was 14.6%. In November 2013 the ratio was 25.4%, with HA ratios ranging from 3.8 to 57.1%. DOAC-initiating patients had less comorbidity but were more likely to present episodes of previous ischemic stroke, hemorrhagic stroke, or TIA when compared with patients initiating with VKA treatment. Variability among HAs was statistically significant, with the majority of HAs ranking above or below the regional initiation average (ICC ≈ 8%). Conclusion: There was high variability in the percentage of DOAC initiation and in the choice of DOAC among HAs. Interventions aimed to improve DOAC initiation decision-making and to reduce variations should take into account the Health Area component. PMID:28883793
NASA Astrophysics Data System (ADS)
Kolski, Jeffrey
The linear lattice properties of the Proton Storage Ring (PSR) at the Los Alamos Neutron Science Center (LANSCE) in Los Alamos, NM were measured and applied to determine a better linear accelerator model. We found that the initial model was deficient in predicting the vertical focusing strength. The additional vertical focusing was located through fundamental understanding of experiment and statistically rigorous analysis. An improved model was constructed and compared against the initial model and measurement at operation set points and set points far away from nominal and was shown to indeed be an enhanced model. Independent component analysis (ICA) is a tool for data mining in many fields of science. Traditionally, ICA is applied to turn-by-turn beam position data as a means to measure the lattice functions of the real machine. Due to the diagnostic setup for the PSR, this method is not applicable. A new application method for ICA is derived, ICA applied along the length of the bunch. The ICA modes represent motions within the beam pulse. Several of the dominate ICA modes are experimentally identified.
Chaotic Image Encryption Algorithm Based on Bit Permutation and Dynamic DNA Encoding.
Zhang, Xuncai; Han, Feng; Niu, Ying
2017-01-01
With the help of the fact that chaos is sensitive to initial conditions and pseudorandomness, combined with the spatial configurations in the DNA molecule's inherent and unique information processing ability, a novel image encryption algorithm based on bit permutation and dynamic DNA encoding is proposed here. The algorithm first uses Keccak to calculate the hash value for a given DNA sequence as the initial value of a chaotic map; second, it uses a chaotic sequence to scramble the image pixel locations, and the butterfly network is used to implement the bit permutation. Then, the image is coded into a DNA matrix dynamic, and an algebraic operation is performed with the DNA sequence to realize the substitution of the pixels, which further improves the security of the encryption. Finally, the confusion and diffusion properties of the algorithm are further enhanced by the operation of the DNA sequence and the ciphertext feedback. The results of the experiment and security analysis show that the algorithm not only has a large key space and strong sensitivity to the key but can also effectively resist attack operations such as statistical analysis and exhaustive analysis.
Chaotic Image Encryption Algorithm Based on Bit Permutation and Dynamic DNA Encoding
2017-01-01
With the help of the fact that chaos is sensitive to initial conditions and pseudorandomness, combined with the spatial configurations in the DNA molecule's inherent and unique information processing ability, a novel image encryption algorithm based on bit permutation and dynamic DNA encoding is proposed here. The algorithm first uses Keccak to calculate the hash value for a given DNA sequence as the initial value of a chaotic map; second, it uses a chaotic sequence to scramble the image pixel locations, and the butterfly network is used to implement the bit permutation. Then, the image is coded into a DNA matrix dynamic, and an algebraic operation is performed with the DNA sequence to realize the substitution of the pixels, which further improves the security of the encryption. Finally, the confusion and diffusion properties of the algorithm are further enhanced by the operation of the DNA sequence and the ciphertext feedback. The results of the experiment and security analysis show that the algorithm not only has a large key space and strong sensitivity to the key but can also effectively resist attack operations such as statistical analysis and exhaustive analysis. PMID:28912802
Trade Studies of Space Launch Architectures using Modular Probabilistic Risk Analysis
NASA Technical Reports Server (NTRS)
Mathias, Donovan L.; Go, Susie
2006-01-01
A top-down risk assessment in the early phases of space exploration architecture development can provide understanding and intuition of the potential risks associated with new designs and technologies. In this approach, risk analysts draw from their past experience and the heritage of similar existing systems as a source for reliability data. This top-down approach captures the complex interactions of the risk driving parts of the integrated system without requiring detailed knowledge of the parts themselves, which is often unavailable in the early design stages. Traditional probabilistic risk analysis (PRA) technologies, however, suffer several drawbacks that limit their timely application to complex technology development programs. The most restrictive of these is a dependence on static planning scenarios, expressed through fault and event trees. Fault trees incorporating comprehensive mission scenarios are routinely constructed for complex space systems, and several commercial software products are available for evaluating fault statistics. These static representations cannot capture the dynamic behavior of system failures without substantial modification of the initial tree. Consequently, the development of dynamic models using fault tree analysis has been an active area of research in recent years. This paper discusses the implementation and demonstration of dynamic, modular scenario modeling for integration of subsystem fault evaluation modules using the Space Architecture Failure Evaluation (SAFE) tool. SAFE is a C++ code that was originally developed to support NASA s Space Launch Initiative. It provides a flexible framework for system architecture definition and trade studies. SAFE supports extensible modeling of dynamic, time-dependent risk drivers of the system and functions at the level of fidelity for which design and failure data exists. The approach is scalable, allowing inclusion of additional information as detailed data becomes available. The tool performs a Monte Carlo analysis to provide statistical estimates. Example results of an architecture system reliability study are summarized for an exploration system concept using heritage data from liquid-fueled expendable Saturn V/Apollo launch vehicles.
SETI in vivo: testing the we-are-them hypothesis
NASA Astrophysics Data System (ADS)
Makukov, Maxim A.; Shcherbak, Vladimir I.
2018-04-01
After it was proposed that life on Earth might descend from seeding by an earlier extraterrestrial civilization motivated to secure and spread life, some authors noted that this alternative offers a testable implication: microbial seeds could be intentionally supplied with a durable signature that might be found in extant organisms. In particular, it was suggested that the optimal location for such an artefact is the genetic code, as the least evolving part of cells. However, as the mainstream view goes, this scenario is too speculative and cannot be meaningfully tested because encoding/decoding a signature within the genetic code is something ill-defined, so any retrieval attempt is doomed to guesswork. Here we refresh the seeded-Earth hypothesis in light of recent observations, and discuss the motivation for inserting a signature. We then show that `biological SETI' involves even weaker assumptions than traditional SETI and admits a well-defined methodological framework. After assessing the possibility in terms of molecular and evolutionary biology, we formalize the approach and, adopting the standard guideline of SETI that encoding/decoding should follow from first principles and be convention-free, develop a universal retrieval strategy. Applied to the canonical genetic code, it reveals a non-trivial precision structure of interlocked logical and numerical attributes of systematic character (previously we found these heuristically). To assess this result in view of the initial assumption, we perform statistical, comparison, interdependence and semiotic analyses. Statistical analysis reveals no causal connection of the result to evolutionary models of the genetic code, interdependence analysis precludes overinterpretation, and comparison analysis shows that known variations of the code lack any precision-logic structures, in agreement with these variations being post-LUCA (i.e. post-seeding) evolutionary deviations from the canonical code. Finally, semiotic analysis shows that not only the found attributes are consistent with the initial assumption, but that they make perfect sense from SETI perspective, as they ultimately maintain some of the most universal codes of culture.
ERIC Educational Resources Information Center
Theoret, Julie M.; Luna, Andrea
2009-01-01
This action research combined qualitative and quantitative techniques to investigate two different types of writing assignments in an introductory undergraduate statistics course. The assignments were written in response to the same set of prompts but in two different ways: homework journal assignments or initial posts to a computer discussion…
Developing Young Children's Emergent Inferential Practices in Statistics
ERIC Educational Resources Information Center
Makar, Katie
2016-01-01
Informal statistical inference has now been researched at all levels of schooling and initial tertiary study. Work in informal statistical inference is least understood in the early years, where children have had little if any exposure to data handling. A qualitative study in Australia was carried out through a series of teaching experiments with…
Minică, Camelia C; Dolan, Conor V; Hottenga, Jouke-Jan; Willemsen, Gonneke; Vink, Jacqueline M; Boomsma, Dorret I
2013-05-01
When phenotypic, but no genotypic data are available for relatives of participants in genetic association studies, previous research has shown that family-based imputed genotypes can boost the statistical power when included in such studies. Here, using simulations, we compared the performance of two statistical approaches suitable to model imputed genotype data: the mixture approach, which involves the full distribution of the imputed genotypes and the dosage approach, where the mean of the conditional distribution features as the imputed genotype. Simulations were run by varying sibship size, size of the phenotypic correlations among siblings, imputation accuracy and minor allele frequency of the causal SNP. Furthermore, as imputing sibling data and extending the model to include sibships of size two or greater requires modeling the familial covariance matrix, we inquired whether model misspecification affects power. Finally, the results obtained via simulations were empirically verified in two datasets with continuous phenotype data (height) and with a dichotomous phenotype (smoking initiation). Across the settings considered, the mixture and the dosage approach are equally powerful and both produce unbiased parameter estimates. In addition, the likelihood-ratio test in the linear mixed model appears to be robust to the considered misspecification in the background covariance structure, given low to moderate phenotypic correlations among siblings. Empirical results show that the inclusion in association analysis of imputed sibling genotypes does not always result in larger test statistic. The actual test statistic may drop in value due to small effect sizes. That is, if the power benefit is small, that the change in distribution of the test statistic under the alternative is relatively small, the probability is greater of obtaining a smaller test statistic. As the genetic effects are typically hypothesized to be small, in practice, the decision on whether family-based imputation could be used as a means to increase power should be informed by prior power calculations and by the consideration of the background correlation.
Seven ways to increase power without increasing N.
Hansen, W B; Collins, L M
1994-01-01
Many readers of this monograph may wonder why a chapter on statistical power was included. After all, by now the issue of statistical power is in many respects mundane. Everyone knows that statistical power is a central research consideration, and certainly most National Institute on Drug Abuse grantees or prospective grantees understand the importance of including a power analysis in research proposals. However, there is ample evidence that, in practice, prevention researchers are not paying sufficient attention to statistical power. If they were, the findings observed by Hansen (1992) in a recent review of the prevention literature would not have emerged. Hansen (1992) examined statistical power based on 46 cohorts followed longitudinally, using nonparametric assumptions given the subjects' age at posttest and the numbers of subjects. Results of this analysis indicated that, in order for a study to attain 80-percent power for detecting differences between treatment and control groups, the difference between groups at posttest would need to be at least 8 percent (in the best studies) and as much as 16 percent (in the weakest studies). In order for a study to attain 80-percent power for detecting group differences in pre-post change, 22 of the 46 cohorts would have needed relative pre-post reductions of greater than 100 percent. Thirty-three of the 46 cohorts had less than 50-percent power to detect a 50-percent relative reduction in substance use. These results are consistent with other review findings (e.g., Lipsey 1990) that have shown a similar lack of power in a broad range of research topics. Thus, it seems that, although researchers are aware of the importance of statistical power (particularly of the necessity for calculating it when proposing research), they somehow are failing to end up with adequate power in their completed studies. This chapter argues that the failure of many prevention studies to maintain adequate statistical power is due to an overemphasis on sample size (N) as the only, or even the best, way to increase statistical power. It is easy to see how this overemphasis has come about. Sample size is easy to manipulate, has the advantage of being related to power in a straight-forward way, and usually is under the direct control of the researcher, except for limitations imposed by finances or subject availability. Another option for increasing power is to increase the alpha used for hypothesis-testing but, as very few researchers seriously consider significance levels much larger than the traditional .05, this strategy seldom is used. Of course, sample size is important, and the authors of this chapter are not recommending that researchers cease choosing sample sizes carefully. Rather, they argue that researchers should not confine themselves to increasing N to enhance power. It is important to take additional measures to maintain and improve power over and above making sure the initial sample size is sufficient. The authors recommend two general strategies. One strategy involves attempting to maintain the effective initial sample size so that power is not lost needlessly. The other strategy is to take measures to maximize the third factor that determines statistical power: effect size.
Ricker, Martin; Peña Ramírez, Víctor M; von Rosen, Dietrich
2014-01-01
Growth curves are monotonically increasing functions that measure repeatedly the same subjects over time. The classical growth curve model in the statistical literature is the Generalized Multivariate Analysis of Variance (GMANOVA) model. In order to model the tree trunk radius (r) over time (t) of trees on different sites, GMANOVA is combined here with the adapted PL regression model Q = A · T+E, where for b ≠ 0 : Q = Ei[-b · r]-Ei[-b · r1] and for b = 0 : Q = Ln[r/r1], A = initial relative growth to be estimated, T = t-t1, and E is an error term for each tree and time point. Furthermore, Ei[-b · r] = ∫(Exp[-b · r]/r)dr, b = -1/TPR, with TPR being the turning point radius in a sigmoid curve, and r1 at t1 is an estimated calibrating time-radius point. Advantages of the approach are that growth rates can be compared among growth curves with different turning point radiuses and different starting points, hidden outliers are easily detectable, the method is statistically robust, and heteroscedasticity of the residuals among time points is allowed. The model was implemented with dendrochronological data of 235 Pinus montezumae trees on ten Mexican volcano sites to calculate comparison intervals for the estimated initial relative growth A. One site (at the Popocatépetl volcano) stood out, with A being 3.9 times the value of the site with the slowest-growing trees. Calculating variance components for the initial relative growth, 34% of the growth variation was found among sites, 31% among trees, and 35% over time. Without the Popocatépetl site, the numbers changed to 7%, 42%, and 51%. Further explanation of differences in growth would need to focus on factors that vary within sites and over time.
Summary Statistics for Homemade ?Play Dough? -- Data Acquired at LLNL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kallman, J S; Morales, K E; Whipple, R E
Using x-ray computerized tomography (CT), we have characterized the x-ray linear attenuation coefficients (LAC) of a homemade Play Dough{trademark}-like material, designated as PDA. Table 1 gives the first-order statistics for each of four CT measurements, estimated with a Gaussian kernel density estimator (KDE) analysis. The mean values of the LAC range from a high of about 2700 LMHU{sub D} 100kVp to a low of about 1200 LMHUD at 300kVp. The standard deviation of each measurement is around 10% to 15% of the mean. The entropy covers the range from 6.0 to 7.4. Ordinarily, we would model the LAC of themore » material and compare the modeled values to the measured values. In this case, however, we did not have the detailed chemical composition of the material and therefore did not model the LAC. Using a method recently proposed by Lawrence Livermore National Laboratory (LLNL), we estimate the value of the effective atomic number, Z{sub eff}, to be near 10. LLNL prepared about 50mL of the homemade 'Play Dough' in a polypropylene vial and firmly compressed it immediately prior to the x-ray measurements. We used the computer program IMGREC to reconstruct the CT images. The values of the key parameters used in the data capture and image reconstruction are given in this report. Additional details may be found in the experimental SOP and a separate document. To characterize the statistical distribution of LAC values in each CT image, we first isolated an 80% central-core segment of volume elements ('voxels') lying completely within the specimen, away from the walls of the polypropylene vial. All of the voxels within this central core, including those comprised of voids and inclusions, are included in the statistics. We then calculated the mean value, standard deviation and entropy for (a) the four image segments and for (b) their digital gradient images. (A digital gradient image of a given image was obtained by taking the absolute value of the difference between the initial image and that same image offset by one voxel horizontally, parallel to the rows of the x-ray detector array.) The statistics of the initial image of LAC values are called 'first order statistics;' those of the gradient image, 'second order statistics.'« less
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-23
... in Tables A and B. Table D--Borrower Closing Costs and Seller Concessions Descriptive Statistics by... accuracy of the statistical data illustrating the correlation between higher seller concessions and an...
Rear-End Crashes: Problem Size Assessment And Statistical Description
DOT National Transportation Integrated Search
1993-05-01
KEYWORDS : RESEARCH AND DEVELOPMENT OR R&D, ADVANCED VEHICLE CONTROL & SAFETY SYSTEMS OR AVCSS, INTELLIGENT VEHICLE INITIATIVE OR IVI : THIS DOCUMENT PRESENTS PROBLEM SIZE ASSESSMENTS AND STATISTICAL CRASH DESCRIPTION FOR REAR-END CRASHES, INC...
Maurer, Jürgen
2016-10-01
Influenza vaccination is strongly associated with socioeconomic status, but there is only limited evidence on the respective roles of socioeconomic differences in vaccination intentions versus corresponding differences in follow-through on initial vaccination plans for subsequent socioeconomic differences in vaccine uptake. Nonparametric mean smoothing, linear regression, and probit models were used to analyze longitudinal survey data on perceived influenza risks, behavioral vaccination intentions, and vaccination behavior of adults during the 2009-2010 influenza A/H1N1 ("swine flu") pandemic in the United States. Perceived influenza risks and behavioral vaccination intentions were elicited prior to the availability of H1N1 vaccine using a probability scale question format. H1N1 vaccine uptake was assessed at the end of the pandemic. Education, income, and health insurance coverage displayed positive associations with behavioral intentions to get vaccinated for pandemic influenza while employment was negatively associated with stated H1N1 vaccination intentions. Education and health insurance coverage also displayed significant positive associations with pandemic vaccine uptake. Moreover, behavioral vaccination intentions showed a strong and statistically significant positive partial association with later H1N1 vaccination. Incorporating vaccination intentions in a statistical model for H1N1 vaccine uptake further highlighted higher levels of follow-through on initial vaccination plans among persons with higher education levels and health insurance. Sampling bias, misreporting in self-reported data, and limited generalizability to nonpandemic influenza are potential limitations of the analysis. Closing the socioeconomic gap in influenza vaccination requires multipronged strategies that not only increase vaccination intentions by improving knowledge, attitudes, and beliefs but also facilitate follow-through on initial vaccination plans by improving behavioral control and access to vaccination for individuals with low education, employed persons, and the uninsured. © The Author(s) 2015.
Maurer, Jürgen
2015-01-01
Background Influenza vaccination is strongly associated with socioeconomic status, but there is only limited evidence on the respective roles of socioeconomic differences in vaccination intentions vs. corresponding differences in follow through on initial vaccination plans for subsequent socioeconomic differences in vaccine uptake. Methods Nonparametric mean smoothing, linear regression and Probit models were used to analyze longitudinal survey data on perceived influenza risks, behavioral vaccination intentions and vaccination behavior of adults during the 2009-10 influenza A/H1N1 (“Swine Flu”) pandemic in the United States. Perceived influenza risks and behavioral vaccination intentions were elicited prior to the availability of H1N1 vaccine using a probability scale question format. H1N1 vaccine uptake was assessed at the end of the pandemic. Results Education, income and health insurance coverage displayed positive associations with behavioral intentions to get vaccinated for pandemic influenza while employment was negatively associated with stated H1N1 vaccination intentions. Education and health insurance coverage also displayed significant positive associations with pandemic vaccine uptake. Moreover, behavioral vaccination intentions showed a strong and statistically significant positive partial association with later H1N1 vaccination. Incorporating vaccination intentions in a statistical model for H1N1 vaccine uptake further highlighted higher levels of follow through on initial vaccination plans among persons with higher education levels and health insurance. Limitations Sampling bias, misreporting in self-reported data, and limited generalizability to non-pandemic influenza are potential limitations of the analysis. Conclusions Closing the socioeconomic gap in influenza vaccination requires multi-pronged strategies that not only increase vaccination intentions by improving knowledge, attitudes and beliefs but also facilitate follow through on initial vaccination plans by improving behavioral control and access to vaccination for individuals with low education, employed persons and the uninsured. PMID:26416814
Sandhu, Satpal Singh; Sandhu, Jasleen
2013-01-01
Objective:To investigate and compare the effects of superelastic nickel–titanium and multistranded stainless steel archwires on pain during the initial phase of orthodontic treatment. Design:A double-blind two-arm parallel design stratified randomized clinical trial. Setting:A single centre in India between December 2010 and June 2012. A total of 96 participants (48 male and 48 females; 14.1±2.1 years old) were randomized (stratified on age, sex and initial crowding) to superelastic nickel–titanium or multistranded stainless steel archwire groups using a computer-generated allocation sequence. Methods:We compared 0.016-inch superelastic nickel–titanium and 0.0175-inch multistranded stainless steel wires in 0.022-inch slot (Roth prescription) preadjusted edgewise appliances. The follow-up period was 14 days. Outcome was assessed with a visual analogue scale at baseline and 32 pre-specified follow-up points. Data was analyzed using mixed-effects model analysis. Results:One participant was lost to follow up and 10 were excluded from the analysis due to bond failure or incomplete questionnaire answers. Ultimately, 85 participants (42 males and 43 females; 14.1±2.0 years old) were analysed for the final results. No statistically significant difference was found for overall pain [F value = 2.65, degrees of freedom (df) = 92.6; P = 0.1071]. However, compared to multistranded stainless steel wires, pain in subjects with superelastic nickel–titanium archwires was significantly greater at 12 h (t = 2.34; P = 0.0193), as well as at day 1 in the morning (t = 2.21, P = 0.0273), afternoon (t = 2.11, P = 0.0346) and at bedtime (t = 2.03, P = 0.042). Conclusion:For overall pain, there was no statistically significant difference between the two wires. However, subjects with superelastic nickel–titanium archwires had a significantly higher pain at peak level. PMID:24297959
Setegn, Tesfaye; Gerbaba, Mulusew; Belachew, Tefera
2011-04-08
Although breastfeeding is universal in Ethiopia, ranges of regional differences in timely initiation of breastfeeding have been documented. Initiation of breastfeeding is highly bound to cultural factors that may either enhance or inhibit the optimal practices. The government of Ethiopia developed National Infant and Young Child Feeding Guideline in 2004 and behavior change communications on breast feeding have been going on since then. However, there is a little information on the practice of timely initiation of breast feeding and factors that predict these practices after the implementation of the national guideline. The objective of this study is to determine the prevalence and determinant factors of timely initiation of breastfeeding among mothers in Bale Goba District, South East Ethiopia. A community based cross sectional study was carried out from February to March 2010 using both quantitative and qualitative methods of data collection. A total of 608 mother infant pairs were selected using simple random sampling method and key informants for the in-depth interview were selected conveniently. Descriptive statistics, bivariate analysis and multivariable logistic regression analyses were employed to identify factors associated with timely initiation of breast feeding. The prevalence of timely initiation of breastfeeding was 52.4%. Bivariate analysis showed that attendance of formal education, being urban resident, institutional delivery and postnatal counseling on breast feeding were significantly associated with timely initiation of breastfeeding (P < 0.05). After adjust sting for other factors on the multivariable logistic model, being in the urban area [AOR: 4.1 (95%C.I: 2.31-7.30)] and getting postnatal counseling [AOR: 2.7(1.86-3.94)] were independent predictors of timely initiation of breastfeeding. The practice of timely initiation of breast feeding is low as nearly half the mothers did not start breastfeeding with one hour after delivery. The results suggest that breast feeding behavior change communication especially during the post natal period is critical in promoting optimal practice in the initiation of breast feeding. Rural mothers need special attention as they are distant from various information sources. © 2011 Gerbaba et al; licensee BioMed Central Ltd.
2011-01-01
Background Although breastfeeding is universal in Ethiopia, ranges of regional differences in timely initiation of breastfeeding have been documented. Initiation of breastfeeding is highly bound to cultural factors that may either enhance or inhibit the optimal practices. The government of Ethiopia developed National Infant and Young Child Feeding Guideline in 2004 and behavior change communications on breast feeding have been going on since then. However, there is a little information on the practice of timely initiation of breast feeding and factors that predict these practices after the implementation of the national guideline. The objective of this study is to determine the prevalence and determinant factors of timely initiation of breastfeeding among mothers in Bale Goba District, South East Ethiopia. Methods A community based cross sectional study was carried out from February to March 2010 using both quantitative and qualitative methods of data collection. A total of 608 mother infant pairs were selected using simple random sampling method and key informants for the in-depth interview were selected conveniently. Descriptive statistics, bivariate analysis and multivariable logistic regression analyses were employed to identify factors associated with timely initiation of breast feeding. Results The prevalence of timely initiation of breastfeeding was 52.4%. Bivariate analysis showed that attendance of formal education, being urban resident, institutional delivery and postnatal counseling on breast feeding were significantly associated with timely initiation of breastfeeding (P < 0.05). After adjust sting for other factors on the multivariable logistic model, being in the urban area [AOR: 4.1 (95%C.I: 2.31-7.30)] and getting postnatal counseling [AOR: 2.7(1.86-3.94)] were independent predictors of timely initiation of breastfeeding. Conclusions The practice of timely initiation of breast feeding is low as nearly half the mothers did not start breastfeeding with one hour after delivery. The results suggest that breast feeding behavior change communication especially during the post natal period is critical in promoting optimal practice in the initiation of breast feeding. Rural mothers need special attention as they are distant from various information sources. PMID:21473791
Statistical analysis of iron geochemical data suggests limited late Proterozoic oxygenation
NASA Astrophysics Data System (ADS)
Sperling, Erik A.; Wolock, Charles J.; Morgan, Alex S.; Gill, Benjamin C.; Kunzmann, Marcus; Halverson, Galen P.; MacDonald, Francis A.; Knoll, Andrew H.; Johnston, David T.
2015-07-01
Sedimentary rocks deposited across the Proterozoic-Phanerozoic transition record extreme climate fluctuations, a potential rise in atmospheric oxygen or re-organization of the seafloor redox landscape, and the initial diversification of animals. It is widely assumed that the inferred redox change facilitated the observed trends in biodiversity. Establishing this palaeoenvironmental context, however, requires that changes in marine redox structure be tracked by means of geochemical proxies and translated into estimates of atmospheric oxygen. Iron-based proxies are among the most effective tools for tracking the redox chemistry of ancient oceans. These proxies are inherently local, but have global implications when analysed collectively and statistically. Here we analyse about 4,700 iron-speciation measurements from shales 2,300 to 360 million years old. Our statistical analyses suggest that subsurface water masses in mid-Proterozoic oceans were predominantly anoxic and ferruginous (depleted in dissolved oxygen and iron-bearing), but with a tendency towards euxinia (sulfide-bearing) that is not observed in the Neoproterozoic era. Analyses further indicate that early animals did not experience appreciable benthic sulfide stress. Finally, unlike proxies based on redox-sensitive trace-metal abundances, iron geochemical data do not show a statistically significant change in oxygen content through the Ediacaran and Cambrian periods, sharply constraining the magnitude of the end-Proterozoic oxygen increase. Indeed, this re-analysis of trace-metal data is consistent with oxygenation continuing well into the Palaeozoic era. Therefore, if changing redox conditions facilitated animal diversification, it did so through a limited rise in oxygen past critical functional and ecological thresholds, as is seen in modern oxygen minimum zone benthic animal communities.
Fordyce, James A
2010-07-23
Phylogenetic hypotheses are increasingly being used to elucidate historical patterns of diversification rate-variation. Hypothesis testing is often conducted by comparing the observed vector of branching times to a null, pure-birth expectation. A popular method for inferring a decrease in speciation rate, which might suggest an early burst of diversification followed by a decrease in diversification rate is the gamma statistic. Using simulations under varying conditions, I examine the sensitivity of gamma to the distribution of the most recent branching times. Using an exploratory data analysis tool for lineages through time plots, tree deviation, I identified trees with a significant gamma statistic that do not appear to have the characteristic early accumulation of lineages consistent with an early, rapid rate of cladogenesis. I further investigated the sensitivity of the gamma statistic to recent diversification by examining the consequences of failing to simulate the full time interval following the most recent cladogenic event. The power of gamma to detect rate decrease at varying times was assessed for simulated trees with an initial high rate of diversification followed by a relatively low rate. The gamma statistic is extraordinarily sensitive to recent diversification rates, and does not necessarily detect early bursts of diversification. This was true for trees of various sizes and completeness of taxon sampling. The gamma statistic had greater power to detect recent diversification rate decreases compared to early bursts of diversification. Caution should be exercised when interpreting the gamma statistic as an indication of early, rapid diversification.
George, Barbara Jane; Sobus, Jon R; Phelps, Lara P; Rashleigh, Brenda; Simmons, Jane Ellen; Hines, Ronald N
2015-05-01
Considerable concern has been raised regarding research reproducibility both within and outside the scientific community. Several factors possibly contribute to a lack of reproducibility, including a failure to adequately employ statistical considerations during study design, bias in sample selection or subject recruitment, errors in developing data inclusion/exclusion criteria, and flawed statistical analysis. To address some of these issues, several publishers have developed checklists that authors must complete. Others have either enhanced statistical expertise on existing editorial boards, or formed distinct statistics editorial boards. Although the U.S. Environmental Protection Agency, Office of Research and Development, already has a strong Quality Assurance Program, an initiative was undertaken to further strengthen statistics consideration and other factors in study design and also to ensure these same factors are evaluated during the review and approval of study protocols. To raise awareness of the importance of statistical issues and provide a forum for robust discussion, a Community of Practice for Statistics was formed in January 2014. In addition, three working groups were established to develop a series of questions or criteria that should be considered when designing or reviewing experimental, observational, or modeling focused research. This article describes the process used to develop these study design guidance documents, their contents, how they are being employed by the Agency's research enterprise, and expected benefits to Agency science. The process and guidance documents presented here may be of utility for any research enterprise interested in enhancing the reproducibility of its science. © The Author 2015. Published by Oxford University Press on behalf of the Society of Toxicology.
Drivers of annual to decadal streamflow variability in the lower Colorado River Basin
NASA Astrophysics Data System (ADS)
Lambeth-Beagles, R. S.; Troch, P. A.
2010-12-01
The Colorado River is the main water supply to the southwest region. As demand reaches the limit of supply in the southwest it becomes increasingly important to understand the dynamics of streamflow in the Colorado River and in particular the tributaries to the lower Colorado River. Climate change may pose an additional threat to the already-scarce water supply in the southwest. Due to the narrowing margin for error, water managers are keen on extending their ability to predict streamflow volumes on a mid-range to decadal scale. Before a predictive streamflow model can be developed, an understanding of the physical drivers of annual to decadal streamflow variability in the lower Colorado River Basin is needed. This research addresses this need by applying multiple statistical methods to identify trends, patterns and relationships present in streamflow, precipitation and temperature over the past century in four contributing watersheds to the lower Colorado River. The four watersheds selected were the Paria, Little Colorado, Virgin/Muddy, and Bill Williams. Time series data over a common period from 1906-2007 for streamflow, precipitation and temperature were used for the initial analysis. Through statistical analysis the following questions were addressed: 1) are there observable trends and patterns in these variables during the past century and 2) if there are trends or patterns, how are they related to each other? The Mann-Kendall test was used to identify trends in the three variables. Assumptions regarding autocorrelation and persistence in the data were taken into consideration. Kendall’s tau-b test was used to establish association between any found trends in the data. Initial results suggest there are two primary processes occurring. First, statistical analysis reveals significant upward trends in temperatures and downward trends in streamflow. However, there appears to be no trend in precipitation data. These trends in streamflow and temperature speak to increasing evaporation and transpiration processes. Second, annual variability in streamflow is not statistically correlated with annual temperature variability but appears to be highly correlated with annual precipitation variability. This implies that on a year-to-year basis, changes in streamflow volumes are directly affected by precipitation and not temperature. Future development of a predictive streamflow model will need to take into consideration these two processes to obtain accurate results. In order to extend predictive skill to the multi-year scale relationships between precipitation, temperature and persistent climate indices such as the Pacific Decadal Oscillation, Atlantic Multidecadal Oscillation and El Nino/Southern Oscillation will need to be examined.
AN EXPLORATION OF THE STATISTICAL SIGNATURES OF STELLAR FEEDBACK
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boyden, Ryan D.; Offner, Stella S. R.; Koch, Eric W.
2016-12-20
All molecular clouds are observed to be turbulent, but the origin, means of sustenance, and evolution of the turbulence remain debated. One possibility is that stellar feedback injects enough energy into the cloud to drive observed motions on parsec scales. Recent numerical studies of molecular clouds have found that feedback from stars, such as protostellar outflows and winds, injects energy and impacts turbulence. We expand upon these studies by analyzing magnetohydrodynamic simulations of molecular clouds, including stellar winds, with a range of stellar mass-loss rates and magnetic field strengths. We generate synthetic {sup 12}CO(1–0) maps assuming that the simulations aremore » at the distance of the nearby Perseus molecular cloud. By comparing the outputs from different initial conditions and evolutionary times, we identify differences in the synthetic observations and characterize these using common astrostatistics. We quantify the different statistical responses using a variety of metrics proposed in the literature. We find that multiple astrostatistics, including the principal component analysis, the spectral correlation function, and the velocity coordinate spectrum (VCS), are sensitive to changes in stellar mass-loss rates and/or time evolution. A few statistics, including the Cramer statistic and VCS, are sensitive to the magnetic field strength. These findings demonstrate that stellar feedback influences molecular cloud turbulence and can be identified and quantified observationally using such statistics.« less
Statistical principle and methodology in the NISAN system.
Asano, C
1979-01-01
The NISAN system is a new interactive statistical analysis program package constructed by an organization of Japanese statisticans. The package is widely available for both statistical situations, confirmatory analysis and exploratory analysis, and is planned to obtain statistical wisdom and to choose optimal process of statistical analysis for senior statisticians. PMID:540594
Krause, Katrina M; Lovelady, Cheryl A; Østbye, Truls
2011-04-01
Excess maternal weight has been negatively associated with breastfeeding. We examined correlates of breastfeeding initiation and intensity in a racially diverse sample of overweight and obese women. This paper presents a secondary analysis of data from 450 women enrolled in a postpartum weight loss intervention (Active Mothers Postpartum [AMP]). Sociodemographic measures and body mass index (BMI), collected at 6 weeks postpartum, were examined for associations with breastfeeding initiation and lactation score (a measure combining duration and exclusivity of breastfeeding until 12 months postpartum). Data were collected September 2004-April 2007. In multivariable analyses, BMI was negatively associated with both initiation of breastfeeding (OR: .96; CI: .92-.99) and lactation score (β -0.22; P = 0.01). Education and infant gestational age were additional correlates of initiation, while race, working full-time, smoking, parity, and gestational age were additional correlates of lactation score. Some racial differences in these correlates were noted, but were not statistically significant. Belief that breastfeeding could aid postpartum weight loss was initially high, but unrelated to breastfeeding initiation or intensity. Maintenance of this belief over time, however, was associated with lower lactation scores. BMI was negatively correlated with breastfeeding initiation and intensity. Among overweight and obese women, unrealistic expectations regarding the effect of breastfeeding on weight loss may negatively impact breastfeeding duration. In general, overweight and obese women may need additional encouragement to initiate breastfeeding and to continue breastfeeding during the infant's first year.
Predictors of Breastfeeding in Overweight and Obese Women: Data From Active Mothers Postpartum (AMP)
Krause, Katrina M.; Lovelady, Cheryl A.; Østbye, Truls
2011-01-01
Excess maternal weight has been negatively associated with breastfeeding. We examined correlates of breastfeeding initiation and intensity in a racially diverse sample of overweight and obese women. This paper presents a secondary analysis of data from 450 women enrolled in a postpartum weight loss intervention (Active Mothers Postpartum [AMP]). Sociodemographic measures and body mass index (BMI), collected at 6 weeks postpartum, were examined for associations with breastfeeding initiation and lactation score (a measure combining duration and exclusivity of breastfeeding until 12 months postpartum). Data were collected September 2004–April 2007. In multivariable analyses, BMI was negatively associated with both initiation of breastfeeding (OR: .96; CI: .92–.99) and lactation score (β −0.22; P = 0.01). Education and infant gestational age were additional correlates of initiation, while race, working full-time, smoking, parity, and gestational age were additional correlates of lactation score. Some racial differences in these correlates were noted, but were not statistically significant. Belief that breastfeeding could aid postpartum weight loss was initially high, but unrelated to breastfeeding initiation or intensity. Maintenance of this belief over time, however, was associated with lower lactation scores. BMI was negatively correlated with breastfeeding initiation and intensity. Among overweight and obese women, unrealistic expectations regarding the effect of breastfeeding on weight loss may negatively impact breastfeeding duration. In general, overweight and obese women may need additional encouragement to initiate breastfeeding and to continue breastfeeding during the infant’s first year. PMID:20821042
NASA Astrophysics Data System (ADS)
Hare, B. M.; Dwyer, J. R.; Winner, L. H.; Uman, M. A.; Jordan, D. M.; Kotovsky, D. A.; Caicedo, J. A.; Wilkes, R. A.; Carvalho, F. L.; Pilkey, J. T.; Ngin, T. K.; Gamerota, W. R.; Rassoul, H. K.
2017-08-01
It has been argued in the technical literature, and widely reported in the popular press, that cosmic ray air showers (CRASs) can initiate lightning via a mechanism known as relativistic runaway electron avalanche (RREA), where large numbers of high-energy and low-energy electrons can, somehow, cause the local atmosphere in a thundercloud to transition to a conducting state. In response to this claim, other researchers have published simulations showing that the electron density produced by RREA is far too small to be able to affect the conductivity in the cloud sufficiently to initiate lightning. In this paper, we compare 74 days of cosmic ray air shower data collected in north central Florida during 2013-2015, the recorded CRASs having primary energies on the order of 1016 eV to 1018 eV and zenith angles less than 38°, with Lightning Mapping Array (LMA) data, and we show that there is no evidence that the detected cosmic ray air showers initiated lightning. Furthermore, we show that the average probability of any of our detected cosmic ray air showers to initiate a lightning flash can be no more than 5%. If all lightning flashes were initiated by cosmic ray air showers, then about 1.6% of detected CRASs would initiate lightning; therefore, we do not have enough data to exclude the possibility that lightning flashes could be initiated by cosmic ray air showers.
Rabal, Obdulia; Link, Wolfgang; Serelde, Beatriz G; Bischoff, James R; Oyarzabal, Julen
2010-04-01
Here we report the development and validation of a complete solution to manage and analyze the data produced by image-based phenotypic screening campaigns of small-molecule libraries. In one step initial crude images are analyzed for multiple cytological features, statistical analysis is performed and molecules that produce the desired phenotypic profile are identified. A naïve Bayes classifier, integrating chemical and phenotypic spaces, is built and utilized during the process to assess those images initially classified as "fuzzy"-an automated iterative feedback tuning. Simultaneously, all this information is directly annotated in a relational database containing the chemical data. This novel fully automated method was validated by conducting a re-analysis of results from a high-content screening campaign involving 33 992 molecules used to identify inhibitors of the PI3K/Akt signaling pathway. Ninety-two percent of confirmed hits identified by the conventional multistep analysis method were identified using this integrated one-step system as well as 40 new hits, 14.9% of the total, originally false negatives. Ninety-six percent of true negatives were properly recognized too. A web-based access to the database, with customizable data retrieval and visualization tools, facilitates the posterior analysis of annotated cytological features which allows identification of additional phenotypic profiles; thus, further analysis of original crude images is not required.
NASA Astrophysics Data System (ADS)
Lagerwall, Gareth; Kiker, Gregory; Muñoz-Carpena, Rafael; Wang, Naiming
2017-01-01
The coupled regional simulation model, and the transport and reaction simulation engine were recently adapted to simulate ecology, specifically Typha domingensis (Cattail) dynamics in the Everglades. While Cattail is a native Everglades species, it has become invasive over the years due to an altered habitat over the last few decades, taking over historically Cladium jamaicense (Sawgrass) areas. Two models of different levels of algorithmic complexity were developed in previous studies, and are used here to determine the impact of various management decisions on the average Cattail density within Water Conservation Area 2A in the Everglades. A Global Uncertainty and Sensitivity Analysis was conducted to test the importance of these management scenarios, as well as the effectiveness of using zonal statistics. Management scenarios included high, medium and low initial water depths, soil phosphorus concentrations, initial Cattail and Sawgrass densities, as well as annually alternating water depths and soil phosphorus concentrations, and a steadily decreasing soil phosphorus concentration. Analysis suggests that zonal statistics are good indicators of regional trends, and that high soil phosphorus concentration is a pre-requisite for expansive Cattail growth. It is a complex task to manage Cattail expansion in this region, requiring the close management and monitoring of water depth and soil phosphorus concentration, and possibly other factors not considered in the model complexities. However, this modeling framework with user-definable complexities and management scenarios, can be considered a useful tool in analyzing many more alternatives, which could be used to aid management decisions in the future.
Okuyucu, Kursat; Ozaydın, Sukru; Alagoz, Engin; Ozgur, Gokhan; Oysul, Fahrettin Guven; Ozmen, Ozlem; Tuncel, Murat; Ozturk, Mustafa; Arslan, Nuri
2016-01-01
Abstract Background Non-Hodgkin’s lymphomas arising from the tissues other than primary lymphatic organs are named primary extranodal lymphoma. Most of the studies evaluated metabolic tumor parameters in different organs and histopathologic variants of this disease generally for treatment response. We aimed to evaluate the prognostic value of metabolic tumor parameters derived from initial FDG-PET/CT in patients with a medley of primary extranodal lymphoma in this study. Patients and methods There were 67 patients with primary extranodal lymphoma for whom FDG-PET/CT was requested for primary staging. Quantitative PET/CT parameters: maximum standardized uptake value (SUVmax), average standardized uptake value (SUVmean), metabolic tumor volume (MTV) and total lesion glycolysis (TLG) were used to estimate disease-free survival and overall survival. Results SUVmean, MTV and TLG were found statistically significant after multivariate analysis. SUVmean remained significant after ROC curve analysis. Sensitivity and specificity were calculated as 88% and 64%, respectively, when the cut-off value of SUVmean was chosen as 5.15. After the investigation of primary presentation sites and histo-pathological variants according to recurrence, there is no difference amongst the variants. Primary site of extranodal lymphomas however, is statistically important (p = 0.014). Testis and central nervous system lymphomas have higher recurrence rate (62.5%, 73%, respectively). Conclusions High SUVmean, MTV and TLG values obtained from primary staging FDG-PET/CT are potential risk factors for both disease-free survival and overall survival in primary extranodal lymphoma. SUVmean is the most significant one amongst them for estimating recurrence/metastasis. PMID:27904443
Abril Hernández, José-María
2015-05-01
After half a century, the use of unsupported (210)Pb ((210)Pbexc) is still far off from being a well established dating tool for recent sediments with widespread applicability. Recent results from the statistical analysis of time series of fluxes, mass sediment accumulation rates (SAR), and initial activities, derived from varved sediments, place serious constraints to the assumption of constant fluxes, which is widely used in dating models. The Sediment Isotope Tomography (SIT) model, under the assumption of non post-depositional redistribution, is used for dating recent sediments in scenarios in that fluxes and SAR are uncorrelated and both vary with time. By using a simple graphical analysis, this paper shows that under the above assumptions, any given (210)Pbexc profile, even with the restriction of a discrete set of reference points, is compatible with an infinite number of chronological lines, and thus generating an infinite number of mathematically exact solutions for histories of initial activity concentrations, SAR and fluxes onto the SWI, with these two last ranging from zero up to infinity. Particularly, SIT results, without additional assumptions, cannot contain any statistically significant difference with respect to the exact solutions consisting in intervals of constant SAR or constant fluxes (both being consistent with the reference points). Therefore, there is not any benefit in its use as a dating tool without the explicit introduction of additional restrictive assumptions about fluxes, SAR and/or their interrelationship. Copyright © 2015 Elsevier Ltd. All rights reserved.
Guillot, Benoît; Jelsch, Christian; Podjarny, Alberto; Lecomte, Claude
2008-05-01
The valence electron density of the protein human aldose reductase was analyzed at 0.66 angstroms resolution. The methodological developments in the software MoPro to adapt standard charge-density techniques from small molecules to macromolecular structures are described. The deformation electron density visible in initial residual Fourier difference maps was significantly enhanced after high-order refinement. The protein structure was refined after transfer of the experimental library multipolar atom model (ELMAM). The effects on the crystallographic statistics, on the atomic thermal displacement parameters and on the structure stereochemistry are analyzed. Constrained refinements of the transferred valence populations Pval and multipoles Plm were performed against the X-ray diffraction data on a selected substructure of the protein with low thermal motion. The resulting charge densities are of good quality, especially for chemical groups with many copies present in the polypeptide chain. To check the effect of the starting point on the result of the constrained multipolar refinement, the same charge-density refinement strategy was applied but using an initial neutral spherical atom model, i.e. without transfer from the ELMAM library. The best starting point for a protein multipolar refinement is the structure with the electron density transferred from the database. This can be assessed by the crystallographic statistical indices, including Rfree, and the quality of the static deformation electron-density maps, notably on the oxygen electron lone pairs. The analysis of the main-chain bond lengths suggests that stereochemical dictionaries would benefit from a revision based on recently determined unrestrained atomic resolution protein structures.
Weber, Joseph J; Mascarenhas, Debra C; Bellin, Lisa S; Raab, Rachel E; Wong, Jan H
2012-10-01
Patient navigation programs are initiated to help guide patients through barriers in a complex cancer care system. We sought to analyze the impact of our patient navigator program on the adherence to specific Breast Cancer Care Quality Indicators (BCCQI). A retrospective cohort of patients with stage I-III breast cancer seen the calendar year prior to the initiation of the patient navigation program were compared with patients treated in the ensuing two calendar years. Quality indicators deemed appropriate for analysis were those associated with overcoming barriers to treatment and those associated with providing health education and improving patient decision-making. A total of 134 consecutive patients between January 1, 2006 and December 31, 2006 and 234 consecutive patients between January 1, 2008 and December 31, 2009 were evaluated for compliance with the BCCQI. There was no significant difference in the mean age or race/ethnic distribution of the study population. In all ten BCCQI evaluated, there was improvement in the percentage of patients in compliance from pre and post implementation of a patient navigator program (range 2.5-27.0 %). Overall, compliance with BCCQI improved from 74.1 to 95.5 % (p < 0.0001). Indicators associated with informed decision-making and patient preference achieved statistical significance, while only completion axillary node dissection in sentinel node-positive biopsies in the process of treatment achieved statistical significance. The implementation of a patient navigator program improved breast cancer care as measured by BCCQI. The impact on disease-free and overall survival remains to be determined.
Lagerwall, Gareth; Kiker, Gregory; Muñoz-Carpena, Rafael; Wang, Naiming
2017-01-01
The coupled regional simulation model, and the transport and reaction simulation engine were recently adapted to simulate ecology, specifically Typha domingensis (Cattail) dynamics in the Everglades. While Cattail is a native Everglades species, it has become invasive over the years due to an altered habitat over the last few decades, taking over historically Cladium jamaicense (Sawgrass) areas. Two models of different levels of algorithmic complexity were developed in previous studies, and are used here to determine the impact of various management decisions on the average Cattail density within Water Conservation Area 2A in the Everglades. A Global Uncertainty and Sensitivity Analysis was conducted to test the importance of these management scenarios, as well as the effectiveness of using zonal statistics. Management scenarios included high, medium and low initial water depths, soil phosphorus concentrations, initial Cattail and Sawgrass densities, as well as annually alternating water depths and soil phosphorus concentrations, and a steadily decreasing soil phosphorus concentration. Analysis suggests that zonal statistics are good indicators of regional trends, and that high soil phosphorus concentration is a pre-requisite for expansive Cattail growth. It is a complex task to manage Cattail expansion in this region, requiring the close management and monitoring of water depth and soil phosphorus concentration, and possibly other factors not considered in the model complexities. However, this modeling framework with user-definable complexities and management scenarios, can be considered a useful tool in analyzing many more alternatives, which could be used to aid management decisions in the future.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, D. F.; Freeman, W. A.; Carhart, R. A.
2005-09-23
This report provides technical documentation for values in the Table of Initial Isolation and Protective Action Distances (PADs) in the 2004 Emergency Response Guidebook (ERG2004). The objective for choosing the PADs specified in the ERG2004 is to balance the need to adequately protect the public from exposure to potentially harmful substances against the risks and expenses that could result from overreacting to a spill. To quantify this balance, a statistical approach is adopted, whereby the best available information is used to conduct an accident scenario analysis and develop a set of up to 1,000,000 hypothetical incidents. The set accounts formore » differences in containers types, incident types, accident severity (i.e., amounts released), locations, times of day, times of year, and meteorological conditions. Each scenario is analyzed using detailed emission rate and atmospheric dispersion models to calculate the downwind chemical concentrations from which a 'safe distance' is determined. The safe distance is defined as the distance downwind from the source at which the chemical concentration falls below health protection criteria. The American Industrial Hygiene Association's Emergency Response Planning Guideline Level 2 (ERPG-2) or equivalent is the health criteria used. The statistical sample of safe distance values for all incidents considered in the analysis are separated into four categories: small spill/daytime release, small spill/nighttime release, large spill/daytime release, and large spill/nighttime release. The 90th-percentile safe distance values for each of these groups became the PADs that appear in the ERG2004.« less
Statistics of work performed on a forced quantum oscillator.
Talkner, Peter; Burada, P Sekhar; Hänggi, Peter
2008-07-01
Various aspects of the statistics of work performed by an external classical force on a quantum mechanical system are elucidated for a driven harmonic oscillator. In this special case two parameters are introduced that are sufficient to completely characterize the force protocol. Explicit results for the characteristic function of work and the corresponding probability distribution are provided and discussed for three different types of initial states of the oscillator: microcanonical, canonical, and coherent states. Depending on the choice of the initial state the probability distributions of the performed work may greatly differ. This result in particular also holds true for identical force protocols. General fluctuation and work theorems holding for microcanonical and canonical initial states are confirmed.
Detonation mode and frequency analysis under high loss conditions for stoichiometric propane-oxygen
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jackson, Scott I.; Lee, Bok Jik; Shepherd, Joseph E.
In this paper, the propagation characteristics of galloping detonations were quantified with a high-time-resolution velocity diagnostic. Combustion waves were initiated in 30-m lengths of 4.1-mm inner diameter transparent tubing filled with stoichiometric propane–oxygen mixtures. Chemiluminescence from the resulting waves was imaged to determine the luminous wave front position and velocity every 83.3 μ. As the mixture initial pressure was decreased from 20 to 7 kPa, the wave was observed to become increasingly unsteady and transition from steady detonation to a galloping detonation. While wave velocities averaged over the full tube length smoothly decreased with initial pressure down to half ofmore » the Chapman–Jouguet detonation velocity (D CJ) at the quenching limit, the actual propagation mechanism was seen to be a galloping wave with a cycle period of approximately 1.0 ms, corresponding to a cycle length of 1.3–2.0 m or 317–488 tube diameters depending on the average wave speed. The long test section length of 7300 tube diameters allowed observation of up to 20 galloping cycles, allowing for statistical analysis of the wave dynamics. In the galloping regime, a bimodal velocity distribution was observed with peaks centered near 0.4 D CJ and 0.95 D CJ. Decreasing initial pressure increasingly favored the low velocity mode. Galloping frequencies ranged from 0.8 to 1.0 kHz and were insensitive to initial mixture pressure. Wave deflagration-to-detonation transition and detonation failure trajectories were found to be repeatable in a given test and also across different initial mixture pressures. The temporal duration of wave dwell at the low and high velocity modes during galloping was also quantified. It was found that the mean wave dwell duration in the low velocity mode was a weak function of initial mixture pressure, while the mean dwell time in the high velocity mode depended exponentially on initial mixture pressure. Analysis of the velocity histories using dynamical systems ideas demonstrated trajectories that varied from stable to limit cycles to aperiodic motion with decreasing initial pressure. Finally, the results indicate that galloping detonation is a persistent phenomenon at long tube lengths.« less
Detonation mode and frequency analysis under high loss conditions for stoichiometric propane-oxygen
Jackson, Scott I.; Lee, Bok Jik; Shepherd, Joseph E.
2016-03-24
In this paper, the propagation characteristics of galloping detonations were quantified with a high-time-resolution velocity diagnostic. Combustion waves were initiated in 30-m lengths of 4.1-mm inner diameter transparent tubing filled with stoichiometric propane–oxygen mixtures. Chemiluminescence from the resulting waves was imaged to determine the luminous wave front position and velocity every 83.3 μ. As the mixture initial pressure was decreased from 20 to 7 kPa, the wave was observed to become increasingly unsteady and transition from steady detonation to a galloping detonation. While wave velocities averaged over the full tube length smoothly decreased with initial pressure down to half ofmore » the Chapman–Jouguet detonation velocity (D CJ) at the quenching limit, the actual propagation mechanism was seen to be a galloping wave with a cycle period of approximately 1.0 ms, corresponding to a cycle length of 1.3–2.0 m or 317–488 tube diameters depending on the average wave speed. The long test section length of 7300 tube diameters allowed observation of up to 20 galloping cycles, allowing for statistical analysis of the wave dynamics. In the galloping regime, a bimodal velocity distribution was observed with peaks centered near 0.4 D CJ and 0.95 D CJ. Decreasing initial pressure increasingly favored the low velocity mode. Galloping frequencies ranged from 0.8 to 1.0 kHz and were insensitive to initial mixture pressure. Wave deflagration-to-detonation transition and detonation failure trajectories were found to be repeatable in a given test and also across different initial mixture pressures. The temporal duration of wave dwell at the low and high velocity modes during galloping was also quantified. It was found that the mean wave dwell duration in the low velocity mode was a weak function of initial mixture pressure, while the mean dwell time in the high velocity mode depended exponentially on initial mixture pressure. Analysis of the velocity histories using dynamical systems ideas demonstrated trajectories that varied from stable to limit cycles to aperiodic motion with decreasing initial pressure. Finally, the results indicate that galloping detonation is a persistent phenomenon at long tube lengths.« less
Educate All Girls and Boys in South Asia: The Global Out-of-School Children Initiative
ERIC Educational Resources Information Center
UNICEF, 2015
2015-01-01
Business as usual has not provided educational opportunities to world's most marginalized children. The South Asia Out-of-School Children Initiative (OOSCI) is part of the global initiative launched by UNICEF and the UNESCO Institute for Statistics (UIS) in 2010. The goal of the initiative is to make significant and sustained reduction in the…
Digest of Education Statistics 2016, 52nd Edition. NCES 2017-094
ERIC Educational Resources Information Center
Snyder, Thomas D.; de Brey, Cristobal; Dillow, Sally A.
2018-01-01
The 2016 edition of the "Digest of Education Statistics" is the 52nd in a series of publications initiated in 1962. The "Digest" has been issued annually except for combined editions for the years 1977-78, 1983-84, and 1985-86. Its primary purpose is to provide a compilation of statistical information covering the broad field…
Digest of Education Statistics, 2010. NCES 2011-015
ERIC Educational Resources Information Center
Snyder, Thomas D.; Dillow, Sally A.
2011-01-01
The 2010 edition of the "Digest of Education Statistics" is the 46th in a series of publications initiated in 1962. The "Digest" has been issued annually except for combined editions for the years 1977-78, 1983-84, and 1985-86. Its primary purpose is to provide a compilation of statistical information covering the broad field…
Digest of Education Statistics, 2009. NCES 2010-013
ERIC Educational Resources Information Center
Snyder, Thomas D.; Dillow, Sally A.
2010-01-01
The 2009 edition of the "Digest of Education Statistics" is the 45th in a series of publications initiated in 1962. The "Digest" has been issued annually except for combined editions for the years 1977-78, 1983-84, and 1985-86. Its primary purpose is to provide a compilation of statistical information covering the broad field…
Digest of Education Statistics 2015, 51st Edition. NCES 2016-014
ERIC Educational Resources Information Center
Snyder, Thomas D.; de Brey, Cristobal; Dillow, Sally A.
2016-01-01
The 2015 edition of the "Digest of Education Statistics" is the 51st in a series of publications initiated in 1962. The "Digest" has been issued annually except for combined editions for the years 1977-78, 1983-84, and 1985-86. Its primary purpose is to provide a compilation of statistical information covering the broad field…
Digest of Education Statistics 2013. NCES 2015-011
ERIC Educational Resources Information Center
Snyder, Thomas D.; Dillow, Sally A.
2015-01-01
The 2013 edition of the "Digest of Education Statistics" is the 49th in a series of publications initiated in 1962. The Digest has been issued annually except for combined editions for the years 1977-78, 1983-84, and 1985-86. Its primary purpose is to provide a compilation of statistical information covering the broad field of American…
Adult Perceptions of In-Class Collaborative Problem Solving as Mitigation for Statistics Anxiety
ERIC Educational Resources Information Center
Kinkead, Karl J.; Miller, Heather; Hammett, Richard
2016-01-01
Two purposes existed for initiating this qualitative case study involving adults who had completed a college-level business statistics course. The first purpose was to explore adult challenges with stress and anxiety during the course: a phenomenon labeled statistics anxiety in the literature. The second purpose was to gain insight into adult…
Orton, Dennis J.; Doucette, Alan A.
2013-01-01
Identification of biomarkers capable of differentiating between pathophysiological states of an individual is a laudable goal in the field of proteomics. Protein biomarker discovery generally employs high throughput sample characterization by mass spectrometry (MS), being capable of identifying and quantifying thousands of proteins per sample. While MS-based technologies have rapidly matured, the identification of truly informative biomarkers remains elusive, with only a handful of clinically applicable tests stemming from proteomic workflows. This underlying lack of progress is attributed in large part to erroneous experimental design, biased sample handling, as well as improper statistical analysis of the resulting data. This review will discuss in detail the importance of experimental design and provide some insight into the overall workflow required for biomarker identification experiments. Proper balance between the degree of biological vs. technical replication is required for confident biomarker identification. PMID:28250400
Why Flash Type Matters: A Statistical Analysis
NASA Astrophysics Data System (ADS)
Mecikalski, Retha M.; Bitzer, Phillip M.; Carey, Lawrence D.
2017-09-01
While the majority of research only differentiates between intracloud (IC) and cloud-to-ground (CG) flashes, there exists a third flash type, known as hybrid flashes. These flashes have extensive IC components as well as return strokes to ground but are misclassified as CG flashes in current flash type analyses due to the presence of a return stroke. In an effort to show that IC, CG, and hybrid flashes should be separately classified, the two-sample Kolmogorov-Smirnov (KS) test was applied to the flash sizes, flash initiation, and flash propagation altitudes for each of the three flash types. The KS test statistically showed that IC, CG, and hybrid flashes do not have the same parent distributions and thus should be separately classified. Separate classification of hybrid flashes will lead to improved lightning-related research, because unambiguously classified hybrid flashes occur on the same order of magnitude as CG flashes for multicellular storms.
Statistical dynamics of religion evolutions
NASA Astrophysics Data System (ADS)
Ausloos, M.; Petroni, F.
2009-10-01
A religion affiliation can be considered as a “degree of freedom” of an agent on the human genre network. A brief review is given on the state of the art in data analysis and modelization of religious “questions” in order to suggest and if possible initiate further research, after using a “statistical physics filter”. We present a discussion of the evolution of 18 so-called religions, as measured through their number of adherents between 1900 and 2000. Some emphasis is made on a few cases presenting a minimum or a maximum in the investigated time range-thereby suggesting a competitive ingredient to be considered, besides the well accepted “at birth” attachment effect. The importance of the “external field” is still stressed through an Avrami late stage crystal growth-like parameter. The observed features and some intuitive interpretations point to opinion based models with vector, rather than scalar, like agents.
Predicting September sea ice: Ensemble skill of the SEARCH Sea Ice Outlook 2008-2013
NASA Astrophysics Data System (ADS)
Stroeve, Julienne; Hamilton, Lawrence C.; Bitz, Cecilia M.; Blanchard-Wrigglesworth, Edward
2014-04-01
Since 2008, the Study of Environmental Arctic Change Sea Ice Outlook has solicited predictions of September sea-ice extent from the Arctic research community. Individuals and teams employ a variety of modeling, statistical, and heuristic approaches to make these predictions. Viewed as monthly ensembles each with one or two dozen individual predictions, they display a bimodal pattern of success. In years when observed ice extent is near its trend, the median predictions tend to be accurate. In years when the observed extent is anomalous, the median and most individual predictions are less accurate. Statistical analysis suggests that year-to-year variability, rather than methods, dominate the variation in ensemble prediction success. Furthermore, ensemble predictions do not improve as the season evolves. We consider the role of initial ice, atmosphere and ocean conditions, and summer storms and weather in contributing to the challenge of sea-ice prediction.
The ASC/SIL ratio for cytopathologists as a quality control measure: a follow-up study.
Nascimento, Alessandra F; Cibas, Edmund S
2007-10-01
Monitoring the relative frequency of the interpretations of atypical squamous cells (ASC) and squamous intraepithelial lesions (SIL) has been proposed as a quality control measure. To assess its value, an ASC/SIL ratio was calculated every 6 months for 3.5 years, and confidential feedback was provided to 10 cytopathologists (CPs). By using simple regression analysis, we analyzed the initial and final ASC/SIL ratios for individual CPs and for the entire group. The ratio was below the upper benchmark of 3:1 for all but 1 CP during every 6-month period. The ratio for all CPs combined showed a downward trend (from 2.05 to 1.73). The ratio for 6 CPs decreased, and for two of them the decrease was statistically significant. One CP showed a statistically significant increase in the ASC/SIL ratio. The decrease for some CPs likely reflects the salutary effect of confidential feedback and counseling.
Lapalud, P; Rothschild, C; Mathieu-Dupas, E; Balicchi, J; Gruel, Y; Laune, D; Molina, F; Schved, J F; Granier, C; Lavigne-Lissalde, G
2015-04-01
Hemophilia A (HA) is a congenital bleeding disorder resulting from factor VIII deficiency. The most serious complication of HA management is the appearance of inhibitory antibodies (Abs) against injected FVIII concentrates. To eradicate inhibitors, immune tolerance induction (ITI) is usually attempted, but it fails in up to 30% of cases. Currently, no undisputed predictive marker of ITI outcome is available to facilitate the clinical decision. To identify predictive markers of ITI efficacy. The isotypic and epitopic repertoires of inhibitory Abs were analyzed in plasma samples collected before ITI initiation from 15 children with severe HA and high-titer inhibitors, and their levels were compared in the two outcome groups (ITI success [n = 7] and ITI failure [n = 8]). The predictive value of these candidate biomarkers and of the currently used indicators (inhibitor titer and age at ITI initiation, highest inhibitor titer before ITI, and interval between inhibitor diagnosis and ITI initiation) was then compared by statistical analysis (Wilcoxon test and receiver receiver operating characteristic [ROC] curve analysis). Whereas current indicators seemed to fail in discriminating patients in the two outcome groups (ITI success or failure), anti-A1 and anti-A2 Ab levels before ITI initiation appeared to be good potential predictive markers of ITI outcome (P < 0.018). ROC analysis showed that anti-A1 and anti-A2 Abs were the best at discriminating between outcome groups (area under the ROC curve of > 0.875). Anti-A1 and anti-A2 Abs could represent new promising tools for the development of ITI outcome prediction tests for children with severe HA. © 2015 International Society on Thrombosis and Haemostasis.
Scoping review protocol: education initiatives for medical psychiatry collaborative care
Shen, Nelson; Sockalingam, Sanjeev; Abi Jaoude, Alexxa; Bailey, Sharon M; Bernier, Thérèse; Freeland, Alison; Hawa, Aceel; Hollenberg, Elisa; Woldemichael, Bethel; Wiljer, David
2017-01-01
Introduction The collaborative care model is an approach providing care to those with mental health and addictions disorders in the primary care setting. There is a robust evidence base demonstrating its clinical and cost-effectiveness in comparison with usual care; however, the transitioning to this new paradigm of care has been difficult. While there are efforts to train and prepare healthcare professionals, not much is known about the current state of collaborative care training programmes. The objective of this scoping review is to understand how widespread these collaborative care education initiatives are, how they are implemented and their impacts. Methods and analysis The scoping review methodology uses the established review methodology by Arksey and O’Malley. The search strategy was developed by a medical librarian and will be applied in eight different databases spanning multiple disciplines. A two-stage screening process consisting of a title and abstract scan and a full-text review will be used to determine the eligibility of articles. To be included, articles must report on an existing collaborative care education initiative for healthcare providers. All articles will be independently assessed for eligibility by pairs of reviewers, and all eligible articles will be abstracted and charted in duplicate using a standardised form. The extracted data will undergo a ‘narrative review’ or a descriptive analysis of the contextual or process-oriented data and simple quantitative analysis using descriptive statistics. Ethics and dissemination Research ethics approval is not required for this scoping review. The results of this scoping review will inform the development of a collaborative care training initiative emerging from the Medical Psychiatry Alliance, a four-institution philanthropic partnership in Ontario, Canada. The results will also be presented at relevant national and international conferences and published in a peer-reviewed journal. PMID:28871017
Funding community medicines by exception: a descriptive epidemiological study from New Zealand.
Rasiah, Dilky; Edwards, Richard; Crampton, Peter
2012-02-24
To assess rates of approval and identify factors associated with successful applications for funding to the New Zealand Community Exceptional Circumstances (CEC) scheme. Descriptive quantitative analysis of data in CEC applications database. The main outcome was initial application approval rate. Analysis included calculation of unadjusted and adjusted associations between potential determinants (for example patient age, gender) and outcomes using logistic regression analysis. All CEC applications with a decision about approval or decline 1 October 2001 to 30 September 2008 were included. Application numbers were high, but had reduced since 2001. A small number of medicines (11) and indications comprised about a third of the applications to the scheme. While some common applications were clearly outside the remit of the scheme, many applications were for patients who fitted the scheme's eligibility criteria. The overall initial application approval rate was 16% and the renewal application approval rate was 88%. Approval rates varied widely by type of medicine, therapeutic group and indication. After adjusting for other potential determinants there were no statistically significant differences in initial approval rates by gender, ethnicity or socioeconomic status of the patient. There were however, significant differences in initial application approval by age of the patient, type of applicant doctor and by geographical location of the applicant doctor. There was no evidence that gender, ethnicity and socioeconomic status of patients were factors associated with successful applications. However, applications for younger patients, those made by specialists, and those made by applying clinicians from the Auckland District Health Board area were more likely to be successful. It is possible that this may to some degree be appropriate, but requires further research.
Dowbor, Tatiana Pluciennik; Westphal, Márcia Faria
2013-08-01
To analyze the current status of the interventions related to social determinants of health conducted in the context of the brazilian family health program. A case study using a mixed method approach based on a sequential explanatory strategy with 171 unit managers in the Family Health Care Program in the municipality of Sao Paulo, SP, Southeastern Brazil, in 2005/2006. Self-administered questionnaires were applied and semi-structured interviews and focus groups were conducted with a purposive sample of professionals involved in initiatives related to social determinants of health. Quantitative data were analyzed using descriptive statistics, multiple correspondence analysis, cluster analysis and correlation tests. Qualitative data were analyzed through content analysis and the creation of thematic categories. Despite the concentration of activities directed at disease care, the Family Health Care Program carries out various activities related to the social determination of health, encompassing the entire spectrum of health promotion approaches (biological, behavioral, psychological, social and structural) and all major social determinants of health described in the literature. There was a significant difference related to the scope of the determinants being worked on in the units according to the area of the city. The description of the activities revealed the fragility of the initiatives and a disconnection with the organizational structure of the Family Health Care Program. The quantity and variety of initiatives related to social determinants of health attests to the program's potential to deal with the social determination of health. On the other hand, the fluidity of objectives and the 'out of the ordinary/extraordinary' characterization of the described initiatives raises concern about its sustainability as an integral part of the program's current operational model.
Mutalik, Sunil; Tadinada, Aditya
2017-09-01
Pineal gland calcification has been proposed to play a role in the pathogenesis of Alzheimer disease. This study evaluated the prevalence and extent of pineal gland calcification in cone-beam computed tomography (CBCT) scans of patients referred for dental implant therapy who could possibly be a vulnerable group for this condition. A retrospective evaluation of 500 CBCT scans was conducted. Scans that showed the area where the pineal gland was located were included. The scans were initially screened by a single observer to record the prevalence and extent of calcification. Six weeks following the completion of the study, another investigator randomly reviewed and selected 50 scans to investigate inter-observer variation, which was evaluated using reliability analysis statistics. The prevalence and measurements of the calcifications were reported using descriptive statistics. The chi-square test was used to compare the prevalence between males and females. The prevalence of pineal gland calcification was 58.8%. There was no statistically significant correlation between age and the extent of the calcification. The prevalence of calcification was 58.6% in females and 59.0% in males. The average anteroposterior measurement was 3.73±1.63 mm, while the average mediolateral measurement was 3.47±1.31 mm. The average total calcified area was 9.79±7.59 mm 2 . The prevalence of pineal gland calcification was high in patients undergoing implant therapy. While not all pineal gland calcifications lead to neurodegenerative disorders, they should be strongly considered in the presence of any symptoms as a reason to initiate further investigations.
Design of point-of-care (POC) microfluidic medical diagnostic devices
NASA Astrophysics Data System (ADS)
Leary, James F.
2018-02-01
Design of inexpensive and portable hand-held microfluidic flow/image cytometry devices for initial medical diagnostics at the point of initial patient contact by emergency medical personnel in the field requires careful design in terms of power/weight requirements to allow for realistic portability as a hand-held, point-of-care medical diagnostics device. True portability also requires small micro-pumps for high-throughput capability. Weight/power requirements dictate use of super-bright LEDs and very small silicon photodiodes or nanophotonic sensors that can be powered by batteries. Signal-to-noise characteristics can be greatly improved by appropriately pulsing the LED excitation sources and sampling and subtracting noise in between excitation pulses. The requirements for basic computing, imaging, GPS and basic telecommunications can be simultaneously met by use of smartphone technologies, which become part of the overall device. Software for a user-interface system, limited real-time computing, real-time imaging, and offline data analysis can be accomplished through multi-platform software development systems that are well-suited to a variety of currently available cellphone technologies which already contain all of these capabilities. Microfluidic cytometry requires judicious use of small sample volumes and appropriate statistical sampling by microfluidic cytometry or imaging for adequate statistical significance to permit real-time (typically < 15 minutes) medical decisions for patients at the physician's office or real-time decision making in the field. One or two drops of blood obtained by pin-prick should be able to provide statistically meaningful results for use in making real-time medical decisions without the need for blood fractionation, which is not realistic in the field.
NASA Astrophysics Data System (ADS)
Li, Y.; Kirchengast, G.; Scherllin-Pirscher, B.; Norman, R.; Yuan, Y. B.; Fritzer, J.; Schwaerz, M.; Zhang, K.
2015-08-01
We introduce a new dynamic statistical optimization algorithm to initialize ionosphere-corrected bending angles of Global Navigation Satellite System (GNSS)-based radio occultation (RO) measurements. The new algorithm estimates background and observation error covariance matrices with geographically varying uncertainty profiles and realistic global-mean correlation matrices. The error covariance matrices estimated by the new approach are more accurate and realistic than in simplified existing approaches and can therefore be used in statistical optimization to provide optimal bending angle profiles for high-altitude initialization of the subsequent Abel transform retrieval of refractivity. The new algorithm is evaluated against the existing Wegener Center Occultation Processing System version 5.6 (OPSv5.6) algorithm, using simulated data on two test days from January and July 2008 and real observed CHAllenging Minisatellite Payload (CHAMP) and Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) measurements from the complete months of January and July 2008. The following is achieved for the new method's performance compared to OPSv5.6: (1) significant reduction of random errors (standard deviations) of optimized bending angles down to about half of their size or more; (2) reduction of the systematic differences in optimized bending angles for simulated MetOp data; (3) improved retrieval of refractivity and temperature profiles; and (4) realistically estimated global-mean correlation matrices and realistic uncertainty fields for the background and observations. Overall the results indicate high suitability for employing the new dynamic approach in the processing of long-term RO data into a reference climate record, leading to well-characterized and high-quality atmospheric profiles over the entire stratosphere.
A two-step super-Gaussian independent component analysis approach for fMRI data.
Ge, Ruiyang; Yao, Li; Zhang, Hang; Long, Zhiying
2015-09-01
Independent component analysis (ICA) has been widely applied to functional magnetic resonance imaging (fMRI) data analysis. Although ICA assumes that the sources underlying data are statistically independent, it usually ignores sources' additional properties, such as sparsity. In this study, we propose a two-step super-GaussianICA (2SGICA) method that incorporates the sparse prior of the sources into the ICA model. 2SGICA uses the super-Gaussian ICA (SGICA) algorithm that is based on a simplified Lewicki-Sejnowski's model to obtain the initial source estimate in the first step. Using a kernel estimator technique, the source density is acquired and fitted to the Laplacian function based on the initial source estimates. The fitted Laplacian prior is used for each source at the second SGICA step. Moreover, the automatic target generation process for initial value generation is used in 2SGICA to guarantee the stability of the algorithm. An adaptive step size selection criterion is also implemented in the proposed algorithm. We performed experimental tests on both simulated data and real fMRI data to investigate the feasibility and robustness of 2SGICA and made a performance comparison between InfomaxICA, FastICA, mean field ICA (MFICA) with Laplacian prior, sparse online dictionary learning (ODL), SGICA and 2SGICA. Both simulated and real fMRI experiments showed that the 2SGICA was most robust to noises, and had the best spatial detection power and the time course estimation among the six methods. Copyright © 2015. Published by Elsevier Inc.
Lee, Kenneth K C; Wan, Matthew H S; Fan, Barry S K; Chau, Michelle W Y; Lee, Vivian W Y
2009-03-01
To find out the antibiotic treatment regimens with the lowest cost for all-cause bacterial pneumonia, a study to compare the costs of different antibiotic regimens in the treatment of patients diagnosed with all-cause bacterial pneumonia who required hospitalisation was carried out. This was a multicentre, retrospective study of patient medical records. The primary aim was to examine whether the initial choice of antibiotic had affected the total cost of treatment, while the secondary aim was to find out whether the initial choice of antibiotic had affected the initial treatment failure rates and death rates. A cost-minimisation analysis (CMA) from a public hospital perspective was employed. A total of 333 patient medical case notes were reviewed. The most commonly prescribed antibiotic regimen was amoxycillin-clavulanate (AC) followed by amoxycillin-clavulanate plus macrolide (ACM) and quinolone (Q). In the study population, no statistical significance could be detected between the mean cost of the three regimens. In the subgroup analysis of patients with a history of chronic obstructive pulmonary disease (COPD) and patients with a history of smoking, the Q regimen appeared to be the least expensive. In the study population, no significant difference could be identified between the mean cost of the three antibiotic regimens. In a special populations such as patients with a history of COPD and patients with a history of smoking, the Q regimen appeared to be superior. Further studies in these areas are needed.
Heller, G.
2015-01-01
Surrogate end point research has grown in recent years with the increasing development and usage of biomarkers in clinical research. Surrogacy analysis is derived through randomized clinical trial data and it is carried out at the individual level and at the trial level. A common surrogate analysis at the individual level is the application of the Prentice criteria. An approach for the evaluation of the Prentice criteria is discussed, with a focus on its most difficult component, the determination of whether the treatment effect is captured by the surrogate. An interpretation of this criterion is illustrated using data from a randomized clinical trial in prostate cancer. PMID:26254442
Development and Validation of the Caring Loneliness Scale.
Karhe, Liisa; Kaunonen, Marja; Koivisto, Anna-Maija
2016-12-01
The Caring Loneliness Scale (CARLOS) includes 5 categories derived from earlier qualitative research. This article assesses the reliability and construct validity of a scale designed to measure patient experiences of loneliness in a professional caring relationship. Statistical analysis with 4 different sample sizes included Cronbach's alpha and exploratory factor analysis with principal axis factoring extraction. The sample size of 250 gave the most useful and comprehensible structure, but all 4 samples yielded underlying content of loneliness experiences. The initial 5 categories were reduced to 4 factors with 24 items and Cronbach's alpha ranging from .77 to .90. The findings support the reliability and validity of CARLOS for the assessment of Finnish breast cancer and heart surgery patients' experiences but as all instruments, further validation is needed.
NASA Astrophysics Data System (ADS)
Lemanzyk, Thomas; Anding, Katharina; Linss, Gerhard; Rodriguez Hernández, Jorge; Theska, René
2015-02-01
The following paper deals with the classification of seeds and seed components of the South-American Incanut plant and the modification of a machine to handle this task. Initially the state of the art is being illustrated. The research was executed in Germany and with a relevant part in Peru and Ecuador. Theoretical considerations for the solution of an automatically analysis of the Incanut seeds were specified. The optimization of the analyzing software and the separation unit of the mechanical hardware are carried out with recognition results. In a final step the practical application of the analysis of the Incanut seeds is held on a trial basis and rated on the bases of statistic values.
NASA Astrophysics Data System (ADS)
Martinez, B. S.; Ye, H.; Levy, R. C.; Fetzer, E. J.; Remer, L.
2017-12-01
Atmospheric aerosols expose high levels of uncertainty in regard to Earth's changing atmospheric energy budget. Continued exploration and analysis is necessary to obtain more complete understanding in which, and to what degree, aerosols contribute within climate feedbacks and global climate change. With the advent of global satellite retrievals, along with specific aerosol optical depth (AOD) Dark Target and Deep Blue algorithms, aerosols can now be better measured and analyzed. Aerosol effect on climate depends primarily on altitude, the reflectance albedo of the underlying surface, along with the presence of clouds and the dynamics thereof. As currently known, the majority of aerosol distribution and mixing occur in the lower troposphere from the surface upwards to around 2km. Additionally, being a primary greenhouse gas contributor, water vapor is significant to climate feedbacks and Earth's radiation budget. Feedbacks are generally reported from the top of atmosphere (TOA). Therefore, little is known of the relationship between water vapor and aerosols; specifically, in regional areas of the globe known for aerosol loading such as anthropogenic biomass burning in South America and naturally occurring dust blowing off the deserts in the African and Arabian peninsulas. Statistical regression and timeseries analysis are used in determining significant probabilities suggesting trends of both regional precipitable water (PW) and AOD increase and decrease over a 13-year time period from 2003-2015. Regions with statistically significant positive or negative trends of AOD and PW are analyzed in determining correlations, or lack thereof. This initial examination helps to deduce and better understand how aerosols contribute to the radiation budget and assessing climate change.
Exploring the link between meteorological drought and streamflow to inform water resource management
NASA Astrophysics Data System (ADS)
Lennard, Amy; Macdonald, Neil; Hooke, Janet
2015-04-01
Drought indicators are an under-used metric in UK drought management. Standardised drought indicators offer a potential monitoring and management tool for operational water resource management. However, the use of these metrics needs further investigation. This work uses statistical analysis of the climatological drought signal based on meteorological drought indicators and observed streamflow data to explore the link between meteorological drought and hydrological drought to inform water resource management for a single water resource region. The region, covering 21,000 km2 of the English Midlands and central Wales, includes a variety of landscapes and climatological conditions. Analysis of the links between meteorological drought and hydrological drought performed using streamflow data from 'natural' catchments indicates a close positive relationship between meteorological drought indicators and streamflow, enhancing confidence in the application of drought indicators for monitoring and management. However, many of the catchments in the region are subject to modification through impoundments, abstractions and discharge. Therefore, it is beneficial to explore how climatological drought signal propagates into managed hydrological systems. Using a longitudinal study of catchments and sub-catchments that include natural and modified river reaches the relationship between meteorological and hydrological drought is explored. Initial statistical analysis of meteorological drought indicators and streamflow data from modified catchments shows a significantly weakened statistical relationship and reveals how anthropogenic activities may alter hydrological drought characteristics in modified catchments. Exploring how meteorological drought indicators link to streamflow across the water supply region helps build an understanding of their utility for operational water resource management.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plemons, R.E.; Hopwood, W.H. Jr.; Hamilton, J.H.
For a number of years the Oak Ridge Y-12 Plant Laboratory has been analyzing coal predominately for the utilities department of the Y-12 Plant. All laboratory procedures, except a Leco sulfur method which used the Leco Instruction Manual as a reference, were written based on the ASTM coal analyses. Sulfur is analyzed at the present time by two methods, gravimetric and Leco. The laboratory has two major endeavors for monitoring the quality of its coal analyses. (1) A control program by the Plant Statistical Quality Control Department. Quality Control submits one sample for every nine samples submitted by the utilitiesmore » departments and the laboratory analyzes a control sample along with the utilities samples. (2) An exchange program with the DOE Coal Analysis Laboratory in Bruceton, Pennsylvania. The Y-12 Laboratory submits to the DOE Coal Laboratory, on even numbered months, a sample that Y-12 has analyzed. The DOE Coal Laboratory submits, on odd numbered months, one of their analyzed samples to the Y-12 Plant Laboratory to be analyzed. The results of these control and exchange programs are monitored not only by laboratory personnel, but also by Statistical Quality Control personnel who provide statistical evaluations. After analysis and reporting of results, all utilities samples are retained by the laboratory until the coal contracts have been settled. The utilities departments have responsibility for the initiation and preparation of the coal samples. The samples normally received by the laboratory have been ground to 4-mesh, reduced to 0.5-gallon quantities, and sealed in air-tight containers. Sample identification numbers and a Request for Analysis are generated by the utilities departments.« less
Raico Gallardo, Yolanda Natali; da Silva-Olivio, Isabela Rodrigues Teixeira; Mukai, Eduardo; Morimoto, Susana; Sesma, Newton; Cordaro, Luca
2017-05-01
To systematically assess the current dental literature comparing the accuracy of computer-aided implant surgery when using different supporting tissues (tooth, mucosa, or bone). Two reviewers searched PubMed (1972 to January 2015) and the Cochrane Central Register of Controlled Trials (Central) (2002 to January 2015). For the assessment of accuracy, studies were included with the following outcome measures: (i) angle deviation, (ii) deviation at the entry point, and (iii) deviation at the apex. Eight clinical studies from the 1602 articles initially identified met the inclusion criteria for the qualitative analysis. Four studies (n = 599 implants) were evaluated using meta-analysis. The bone-supported guides showed a statistically significant greater deviation in angle (P < 0.001), entry point (P = 0.01), and the apex (P = 0.001) when compared to the tooth-supported guides. Conversely, when only retrospective studies were analyzed, not significant differences are revealed in the deviation of the entry point and apex. The mucosa-supported guides indicated a statistically significant greater reduction in angle deviation (P = 0.02), deviation at the entry point (P = 0.002), and deviation at the apex (P = 0.04) when compared to the bone-supported guides. Between the mucosa- and tooth-supported guides, there were no statistically significant differences for any of the outcome measures. It can be concluded that the tissue of the guide support influences the accuracy of computer-aided implant surgery. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, H.; Kim, Rokho; Korrick, S.
1996-12-31
In an earlier report based on participants in the Veterans Administration Normative Aging Study, we found a significant association between the risk of hypertension and lead levels in tibia. To examine the possible confounding effects of education and occupation, we considered in this study five levels of education and three levels of occupation as independent variables in the statistical model. Of 1,171 active subjects seen between August 1991 and December 1994, 563 provided complete data for this analysis. In the initial logistic regression model, acre and body mass index, family history of hypertension, and dietary sodium intake, but neither cumulativemore » smoking nor alcohol ingestion, conferred increased odds ratios for being hypertensive that were statistically significant. When the lead biomarkers were added separately to this initial logistic model, tibia lead and patella lead levels were associated with significantly elevated odds ratios for hypertension. In the final backward elimination logistic regression model that included categorical variables for education and occupation, the only variables retained were body mass index, family history of hypertension, and tibia lead level. We conclude that education and occupation variables were not confounding the association between the lead biomarkers and hypertension that we reported previously. 27 refs., 3 tabs.« less
Among long-term crack smokers, who avoids and who succumbs to cocaine addiction?
Falck, Russel S; Wang, Jichuan; Carlson, Robert G
2008-11-01
Crack cocaine is a highly addictive drug. To learn more about crack addiction, long-term crack smokers who had never met the DSM-IV criteria for lifetime cocaine dependence were compared with those who had. The study sample consisted of crack users (n=172) from the Dayton, Ohio, area who were interviewed periodically over 8 years. Data were collected on a range of variables including age of crack initiation, frequency of recent use, and lifetime cocaine dependence. Cocaine dependence was common with 62.8% of the sample having experienced it. There were no statistically significant differences between dependent and non-dependent users for age of crack initiation or frequency of crack use. In terms of sociodemographics, only race/ethnicity was significant, with proportionally fewer African-Americans than whites meeting the criteria for cocaine dependence. Controlling for sociodemographics, partial correlation analysis showed positive, statistically significant relationships between lifetime cocaine dependence and anti-social personality disorder, attention deficit/hyperactivity disorder, and lifetime dependence on alcohol, cannabis, amphetamine, sedative-hypnotics, and opioids. These results highlight the importance addressing race/ethnicity and comorbid disorders when developing, implementing, and evaluating interventions targeting people who use crack cocaine. Additional research is needed to better understand the role of race/ethnicity in the development of cocaine dependence resulting from crack use.
Dewa, Carolyn S; Zipursky, Robert B; Chau, Nancy; Furimsky, Ivana; Collins, April; Agid, Ofer; Goering, Paula
2009-11-01
This pilot study compared the effectiveness of specialized care that was home based versus hospital based for individuals experiencing their first psychotic episode. A randomized controlled trial design was used. A total of 29 subjects were interviewed at baseline, 3 and 9 months. Repeated measures analysis of variance was employed to test for statistically significant changes over time within and between groups with regard to community psychosocial functioning and symptom severity. Our findings indicate that subjects in both the home-based and hospital-based programmes significantly improved with regard to symptoms and community functioning over time. However, the rates of change over time were not significantly different between the two programmes. There was a statistically significant difference between programmes with regard to the proportion of subjects with less than two visits (i.e. either did not attend their first assessment or attended follow-up visits after their assessment). This was a modest pilot study and the sample was too small to allow definitive conclusions to be drawn. However, the results raise questions about differences in initial treatment engagement. They suggest the need for additional research focusing on interventions that promote initial treatment seeking. © 2009 The Authors. Journal compilation © 2009 Blackwell Publishing Asia Pty Ltd.
Gomes, Liliane R.; Gomes, Marcelo; Jung, Bryan; Paniagua, Beatriz; Ruellas, Antonio C.; Gonçalves, João Roberto; Styner, Martin A.; Wolford, Larry; Cevidanes, Lucia
2015-01-01
Abstract. This study aimed to investigate imaging statistical approaches for classifying three-dimensional (3-D) osteoarthritic morphological variations among 169 temporomandibular joint (TMJ) condyles. Cone-beam computed tomography scans were acquired from 69 subjects with long-term TMJ osteoarthritis (OA), 15 subjects at initial diagnosis of OA, and 7 healthy controls. Three-dimensional surface models of the condyles were constructed and SPHARM-PDM established correspondent points on each model. Multivariate analysis of covariance and direction-projection-permutation (DiProPerm) were used for testing statistical significance of the differences between the groups determined by clinical and radiographic diagnoses. Unsupervised classification using hierarchical agglomerative clustering was then conducted. Compared with healthy controls, OA average condyle was significantly smaller in all dimensions except its anterior surface. Significant flattening of the lateral pole was noticed at initial diagnosis. We observed areas of 3.88-mm bone resorption at the superior surface and 3.10-mm bone apposition at the anterior aspect of the long-term OA average model. DiProPerm supported a significant difference between the healthy control and OA group (p-value=0.001). Clinically meaningful unsupervised classification of TMJ condylar morphology determined a preliminary diagnostic index of 3-D osteoarthritic changes, which may be the first step towards a more targeted diagnosis of this condition. PMID:26158119
Krumme, Alexis A; Sanfélix-Gimeno, Gabriel; Franklin, Jessica M; Isaman, Danielle L; Mahesri, Mufaddal; Matlin, Olga S; Shrank, William H; Brennan, Troyen A; Brill, Gregory; Choudhry, Niteesh K
2016-01-01
Objective The use of retail purchasing data may improve adherence prediction over approaches using healthcare insurance claims alone. Design Retrospective. Setting and participants A cohort of patients who received prescription medication benefits through CVS Caremark, used a CVS Pharmacy ExtraCare Health Care (ECHC) loyalty card, and initiated a statin medication in 2011. Outcome We evaluated associations between retail purchasing patterns and optimal adherence to statins in the 12 subsequent months. Results Among 11 010 statin initiators, 43% were optimally adherent at 12 months of follow-up. Greater numbers of store visits per month and dollar amount per visit were positively associated with optimal adherence, as was making a purchase on the same day as filling a prescription (p<0.0001 for all). Models to predict adherence using retail purchase variables had low discriminative ability (C-statistic: 0.563), while models with both clinical and retail purchase variables achieved a C-statistic of 0.617. Conclusions While the use of retail purchases may improve the discriminative ability of claims-based approaches, these data alone appear inadequate for adherence prediction, even with the addition of more complex analytical approaches. Nevertheless, associations between retail purchasing behaviours and adherence could inform the development of quality improvement interventions. PMID:28186924
Comparing Networks from a Data Analysis Perspective
NASA Astrophysics Data System (ADS)
Li, Wei; Yang, Jing-Yu
To probe network characteristics, two predominant ways of network comparison are global property statistics and subgraph enumeration. However, they suffer from limited information and exhaustible computing. Here, we present an approach to compare networks from the perspective of data analysis. Initially, the approach projects each node of original network as a high-dimensional data point, and the network is seen as clouds of data points. Then the dispersion information of the principal component analysis (PCA) projection of the generated data clouds can be used to distinguish networks. We applied this node projection method to the yeast protein-protein interaction networks and the Internet Autonomous System networks, two types of networks with several similar higher properties. The method can efficiently distinguish one from the other. The identical result of different datasets from independent sources also indicated that the method is a robust and universal framework.
Effects of additional data on Bayesian clustering.
Yamazaki, Keisuke
2017-10-01
Hierarchical probabilistic models, such as mixture models, are used for cluster analysis. These models have two types of variables: observable and latent. In cluster analysis, the latent variable is estimated, and it is expected that additional information will improve the accuracy of the estimation of the latent variable. Many proposed learning methods are able to use additional data; these include semi-supervised learning and transfer learning. However, from a statistical point of view, a complex probabilistic model that encompasses both the initial and additional data might be less accurate due to having a higher-dimensional parameter. The present paper presents a theoretical analysis of the accuracy of such a model and clarifies which factor has the greatest effect on its accuracy, the advantages of obtaining additional data, and the disadvantages of increasing the complexity. Copyright © 2017 Elsevier Ltd. All rights reserved.
Environmental initiative prioritization with a Delphi approach: a case study.
Gokhale, A A
2001-08-01
India is fast finding its place in the industrialized world and that is beginning to raise its environmental consciousness. The Delphi technique was used to prioritize specific needs and articulate a sustainable urban improvement strategy for the city of Mumbai (formerly Bombay). The Delphi technique is a means of achieving consensual validity among raters by providing them feedback regarding other raters' responses. Mumbai has several indigenous environmental groups that were tapped for activists and leaders; the study was conducted using ten environmentalists. In the initial phases the responses resulted in a range of possible program alternatives. The last two stages helped to seek out information that generated a consensus on the part of the respondent group. Statistical analysis methods included a hierarchical cluster analysis, mean, median, mode, and percent of agreement calculations using SPSS software. The face-to-face discussion in phase 4 clarified some issues and helped the group as a whole to outline the strategy for putting in place the essential elements of a framework to improve the quality of life in an urban environment.
Thomson, Gill; Bilson, Andy; Dykes, Fiona
2012-04-01
to describe a 'hearts and minds' approach to community Baby Friendly Initiative implementation developed from the views of multidisciplinary professionals. a qualitative descriptive study utilising focus groups and interviews, with thematic networks analysis conducted. forty-seven professionals were consulted from two primary health-care facilities located in the North-West of England. thematic networks analysis generated a global theme of a 'hearts and minds approach' to BFI implementation, which embodies emotional and rational engagement. The three underpinning organising themes (and their associated basic themes): 'credible leadership', 'engagement of key partners' and 'changing attitudes and practice' reflect the context, processes and outcomes of a 'hearts and minds' approach. a 'hearts and minds' approach transcends the prescriptive aspects of a macro-level intervention with its emphasis upon audits, training, statistics and 'hard' evidence through valuing other professionals and engaging staff at all levels. It offers insights into how organisational change may move beyond traditional top-down mechanisms for driving change to incorporate ways that value others and promote cooperation and reflection. Copyright © 2011 Elsevier Ltd. All rights reserved.
Data Analysis And Polarization Measurements With GEMS
NASA Technical Reports Server (NTRS)
Stohmayer, Tod
2011-01-01
The Gravity and Extreme Magnetism SMEX (GEMS) mission was selected by NASA for flight in 2014. GEMS will make the first sensitive survey of X-ray polarization across a wide range of source classes including black hole and neutron star binaries, AGN of different types, rotation and accretion-powered pulsars, magnetars, shell supernova remnants and pulsar wind nebulae. GEMS employs grazing-incidence foil mirrors and novel time-projection chamber (TPC) polarimeters leveraging the photoelectric effect. The GEMS detectors image the charge tracks of photoelectrons produced by 2 - 10 keV X-rays. The initial direction of the photoelectron is determined by the linear polarization of the photon. We present an overview of the data analysis challenges and methods for GEMS, including procedures for producing optimally filtered images of the charge tracks and estimating their initial directions. We illustrate our methods using laboratory measurements of polarized and unpolarized X-rays with flight-like detectors as well as from simulated tracks. We also present detailed simulations exploring the statistics of polarization measurements appropriate for GEMS, and make comparisons with previous work.
Structure and evolution of a European Parliament via a network and correlation analysis
NASA Astrophysics Data System (ADS)
Puccio, Elena; Pajala, Antti; Piilo, Jyrki; Tumminello, Michele
2016-11-01
We present a study of the network of relationships among elected members of the Finnish parliament, based on a quantitative analysis of initiative co-signatures, and its evolution over 16 years. To understand the structure of the parliament, we constructed a statistically validated network of members, based on the similarity between the patterns of initiatives they signed. We looked for communities within the network and characterized them in terms of members' attributes, such as electoral district and party. To gain insight on the nested structure of communities, we constructed a hierarchical tree of members from the correlation matrix. Afterwards, we studied parliament dynamics yearly, with a focus on correlations within and between parties, by also distinguishing between government and opposition. Finally, we investigated the role played by specific individuals, at a local level. In particular, whether they act as proponents who gather consensus, or as signers. Our results provide a quantitative background to current theories in political science. From a methodological point of view, our network approach has proven able to highlight both local and global features of a complex social system.
Budhwani, H; De, P
2017-12-01
Vaccine disparities research often focuses on differences between the five main racial and ethnic classifications, ignoring heterogeneity of subpopulations. Considering this knowledge gap, we examined human papillomavirus (HPV) vaccine initiation in Asian Indians and Asian subpopulations. National Health Interview Survey data (2008-2013), collected by the National Center for Health Statistics, were analyzed. Multiple logistic regression analysis was conducted on adults aged 18-26 years (n = 20,040). Asian Indians had high income, education, and health insurance coverage, all positive predictors of preventative health engagement and vaccine uptake. However, we find that Asian Indians had comparatively lower rates of HPV vaccine initiation (odds ratio = 0.41; 95% confidence interval = 0.207-0.832), and foreign-born Asian Indians had the lowest rate HPV vaccination of all subpopulations (2.3%). Findings substantiate the need for research on disaggregated data rather than evaluating vaccination behaviors solely across standard racial and ethnic categories. We identified two populations that were initiating HPV vaccine at abysmal levels: foreign-born persons and Asian Indians. Development of culturally appropriate messaging has the potential to improve these initiation rates and improve population health. Copyright © 2017 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Saputra, K. V. I.; Cahyadi, L.; Sembiring, U. A.
2018-01-01
Start in this paper, we assess our traditional elementary statistics education and also we introduce elementary statistics with simulation-based inference. To assess our statistical class, we adapt the well-known CAOS (Comprehensive Assessment of Outcomes in Statistics) test that serves as an external measure to assess the student’s basic statistical literacy. This test generally represents as an accepted measure of statistical literacy. We also introduce a new teaching method on elementary statistics class. Different from the traditional elementary statistics course, we will introduce a simulation-based inference method to conduct hypothesis testing. From the literature, it has shown that this new teaching method works very well in increasing student’s understanding of statistics.
Frings, Andreas; Steinberg, Johannes; Druchkiv, Vasyl; Linke, Stephan J; Katz, Toam
2017-08-01
This study was initiated to introduce the term non-corneal ocular residual astigmatism (N-CORA) as a new parameter in astigmatic change analysis after implantation of two different types of non-toric, multifocal intraocular lenses (MIOL). Seventy-two eyes from 72 consecutive patients after MIOL surgery were studied in terms of a retrospective, cross-sectional data analysis. Two types of spherical MIOL were used. Surgical technique in all patients was a 2.4-mm incision phacoemulsification, performed by one surgeon. To investigate the magnitude and axis of astigmatic changes, the true corneal astigmatism and Alpins vector method were applied. There were no statistically significant between-group differences related to the preoperative refraction or ocular residual astigmatism (ORA). After surgery, the mean refractive surgically induced astigmatism (RSIA) and the topographic SIA (TSIA) did not differ significantly between the lenses. The magnitude and orientation of ORA and N-CORA changed after surgery. There are no statistically significant differences in postoperative ORA in magnitude or axis when implanting different types of MIOL. The similarity of N-CORA between both MIOL types shows that both diffractive and refractive asymmetric MIOLs with plate haptics have the same pseudolentogenic astigmatic effect which could be presented in terms of the newly introduced parameter N-CORA.
Replica analysis of overfitting in regression models for time-to-event data
NASA Astrophysics Data System (ADS)
Coolen, A. C. C.; Barrett, J. E.; Paga, P.; Perez-Vicente, C. J.
2017-09-01
Overfitting, which happens when the number of parameters in a model is too large compared to the number of data points available for determining these parameters, is a serious and growing problem in survival analysis. While modern medicine presents us with data of unprecedented dimensionality, these data cannot yet be used effectively for clinical outcome prediction. Standard error measures in maximum likelihood regression, such as p-values and z-scores, are blind to overfitting, and even for Cox’s proportional hazards model (the main tool of medical statisticians), one finds in literature only rules of thumb on the number of samples required to avoid overfitting. In this paper we present a mathematical theory of overfitting in regression models for time-to-event data, which aims to increase our quantitative understanding of the problem and provide practical tools with which to correct regression outcomes for the impact of overfitting. It is based on the replica method, a statistical mechanical technique for the analysis of heterogeneous many-variable systems that has been used successfully for several decades in physics, biology, and computer science, but not yet in medical statistics. We develop the theory initially for arbitrary regression models for time-to-event data, and verify its predictions in detail for the popular Cox model.
Onalan, Reside; Onalan, Gogsen; Tonguc, Esra; Ozdener, Tulin; Dogan, Muammer; Mollamahmutoglu, Leyla
2009-04-01
To determine the subgroup of patients in whom office hysteroscopy should be routinely performed before an in vitro fertilization (IVF) program. Retrospective cohort analysis. Tertiary education and research hospital. Two hundred twenty-three patients who underwent a uterine evaluation by office hysteroscopy before the IVF and embryo transfer cycle. The office hysteroscopy was performed in the follicular phase of the menstrual cycle before the IVF cycle. The office findings: number of polyps, number of multiple polyps, and polyp size. Patients with polycystic ovary syndrome (PCOS) had a higher number of endometrial polyps, but the difference was not statistically significant (28.9% vs. 18.3%). When comparing the patients according to BMI, patients with BMI >or=30 had a statistically significantly higher number of endometrial polyps versus BMI <30 (52% vs. 15%). On the other hand, obesity was positively correlated with the occurrence of polyps, size of the polyps, and occurrence of multiple number of polyps in the correlation analysis. In addition, logistic regression analysis using age, obesity, duration of infertility, and estradiol levels revealed that obesity was an independent prognostic factor for the development of endometrial polyps. Office hysteroscopy should be performed in patients with BMI >or=30 because obesity may act as an initiator for the pathogenesis of endometrial polyps.
Brito, Janaína Salmos; Santos Neto, Alexandrino; Silva, Luciano; Menezes, Rebeca; Araújo, Natália; Carneiro, Vanda; Moreno, Lara Magalhães; Miranda, Jéssica; Álvares, Pâmella; Nevares, Giselle; Xavier, Felipe; Arruda, José Alcides; Bessa-Nogueira, Ricardo; Santos, Natanael; Queiroz, Gabriela; Sobral, Ana Paula; Silveira, Márcia; Albuquerque, Diana; Gerbi, Marleny
2016-01-01
This paper aimed to analyze the in vitro industrialized fruit juices effect plus soy to establish the erosive potential of these solutions. Seventy bovine incisors were selected after being evaluated under stereomicroscope. Their crowns were prepared and randomly divided into 7 groups, using microhardness with allocation criteria. The crowns were submitted to the fruit juice plus soy during 15 days, twice a day. The pH values, acid titration, and Knoop microhardness were recorded and the specimens were evaluated using X-ray microfluorescence (µXRF). The pH average for all juices and after 3 days was significantly below the critical value for dental erosion. In average, the pH value decreases 14% comparing initial time and pH after 3 days. Comparing before and after, there was a 49% microhardness decrease measured in groups (p < 0.05). Groups G1, G2, G5, and G6 are above this average. The analysis by μXRF showed a decrease of approximately 7% Ca and 4% P on bovine crowns surface. Florida (FL) statistical analysis showed a statistically significant 1 difference between groups. Thus, a tooth chance to suffer demineralization due to industrialized fruit juices plus soy is real.
3Drefine: an interactive web server for efficient protein structure refinement
Bhattacharya, Debswapna; Nowotny, Jackson; Cao, Renzhi; Cheng, Jianlin
2016-01-01
3Drefine is an interactive web server for consistent and computationally efficient protein structure refinement with the capability to perform web-based statistical and visual analysis. The 3Drefine refinement protocol utilizes iterative optimization of hydrogen bonding network combined with atomic-level energy minimization on the optimized model using a composite physics and knowledge-based force fields for efficient protein structure refinement. The method has been extensively evaluated on blind CASP experiments as well as on large-scale and diverse benchmark datasets and exhibits consistent improvement over the initial structure in both global and local structural quality measures. The 3Drefine web server allows for convenient protein structure refinement through a text or file input submission, email notification, provided example submission and is freely available without any registration requirement. The server also provides comprehensive analysis of submissions through various energy and statistical feedback and interactive visualization of multiple refined models through the JSmol applet that is equipped with numerous protein model analysis tools. The web server has been extensively tested and used by many users. As a result, the 3Drefine web server conveniently provides a useful tool easily accessible to the community. The 3Drefine web server has been made publicly available at the URL: http://sysbio.rnet.missouri.edu/3Drefine/. PMID:27131371
Mundim, Fabrício M; Antunes, Pedro L; Sousa, Ana Beatriz S; Garcia, Lucas F R; Pires-de-Souza, Fernanda C P
2012-06-01
To evaluate the colour stability of paints used for ocular prosthesis iris painting submitted for accelerated artificial ageing (AAA). Forty specimens of acrylic resin for sclera (16 × 2 mm) were made and separated into eight groups (n = 10) according to the type of paint (gouache, GP; oil, OP; acrylic AP; and composite resin for characterisation, CR) and the colours used (blue/brown). After drying (72 h), a new layer of colourless acrylic resin was applied and the initial colour readout was performed (Spectrophotometer PCB 6807). New colour readouts were performed after AAA, and ΔE was calculated. Statistical analysis (two-way anova-Bonferroni, p < 0.05) demonstrated that the brown colour showed lower ΔE means in comparison with the blue colour, with statistically significant difference for AP only. Blue colour showed no statistically significant difference with regard to the type of paint used. Brown AP showed lower ΔE than the other groups, with significant difference for OP and GP. GP showed greater alteration in ΔE for the brown colour, being statistically similar only to OP. Only the AP group for brown pigment shows clinically acceptable values for colour stability after AAA. © 2011 The Gerodontology Society and John Wiley & Sons A/S.
Papadia, Andrea; Bellati, Filippo; Bogani, Giorgio; Ditto, Antonino; Martinelli, Fabio; Lorusso, Domenica; Donfrancesco, Cristina; Gasparri, Maria Luisa; Raspagliesi, Francesco
2015-12-01
The aim of this study was to identify clinical variables that may predict the need for adjuvant radiotherapy after neoadjuvant chemotherapy (NACT) and radical surgery in locally advanced cervical cancer patients. A retrospective series of cervical cancer patients with International Federation of Gynecology and Obstetrics (FIGO) stages IB2-IIB treated with NACT followed by radical surgery was analyzed. Clinical predictors of persistence of intermediate- and/or high-risk factors at final pathological analysis were investigated. Statistical analysis was performed using univariate and multivariate analysis and using a model based on artificial intelligence known as artificial neuronal network (ANN) analysis. Overall, 101 patients were available for the analyses. Fifty-two (51 %) patients were considered at high risk secondary to parametrial, resection margin and/or lymph node involvement. When disease was confined to the cervix, four (4 %) patients were considered at intermediate risk. At univariate analysis, FIGO grade 3, stage IIB disease at diagnosis and the presence of enlarged nodes before NACT predicted the presence of intermediate- and/or high-risk factors at final pathological analysis. At multivariate analysis, only FIGO grade 3 and tumor diameter maintained statistical significance. The specificity of ANN models in evaluating predictive variables was slightly superior to conventional multivariable models. FIGO grade, stage, tumor diameter, and histology are associated with persistence of pathological intermediate- and/or high-risk factors after NACT and radical surgery. This information is useful in counseling patients at the time of treatment planning with regard to the probability of being subjected to pelvic radiotherapy after completion of the initially planned treatment.
Statistical Analysis of Research Data | Center for Cancer Research
Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. The Statistical Analysis of Research Data (SARD) course will be held on April 5-6, 2018 from 9 a.m.-5 p.m. at the National Institutes of Health's Natcher Conference Center, Balcony C on the Bethesda Campus. SARD is designed to provide an overview on the general principles of statistical analysis of research data. The first day will feature univariate data analysis, including descriptive statistics, probability distributions, one- and two-sample inferential statistics.
Shaikh, Masood Ali
2017-09-01
Assessment of research articles in terms of study designs used, statistical tests applied and the use of statistical analysis programmes help determine research activity profile and trends in the country. In this descriptive study, all original articles published by Journal of Pakistan Medical Association (JPMA) and Journal of the College of Physicians and Surgeons Pakistan (JCPSP), in the year 2015 were reviewed in terms of study designs used, application of statistical tests, and the use of statistical analysis programmes. JPMA and JCPSP published 192 and 128 original articles, respectively, in the year 2015. Results of this study indicate that cross-sectional study design, bivariate inferential statistical analysis entailing comparison between two variables/groups, and use of statistical software programme SPSS to be the most common study design, inferential statistical analysis, and statistical analysis software programmes, respectively. These results echo previously published assessment of these two journals for the year 2014.
Automatic Generation of Algorithms for the Statistical Analysis of Planetary Nebulae Images
NASA Technical Reports Server (NTRS)
Fischer, Bernd
2004-01-01
Analyzing data sets collected in experiments or by observations is a Core scientific activity. Typically, experimentd and observational data are &aught with uncertainty, and the analysis is based on a statistical model of the conjectured underlying processes, The large data volumes collected by modern instruments make computer support indispensible for this. Consequently, scientists spend significant amounts of their time with the development and refinement of the data analysis programs. AutoBayes [GF+02, FS03] is a fully automatic synthesis system for generating statistical data analysis programs. Externally, it looks like a compiler: it takes an abstract problem specification and translates it into executable code. Its input is a concise description of a data analysis problem in the form of a statistical model as shown in Figure 1; its output is optimized and fully documented C/C++ code which can be linked dynamically into the Matlab and Octave environments. Internally, however, it is quite different: AutoBayes derives a customized algorithm implementing the given model using a schema-based process, and then further refines and optimizes the algorithm into code. A schema is a parameterized code template with associated semantic constraints which define and restrict the template s applicability. The schema parameters are instantiated in a problem-specific way during synthesis as AutoBayes checks the constraints against the original model or, recursively, against emerging sub-problems. AutoBayes schema library contains problem decomposition operators (which are justified by theorems in a formal logic in the domain of Bayesian networks) as well as machine learning algorithms (e.g., EM, k-Means) and nu- meric optimization methods (e.g., Nelder-Mead simplex, conjugate gradient). AutoBayes augments this schema-based approach by symbolic computation to derive closed-form solutions whenever possible. This is a major advantage over other statistical data analysis systems which use numerical approximations even in cases where closed-form solutions exist. AutoBayes is implemented in Prolog and comprises approximately 75.000 lines of code. In this paper, we take one typical scientific data analysis problem-analyzing planetary nebulae images taken by the Hubble Space Telescope-and show how AutoBayes can be used to automate the implementation of the necessary anal- ysis programs. We initially follow the analysis described by Knuth and Hajian [KHO2] and use AutoBayes to derive code for the published models. We show the details of the code derivation process, including the symbolic computations and automatic integration of library procedures, and compare the results of the automatically generated and manually implemented code. We then go beyond the original analysis and use AutoBayes to derive code for a simple image segmentation procedure based on a mixture model which can be used to automate a manual preproceesing step. Finally, we combine the original approach with the simple segmentation which yields a more detailed analysis. This also demonstrates that AutoBayes makes it easy to combine different aspects of data analysis.
Bohnert, Kipling M; Ríos-Bedoya, Carlos F; Breslau, Naomi
2009-12-01
Parental monitoring has been identified as a predictor of adolescent smoking initiation. However, it is uncertain if the association is uniform across different racial groups. Random samples of low birth-weight and normal birth-weight children were drawn from newborn discharge lists (1983-1985) of two major hospitals in southeast Michigan, one serving an inner city and the other serving suburbs. Assessments occurred at ages 6, 11, and 17 years. Statistical analysis was conducted on children with data on parent monitoring at age 11 and tobacco use at age 17 who had never smoked a cigarette up to age 11 (n = 572). Multiple logistic regression was used to examine the association between parent monitoring and children's smoking initiation. Two-way interactions were tested. The relationship between parent monitoring at age 11 and child smoking initiation from ages 11 to 17 varied by race. Among White children, an increase of 1 point on the parent monitoring scale signaled an 11% reduction in the odds of initiating smoking by age 17. In contrast, parent monitoring was not significantly associated with smoking initiation among Black children. The results suggest a differential influence of parent monitoring on adolescent smoking between White and Black children. Future research would benefit from close attention to parental goals and concerns and to extra-familial factors that shape smoking behavior across racially and socially disparate communities.
Impact of Soil Moisture Initialization on Seasonal Weather Prediction
NASA Technical Reports Server (NTRS)
Koster, Randal D.; Suarez, Max J.; Houser, Paul (Technical Monitor)
2002-01-01
The potential role of soil moisture initialization in seasonal forecasting is illustrated through ensembles of simulations with the NASA Seasonal-to-Interannual Prediction Project (NSIPP) model. For each boreal summer during 1997-2001, we generated two 16-member ensembles of 3-month simulations. The first, "AMIP-style" ensemble establishes the degree to which a perfect prediction of SSTs would contribute to the seasonal prediction of precipitation and temperature over continents. The second ensemble is identical to the first, except that the land surface is also initialized with "realistic" soil moisture contents through the continuous prior application (within GCM simulations leading up to the start of the forecast period) of a daily observational precipitation data set and the associated avoidance of model drift through the scaling of all surface prognostic variables. A comparison of the two ensembles shows that soil moisture initialization has a statistically significant impact on summertime precipitation and temperature over only a handful of continental regions. These regions agree, to first order, with regions that satisfy three conditions: (1) a tendency toward large initial soil moisture anomalies, (2) a strong sensitivity of evaporation to soil moisture, and (3) a strong sensitivity of precipitation to evaporation. The degree to which the initialization improves forecasts relative to observations is mixed, reflecting a critical need for the continued development of model parameterizations and data analysis strategies.
Bleidorn, Jutta; Hummers-Pradier, Eva; Schmiemann, Guido; Wiese, Birgitt; Gágyor, Ildikó
2016-01-01
Uncomplicated urinary tract infections (UTI) are common in general practice, and are usually treated with antibiotics. Recurrent UTI often pose a serious problem for affected women. Little is known about recurrent UTI and complications when uncomplicated UTI are treated without antibiotics. With ICUTI (Immediate vs. conditional antibiotic use in uncomplicated UTI, funded by BMBF No. 01KG1105) we assessed whether initial symptomatic treatment with ibuprofen could be a treatment alternative for uncomplicated UTI. The presented analysis aims to assess the influence of initial (non-)antibiotic treatment on recurrent UTI rates and pyelonephritis after day 28 up to 6 months after trial participation. This study is a retrospective long-term follow-up analysis of ICUTI patients, surveyed telephonically six months after inclusion in the trial. Recurrent UTI, pyelonephritis or hospitalizations were documented. Statistical evaluation was performed by descriptive and multivariate analyses with SPSS 21. For the six months follow-up survey, 386 trial participants could be contacted (494 had been included in ICUTI initially, 446 had completed the trial). From day 28 until 6 months after inclusion in ICUTI, 84 recurrent UTI were reported by 80 patients. Univariate and multivariate analyses showed no effect of initial treatment group or antibiotic treatment on number of patients with recurrent UTI. Yet, both analyses showed that patients with a history of previous UTI had significantly more often recurrent UTI. Pyelonephritis occurred in two patients of the antibiotic group and in one patient in the non-antibiotic group. This follow-up analysis of a trial comparing antibiotic vs. symptomatic treatment for uncomplicated UTI showed that non-antibiotic treatment has no negative impact on recurrent UTI rates or pyelonephritis after day 28 and up to six months after initial treatment. Thus, a four week follow-up in UTI trials seems adequate.
Three Years of the New Mexico Laptop Learning Initiative (NMLLI): Stumbling toward Innovation
ERIC Educational Resources Information Center
Rutledge, David; Duran, James; Carroll-Miranda, Joseph
2007-01-01
This article presents qualitative results of the first three years of the New Mexico Laptop Learning Initiative (NMLLI). Results suggest that teachers, students, and their communities support this initiative to improve student learning. Descriptive statistics were used during year two to further understand how the laptops were being used by…
NASA Astrophysics Data System (ADS)
Taheri, Shaghayegh; Fevens, Thomas; Bui, Tien D.
2017-02-01
Computerized assessments for diagnosis or malignancy grading of cyto-histopathological specimens have drawn increased attention in the field of digital pathology. Automatic segmentation of cell nuclei is a fundamental step in such automated systems. Despite considerable research, nuclei segmentation is still a challenging task due noise, nonuniform illumination, and most importantly, in 2D projection images, overlapping and touching nuclei. In most published approaches, nuclei refinement is a post-processing step after segmentation, which usually refers to the task of detaching the aggregated nuclei or merging the over-segmented nuclei. In this work, we present a novel segmentation technique which effectively addresses the problem of individually segmenting touching or overlapping cell nuclei during the segmentation process. The proposed framework is a region-based segmentation method, which consists of three major modules: i) the image is passed through a color deconvolution step to extract the desired stains; ii) then the generalized fast radial symmetry transform is applied to the image followed by non-maxima suppression to specify the initial seed points for nuclei, and their corresponding GFRS ellipses which are interpreted as the initial nuclei borders for segmentation; iii) finally, these nuclei border initial curves are evolved through the use of a statistical level-set approach along with topology preserving criteria for segmentation and separation of nuclei at the same time. The proposed method is evaluated using Hematoxylin and Eosin, and fluorescent stained images, performing qualitative and quantitative analysis, showing that the method outperforms thresholding and watershed segmentation approaches.
Short communication: Genetic association between schizophrenia and cannabis use.
Verweij, Karin J H; Abdellaoui, Abdel; Nivard, Michel G; Sainz Cort, Alberto; Ligthart, Lannie; Draisma, Harmen H M; Minică, Camelia C; Gillespie, Nathan A; Willemsen, Gonneke; Hottenga, Jouke-Jan; Boomsma, Dorret I; Vink, Jacqueline M
2017-02-01
Previous studies have shown a relationship between schizophrenia and cannabis use. As both traits are substantially heritable, a shared genetic liability could explain the association. We use two recently developed genomics methods to investigate the genetic overlap between schizophrenia and cannabis use. Firstly, polygenic risk scores for schizophrenia were created based on summary statistics from the largest schizophrenia genome-wide association (GWA) meta-analysis to date. We analysed the association between these schizophrenia polygenic scores and multiple cannabis use phenotypes (lifetime use, regular use, age at initiation, and quantity and frequency of use) in a sample of 6,931 individuals. Secondly, we applied LD-score regression to the GWA summary statistics of schizophrenia and lifetime cannabis use to calculate the genome-wide genetic correlation. Polygenic risk scores for schizophrenia were significantly (α<0.05) associated with five of the eight cannabis use phenotypes, including lifetime use, regular use, and quantity of use, with risk scores explaining up to 0.5% of the variance. Associations were not significant for age at initiation of use and two measures of frequency of use analyzed in lifetime users only, potentially because of reduced power due to a smaller sample size. The LD-score regression revealed a significant genetic correlation of r g =0.22 (SE=0.07, p=0.003) between schizophrenia and lifetime cannabis use. Common genetic variants underlying schizophrenia and lifetime cannabis use are partly overlapping. Individuals with a stronger genetic predisposition to schizophrenia are more likely to initiate cannabis use, use cannabis more regularly, and consume more cannabis over their lifetime. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Alasil, Tarek; Wang, Kaidi; Yu, Fei; Field, Matthew G.; Lee, Hang; Baniasadi, Neda; de Boer, Johannes F.; Coleman, Anne L.; Chen, Teresa C.
2015-01-01
Purpose To determine the retinal nerve fiber layer (RNFL) thickness at which visual field (VF) damage becomes detectable and associated with structural loss. Design Retrospective cross-sectional study. Methods Eighty seven healthy and 108 glaucoma subjects (one eye per subject) were recruited from an academic institution. All patients had VF examinations (Swedish Interactive Threshold Algorithm 24-2 test of the Humphrey visual field analyzer 750i; Carl Zeiss Meditec, Dublin, CA) and spectral domain optical coherence tomography RNFL scans (Spectralis, Heidelberg Engineering, Heidelberg, Germany). Comparison of RNFL thicknesses values with VF threshold values showed a plateau of VF threshold values at high RNFL thickness values and then a sharp decrease at lower RNFL thickness values. A broken stick statistical analysis was utilized to estimate the tipping point at which RNFL thickness values are associated with VF defects. The slope for the association between structure and function was computed for data above and below the tipping point. Results The mean RNFL thickness value that was associated with initial VF loss was 89 μm. The superior RNFL thickness value that was associated with initial corresponding inferior VF loss was 100 μm. The inferior RNFL thickness value that was associated with initial corresponding superior VF loss was 73 μm. The differences between all the slopes above and below the aforementioned tipping points were statistically significant (p<0.001). Conclusions In open angle glaucoma, substantial RNFL thinning or structural loss appears to be necessary before functional visual field defects become detectable. PMID:24487047
Assessing Threat Detection Scenarios through Hypothesis Generation and Testing
2015-12-01
Publications. Field, A. (2005). Discovering statistics using SPSS (2nd ed.). Thousand Oaks, CA: Sage Publications. Fisher, S. D., Gettys, C. F...therefore, subsequent F statistics are reported using the Huynh-Feldt correction (Greenhouse-Geisser Epsilon > .775). Experienced and inexperienced...change in hypothesis using experience and initial confidence as predictors. In the Dog Day scenario, the regression was not statistically
Digest of Education Statistics 2014, 50th Edition. NCES 2016-006
ERIC Educational Resources Information Center
Snyder, Thomas D.; de Brey, Cristobal; Dillow, Sally A.
2016-01-01
The 2014 edition of the "Digest of Education Statistics" is the 50th in a series of publications initiated in 1962. The Digest has been issued annually except for combined editions for the years 1977-78, 1983-84, and 1985-86. Its primary purpose is to provide a compilation of statistical information covering the broad field of American…
ERIC Educational Resources Information Center
Schmind, Kendra K.; Blankenship, Erin E.; Kerby. April T.; Green, Jennifer L.; Smith, Wendy M.
2014-01-01
The statistical preparation of in-service teachers, particularly middle school teachers, has been an area of concern for several years. This paper discusses the creation and delivery of an introductory statistics course as part of a master's degree program for in-service mathematics teachers. The initial course development took place before the…
Experimental demonstration of a two-phase population extinction hazard
Drake, John M.; Shapiro, Jeff; Griffen, Blaine D.
2011-01-01
Population extinction is a fundamental biological process with applications to ecology, epidemiology, immunology, conservation biology and genetics. Although a monotonic relationship between initial population size and mean extinction time is predicted by virtually all theoretical models, attempts at empirical demonstration have been equivocal. We suggest that this anomaly is best explained with reference to the transient properties of ensembles of populations. Specifically, we submit that under experimental conditions, many populations escape their initially vulnerable state to reach quasi-stationarity, where effects of initial conditions are erased. Thus, extinction of populations initialized far from quasi-stationarity may be exposed to a two-phase extinction hazard. An empirical prediction of this theory is that the fit Cox proportional hazards regression model for the observed survival time distribution of a group of populations will be shown to violate the proportional hazards assumption early in the experiment, but not at later times. We report results of two experiments with the cladoceran zooplankton Daphnia magna designed to exhibit this phenomenon. In one experiment, habitat size was also varied. Statistical analysis showed that in one of these experiments a transformation occurred so that very early in the experiment there existed a transient phase during which the extinction hazard was primarily owing to the initial population size, and that this was gradually replaced by a more stable quasi-stationary phase. In the second experiment, only habitat size unambiguously displayed an effect. Analysis of data pooled from both experiments suggests that the overall extinction time distribution in this system results from the mixture of extinctions during the initial rapid phase, during which the effects of initial population size can be considerable, and a longer quasi-stationary phase, during which only habitat size has an effect. These are the first results, to our knowledge, of a two-phase population extinction process. PMID:21429907
Predicting Subsequent Myopia in Initially Pilot-Qualified USAFA Cadets.
1985-12-27
Refraction Measurement 14 Accesion For . 4.0 RESULTS NTIS CRA&I 15 4.1 Descriptive Statistics DTIC TAB 0 15i ~ ~Unannoutwced [ 4.2 Predictive Statistics ...mentioned), and three were missing a status. The data of the subject who was commissionable were dropped from the statistical analyses. Of the 91...relatively equal numbers of participants from all classes will become obvious ’’" - within the results. J 4.1 Descriptive Statistics In the original plan
Mathematical Analysis of a Coarsening Model with Local Interactions
NASA Astrophysics Data System (ADS)
Helmers, Michael; Niethammer, Barbara; Velázquez, Juan J. L.
2016-10-01
We consider particles on a one-dimensional lattice whose evolution is governed by nearest-neighbor interactions where particles that have reached size zero are removed from the system. Concentrating on configurations with infinitely many particles, we prove existence of solutions under a reasonable density assumption on the initial data and show that the vanishing of particles and the localized interactions can lead to non-uniqueness. Moreover, we provide a rigorous upper coarsening estimate and discuss generic statistical properties as well as some non-generic behavior of the evolution by means of heuristic arguments and numerical observations.
Update on ONC's Substellar IMF: A Second Peak in the Brown Dwarf Regime
NASA Astrophysics Data System (ADS)
Drass, Holger; Bayo, A.; Chini, R.; Haas, M.
2017-06-01
The Orion Nebular Cluster (ONC) has become the prototype cluster for studying the Initial Mass Function (IMF). In a deep JHK survey of the ONC with HAWK-I we detected a large population of 900 Brown Dwarfs and Planetary Mass Object candidates presenting a pronounced second peak in the substellar IMF. One of the most obvious issues of this result is the verification of cluster membership. The analysis so far was mainly based on statistical consideration. In this presentation I will show the results from using different high-resolution extinction map to determine the ONC membership.
DOT National Transportation Integrated Search
1998-09-01
In 1971, the Louisiana Department of Transportation and Development initiated a statistically based specification system for asphaltic concrete using historically generated data. A Materials Test Data (MATT) reporting system was also started to archi...
Statistical analyses of commercial vehicle accident factors. Volume 1 Part 1
DOT National Transportation Integrated Search
1978-02-01
Procedures for conducting statistical analyses of commercial vehicle accidents have been established and initially applied. A file of some 3,000 California Highway Patrol accident reports from two areas of California during a period of about one year...
The Essential Genome of Escherichia coli K-12.
Goodall, Emily C A; Robinson, Ashley; Johnston, Iain G; Jabbari, Sara; Turner, Keith A; Cunningham, Adam F; Lund, Peter A; Cole, Jeffrey A; Henderson, Ian R
2018-02-20
Transposon-directed insertion site sequencing (TraDIS) is a high-throughput method coupling transposon mutagenesis with short-fragment DNA sequencing. It is commonly used to identify essential genes. Single gene deletion libraries are considered the gold standard for identifying essential genes. Currently, the TraDIS method has not been benchmarked against such libraries, and therefore, it remains unclear whether the two methodologies are comparable. To address this, a high-density transposon library was constructed in Escherichia coli K-12. Essential genes predicted from sequencing of this library were compared to existing essential gene databases. To decrease false-positive identification of essential genes, statistical data analysis included corrections for both gene length and genome length. Through this analysis, new essential genes and genes previously incorrectly designated essential were identified. We show that manual analysis of TraDIS data reveals novel features that would not have been detected by statistical analysis alone. Examples include short essential regions within genes, orientation-dependent effects, and fine-resolution identification of genome and protein features. Recognition of these insertion profiles in transposon mutagenesis data sets will assist genome annotation of less well characterized genomes and provides new insights into bacterial physiology and biochemistry. IMPORTANCE Incentives to define lists of genes that are essential for bacterial survival include the identification of potential targets for antibacterial drug development, genes required for rapid growth for exploitation in biotechnology, and discovery of new biochemical pathways. To identify essential genes in Escherichia coli , we constructed a transposon mutant library of unprecedented density. Initial automated analysis of the resulting data revealed many discrepancies compared to the literature. We now report more extensive statistical analysis supported by both literature searches and detailed inspection of high-density TraDIS sequencing data for each putative essential gene for the E. coli model laboratory organism. This paper is important because it provides a better understanding of the essential genes of E. coli , reveals the limitations of relying on automated analysis alone, and provides a new standard for the analysis of TraDIS data. Copyright © 2018 Goodall et al.
Analysis of Variance: What Is Your Statistical Software Actually Doing?
ERIC Educational Resources Information Center
Li, Jian; Lomax, Richard G.
2011-01-01
Users assume statistical software packages produce accurate results. In this article, the authors systematically examined Statistical Package for the Social Sciences (SPSS) and Statistical Analysis System (SAS) for 3 analysis of variance (ANOVA) designs, mixed-effects ANOVA, fixed-effects analysis of covariance (ANCOVA), and nested ANOVA. For each…
Pan, Larry; Baek, Seunghee; Edmonds, Pamela R; Roach, Mack; Wolkov, Harvey; Shah, Satish; Pollack, Alan; Hammond, M Elizabeth; Dicker, Adam P
2013-04-25
Angiogenesis is a key element in solid-tumor growth, invasion, and metastasis. VEGF is among the most potent angiogenic factor thus far detected. The aim of the present study is to explore the potential of VEGF (also known as VEGF-A) as a prognostic and predictive biomarker among men with locally advanced prostate cancer. The analysis was performed using patients enrolled on RTOG 8610, a phase III randomized control trial of radiation therapy alone (Arm 1) versus short-term neoadjuvant and concurrent androgen deprivation and radiation therapy (Arm 2) in men with locally advanced prostate carcinoma. Tissue samples were obtained from the RTOG tissue repository. Hematoxylin and eosin slides were reviewed, and paraffin blocks were immunohistochemically stained for VEGF expression and graded by Intensity score (0-3). Cox or Fine and Gray's proportional hazards models were used. Sufficient pathologic material was available from 103 (23%) of the 456 analyzable patients enrolled in the RTOG 8610 study. There were no statistically significant differences in the pre-treatment characteristics between the patient groups with and without VEGF intensity data. Median follow-up for all surviving patients with VEGF intensity data is 12.2 years. Univariate and multivariate analyses demonstrated no statistically significant correlation between the intensity of VEGF expression and overall survival, distant metastasis, local progression, disease-free survival, or biochemical failure. VEGF expression was also not statistically significantly associated with any of the endpoints when analyzed by treatment arm. This study revealed no statistically significant prognostic or predictive value of VEGF expression for locally advanced prostate cancer. This analysis is among one of the largest sample bases with long-term follow-up in a well-characterized patient population. There is an urgent need to establish multidisciplinary initiatives for coordinating further research in the area of human prostate cancer biomarkers.
Hierarchical multivariate covariance analysis of metabolic connectivity.
Carbonell, Felix; Charil, Arnaud; Zijdenbos, Alex P; Evans, Alan C; Bedell, Barry J
2014-12-01
Conventional brain connectivity analysis is typically based on the assessment of interregional correlations. Given that correlation coefficients are derived from both covariance and variance, group differences in covariance may be obscured by differences in the variance terms. To facilitate a comprehensive assessment of connectivity, we propose a unified statistical framework that interrogates the individual terms of the correlation coefficient. We have evaluated the utility of this method for metabolic connectivity analysis using [18F]2-fluoro-2-deoxyglucose (FDG) positron emission tomography (PET) data from the Alzheimer's Disease Neuroimaging Initiative (ADNI) study. As an illustrative example of the utility of this approach, we examined metabolic connectivity in angular gyrus and precuneus seed regions of mild cognitive impairment (MCI) subjects with low and high β-amyloid burdens. This new multivariate method allowed us to identify alterations in the metabolic connectome, which would not have been detected using classic seed-based correlation analysis. Ultimately, this novel approach should be extensible to brain network analysis and broadly applicable to other imaging modalities, such as functional magnetic resonance imaging (MRI).
NASA Technical Reports Server (NTRS)
Marr, Greg C.
2003-01-01
The Triana spacecraft was designed to be launched by the Space Shuttle. The nominal Triana mission orbit will be a Sun-Earth L1 libration point orbit. Using the NASA Goddard Space Flight Center's Orbit Determination Error Analysis System (ODEAS), orbit determination (OD) error analysis results are presented for all phases of the Triana mission from the first correction maneuver through approximately launch plus 6 months. Results are also presented for the science data collection phase of the Fourier Kelvin Stellar Interferometer Sun-Earth L2 libration point mission concept with momentum unloading thrust perturbations during the tracking arc. The Triana analysis includes extensive analysis of an initial short arc orbit determination solution and results using both Deep Space Network (DSN) and commercial Universal Space Network (USN) statistics. These results could be utilized in support of future Sun-Earth libration point missions.
Lü, Yiran; Hao, Shuxin; Zhang, Guoqing; Liu, Jie; Liu, Yue; Xu, Dongqun
2018-01-01
To implement the online statistical analysis function in information system of air pollution and health impact monitoring, and obtain the data analysis information real-time. Using the descriptive statistical method as well as time-series analysis and multivariate regression analysis, SQL language and visual tools to implement online statistical analysis based on database software. Generate basic statistical tables and summary tables of air pollution exposure and health impact data online; Generate tendency charts of each data part online and proceed interaction connecting to database; Generate butting sheets which can lead to R, SAS and SPSS directly online. The information system air pollution and health impact monitoring implements the statistical analysis function online, which can provide real-time analysis result to its users.
Intensity changes in future extreme precipitation: A statistical event-based approach.
NASA Astrophysics Data System (ADS)
Manola, Iris; van den Hurk, Bart; de Moel, Hans; Aerts, Jeroen
2017-04-01
Short-lived precipitation extremes are often responsible for hazards in urban and rural environments with economic and environmental consequences. The precipitation intensity is expected to increase about 7% per degree of warming, according to the Clausius-Clapeyron (CC) relation. However, the observations often show a much stronger increase in the sub-daily values. In particular, the behavior of the hourly summer precipitation from radar observations with the dew point temperature (the Pi-Td relation) for the Netherlands suggests that for moderate to warm days the intensification of the precipitation can be even higher than 21% per degree of warming, that is 3 times higher than the expected CC relation. The rate of change depends on the initial precipitation intensity, as low percentiles increase with a rate below CC, the medium percentiles with 2CC and the moderate-high and high percentiles with 3CC. This non-linear statistical Pi-Td relation is suggested to be used as a delta-transformation to project how a historic extreme precipitation event would intensify under future, warmer conditions. Here, the Pi-Td relation is applied over a selected historic extreme precipitation event to 'up-scale' its intensity to warmer conditions. Additionally, the selected historic event is simulated in the high-resolution, convective-permitting weather model Harmonie. The initial and boundary conditions are alternated to represent future conditions. The comparison between the statistical and the numerical method of projecting the historic event to future conditions showed comparable intensity changes, which depending on the initial percentile intensity, range from below CC to a 3CC rate of change per degree of warming. The model tends to overestimate the future intensities for the low- and the very high percentiles and the clouds are somewhat displaced, due to small wind and convection changes. The total spatial cloud coverage in the model remains, as also in the statistical method, unchanged. The advantages of the suggested Pi-Td method of projecting future precipitation events from historic events is that it is simple to use, is less expensive time, computational and resource wise compared to a numerical model. The outcome can be used directly for hydrological and climatological studies and for impact analysis such as for flood risk assessments.
Alhdiri, Maryam Ahmed; Samat, Nor Azah; Mohamed, Zulkifley
2017-03-01
Cancer is the most rapidly spreading disease in the world, especially in developing countries, including Libya. Cancer represents a significant burden on patients, families, and their societies. This disease can be controlled if detected early. Therefore, disease mapping has recently become an important method in the fields of public health research and disease epidemiology. The correct choice of statistical model is a very important step to producing a good map of a disease. Libya was selected to perform this work and to examine its geographical variation in the incidence of lung cancer. The objective of this paper is to estimate the relative risk for lung cancer. Four statistical models to estimate the relative risk for lung cancer and population censuses of the study area for the time period 2006 to 2011 were used in this work. They are initially known as Standardized Morbidity Ratio, which is the most popular statistic, which used in the field of disease mapping, Poisson-gamma model, which is one of the earliest applications of Bayesian methodology, Besag, York and Mollie (BYM) model and Mixture model. As an initial step, this study begins by providing a review of all proposed models, which we then apply to lung cancer data in Libya. Maps, tables and graph, goodness-of-fit (GOF) were used to compare and present the preliminary results. This GOF is common in statistical modelling to compare fitted models. The main general results presented in this study show that the Poisson-gamma model, BYM model, and Mixture model can overcome the problem of the first model (SMR) when there is no observed lung cancer case in certain districts. Results show that the Mixture model is most robust and provides better relative risk estimates across a range of models. Creative Commons Attribution License
Alhdiri, Maryam Ahmed; Samat, Nor Azah; Mohamed, Zulkifley
2017-01-01
Cancer is the most rapidly spreading disease in the world, especially in developing countries, including Libya. Cancer represents a significant burden on patients, families, and their societies. This disease can be controlled if detected early. Therefore, disease mapping has recently become an important method in the fields of public health research and disease epidemiology. The correct choice of statistical model is a very important step to producing a good map of a disease. Libya was selected to perform this work and to examine its geographical variation in the incidence of lung cancer. The objective of this paper is to estimate the relative risk for lung cancer. Four statistical models to estimate the relative risk for lung cancer and population censuses of the study area for the time period 2006 to 2011 were used in this work. They are initially known as Standardized Morbidity Ratio, which is the most popular statistic, which used in the field of disease mapping, Poisson-gamma model, which is one of the earliest applications of Bayesian methodology, Besag, York and Mollie (BYM) model and Mixture model. As an initial step, this study begins by providing a review of all proposed models, which we then apply to lung cancer data in Libya. Maps, tables and graph, goodness-of-fit (GOF) were used to compare and present the preliminary results. This GOF is common in statistical modelling to compare fitted models. The main general results presented in this study show that the Poisson-gamma model, BYM model, and Mixture model can overcome the problem of the first model (SMR) when there is no observed lung cancer case in certain districts. Results show that the Mixture model is most robust and provides better relative risk estimates across a range of models. PMID:28440974
Mohammadi, Mohammad Javad; Takdastan, Afshin; Jorfi, Sahand; Neisi, Abdolkazem; Farhadi, Majid; Yari, Ahmad Reza; Dobaradaran, Sina; Khaniabadi, Yusef Omidi
2017-04-01
In this work, we present the result of an electric coagulation process with iron and aluminum electrodes for removal of chemical and biological oxygen demand (COD and BOD) from grey water in different car washes of Ahvaz, Iran. Nowadays, one of the important dangerous that can contaminate water resources for drinking, agriculture and industrial is Car wash effluent [1,2]. In this study, initial COD and BOD concentration, pH of the solution, voltage power and reaction time was investigated. The concentration level of remaining COD and BOD in samples was measured, using DR/5000 UV-vis HACH spectrophotometer [3,4]. The effects of contact time, initial pH, electrical potential and voltage data on removal of COD and BOD were presented. Statistical analysis of the data was carried out using Special Package for Social Sciences (SPSS 16).
Pencina, Michael J; Louzao, Darcy M; McCourt, Brian J; Adams, Monique R; Tayyabkhan, Rehbar H; Ronco, Peter; Peterson, Eric D
2016-02-01
There are growing calls for sponsors to increase transparency by providing access to clinical trial data. In response, Bristol-Myers Squibb and the Duke Clinical Research Institute have collaborated on a new initiative, Supporting Open Access to Researchers. The aim is to facilitate open sharing of Bristol-Myers Squibb trial data with interested researchers. Key features of the Supporting Open Access to Researchers data sharing model include an independent review committee that ensures expert consideration of each proposal, stringent data deidentification/anonymization and protection of patient privacy, requirement of prespecified statistical analysis plans, and independent review of manuscripts before submission for publication. We believe that these approaches will promote open science by allowing investigators to verify trial results as well as to pursue interesting secondary uses of trial data without compromising scientific integrity. Copyright © 2015 Elsevier Inc. All rights reserved.
[The dynamic binding of resources for health in Brazil: 1995 to 2004].
de França, José Rivaldo Melo; do Rosário Costa, Nilson
2011-01-01
The aim of this study is to discuss the Brazilian Federal participation in the financing health sector from 1995 to 2004, having the basis the resources of the institutional initiatives related to the indexation of expenses, considering the macroeconomic adjustment practiced in that period and the institutions role to protect the cash flow. Examining the performances of the institutional mechanism actually adopted with the purpose of guarantee the flow regularity and extension of the values, by the analisys of the Temporary Contribution on the Financial Movement (TCFM) and the Constitutional Amendment Number 29 (CA 29) whose initiatives has their efficiency questioned. It demonstrates the impact of the destination of such measures from the statistics analysis of the use of the resources of the TCFM regarding its levies and the indexation of Union resources effects measures from the CA 29 before and after the application of its determinations.
Climate Change: Modeling the Human Response
NASA Astrophysics Data System (ADS)
Oppenheimer, M.; Hsiang, S. M.; Kopp, R. E.
2012-12-01
Integrated assessment models have historically relied on forward modeling including, where possible, process-based representations to project climate change impacts. Some recent impact studies incorporate the effects of human responses to initial physical impacts, such as adaptation in agricultural systems, migration in response to drought, and climate-related changes in worker productivity. Sometimes the human response ameliorates the initial physical impacts, sometimes it aggravates it, and sometimes it displaces it onto others. In these arenas, understanding of underlying socioeconomic mechanisms is extremely limited. Consequently, for some sectors where sufficient data has accumulated, empirically based statistical models of human responses to past climate variability and change have been used to infer response sensitivities which may apply under certain conditions to future impacts, allowing a broad extension of integrated assessment into the realm of human adaptation. We discuss the insights gained from and limitations of such modeling for benefit-cost analysis of climate change.
Hyperopic photorefractive keratectomy and central islands
NASA Astrophysics Data System (ADS)
Gobbi, Pier Giorgio; Carones, Francesco; Morico, Alessandro; Vigo, Luca; Brancato, Rosario
1998-06-01
We have evaluated the refractive evolution in patients treated with yhyperopic PRK to assess the extent of the initial overcorrection and the time constant of regression. To this end, the time history of the refractive error (i.e. the difference between achieved and intended refractive correction) has been fitted by means of an exponential statistical model, giving information characterizing the surgical procedure with a direct clinical meaning. Both hyperopic and myopic PRk procedures have been analyzed by this method. The analysis of the fitting model parameters shows that hyperopic PRK patients exhibit a definitely higher initial overcorrection than myopic ones, and a regression time constant which is much longer. A common mechanism is proposed to be responsible for the refractive outcomes in hyperopic treatments and in myopic patients exhibiting significant central islands. The interpretation is in terms of superhydration of the central cornea, and is based on a simple physical model evaluating the amount of centripetal compression in the apical cornea.
Arts, Science, Engineering and Medicine Collaborate to Educate Public on Bioenergetics.
Tompkins, Emily; Faris, Sarah; Hughes, Laura; Maurakis, Eugene; Lesnefsky, Edward Joseph; Rao, Raj Raghavendra; Iyer, Shilpa
2017-01-01
Mitochondrial dysfunction has correlated with a rise in energy deficiency disorders (EDD). The EDDs include mitochondrial disorders, obesity, metabolic disorders, cardiovascular and neurodegenerative disorders. Many individuals in our communities are at high risk of developing these disorders, yet are unaware of it. Our goal was to increase public awareness of mitochondrial health, whilst providing students with an innovative educational experience. We designed a 'Bioenergetics exhibition' by introducing Arts into traditional STEM (Science, Technology, Engineering & Mathematics) disciplines to create a new STEAM-(Health) initiative. Results indicated ~120,000 guests visited the exhibition, including many school-aged children, teachers and families. Comparative analysis of random first-time vs. repeat visitor surveys demonstrated a statistically significant (8.25% at p-value = 0.006) increase in knowledge of mitochondrial disease and bioenergetics. Our findings clearly support the power of the STEAM-H initiative in creatively communicating the complex science to a broader community.
Experiences in using DISCUS for visualizing human communication
NASA Astrophysics Data System (ADS)
Groehn, Matti; Nieminen, Marko; Haho, Paeivi; Smeds, Riitta
2000-02-01
In this paper, we present further improvement to the DISCUS software that can be used to record and analyze the flow and constants of business process simulation session discussion. The tool was initially introduced in 'visual data exploration and analysis IV' conference. The initial features of the tool enabled the visualization of discussion flow in business process simulation sessions and the creation of SOM analyses. The improvements of the tool consists of additional visualization possibilities that enable quick on-line analyses and improved graphical statistics. We have also created the very first interface to audio data and implemented two ways to visualize it. We also outline additional possibilities to use the tool in other application areas: these include usability testing and the possibility to use the tool for capturing design rationale in a product development process. The data gathered with DISCUS may be used in other applications, and further work may be done with data ming techniques.
Single-Molecule Probing the Energy Landscape of Enzymatic Reaction and Non-Covalent Interactions
NASA Astrophysics Data System (ADS)
Lu, H. Peter; Hu, Dehong; Chen, Yu; Vorpagel, Erich R.
2002-03-01
We have applied single-molecule spectroscopy under physiological conditions to study the mechanisms and dynamics of T4 lysozyme enzymatic reactions, characterizing mode-specific protein conformational dynamics. Enzymatic reaction turnovers and the associated structure changes of individual protein molecules were observed simultaneously in real-time. The overall reaction rates were found to vary widely from molecule-to-molecule, and the initial non-specific binding of the enzyme to the substrate was seen to dominate this inhomogeneity. The reaction steps subsequent to the initial binding were found to have homogeneous rates. Molecular dynamics simulation has been applied to elucidate the mechanism and intermediate states of the single-molecule enzymatic reaction. Combining the analysis of single-molecule experimental trajectories, MD simulation trajectories, and statistical modeling, we have revealed the nature of multiple intermediate states involved in the active enzyme-substrate complex formation and the associated conformational change mechanism and dynamics.
Second trimester serum cortisol and preterm birth: an analysis by timing and subtype.
Bandoli, Gretchen; Jelliffe-Pawlowski, Laura L; Feuer, Sky K; Liang, Liang; Oltman, Scott P; Paynter, Randi; Ross, Kharah M; Schetter, Christine Dunkel; Ryckman, Kelli K; Chambers, Christina D
2018-05-24
We hypothesized second trimester serum cortisol would be higher in spontaneous preterm births compared to provider-initiated (previously termed 'medically indicated') preterm births. We used a nested case-control design with a sample of 993 women with live births. Cortisol was measured from serum samples collected as part of routine prenatal screening. We tested whether mean-adjusted cortisol fold-change differed by gestational age at delivery or preterm birth subtype using multivariable linear regression. An inverse association between cortisol and gestational age category (trend p = 0.09) was observed. Among deliveries prior to 37 weeks, the mean-adjusted cortisol fold-change values were highest for preterm premature rupture of the membranes (1.10), followed by premature labor (1.03) and provider-initiated preterm birth (1.01), although they did not differ statistically. Cortisol continues to be of interest as a marker of future preterm birth. Augmentation with additional biomarkers should be explored.
Initial assessment of hearing loss using a mobile application for audiological evaluation.
Derin, S; Cam, O H; Beydilli, H; Acar, E; Elicora, S S; Sahan, M
2016-03-01
This study aimed to compare an Apple iOS mobile operating system application for audiological evaluation with conventional audiometry, and to determine its accuracy and reliability in the initial evaluation of hearing loss. The study comprised 32 patients (16 females) diagnosed with hearing loss. The patients were first evaluated with conventional audiometry and the degree of hearing loss was recorded. Then they underwent a smartphone-based hearing test and the data were compared using Cohen's kappa analysis. Patients' mean age was 53.59 ± 18.01 years (range, 19-85 years). The mobile phone audiometry results for 39 of the 64 ears were fully compatible with the conventional audiometry results. There was a statistically significant concordant relationship between the two sets of audiometry results (p < 0.05). Ear Trumpet version 1.0.2 is a compact and simple mobile application on the Apple iPhone 5 that can measure hearing loss with reliable results.
Lee, Kian Mun; Hamid, Sharifah Bee Abd
2015-01-19
The performance of advance photocatalytic degradation of 4-chlorophenoxyacetic acid (4-CPA) strongly depends on photocatalyst dosage, initial concentration and initial pH. In the present study, a simple response surface methodology (RSM) was applied to investigate the interaction between these three independent factors. Thus, the photocatalytic degradation of 4-CPA in aqueous medium assisted by ultraviolet-active ZnO photocatalyst was systematically investigated. This study aims to determine the optimum processing parameters to maximize 4-CPA degradation. Based on the results obtained, it was found that a maximum of 91% of 4-CPA was successfully degraded under optimal conditions (0.02 g ZnO dosage, 20.00 mg/L of 4-CPA and pH 7.71). All the experimental data showed good agreement with the predicted results obtained from statistical analysis.
Antoszewska-Smith, Joanna; Sarul, Michał; Łyczek, Jan; Konopka, Tomasz; Kawala, Beata
2017-03-01
The aim of this systematic review was to compare the effectiveness of orthodontic miniscrew implants-temporary intraoral skeletal anchorage devices (TISADs)-in anchorage reinforcement during en-masse retraction in relation to conventional methods of anchorage. A search of PubMed, Embase, Cochrane Central Register of Controlled Trials, and Web of Science was performed. The keywords were orthodontic, mini-implants, miniscrews, miniplates, and temporary anchorage device. Relevant articles were assessed for quality according to Cochrane guidelines and the data extracted for statistical analysis. A meta-analysis of raw mean differences concerning anchorage loss, tipping of molars, retraction of incisors, tipping of incisors, and treatment duration was carried out. Initially, we retrieved 10,038 articles. The selection process finally resulted in 14 articles including 616 patients (451 female, 165 male) for detailed analysis. Quality of the included studies was assessed as moderate. Meta-analysis showed that use of TISADs facilitates better anchorage reinforcement compared with conventional methods. On average, TISADs enabled 1.86 mm more anchorage preservation than did conventional methods (P <0.001). The results of the meta-analysis showed that TISADs are more effective than conventional methods of anchorage reinforcement. The average difference of 2 mm seems not only statistically but also clinically significant. However, the results should be interpreted with caution because of the moderate quality of the included studies. More high-quality studies on this issue are necessary to enable drawing more reliable conclusions. Copyright © 2016 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.
Access to Oral Osteoporosis Drugs among Female Medicare Part D Beneficiaries
Lin, Chia-Wei; Karaca-Mandic, Pinar; McCullough, Jeffrey S.; Weaver, Lesley
2014-01-01
Background For women living with osteoporosis, high out-of-pocket drug costs may prevent drug therapy initiation. We investigate the association between oral osteoporosis out-of-pocket medication costs and female Medicare beneficiaries’ initiation of osteoporosis drug therapy. Methods We used 2007 and 2008 administrative claims and enrollment data for a 5% random sample of Medicare beneficiaries. Our study sample included age-qualified, female beneficiaries who had no prior history of osteoporosis but were diagnosed with osteoporosis in 2007 or 2008. Additionally, we only included beneficiaries continuously enrolled in standalone prescription drug plans. We excluded beneficiaries who had a chronic condition that was contraindicated with osteoporosis drug utilization. Our final sample included 25,069 beneficiaries. Logistic regression analysis was used to examine the association between the out-of-pocket costs and initiation of oral osteoporosis drug therapy during the year of diagnosis. Findings Twenty-six percent of female Medicare beneficiaries newly diagnosed with osteoporosis initiated oral osteoporosis drug therapy. Beneficiaries’ out-of-pocket costs were not associated with the initiation of drug therapy for osteoporosis. However, there were statistically significant racial disparities in beneficiaries’ initiation of drug therapy. African Americans were 3 percentage points less likely to initiate drug therapy than whites. In contrast, Asian/Pacific Islander and Hispanic beneficiaries were 8 and 18 percentage points respectively more likely to initiate drug therapy than whites. Additionally, institutionalized beneficiaries were 11 percentage points less likely to initiate drug therapy than other beneficiaries. Conclusions Access barriers for drug therapy initiation may be driven by factors other than patients’ out-of-pocket costs. These results suggest that improved osteoporosis treatment requires a more comprehensive approach that goes beyond payment policies. PMID:24837398