Descriptive Statistical Techniques for Librarians. 2nd Edition.
ERIC Educational Resources Information Center
Hafner, Arthur W.
A thorough understanding of the uses and applications of statistical techniques is integral in gaining support for library funding or new initiatives. This resource is designed to help practitioners develop and manipulate descriptive statistical information in evaluating library services, tracking and controlling limited resources, and analyzing…
ERIC Educational Resources Information Center
Karadag, Engin
2010-01-01
To assess research methods and analysis of statistical techniques employed by educational researchers, this study surveyed unpublished doctoral dissertation from 2003 to 2007. Frequently used research methods consisted of experimental research; a survey; a correlational study; and a case study. Descriptive statistics, t-test, ANOVA, factor…
A Multidisciplinary Approach for Teaching Statistics and Probability
ERIC Educational Resources Information Center
Rao, C. Radhakrishna
1971-01-01
The author presents a syllabus for an introductory (first year after high school) course in statistics and probability and some methods of teaching statistical techniques. The description comes basically from the procedures used at the Indian Statistical Institute, Calcutta. (JG)
Descriptive and inferential statistical methods used in burns research.
Al-Benna, Sammy; Al-Ajam, Yazan; Way, Benjamin; Steinstraesser, Lars
2010-05-01
Burns research articles utilise a variety of descriptive and inferential methods to present and analyse data. The aim of this study was to determine the descriptive methods (e.g. mean, median, SD, range, etc.) and survey the use of inferential methods (statistical tests) used in articles in the journal Burns. This study defined its population as all original articles published in the journal Burns in 2007. Letters to the editor, brief reports, reviews, and case reports were excluded. Study characteristics, use of descriptive statistics and the number and types of statistical methods employed were evaluated. Of the 51 articles analysed, 11(22%) were randomised controlled trials, 18(35%) were cohort studies, 11(22%) were case control studies and 11(22%) were case series. The study design and objectives were defined in all articles. All articles made use of continuous and descriptive data. Inferential statistics were used in 49(96%) articles. Data dispersion was calculated by standard deviation in 30(59%). Standard error of the mean was quoted in 19(37%). The statistical software product was named in 33(65%). Of the 49 articles that used inferential statistics, the tests were named in 47(96%). The 6 most common tests used (Student's t-test (53%), analysis of variance/co-variance (33%), chi(2) test (27%), Wilcoxon & Mann-Whitney tests (22%), Fisher's exact test (12%)) accounted for the majority (72%) of statistical methods employed. A specified significance level was named in 43(88%) and the exact significance levels were reported in 28(57%). Descriptive analysis and basic statistical techniques account for most of the statistical tests reported. This information should prove useful in deciding which tests should be emphasised in educating burn care professionals. These results highlight the need for burn care professionals to have a sound understanding of basic statistics, which is crucial in interpreting and reporting data. Advice should be sought from professionals in the fields of biostatistics and epidemiology when using more advanced statistical techniques. Copyright 2009 Elsevier Ltd and ISBI. All rights reserved.
An overview of data acquisition, signal coding and data analysis techniques for MST radars
NASA Technical Reports Server (NTRS)
Rastogi, P. K.
1986-01-01
An overview is given of the data acquisition, signal processing, and data analysis techniques that are currently in use with high power MST/ST (mesosphere stratosphere troposphere/stratosphere troposphere) radars. This review supplements the works of Rastogi (1983) and Farley (1984) presented at previous MAP workshops. A general description is given of data acquisition and signal processing operations and they are characterized on the basis of their disparate time scales. Then signal coding, a brief description of frequently used codes, and their limitations are discussed, and finally, several aspects of statistical data processing such as signal statistics, power spectrum and autocovariance analysis, outlier removal techniques are discussed.
Spriestersbach, Albert; Röhrig, Bernd; du Prel, Jean-Baptist; Gerhold-Ay, Aslihan; Blettner, Maria
2009-09-01
Descriptive statistics are an essential part of biometric analysis and a prerequisite for the understanding of further statistical evaluations, including the drawing of inferences. When data are well presented, it is usually obvious whether the author has collected and evaluated them correctly and in keeping with accepted practice in the field. Statistical variables in medicine may be of either the metric (continuous, quantitative) or categorical (nominal, ordinal) type. Easily understandable examples are given. Basic techniques for the statistical description of collected data are presented and illustrated with examples. The goal of a scientific study must always be clearly defined. The definition of the target value or clinical endpoint determines the level of measurement of the variables in question. Nearly all variables, whatever their level of measurement, can be usefully presented graphically and numerically. The level of measurement determines what types of diagrams and statistical values are appropriate. There are also different ways of presenting combinations of two independent variables graphically and numerically. The description of collected data is indispensable. If the data are of good quality, valid and important conclusions can already be drawn when they are properly described. Furthermore, data description provides a basis for inferential statistics.
42 CFR 402.7 - Notice of proposed determination.
Code of Federal Regulations, 2010 CFR
2010-10-01
... and a brief description of the statistical sampling technique CMS or OIG used. (3) The reason why the... is relying upon statistical sampling to project the number and types of claims or requests for...
[Statistical analysis of German radiologic periodicals: developmental trends in the last 10 years].
Golder, W
1999-09-01
To identify which statistical tests are applied in German radiological publications, to what extent their use has changed during the last decade, and which factors might be responsible for this development. The major articles published in "ROFO" and "DER RADIOLOGE" during 1988, 1993 and 1998 were reviewed for statistical content. The contributions were classified by principal focus and radiological subspecialty. The methods used were assigned to descriptive, basal and advanced statistics. Sample size, significance level and power were established. The use of experts' assistance was monitored. Finally, we calculated the so-called cumulative accessibility of the publications. 525 contributions were found to be eligible. In 1988, 87% used descriptive statistics only, 12.5% basal, and 0.5% advanced statistics. The corresponding figures in 1993 and 1998 are 62 and 49%, 32 and 41%, and 6 and 10%, respectively. Statistical techniques were most likely to be used in research on musculoskeletal imaging and articles dedicated to MRI. Six basic categories of statistical methods account for the complete statistical analysis appearing in 90% of the articles. ROC analysis is the single most common advanced technique. Authors make increasingly use of statistical experts' opinion and programs. During the last decade, the use of statistical methods in German radiological journals has fundamentally improved, both quantitatively and qualitatively. Presently, advanced techniques account for 20% of the pertinent statistical tests. This development seems to be promoted by the increasing availability of statistical analysis software.
Hayat, Matthew J.; Powell, Amanda; Johnson, Tessa; Cadwell, Betsy L.
2017-01-01
Statistical literacy and knowledge is needed to read and understand the public health literature. The purpose of this study was to quantify basic and advanced statistical methods used in public health research. We randomly sampled 216 published articles from seven top tier general public health journals. Studies were reviewed by two readers and a standardized data collection form completed for each article. Data were analyzed with descriptive statistics and frequency distributions. Results were summarized for statistical methods used in the literature, including descriptive and inferential statistics, modeling, advanced statistical techniques, and statistical software used. Approximately 81.9% of articles reported an observational study design and 93.1% of articles were substantively focused. Descriptive statistics in table or graphical form were reported in more than 95% of the articles, and statistical inference reported in more than 76% of the studies reviewed. These results reveal the types of statistical methods currently used in the public health literature. Although this study did not obtain information on what should be taught, information on statistical methods being used is useful for curriculum development in graduate health sciences education, as well as making informed decisions about continuing education for public health professionals. PMID:28591190
Hayat, Matthew J; Powell, Amanda; Johnson, Tessa; Cadwell, Betsy L
2017-01-01
Statistical literacy and knowledge is needed to read and understand the public health literature. The purpose of this study was to quantify basic and advanced statistical methods used in public health research. We randomly sampled 216 published articles from seven top tier general public health journals. Studies were reviewed by two readers and a standardized data collection form completed for each article. Data were analyzed with descriptive statistics and frequency distributions. Results were summarized for statistical methods used in the literature, including descriptive and inferential statistics, modeling, advanced statistical techniques, and statistical software used. Approximately 81.9% of articles reported an observational study design and 93.1% of articles were substantively focused. Descriptive statistics in table or graphical form were reported in more than 95% of the articles, and statistical inference reported in more than 76% of the studies reviewed. These results reveal the types of statistical methods currently used in the public health literature. Although this study did not obtain information on what should be taught, information on statistical methods being used is useful for curriculum development in graduate health sciences education, as well as making informed decisions about continuing education for public health professionals.
Econ Simulation Cited as Success
ERIC Educational Resources Information Center
Workman, Robert; Maher, John
1973-01-01
A brief description of a computerized economics simulation model which provides students with an opportunity to apply microeconomic principles along with elementary accounting and statistical techniques.'' (Author/AK)
Statistical description of tectonic motions
NASA Technical Reports Server (NTRS)
Agnew, Duncan Carr
1993-01-01
This report summarizes investigations regarding tectonic motions. The topics discussed include statistics of crustal deformation, Earth rotation studies, using multitaper spectrum analysis techniques applied to both space-geodetic data and conventional astrometric estimates of the Earth's polar motion, and the development, design, and installation of high-stability geodetic monuments for use with the global positioning system.
Donato, David I.
2012-01-01
This report presents the mathematical expressions and the computational techniques required to compute maximum-likelihood estimates for the parameters of the National Descriptive Model of Mercury in Fish (NDMMF), a statistical model used to predict the concentration of methylmercury in fish tissue. The expressions and techniques reported here were prepared to support the development of custom software capable of computing NDMMF parameter estimates more quickly and using less computer memory than is currently possible with available general-purpose statistical software. Computation of maximum-likelihood estimates for the NDMMF by numerical solution of a system of simultaneous equations through repeated Newton-Raphson iterations is described. This report explains the derivation of the mathematical expressions required for computational parameter estimation in sufficient detail to facilitate future derivations for any revised versions of the NDMMF that may be developed.
Goddard trajectory determination subsystem: Mathematical specifications
NASA Technical Reports Server (NTRS)
Wagner, W. E. (Editor); Velez, C. E. (Editor)
1972-01-01
The mathematical specifications of the Goddard trajectory determination subsystem of the flight dynamics system are presented. These specifications include the mathematical description of the coordinate systems, dynamic and measurement model, numerical integration techniques, and statistical estimation concepts.
Fourier Descriptor Analysis and Unification of Voice Range Profile Contours: Method and Applications
ERIC Educational Resources Information Center
Pabon, Peter; Ternstrom, Sten; Lamarche, Anick
2011-01-01
Purpose: To describe a method for unified description, statistical modeling, and comparison of voice range profile (VRP) contours, even from diverse sources. Method: A morphologic modeling technique, which is based on Fourier descriptors (FDs), is applied to the VRP contour. The technique, which essentially involves resampling of the curve of the…
Evaluation of wind field statistics near and inside clouds using a coherent Doppler lidar
NASA Astrophysics Data System (ADS)
Lottman, Brian Todd
1998-09-01
This work proposes advanced techniques for measuring the spatial wind field statistics near and inside clouds using a vertically pointing solid state coherent Doppler lidar on a fixed ground based platform. The coherent Doppler lidar is an ideal instrument for high spatial and temporal resolution velocity estimates. The basic parameters of lidar are discussed, including a complete statistical description of the Doppler lidar signal. This description is extended to cases with simple functional forms for aerosol backscatter and velocity. An estimate for the mean velocity over a sensing volume is produced by estimating the mean spectra. There are many traditional spectral estimators, which are useful for conditions with slowly varying velocity and backscatter. A new class of estimators (novel) is introduced that produces reliable velocity estimates for conditions with large variations in aerosol backscatter and velocity with range, such as cloud conditions. Performance of traditional and novel estimators is computed for a variety of deterministic atmospheric conditions using computer simulated data. Wind field statistics are produced for actual data for a cloud deck, and for multi- layer clouds. Unique results include detection of possible spectral signatures for rain, estimates for the structure function inside a cloud deck, reliable velocity estimation techniques near and inside thin clouds, and estimates for simple wind field statistics between cloud layers.
An unsupervised classification technique for multispectral remote sensing data.
NASA Technical Reports Server (NTRS)
Su, M. Y.; Cummings, R. E.
1973-01-01
Description of a two-part clustering technique consisting of (a) a sequential statistical clustering, which is essentially a sequential variance analysis, and (b) a generalized K-means clustering. In this composite clustering technique, the output of (a) is a set of initial clusters which are input to (b) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by traditional supervised maximum-likelihood classification techniques.
Technical Note: The Initial Stages of Statistical Data Analysis
Tandy, Richard D.
1998-01-01
Objective: To provide an overview of several important data-related considerations in the design stage of a research project and to review the levels of measurement and their relationship to the statistical technique chosen for the data analysis. Background: When planning a study, the researcher must clearly define the research problem and narrow it down to specific, testable questions. The next steps are to identify the variables in the study, decide how to group and treat subjects, and determine how to measure, and the underlying level of measurement of, the dependent variables. Then the appropriate statistical technique can be selected for data analysis. Description: The four levels of measurement in increasing complexity are nominal, ordinal, interval, and ratio. Nominal data are categorical or “count” data, and the numbers are treated as labels. Ordinal data can be ranked in a meaningful order by magnitude. Interval data possess the characteristics of ordinal data and also have equal distances between levels. Ratio data have a natural zero point. Nominal and ordinal data are analyzed with nonparametric statistical techniques and interval and ratio data with parametric statistical techniques. Advantages: Understanding the four levels of measurement and when it is appropriate to use each is important in determining which statistical technique to use when analyzing data. PMID:16558489
Logo image clustering based on advanced statistics
NASA Astrophysics Data System (ADS)
Wei, Yi; Kamel, Mohamed; He, Yiwei
2007-11-01
In recent years, there has been a growing interest in the research of image content description techniques. Among those, image clustering is one of the most frequently discussed topics. Similar to image recognition, image clustering is also a high-level representation technique. However it focuses on the coarse categorization rather than the accurate recognition. Based on wavelet transform (WT) and advanced statistics, the authors propose a novel approach that divides various shaped logo images into groups according to the external boundary of each logo image. Experimental results show that the presented method is accurate, fast and insensitive to defects.
NASA Astrophysics Data System (ADS)
Sarmini; Suyanto, Totok; Nadiroh, Ulin
2018-01-01
In general, corruption is very harmful to society. One of the efforts in preventing corruption is by the culture of Anti-Corruption Education in the young generation through teaching materials in schools. The research method used is qualitative description. The sample in this research is 60 junior high school teachers of Citizenship Education in Surabaya. Data analysis technique used in this research is descriptive statistic with percentage technique. The result of this research is that it is very important that the value of the character of anti-corruption education in teaching materials to grow in the young generation.
NASA standard: Trend analysis techniques
NASA Technical Reports Server (NTRS)
1990-01-01
Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.
NASA standard: Trend analysis techniques
NASA Technical Reports Server (NTRS)
1988-01-01
This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.
2018-01-01
collected data. These statistical techniques are under the area of descriptive statistics, which is a methodology to condense the data in quantitative ...ARL-TR-8270 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Morphological Filter...report when it is no longer needed. Do not return it to the originator. ARL-TR-8270 ● JAN 2017 US Army Research Laboratory An
Modelling the effect of structural QSAR parameters on skin penetration using genetic programming
NASA Astrophysics Data System (ADS)
Chung, K. K.; Do, D. Q.
2010-09-01
In order to model relationships between chemical structures and biological effects in quantitative structure-activity relationship (QSAR) data, an alternative technique of artificial intelligence computing—genetic programming (GP)—was investigated and compared to the traditional method—statistical. GP, with the primary advantage of generating mathematical equations, was employed to model QSAR data and to define the most important molecular descriptions in QSAR data. The models predicted by GP agreed with the statistical results, and the most predictive models of GP were significantly improved when compared to the statistical models using ANOVA. Recently, artificial intelligence techniques have been applied widely to analyse QSAR data. With the capability of generating mathematical equations, GP can be considered as an effective and efficient method for modelling QSAR data.
Noise characteristics of the Skylab S-193 altimeter altitude measurements
NASA Technical Reports Server (NTRS)
Hatch, W. E.
1975-01-01
The statistical characteristics of the SKYLAB S-193 altimeter altitude noise are considered. These results are reported in a concise format for use and analysis by the scientific community. In most instances the results have been grouped according to satellite pointing so that the effects of pointing on the statistical characteristics can be readily seen. The altimeter measurements and the processing techniques are described. The mathematical descriptions of the computer programs used for these results are included.
Mathematical problem solving ability of sport students in the statistical study
NASA Astrophysics Data System (ADS)
Sari, E. F. P.; Zulkardi; Putri, R. I. I.
2017-12-01
This study aims to determine the problem-solving ability of sport students of PGRI Palembang semester V in the statistics course. Subjects in this study were sport students of PGRI Palembang semester V which amounted to 31 people. The research method used is quasi experiment type one case shoot study. Data collection techniques in this study use the test and data analysis used is quantitative descriptive statistics. The conclusion of this study shown that the mathematical problem solving ability of PGRI Palembang sport students of V semester in the statistical course is categorized well with the average of the final test score of 80.3.
Advanced statistical methods for improved data analysis of NASA astrophysics missions
NASA Technical Reports Server (NTRS)
Feigelson, Eric D.
1992-01-01
The investigators under this grant studied ways to improve the statistical analysis of astronomical data. They looked at existing techniques, the development of new techniques, and the production and distribution of specialized software to the astronomical community. Abstracts of nine papers that were produced are included, as well as brief descriptions of four software packages. The articles that are abstracted discuss analytical and Monte Carlo comparisons of six different linear least squares fits, a (second) paper on linear regression in astronomy, two reviews of public domain software for the astronomer, subsample and half-sample methods for estimating sampling distributions, a nonparametric estimation of survival functions under dependent competing risks, censoring in astronomical data due to nondetections, an astronomy survival analysis computer package called ASURV, and improving the statistical methodology of astronomical data analysis.
The composite sequential clustering technique for analysis of multispectral scanner data
NASA Technical Reports Server (NTRS)
Su, M. Y.
1972-01-01
The clustering technique consists of two parts: (1) a sequential statistical clustering which is essentially a sequential variance analysis, and (2) a generalized K-means clustering. In this composite clustering technique, the output of (1) is a set of initial clusters which are input to (2) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by traditional supervised maximum likelihood classification techniques. The mathematical algorithms for the composite sequential clustering program and a detailed computer program description with job setup are given.
Point pattern analysis of FIA data
Chris Woodall
2002-01-01
Point pattern analysis is a branch of spatial statistics that quantifies the spatial distribution of points in two-dimensional space. Point pattern analysis was conducted on stand stem-maps from FIA fixed-radius plots to explore point pattern analysis techniques and to determine the ability of pattern descriptions to describe stand attributes. Results indicate that the...
ERIC Educational Resources Information Center
Subramaniam, Maithreyi; Hanafi, Jaffri; Putih, Abu Talib
2016-01-01
This study adopted 30 first year graphic design students' artwork, with critical analysis using Feldman's model of art criticism. Data were analyzed quantitatively; descriptive statistical techniques were employed. The scores were viewed in the form of mean score and frequencies to determine students' performances in their critical ability.…
ERIC Educational Resources Information Center
Okoza, Jolly; Aluede, Oyaziwo; Owens-Sogolo, Osasere
2013-01-01
This study examined metacognitive awareness of learning strategies among Secondary School Students in Edo State, Nigeria. The study was an exploratory one, which utilized descriptive statistics. A total number of 1200 students drawn through multistage proportionate random sampling technique participated in the study. The study found that secondary…
2017-10-01
casualty care using descriptive statistical analysis and modeling techniques. Aim 2: Identify the ideal provider training and competency assessment... Methodologies , Course Type, Course Availability, Assessment Criteria, Requirements, Funding, Alignment with Clinical Practice Guidelines (CPGs...Aim 1: Descriptive study of all available information for combat casualties in Afghanistan. Specific Tasks: 1) who – patients treated; clinician mix
Statistics for nuclear engineers and scientists. Part 1. Basic statistical inference
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beggs, W.J.
1981-02-01
This report is intended for the use of engineers and scientists working in the nuclear industry, especially at the Bettis Atomic Power Laboratory. It serves as the basis for several Bettis in-house statistics courses. The objectives of the report are to introduce the reader to the language and concepts of statistics and to provide a basic set of techniques to apply to problems of the collection and analysis of data. Part 1 covers subjects of basic inference. The subjects include: descriptive statistics; probability; simple inference for normally distributed populations, and for non-normal populations as well; comparison of two populations; themore » analysis of variance; quality control procedures; and linear regression analysis.« less
Use of statistical procedures in Brazilian and international dental journals.
Ambrosano, Gláucia Maria Bovi; Reis, André Figueiredo; Giannini, Marcelo; Pereira, Antônio Carlos
2004-01-01
A descriptive survey was performed in order to assess the statistical content and quality of Brazilian and international dental journals, and compare their evolution throughout the last decades. The authors identified the reporting and accuracy of statistical techniques in 1000 papers published from 1970 to 2000 in seven dental journals: three Brazilian (Brazilian Dental Journal, Revista de Odontologia da Universidade de Sao Paulo and Revista de Odontologia da UNESP) and four international journals (Journal of the American Dental Association, Journal of Dental Research, Caries Research and Journal of Periodontology). Papers were divided into two time periods: from 1970 to 1989, and from 1990 to 2000. A slight increase in the number of articles that presented some form of statistical technique was noticed for Brazilian journals (from 61.0 to 66.7%), whereas for international journals, a significant increase was observed (65.8 to 92.6%). In addition, a decrease in the number of statistical errors was verified. The most commonly used statistical tests as well as the most frequent errors found in dental journals were assessed. Hopefully, this investigation will encourage dental educators to better plan the teaching of biostatistics, and to improve the statistical quality of submitted manuscripts.
ERIC Educational Resources Information Center
Luna-Torres, Maria; McKinney, Lyle; Horn, Catherine; Jones, Sara
2018-01-01
This study examined a sample of community college students from a diverse, large urban community college system in Texas. To gain a deeper understanding about the effects of background characteristics on student borrowing behaviors and enrollment outcomes, the study employed descriptive statistics and regression techniques to examine two separate…
A detailed description of the sequential probability ratio test for 2-IMU FDI
NASA Technical Reports Server (NTRS)
Rich, T. M.
1976-01-01
The sequential probability ratio test (SPRT) for 2-IMU FDI (inertial measuring unit failure detection/isolation) is described. The SPRT is a statistical technique for detecting and isolating soft IMU failures originally developed for the strapdown inertial reference unit. The flowchart of a subroutine incorporating the 2-IMU SPRT is included.
Eng, Kevin H; Schiller, Emily; Morrell, Kayla
2015-11-03
Researchers developing biomarkers for cancer prognosis from quantitative gene expression data are often faced with an odd methodological discrepancy: while Cox's proportional hazards model, the appropriate and popular technique, produces a continuous and relative risk score, it is hard to cast the estimate in clear clinical terms like median months of survival and percent of patients affected. To produce a familiar Kaplan-Meier plot, researchers commonly make the decision to dichotomize a continuous (often unimodal and symmetric) score. It is well known in the statistical literature that this procedure induces significant bias. We illustrate the liabilities of common techniques for categorizing a risk score and discuss alternative approaches. We promote the use of the restricted mean survival (RMS) and the corresponding RMS curve that may be thought of as an analog to the best fit line from simple linear regression. Continuous biomarker workflows should be modified to include the more rigorous statistical techniques and descriptive plots described in this article. All statistics discussed can be computed via standard functions in the Survival package of the R statistical programming language. Example R language code for the RMS curve is presented in the appendix.
Pupil Size in Outdoor Environments
2007-04-06
studies. .........................19 Table 3: Descriptive statistics for pupils measured over luminance range. .........50 Table 4: N in each...strata for all pupil measurements..........................................50 Table 5: Descriptive statistics stratified against eye color...59 Table 6: Descriptive statistics stratified against gender. .....................................64 Table 7: Descriptive
An attribute-driven statistics generator for use in a G.I.S. environment
NASA Technical Reports Server (NTRS)
Thomas, R. W.; Ritter, P. R.; Kaugars, A.
1984-01-01
When performing research using digital geographic information it is often useful to produce quantitative characterizations of the data, usually within some constraints. In the research environment the different combinations of required data and constraints can often become quite complex. This paper describes a technique that gives the researcher a powerful and flexible way to set up many possible combinations of data and constraints without having to perform numerous intermediate steps or create temporary data bands. This method provides an efficient way to produce descriptive statistics in such situations.
Effect of different mixing methods on the physical properties of Portland cement.
Shahi, Shahriar; Ghasemi, Negin; Rahimi, Saeed; Yavari, Hamidreza; Samiei, Mohammad; Jafari, Farnaz
2016-12-01
The Portland cement is hydrophilic cement; as a result, the powder-to-liquid ratio affects the properties of the final mix. In addition, the mixing technique affects hydration. The aim of this study was to evaluate the effect of different mixing techniques (conventional, amalgamator and ultrasonic) on some selective physical properties of Portland cement. The physical properties to be evaluated were determined using the ISO 6786:2001 specification. One hundred sixty two samples of Portland cement were prepared for three mixing techniques for each physical property (each 6 samples). Data were analyzed using descriptive statistics, one-way ANOVA and post hoc Tukey tests. Statistical significance was set at P <0.05. The mixing technique had no significant effect on the compressive strength, film thickness and flow of Portland cement ( P >0.05). Dimensional changes (shrinkage), solubility and pH increased significantly by amalgamator and ultrasonic mixing techniques ( P <0.05). The ultrasonic technique significantly decreased working time, and the amalgamator and ultrasonic techniques significantly decreased the setting time ( P <0.05). The mixing technique exerted no significant effect on the flow, film thickness and compressive strength of Portland cement samples. Key words: Physical properties, Portland cement, mixing methods.
The bio-optical properties of CDOM as descriptor of lake stratification.
Bracchini, Luca; Dattilo, Arduino Massimo; Hull, Vincent; Loiselle, Steven Arthur; Martini, Silvia; Rossi, Claudio; Santinelli, Chiara; Seritti, Alfredo
2006-11-01
Multivariate statistical techniques are used to demonstrate the fundamental role of CDOM optical properties in the description of water masses during the summer stratification of a deep lake. PC1 was linked with dissolved species and PC2 with suspended particles. In the first principal component that the role of CDOM bio-optical properties give a better description of the stratification of the Salto Lake with respect to temperature. The proposed multivariate approach can be used for the analysis of different stratified aquatic ecosystems in relation to interaction between bio-optical properties and stratification of the water body.
Accuracy of Physical Self-Description Among Chronic Exercisers and Non-Exercisers.
Berning, Joseph M; DeBeliso, Mark; Sevene, Trish G; Adams, Kent J; Salmon, Paul; Stamford, Bryant A
2014-11-06
This study addressed the role of chronic exercise to enhance physical self-description as measured by self-estimated percent body fat. Accuracy of physical self-description was determined in normal-weight, regularly exercising and non-exercising males with similar body mass index (BMI)'s and females with similar BMI's (n=42 males and 45 females of which 23 males and 23 females met criteria to be considered chronic exercisers). Statistical analyses were conducted to determine the degree of agreement between self-estimated percent body fat and actual laboratory measurements (hydrostatic weighing). Three statistical techniques were employed: Pearson correlation coefficients, Bland and Altman plots, and regression analysis. Agreement between measured and self-estimated percent body fat was superior for males and females who exercised chronically, compared to non-exercisers. The clinical implications are as follows. Satisfaction with one's body can be influenced by several factors, including self-perceived body composition. Dissatisfaction can contribute to maladaptive and destructive weight management behaviors. The present study suggests that regular exercise provides a basis for more positive weight management behaviors by enhancing the accuracy of self-assessed body composition.
Earthquake prediction evaluation standards applied to the VAN Method
NASA Astrophysics Data System (ADS)
Jackson, David D.
Earthquake prediction research must meet certain standards before it can be suitably evaluated for potential application in decision making. For methods that result in a binary (on or off) alarm condition, requirements include (1) a quantitative description of observables that trigger an alarm, (2) a quantitative description, including ranges of time, location, and magnitude, of the predicted earthquakes, (3) documented evidence of all previous alarms, (4) a complete list of predicted earthquakes, (5) a complete list of unpredicted earthquakes. The VAN technique [Varotsos and Lazaridou, 1991; Varotsos et al., 1996] has not yet been stated as a testable hypothesis. It fails criteria (1) and (2) so it is not ready to be evaluated properly. Although telegrams were transmitted in advance of claimed successes, these telegrams did not fully specify the predicted events, and all of the published statistical evaluations involve many subjective ex post facto decisions. Lacking a statistically demonstrated relationship to earthquakes, a candidate prediction technique should satisfy several plausibility criteria, including: (1) a reasonable relationship between the location of the candidate precursor and that of the predicted earthquake, (2) some demonstration that the candidate precursory observations are related to stress, strain, or other quantities related to earthquakes, and (3) the existence of co-seismic as well as pre-seismic variations of the candidate precursor. The VAN technique meets none of these criteria.
NASA Technical Reports Server (NTRS)
1972-01-01
System studies, equipment simulation, hardware development and flight tests which were conducted during the development of aircraft collision hazard warning system are discussed. The system uses a cooperative, continuous wave Doppler radar principle with pseudo-random frequency modulation. The report presents a description of the system operation and deals at length with the use of pseudo-random coding techniques. In addition, the use of mathematical modeling and computer simulation to determine the alarm statistics and system saturation characteristics in terminal area traffic of variable density is discussed.
NASA Technical Reports Server (NTRS)
Edwards, S. F.; Kantsios, A. G.; Voros, J. P.; Stewart, W. F.
1975-01-01
The development of a radiometric technique for determining the spectral and total normal emittance of materials heated to temperatures of 800, 1100, and 1300 K by direct comparison with National Bureau of Standards (NBS) reference specimens is discussed. Emittances are measured over the spectral range of 1 to 15 microns and are statistically compared with NBS reference specimens. Results are included for NBS reference specimens, Rene 41, alundum, zirconia, AISI type 321 stainless steel, nickel 201, and a space-shuttle reusable surface insulation.
High Accuracy Transistor Compact Model Calibrations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hembree, Charles E.; Mar, Alan; Robertson, Perry J.
2015-09-01
Typically, transistors are modeled by the application of calibrated nominal and range models. These models consists of differing parameter values that describe the location and the upper and lower limits of a distribution of some transistor characteristic such as current capacity. Correspond- ingly, when using this approach, high degrees of accuracy of the transistor models are not expected since the set of models is a surrogate for a statistical description of the devices. The use of these types of models describes expected performances considering the extremes of process or transistor deviations. In contrast, circuits that have very stringent accuracy requirementsmore » require modeling techniques with higher accuracy. Since these accurate models have low error in transistor descriptions, these models can be used to describe part to part variations as well as an accurate description of a single circuit instance. Thus, models that meet these stipulations also enable the calculation of quantifi- cation of margins with respect to a functional threshold and uncertainties in these margins. Given this need, new model high accuracy calibration techniques for bipolar junction transis- tors have been developed and are described in this report.« less
1984-11-01
welL The subipace is found by using the usual linear eigenv’ctor solution in th3 new enlarged space. This technique was first suggested by Gnanadesikan ...Wilk (1966, 1968), and a good description can be found in Gnanadesikan (1977). They suggested using polynomial functions’ of the original p co...Heidelberg, Springer Ver- lag. Gnanadesikan , R. (1977), Methods for Statistical Data Analysis of Multivariate Observa- tions, Wiley, New York
The importance of anti corruption education teaching materials for the young generation
NASA Astrophysics Data System (ADS)
Sarmini; Made Swanda, I.; Nadiroh, Ulin
2018-01-01
Corruption is one of the most serious issues in many countries. The purpose of this paper is to identify the importance of anti-corruption education teaching materials for the younger generation. The research method used qualitative description with questionnaire as data collection tool. The sample in this research was 150 junior high school teachers in Surabaya. Data analysis technique used in this research was descriptive statistic with percentage technique. The result of this research was that Socisl Studies teachers in Surabaya realize that teaching materials on Anti-Corruption Education is very important in Social Studies learning activities. Recommendations for further research is to examine the antieducation teaching materials that contain the value of anti-corruption character. With anticorruption education is expected to give awareness and change to all the younger generation to understand and realize the importance of having the character of anti-corruption and can mengnglemlem in society.
Analysis of laparoscopic port site complications: A descriptive study
Karthik, Somu; Augustine, Alfred Joseph; Shibumon, Mundunadackal Madhavan; Pai, Manohar Varadaraya
2013-01-01
CONTEXT: The rate of port site complications following conventional laparoscopic surgery is about 21 per 100,000 cases. It has shown a proportional rise with increase in the size of the port site incision and trocar. Although rare, complications that occur at the port site include infection, bleeding, and port site hernia. AIMS: To determine the morbidity associated with ports at the site of their insertion in laparoscopic surgery and to identify risk factors for complications. SETTINGS AND DESIGN: Prospective descriptive study. MATERIALS AND METHODS: In the present descriptive study, a total of 570 patients who underwent laparoscopic surgeries for various ailments between August 2009 and July 2011 at our institute were observed for port site complications prospectively and the complications were reviewed. STATISTICAL ANALYSIS USED: Descriptive statistical analysis was carried out in the present study. The statistical software, namely, SPSS 15.0 was used for the analysis of the data. RESULTS: Of the 570 patients undergoing laparoscopic surgery, 17 (3%) had developed complications specifically related to the port site during a minimum follow-up of three months; port site infection (PSI) was the most frequent (n = 10, 1.8%), followed by port site bleeding (n = 4, 0.7%), omentum-related complications (n = 2; 0.35%), and port site metastasis (n = 1, 0.175%). CONCLUSIONS: Laparoscopic surgeries are associated with minimal port site complications. Complications are related to the increased number of ports. Umbilical port involvement is the commonest. Most complications are manageable with minimal morbidity, and can be further minimized with meticulous surgical technique during entry and exit. PMID:23741110
Quality of reporting statistics in two Indian pharmacology journals.
Jaykaran; Yadav, Preeti
2011-04-01
To evaluate the reporting of the statistical methods in articles published in two Indian pharmacology journals. All original articles published since 2002 were downloaded from the journals' (Indian Journal of Pharmacology (IJP) and Indian Journal of Physiology and Pharmacology (IJPP)) website. These articles were evaluated on the basis of appropriateness of descriptive statistics and inferential statistics. Descriptive statistics was evaluated on the basis of reporting of method of description and central tendencies. Inferential statistics was evaluated on the basis of fulfilling of assumption of statistical methods and appropriateness of statistical tests. Values are described as frequencies, percentage, and 95% confidence interval (CI) around the percentages. Inappropriate descriptive statistics was observed in 150 (78.1%, 95% CI 71.7-83.3%) articles. Most common reason for this inappropriate descriptive statistics was use of mean ± SEM at the place of "mean (SD)" or "mean ± SD." Most common statistical method used was one-way ANOVA (58.4%). Information regarding checking of assumption of statistical test was mentioned in only two articles. Inappropriate statistical test was observed in 61 (31.7%, 95% CI 25.6-38.6%) articles. Most common reason for inappropriate statistical test was the use of two group test for three or more groups. Articles published in two Indian pharmacology journals are not devoid of statistical errors.
NASA Astrophysics Data System (ADS)
Cheng, Jie; Qian, Zhaogang; Irani, Keki B.; Etemad, Hossein; Elta, Michael E.
1991-03-01
To meet the ever-increasing demand of the rapidly-growing semiconductor manufacturing industry it is critical to have a comprehensive methodology integrating techniques for process optimization real-time monitoring and adaptive process control. To this end we have accomplished an integrated knowledge-based approach combining latest expert system technology machine learning method and traditional statistical process control (SPC) techniques. This knowledge-based approach is advantageous in that it makes it possible for the task of process optimization and adaptive control to be performed consistently and predictably. Furthermore this approach can be used to construct high-level and qualitative description of processes and thus make the process behavior easy to monitor predict and control. Two software packages RIST (Rule Induction and Statistical Testing) and KARSM (Knowledge Acquisition from Response Surface Methodology) have been developed and incorporated with two commercially available packages G2 (real-time expert system) and ULTRAMAX (a tool for sequential process optimization).
Statistical iterative reconstruction to improve image quality for digital breast tomosynthesis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Shiyu, E-mail: shiyu.xu@gmail.com; Chen, Ying, E-mail: adachen@siu.edu; Lu, Jianping
2015-09-15
Purpose: Digital breast tomosynthesis (DBT) is a novel modality with the potential to improve early detection of breast cancer by providing three-dimensional (3D) imaging with a low radiation dose. 3D image reconstruction presents some challenges: cone-beam and flat-panel geometry, and highly incomplete sampling. A promising means to overcome these challenges is statistical iterative reconstruction (IR), since it provides the flexibility of accurate physics modeling and a general description of system geometry. The authors’ goal was to develop techniques for applying statistical IR to tomosynthesis imaging data. Methods: These techniques include the following: a physics model with a local voxel-pair basedmore » prior with flexible parameters to fine-tune image quality; a precomputed parameter λ in the prior, to remove data dependence and to achieve a uniform resolution property; an effective ray-driven technique to compute the forward and backprojection; and an oversampled, ray-driven method to perform high resolution reconstruction with a practical region-of-interest technique. To assess the performance of these techniques, the authors acquired phantom data on the stationary DBT prototype system. To solve the estimation problem, the authors proposed an optimization-transfer based algorithm framework that potentially allows fewer iterations to achieve an acceptably converged reconstruction. Results: IR improved the detectability of low-contrast and small microcalcifications, reduced cross-plane artifacts, improved spatial resolution, and lowered noise in reconstructed images. Conclusions: Although the computational load remains a significant challenge for practical development, the superior image quality provided by statistical IR, combined with advancing computational techniques, may bring benefits to screening, diagnostics, and intraoperative imaging in clinical applications.« less
Methods and application of system identification in shock and vibration.
NASA Technical Reports Server (NTRS)
Collins, J. D.; Young, J. P.; Kiefling, L.
1972-01-01
A logical picture is presented of current useful system identification techniques in the shock and vibration field. A technology tree diagram is developed for the purpose of organizing and categorizing the widely varying approaches according to the fundamental nature of each. Specific examples of accomplished activity for each identification category are noted and discussed. To provide greater insight into the most current trends in the system identification field, a somewhat detailed description is presented of the essential features of a recently developed technique that is based on making the maximum use of all statistically known information about a system.
Quality of reporting statistics in two Indian pharmacology journals
Jaykaran; Yadav, Preeti
2011-01-01
Objective: To evaluate the reporting of the statistical methods in articles published in two Indian pharmacology journals. Materials and Methods: All original articles published since 2002 were downloaded from the journals’ (Indian Journal of Pharmacology (IJP) and Indian Journal of Physiology and Pharmacology (IJPP)) website. These articles were evaluated on the basis of appropriateness of descriptive statistics and inferential statistics. Descriptive statistics was evaluated on the basis of reporting of method of description and central tendencies. Inferential statistics was evaluated on the basis of fulfilling of assumption of statistical methods and appropriateness of statistical tests. Values are described as frequencies, percentage, and 95% confidence interval (CI) around the percentages. Results: Inappropriate descriptive statistics was observed in 150 (78.1%, 95% CI 71.7–83.3%) articles. Most common reason for this inappropriate descriptive statistics was use of mean ± SEM at the place of “mean (SD)” or “mean ± SD.” Most common statistical method used was one-way ANOVA (58.4%). Information regarding checking of assumption of statistical test was mentioned in only two articles. Inappropriate statistical test was observed in 61 (31.7%, 95% CI 25.6–38.6%) articles. Most common reason for inappropriate statistical test was the use of two group test for three or more groups. Conclusion: Articles published in two Indian pharmacology journals are not devoid of statistical errors. PMID:21772766
Oral health literacy: awareness and practices among pediatric dentists.
Stowers, Megan E; Lee, Jessica Y; Majewski, Robert F; Estrella, Maria Regina P; Taylor, George W; Boynton, James R
2013-01-01
The purpose of this study was to examine pediatric dentists' awareness and experiences with oral health literacy and to identify communication techniques used with parents. Active North American members of the American Academy of Pediatric Dentistry were invited to participate in the survey. Descriptive statistical analyses were completed, and Pearson's chi-square crosstabs tests were used to compare categorical data between groups. Data were collected from 22 percent (N=1,059) of pediatric dentists; 68 to 87 percent use basic communication techniques routinely, while 36 to 79 percent routinely use enhanced communication techniques. Approximately 59 percent (N=620) reported having had an experience with health literacy miscommunication, while 11 percent (N=116) are aware of an error in patient care that resulted from oral health literacy miscommunication. Respondents who have had an experience with miscommunication were significantly more likely statistically to perceive barriers to effective communication as more significant than those without a history of miscommunication experience (P<.001). Most pediatric dentists have experienced situations in which a parent has misunderstood information. Basic communication techniques were most commonly used, while enhanced communication techniques were used less routinely. Those who have had experience with oral health literacy miscommunication events perceive barriers to effective communication as more significant.
Difficulties in learning and teaching statistics: teacher views
NASA Astrophysics Data System (ADS)
Koparan, Timur
2015-01-01
The purpose of this study is to define teacher views about the difficulties in learning and teaching middle school statistics subjects. To serve this aim, a number of interviews were conducted with 10 middle school maths teachers in 2011-2012 school year in the province of Trabzon. Of the qualitative descriptive research methods, the semi-structured interview technique was applied in the research. In accordance with the aim, teacher opinions about the statistics subjects were examined and analysed. Similar responses from the teachers were grouped and evaluated. The teachers stated that it was positive that middle school statistics subjects were taught gradually in every grade but some difficulties were experienced in the teaching of this subject. The findings are presented in eight themes which are context, sample, data representation, central tendency and dispersion measure, probability, variance, and other difficulties.
Thompson, Cheryl Bagley
2009-01-01
This 13th article of the Basics of Research series is first in a short series on statistical analysis. These articles will discuss creating your statistical analysis plan, levels of measurement, descriptive statistics, probability theory, inferential statistics, and general considerations for interpretation of the results of a statistical analysis.
NASA Technical Reports Server (NTRS)
Fechtig, H.
1973-01-01
A description of techniques used in recent experiments to detect and analyze cosmic dust and micrometeorites is given and the results both from the study of lunar crater statistics and from in situ measurements are reviewed. The results from lunar crater statistics show an agreement with the results obtained from in situ measurements in interplanetary space and derived from zodiacal light measurements. The near earth results show an enhancement in the flux numbers. This can be caused either by secondary lunar debris or by disintegration of low density fireballs in the outer atmosphere.
The role of drop velocity in statistical spray description
NASA Technical Reports Server (NTRS)
Groeneweg, J. F.; El-Wakil, M. M.; Myers, P. S.; Uyehara, O. A.
1978-01-01
The justification for describing a spray by treating drop velocity as a random variable on an equal statistical basis with drop size was studied experimentally. A double exposure technique using fluorescent drop photography was used to make size and velocity measurements at selected locations in a steady ethanol spray formed by a swirl atomizer. The size velocity data were categorized to construct bivariate spray density functions to describe the spray immediately after formation and during downstream propagation. Bimodal density functions were formed by environmental interaction during downstream propagation. Large differences were also found between spatial mass density and mass flux size distribution at the same location.
Fast Quantum Algorithm for Predicting Descriptive Statistics of Stochastic Processes
NASA Technical Reports Server (NTRS)
Williams Colin P.
1999-01-01
Stochastic processes are used as a modeling tool in several sub-fields of physics, biology, and finance. Analytic understanding of the long term behavior of such processes is only tractable for very simple types of stochastic processes such as Markovian processes. However, in real world applications more complex stochastic processes often arise. In physics, the complicating factor might be nonlinearities; in biology it might be memory effects; and in finance is might be the non-random intentional behavior of participants in a market. In the absence of analytic insight, one is forced to understand these more complex stochastic processes via numerical simulation techniques. In this paper we present a quantum algorithm for performing such simulations. In particular, we show how a quantum algorithm can predict arbitrary descriptive statistics (moments) of N-step stochastic processes in just O(square root of N) time. That is, the quantum complexity is the square root of the classical complexity for performing such simulations. This is a significant speedup in comparison to the current state of the art.
NASA Astrophysics Data System (ADS)
Bouhaj, M.; von Estorff, O.; Peiffer, A.
2017-09-01
In the application of Statistical Energy Analysis "SEA" to complex assembled structures, a purely predictive model often exhibits errors. These errors are mainly due to a lack of accurate modelling of the power transmission mechanism described through the Coupling Loss Factors (CLF). Experimental SEA (ESEA) is practically used by the automotive and aerospace industry to verify and update the model or to derive the CLFs for use in an SEA predictive model when analytical estimates cannot be made. This work is particularly motivated by the lack of procedures that allow an estimate to be made of the variance and confidence intervals of the statistical quantities when using the ESEA technique. The aim of this paper is to introduce procedures enabling a statistical description of measured power input, vibration energies and the derived SEA parameters. Particular emphasis is placed on the identification of structural CLFs of complex built-up structures comparing different methods. By adopting a Stochastic Energy Model (SEM), the ensemble average in ESEA is also addressed. For this purpose, expressions are obtained to randomly perturb the energy matrix elements and generate individual samples for the Monte Carlo (MC) technique applied to derive the ensemble averaged CLF. From results of ESEA tests conducted on an aircraft fuselage section, the SEM approach provides a better performance of estimated CLFs compared to classical matrix inversion methods. The expected range of CLF values and the synthesized energy are used as quality criteria of the matrix inversion, allowing to assess critical SEA subsystems, which might require a more refined statistical description of the excitation and the response fields. Moreover, the impact of the variance of the normalized vibration energy on uncertainty of the derived CLFs is outlined.
Nick, Todd G
2007-01-01
Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.
Buttigieg, Pier Luigi; Ramette, Alban
2014-12-01
The application of multivariate statistical analyses has become a consistent feature in microbial ecology. However, many microbial ecologists are still in the process of developing a deep understanding of these methods and appreciating their limitations. As a consequence, staying abreast of progress and debate in this arena poses an additional challenge to many microbial ecologists. To address these issues, we present the GUide to STatistical Analysis in Microbial Ecology (GUSTA ME): a dynamic, web-based resource providing accessible descriptions of numerous multivariate techniques relevant to microbial ecologists. A combination of interactive elements allows users to discover and navigate between methods relevant to their needs and examine how they have been used by others in the field. We have designed GUSTA ME to become a community-led and -curated service, which we hope will provide a common reference and forum to discuss and disseminate analytical techniques relevant to the microbial ecology community. © 2014 The Authors. FEMS Microbiology Ecology published by John Wiley & Sons Ltd on behalf of Federation of European Microbiological Societies.
NASA Astrophysics Data System (ADS)
Leung, Juliana Y.; Srinivasan, Sanjay
2016-09-01
Modeling transport process at large scale requires proper scale-up of subsurface heterogeneity and an understanding of its interaction with the underlying transport mechanisms. A technique based on volume averaging is applied to quantitatively assess the scaling characteristics of effective mass transfer coefficient in heterogeneous reservoir models. The effective mass transfer coefficient represents the combined contribution from diffusion and dispersion to the transport of non-reactive solute particles within a fluid phase. Although treatment of transport problems with the volume averaging technique has been published in the past, application to geological systems exhibiting realistic spatial variability remains a challenge. Previously, the authors developed a new procedure where results from a fine-scale numerical flow simulation reflecting the full physics of the transport process albeit over a sub-volume of the reservoir are integrated with the volume averaging technique to provide effective description of transport properties. The procedure is extended such that spatial averaging is performed at the local-heterogeneity scale. In this paper, the transport of a passive (non-reactive) solute is simulated on multiple reservoir models exhibiting different patterns of heterogeneities, and the scaling behavior of effective mass transfer coefficient (Keff) is examined and compared. One such set of models exhibit power-law (fractal) characteristics, and the variability of dispersion and Keff with scale is in good agreement with analytical expressions described in the literature. This work offers an insight into the impacts of heterogeneity on the scaling of effective transport parameters. A key finding is that spatial heterogeneity models with similar univariate and bivariate statistics may exhibit different scaling characteristics because of the influence of higher order statistics. More mixing is observed in the channelized models with higher-order continuity. It reinforces the notion that the flow response is influenced by the higher-order statistical description of heterogeneity. An important implication is that when scaling-up transport response from lab-scale results to the field scale, it is necessary to account for the scale-up of heterogeneity. Since the characteristics of higher-order multivariate distributions and large-scale heterogeneity are typically not captured in small-scale experiments, a reservoir modeling framework that captures the uncertainty in heterogeneity description should be adopted.
Otwombe, Kennedy N.; Petzold, Max; Martinson, Neil; Chirwa, Tobias
2014-01-01
Background Research in the predictors of all-cause mortality in HIV-infected people has widely been reported in literature. Making an informed decision requires understanding the methods used. Objectives We present a review on study designs, statistical methods and their appropriateness in original articles reporting on predictors of all-cause mortality in HIV-infected people between January 2002 and December 2011. Statistical methods were compared between 2002–2006 and 2007–2011. Time-to-event analysis techniques were considered appropriate. Data Sources Pubmed/Medline. Study Eligibility Criteria Original English-language articles were abstracted. Letters to the editor, editorials, reviews, systematic reviews, meta-analysis, case reports and any other ineligible articles were excluded. Results A total of 189 studies were identified (n = 91 in 2002–2006 and n = 98 in 2007–2011) out of which 130 (69%) were prospective and 56 (30%) were retrospective. One hundred and eighty-two (96%) studies described their sample using descriptive statistics while 32 (17%) made comparisons using t-tests. Kaplan-Meier methods for time-to-event analysis were commonly used in the earlier period (n = 69, 76% vs. n = 53, 54%, p = 0.002). Predictors of mortality in the two periods were commonly determined using Cox regression analysis (n = 67, 75% vs. n = 63, 64%, p = 0.12). Only 7 (4%) used advanced survival analysis methods of Cox regression analysis with frailty in which 6 (3%) were used in the later period. Thirty-two (17%) used logistic regression while 8 (4%) used other methods. There were significantly more articles from the first period using appropriate methods compared to the second (n = 80, 88% vs. n = 69, 70%, p-value = 0.003). Conclusion Descriptive statistics and survival analysis techniques remain the most common methods of analysis in publications on predictors of all-cause mortality in HIV-infected cohorts while prospective research designs are favoured. Sophisticated techniques of time-dependent Cox regression and Cox regression with frailty are scarce. This motivates for more training in the use of advanced time-to-event methods. PMID:24498313
Overholser, Brian R; Sowinski, Kevin M
2007-12-01
Biostatistics is the application of statistics to biologic data. The field of statistics can be broken down into 2 fundamental parts: descriptive and inferential. Descriptive statistics are commonly used to categorize, display, and summarize data. Inferential statistics can be used to make predictions based on a sample obtained from a population or some large body of information. It is these inferences that are used to test specific research hypotheses. This 2-part review will outline important features of descriptive and inferential statistics as they apply to commonly conducted research studies in the biomedical literature. Part 1 in this issue will discuss fundamental topics of statistics and data analysis. Additionally, some of the most commonly used statistical tests found in the biomedical literature will be reviewed in Part 2 in the February 2008 issue.
Densely calculated facial soft tissue thickness for craniofacial reconstruction in Chinese adults.
Shui, Wuyang; Zhou, Mingquan; Deng, Qingqiong; Wu, Zhongke; Ji, Yuan; Li, Kang; He, Taiping; Jiang, Haiyan
2016-09-01
Craniofacial reconstruction (CFR) is used to recreate a likeness of original facial appearance for an unidentified skull; this technique has been applied in both forensics and archeology. Many CFR techniques rely on the average facial soft tissue thickness (FSTT) of anatomical landmarks, related to ethnicity, age, sex, body mass index (BMI), etc. Previous studies typically employed FSTT at sparsely distributed anatomical landmarks, where different landmark definitions may affect the contrasting results. In the present study, a total of 90,198 one-to-one correspondence skull vertices are established on 171 head CT-scans and the FSTT of each corresponding vertex is calculated (hereafter referred to as densely calculated FSTT) for statistical analysis and CFR. Basic descriptive statistics (i.e., mean and standard deviation) for densely calculated FSTT are reported separately according to sex and age. Results show that 76.12% of overall vertices indicate that the FSTT is greater in males than females, with the exception of vertices around the zygoma, zygomatic arch and mid-lateral orbit. These sex-related significant differences are found at 55.12% of all vertices and the statistically age-related significant differences are depicted between the three age groups at a majority of all vertices (73.31% for males and 63.43% for females). Five non-overlapping categories are given and the descriptive statistics (i.e., mean, standard deviation, local standard deviation and percentage) are reported. Multiple appearances are produced using the densely calculated FSTT of various age and sex groups, and a quantitative assessment is provided to examine how relevant the choice of FSTT is to increasing the accuracy of CFR. In conclusion, this study provides a new perspective in understanding the distribution of FSTT and the construction of a new densely calculated FSTT database for craniofacial reconstruction. Copyright © 2016. Published by Elsevier Ireland Ltd.
Neutral gas sympathetic cooling of an ion in a Paul trap.
Chen, Kuang; Sullivan, Scott T; Hudson, Eric R
2014-04-11
A single ion immersed in a neutral buffer gas is studied. An analytical model is developed that gives a complete description of the dynamics and steady-state properties of the ions. An extension of this model, using techniques employed in the mathematics of economics and finance, is used to explain the recent observation of non-Maxwellian statistics for these systems. Taken together, these results offer an explanation of the long-standing issues associated with sympathetic cooling of an ion by a neutral buffer gas.
Neutral Gas Sympathetic Cooling of an Ion in a Paul Trap
NASA Astrophysics Data System (ADS)
Chen, Kuang; Sullivan, Scott T.; Hudson, Eric R.
2014-04-01
A single ion immersed in a neutral buffer gas is studied. An analytical model is developed that gives a complete description of the dynamics and steady-state properties of the ions. An extension of this model, using techniques employed in the mathematics of economics and finance, is used to explain the recent observation of non-Maxwellian statistics for these systems. Taken together, these results offer an explanation of the long-standing issues associated with sympathetic cooling of an ion by a neutral buffer gas.
Tuuli, Methodius G; Odibo, Anthony O
2011-08-01
The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.
Signal Detection Techniques for Diagnostic Monitoring of Space Shuttle Main Engine Turbomachinery
NASA Technical Reports Server (NTRS)
Coffin, Thomas; Jong, Jen-Yi
1986-01-01
An investigation to develop, implement, and evaluate signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery is reviewed. A brief description of the Space Shuttle Main Engine (SSME) test/measurement program is presented. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques have been implemented on a computer and applied to dynamc signals. A laboratory evaluation of the methods with respect to signal detection capability is described. A unique coherence function (the hyper-coherence) was developed through the course of this investigation, which appears promising as a diagnostic tool. This technique and several other non-linear methods of signal analysis are presented and illustrated by application. Software for application of these techniques has been installed on the signal processing system at the NASA/MSFC Systems Dynamics Laboratory.
Comparative Research of Navy Voluntary Education at Operational Commands
2017-03-01
return on investment, ROI, logistic regression, multivariate analysis, descriptive statistics, Markov, time-series, linear programming 15. NUMBER...21 B. DESCRIPTIVE STATISTICS TABLES ...............................................25 C. PRIVACY CONSIDERATIONS...THIS PAGE INTENTIONALLY LEFT BLANK xi LIST OF TABLES Table 1. Variables and Descriptions . Adapted from NETC (2016). .......................21
Analysis of Professional and Pre-Accession Characteristics and Junior Naval Officer Performance
2018-03-01
REVIEW .............................................5 A. NAVY PERFORMANCE EVALUATION SYSTEM ............................5 B. PROFESSIONAL...17 A. DATA DESCRIPTION ...........................................................................17 B. SUMMARY...STATISTICS ......................................................................24 C. DESCRIPTIVE STATISTICS
Recognition of speaker-dependent continuous speech with KEAL
NASA Astrophysics Data System (ADS)
Mercier, G.; Bigorgne, D.; Miclet, L.; Le Guennec, L.; Querre, M.
1989-04-01
A description of the speaker-dependent continuous speech recognition system KEAL is given. An unknown utterance, is recognized by means of the followng procedures: acoustic analysis, phonetic segmentation and identification, word and sentence analysis. The combination of feature-based, speaker-independent coarse phonetic segmentation with speaker-dependent statistical classification techniques is one of the main design features of the acoustic-phonetic decoder. The lexical access component is essentially based on a statistical dynamic programming technique which aims at matching a phonemic lexical entry containing various phonological forms, against a phonetic lattice. Sentence recognition is achieved by use of a context-free grammar and a parsing algorithm derived from Earley's parser. A speaker adaptation module allows some of the system parameters to be adjusted by matching known utterances with their acoustical representation. The task to be performed, described by its vocabulary and its grammar, is given as a parameter of the system. Continuously spoken sentences extracted from a 'pseudo-Logo' language are analyzed and results are presented.
Histometric analyses of cancellous and cortical interface in autogenous bone grafting
Netto, Henrique Duque; Olate, Sergio; Klüppel, Leandro; do Carmo, Antonio Marcio Resende; Vásquez, Bélgica; Albergaria-Barbosa, Jose
2013-01-01
Surgical procedures involving the rehabilitation of the maxillofacial region frequently require bone grafts; the aim of this research was to evaluate the interface between recipient and graft with cortical or cancellous contact. 6 adult beagle dogs with 15 kg weight were included in the study. Under general anesthesia, an 8 mm diameter block was obtained from parietal bone of each animal and was put on the frontal bone with a 12 mm 1.5 screws. Was used the lag screw technique from better contact between the recipient and graft. 3-week and 6-week euthanized period were chosen for histometric evaluation. Hematoxylin-eosin was used in a histologic routine technique and histomorphometry was realized with IMAGEJ software. T test was used for data analyses with p<0.05 for statistical significance. The result show some differences in descriptive histology but non statistical differences in the interface between cortical or cancellous bone at 3 or 6 week; as natural, after 6 week of surgery, bone integration was better and statistically superior to 3-week analyses. We conclude that integration of cortical or cancellous bone can be usefully without differences. PMID:23923071
Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems.
Williams, Richard A; Timmis, Jon; Qwarnstrom, Eva E
2016-01-01
Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model.
Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems
Timmis, Jon; Qwarnstrom, Eva E.
2016-01-01
Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model. PMID:27571414
Learning moment-based fast local binary descriptor
NASA Astrophysics Data System (ADS)
Bellarbi, Abdelkader; Zenati, Nadia; Otmane, Samir; Belghit, Hayet
2017-03-01
Recently, binary descriptors have attracted significant attention due to their speed and low memory consumption; however, using intensity differences to calculate the binary descriptive vector is not efficient enough. We propose an approach to binary description called POLAR_MOBIL, in which we perform binary tests between geometrical and statistical information using moments in the patch instead of the classical intensity binary test. In addition, we introduce a learning technique used to select an optimized set of binary tests with low correlation and high variance. This approach offers high distinctiveness against affine transformations and appearance changes. An extensive evaluation on well-known benchmark datasets reveals the robustness and the effectiveness of the proposed descriptor, as well as its good performance in terms of low computation complexity when compared with state-of-the-art real-time local descriptors.
Statistical methods for investigating quiescence and other temporal seismicity patterns
Matthews, M.V.; Reasenberg, P.A.
1988-01-01
We propose a statistical model and a technique for objective recognition of one of the most commonly cited seismicity patterns:microearthquake quiescence. We use a Poisson process model for seismicity and define a process with quiescence as one with a particular type of piece-wise constant intensity function. From this model, we derive a statistic for testing stationarity against a 'quiescence' alternative. The large-sample null distribution of this statistic is approximated from simulated distributions of appropriate functionals applied to Brownian bridge processes. We point out the restrictiveness of the particular model we propose and of the quiescence idea in general. The fact that there are many point processes which have neither constant nor quiescent rate functions underscores the need to test for and describe nonuniformity thoroughly. We advocate the use of the quiescence test in conjunction with various other tests for nonuniformity and with graphical methods such as density estimation. ideally these methods may promote accurate description of temporal seismicity distributions and useful characterizations of interesting patterns. ?? 1988 Birkha??user Verlag.
Ene-Obong, Henrietta Nkechi; Onuoha, Nne Ola; Eme, Paul Eze
2017-11-01
This study examined gender roles, family relationships, food security, and nutritional status of households in Ohafia: a matrilineal society in Nigeria. A cross-sectional descriptive study was conducted. Multistage sampling technique was used to select 287 households from three villages: Akanu, Amangwu, and Elu. Qualitative and quantitative data collection methods were adopted, namely, focus group discussions and questionnaires. Anthropometric measurements (height and weight for mothers and children and Mid-Upper Arm Circumference for young children) were taken using standard techniques. The body mass index of women was calculated. All nutritional indices were compared with reference standards. Food insecurity was assessed using the Household Hunger Scale and Dietary Diversity Score, then analysed using the Statistical Product for Service Solution version 21. Data analysis used descriptive statistics. Most (91.2%) of the respondents were female. The matrilineal system known as ikwu nne or iri ala a nne (inheritance through mothers' lineage) is still in place but is changing. One important benefit of the system is the access to land by women. Whereas women participated actively in agriculture, food preparation, and care of family, the men were moving to off-farm activities. High prevalence of household food insecurity (66%) and signs of malnutrition including moderate to severe stunting (48.4%) and wasting (31.7%) in children, household hunger (34.5%), and overweight (27.5%) and obesity (19.2%) among mothers were observed. These communities urgently need gender sensitive food and nutrition interventions. © 2018 John Wiley & Sons Ltd.
Ghaibeh, A Ammar; Kasem, Asem; Ng, Xun Jin; Nair, Hema Latha Krishna; Hirose, Jun; Thiruchelvam, Vinesh
2018-01-01
The analysis of Electronic Health Records (EHRs) is attracting a lot of research attention in the medical informatics domain. Hospitals and medical institutes started to use data mining techniques to gain new insights from the massive amounts of data that can be made available through EHRs. Researchers in the medical field have often used descriptive statistics and classical statistical methods to prove assumed medical hypotheses. However, discovering new insights from large amounts of data solely based on experts' observations is difficult. Using data mining techniques and visualizations, practitioners can find hidden knowledge, identify interesting patterns, or formulate new hypotheses to be further investigated. This paper describes a work in progress on using data mining methods to analyze clinical data of Nasopharyngeal Carcinoma (NPC) cancer patients. NPC is the fifth most common cancer among Malaysians, and the data analyzed in this study was collected from three states in Malaysia (Kuala Lumpur, Sabah and Sarawak), and is considered to be the largest up-to-date dataset of its kind. This research is addressing the issue of cancer recurrence after the completion of radiotherapy and chemotherapy treatment. We describe the procedure, problems, and insights gained during the process.
Pabon, Peter; Ternström, Sten; Lamarche, Anick
2011-06-01
To describe a method for unified description, statistical modeling, and comparison of voice range profile (VRP) contours, even from diverse sources. A morphologic modeling technique, which is based on Fourier descriptors (FDs), is applied to the VRP contour. The technique, which essentially involves resampling of the curve of the contour, is assessed and also is compared to density-based VRP averaging methods that use the overlap count. VRP contours can be usefully described and compared using FDs. The method also permits the visualization of the local covariation along the contour average. For example, the FD-based analysis shows that the population variance for ensembles of VRP contours is usually smallest at the upper left part of the VRP. To illustrate the method's advantages and possible further application, graphs are given that compare the averaged contours from different authors and recording devices--for normal, trained, and untrained male and female voices as well as for child voices. The proposed technique allows any VRP shape to be brought to the same uniform base. On this uniform base, VRP contours or contour elements coming from a variety of sources may be placed within the same graph for comparison and for statistical analysis.
Kurji, Zahra A; Sigal, Michael J; Andrews, Paul; Titley, Keith
2011-01-01
The purpose of this study was to assess the clinical and radiographic outcomes of a 1-minute application of full-strength Buckley's formocresol with concurrent hemostasis using the medicated cotton pledget in human primary teeth. Using a retrospective chart review, clinical and radiographic data were available for 557 primary molars in 320 patients. Descriptive statistics and survival analysis were used to assess outcomes. Overall clinical success, radiographic success, and cumulative 5-year survival rates were approximately 99%, 90%, and 87%, respectively. Internal root resorption (∼5%) and pulp canal obliteration (∼2%) were the most frequently observed radiographic failures. Thirty-nine teeth were extracted due to clinical and or radiographic failure. Mandibular molars were 6 times more prone to radiographic failure than maxillary molars. Success rates for the modified technique are comparable to techniques that use the 5-minute diluted or full-strength solutions reported in the literature. This 1-minute full-strength formocresol technique is an acceptable alternative to published traditional techniques.
Statistics 101 for Radiologists.
Anvari, Arash; Halpern, Elkan F; Samir, Anthony E
2015-10-01
Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. © RSNA, 2015.
Writing to Learn Statistics in an Advanced Placement Statistics Course
ERIC Educational Resources Information Center
Northrup, Christian Glenn
2012-01-01
This study investigated the use of writing in a statistics classroom to learn if writing provided a rich description of problem-solving processes of students as they solved problems. Through analysis of 329 written samples provided by students, it was determined that writing provided a rich description of problem-solving processes and enabled…
The Greyhound Strike: Using a Labor Dispute to Teach Descriptive Statistics.
ERIC Educational Resources Information Center
Shatz, Mark A.
1985-01-01
A simulation exercise of a labor-management dispute is used to teach psychology students some of the basics of descriptive statistics. Using comparable data sets generated by the instructor, students work in small groups to develop a statistical presentation that supports their particular position in the dispute. (Author/RM)
Simultenious binary hash and features learning for image retrieval
NASA Astrophysics Data System (ADS)
Frantc, V. A.; Makov, S. V.; Voronin, V. V.; Marchuk, V. I.; Semenishchev, E. A.; Egiazarian, K. O.; Agaian, S.
2016-05-01
Content-based image retrieval systems have plenty of applications in modern world. The most important one is the image search by query image or by semantic description. Approaches to this problem are employed in personal photo-collection management systems, web-scale image search engines, medical systems, etc. Automatic analysis of large unlabeled image datasets is virtually impossible without satisfactory image-retrieval technique. It's the main reason why this kind of automatic image processing has attracted so much attention during recent years. Despite rather huge progress in the field, semantically meaningful image retrieval still remains a challenging task. The main issue here is the demand to provide reliable results in short amount of time. This paper addresses the problem by novel technique for simultaneous learning of global image features and binary hash codes. Our approach provide mapping of pixel-based image representation to hash-value space simultaneously trying to save as much of semantic image content as possible. We use deep learning methodology to generate image description with properties of similarity preservation and statistical independence. The main advantage of our approach in contrast to existing is ability to fine-tune retrieval procedure for very specific application which allow us to provide better results in comparison to general techniques. Presented in the paper framework for data- dependent image hashing is based on use two different kinds of neural networks: convolutional neural networks for image description and autoencoder for feature to hash space mapping. Experimental results confirmed that our approach has shown promising results in compare to other state-of-the-art methods.
NASA Technical Reports Server (NTRS)
Kummerow, Christian; Giglio, Louis
1994-01-01
This paper describes a multichannel physical approach for retrieving rainfall and vertical structure information from satellite-based passive microwave observations. The algorithm makes use of statistical inversion techniques based upon theoretically calculated relations between rainfall rates and brightness temperatures. Potential errors introduced into the theoretical calculations by the unknown vertical distribution of hydrometeors are overcome by explicity accounting for diverse hydrometeor profiles. This is accomplished by allowing for a number of different vertical distributions in the theoretical brightness temperature calculations and requiring consistency between the observed and calculated brightness temperatures. This paper will focus primarily on the theoretical aspects of the retrieval algorithm, which includes a procedure used to account for inhomogeneities of the rainfall within the satellite field of view as well as a detailed description of the algorithm as it is applied over both ocean and land surfaces. The residual error between observed and calculated brightness temperatures is found to be an important quantity in assessing the uniqueness of the solution. It is further found that the residual error is a meaningful quantity that can be used to derive expected accuracies from this retrieval technique. Examples comparing the retrieved results as well as the detailed analysis of the algorithm performance under various circumstances are the subject of a companion paper.
Rear-End Crashes: Problem Size Assessment And Statistical Description
DOT National Transportation Integrated Search
1993-05-01
KEYWORDS : RESEARCH AND DEVELOPMENT OR R&D, ADVANCED VEHICLE CONTROL & SAFETY SYSTEMS OR AVCSS, INTELLIGENT VEHICLE INITIATIVE OR IVI : THIS DOCUMENT PRESENTS PROBLEM SIZE ASSESSMENTS AND STATISTICAL CRASH DESCRIPTION FOR REAR-END CRASHES, INC...
Statistics in the pharmacy literature.
Lee, Charlene M; Soin, Herpreet K; Einarson, Thomas R
2004-09-01
Research in statistical methods is essential for maintenance of high quality of the published literature. To update previous reports of the types and frequencies of statistical terms and procedures in research studies of selected professional pharmacy journals. We obtained all research articles published in 2001 in 6 journals: American Journal of Health-System Pharmacy, The Annals of Pharmacotherapy, Canadian Journal of Hospital Pharmacy, Formulary, Hospital Pharmacy, and Journal of the American Pharmaceutical Association. Two independent reviewers identified and recorded descriptive and inferential statistical terms/procedures found in the methods, results, and discussion sections of each article. Results were determined by tallying the total number of times, as well as the percentage, that each statistical term or procedure appeared in the articles. One hundred forty-four articles were included. Ninety-eight percent employed descriptive statistics; of these, 28% used only descriptive statistics. The most common descriptive statistical terms were percentage (90%), mean (74%), standard deviation (58%), and range (46%). Sixty-nine percent of the articles used inferential statistics, the most frequent being chi(2) (33%), Student's t-test (26%), Pearson's correlation coefficient r (18%), ANOVA (14%), and logistic regression (11%). Statistical terms and procedures were found in nearly all of the research articles published in pharmacy journals. Thus, pharmacy education should aim to provide current and future pharmacists with an understanding of the common statistical terms and procedures identified to facilitate the appropriate appraisal and consequential utilization of the information available in research articles.
ERIC Educational Resources Information Center
Perrett, Jamis J.
2012-01-01
This article demonstrates how textbooks differ in their description of the term "experimental unit". Advanced Placement Statistics teachers and students are often limited in their statistical knowledge by the information presented in their classroom textbook. Definitions and descriptions differ among textbooks as well as among different…
2016-02-02
23 Descriptive Statistics for Enlisted Service Applicants and Accessions...33 Summary Statistics for Applicants and Accessions for Enlisted Service ..................................... 36 Applicants and...utilization among Soldiers screened using TAPAS. Section 2 of this report includes the descriptive statistics AMSARA compiles and publishes
Statistical description of tectonic motions
NASA Technical Reports Server (NTRS)
Agnew, Duncan Carr
1991-01-01
The behavior of stochastic processes was studied whose power spectra are described by power-law behavior. The details of the analysis and the conclusions that were reached are presented. This analysis was extended to compare detection capabilities of different measurement techniques (e.g., gravimetry and GPS for the vertical, and seismometers and GPS for horizontal), both in general and for the specific case of the deformations produced by a dislocation in a half-space (which applies to seismic of preseismic sources). The time-domain behavior of power-law noises is also investigated.
Bangdiwala, Shrikant I
2017-01-01
When studying the agreement between two observers rating the same n units into the same k discrete ordinal categories, Bangdiwala (1985) proposed using the "agreement chart" to visually assess agreement. This article proposes that often it is more interesting to focus on the patterns of disagreement and visually understanding the departures from perfect agreement. The article reviews the use of graphical techniques for descriptively assessing agreement and disagreements, and also reviews some of the available summary statistics that quantify such relationships.
Classification software technique assessment
NASA Technical Reports Server (NTRS)
Jayroe, R. R., Jr.; Atkinson, R.; Dasarathy, B. V.; Lybanon, M.; Ramapryian, H. K.
1976-01-01
A catalog of software options is presented for the use of local user communities to obtain software for analyzing remotely sensed multispectral imagery. The resources required to utilize a particular software program are described. Descriptions of how a particular program analyzes data and the performance of that program for an application and data set provided by the user are shown. An effort is made to establish a statistical performance base for various software programs with regard to different data sets and analysis applications, to determine the status of the state-of-the-art.
C-statistic fitting routines: User's manual and reference guide
NASA Technical Reports Server (NTRS)
Nousek, John A.; Farwana, Vida
1991-01-01
The computer program is discussed which can read several input files and provide a best set of values for the functions provided by the user, using either C-statistic or the chi(exp 2) statistic method. The program consists of one main routine and several functions and subroutines. Detail descriptions of each function and subroutine is presented. A brief description of the C-statistic and the reason for its application is also presented.
Wanlass, Paul W; Sikorski, David M; Kizhakkeveettil, Anupama; Tobias, Gene S
2018-03-12
To assess students' opinions of the potential influence of taking elective courses in chiropractic techniques and their future practice preferences. An anonymous, voluntary survey was conducted among graduating students from a doctor of chiropractic program. The survey included questions regarding the chiropractic technique elective courses they had completed and the potential influence of these courses on their chiropractic technique choices in future practice. Surveys were pretested for face validity, and data were analyzed using descriptive and inferential statistics. Of the 56 surveys distributed, 46 were completed, for a response rate of 82%. More than half of the students reported having taken at least 1 elective course in diversified technique (80%), Cox technique (76%), Activator Methods (70%), or sacro-occipital technique (63%). Less than half of the respondents reported taking technique elective courses in Gonstead or Thompson techniques. More than half of the students stated they were more likely to use Activator (72%), Thompson (68%), diversified (57%), or Cox (54%) techniques in their future practice after taking an elective course in that technique. Females stated that they were more likely to use Activator Methods ( p = .006) in future practice. Chiropractic technique elective courses in the doctor of chiropractic curriculum may influence students' choices of future practice chiropractic technique.
Validating Future Force Performance Measures (Army Class): Concluding Analyses
2016-06-01
32 Table 3.10. Descriptive Statistics and Intercorrelations for LV Final Predictor Factor Scores...55 Table 4.7. Descriptive Statistics for Analysis Criteria...Soldier attrition and performance: Dependability (Non- Delinquency ), Adjustment, Physical Conditioning, Leadership, Work Orientation, and Agreeableness
Review and classification of variability analysis techniques with clinical applications.
Bravi, Andrea; Longtin, André; Seely, Andrew J E
2011-10-10
Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis.
Review and classification of variability analysis techniques with clinical applications
2011-01-01
Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis. PMID:21985357
[Introduction to Exploratory Factor Analysis (EFA)].
Martínez, Carolina Méndez; Sepúlveda, Martín Alonso Rondón
2012-03-01
Exploratory Factor Analysis (EFA) has become one of the most frequently used statistical techniques, especially in the medical and social sciences. Given its popularity, it is essential to understand the basic concepts necessary for its proper application and to take into consideration the main strengths and weaknesses of this technique. To present in a clear and concise manner the main applications of this technique, to determine the basic requirements for its use providing a description step by step of its methodology, and to establish the elements that must be taken into account during its preparation in order to not incur in erroneous results and interpretations. Narrative review. This review identifies the basic concepts and briefly describes the objectives, design, assumptions, and methodology to achieve factor derivation, global adjustment evaluation, and adequate interpretation of results. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.
An open-access CMIP5 pattern library for temperature and precipitation: Description and methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lynch, Cary D.; Hartin, Corinne A.; Bond-Lamberty, Benjamin
Pattern scaling is used to efficiently emulate general circulation models and explore uncertainty in climate projections under multiple forcing scenarios. Pattern scaling methods assume that local climate changes scale with a global mean temperature increase, allowing for spatial patterns to be generated for multiple models for any future emission scenario. For uncertainty quantification and probabilistic statistical analysis, a library of patterns with descriptive statistics for each file would be beneficial, but such a library does not presently exist. Of the possible techniques used to generate patterns, the two most prominent are the delta and least squared regression methods. We exploremore » the differences and statistical significance between patterns generated by each method and assess performance of the generated patterns across methods and scenarios. Differences in patterns across seasons between methods and epochs were largest in high latitudes (60-90°N/S). Bias and mean errors between modeled and pattern predicted output from the linear regression method were smaller than patterns generated by the delta method. Across scenarios, differences in the linear regression method patterns were more statistically significant, especially at high latitudes. We found that pattern generation methodologies were able to approximate the forced signal of change to within ≤ 0.5°C, but choice of pattern generation methodology for pattern scaling purposes should be informed by user goals and criteria. As a result, this paper describes our library of least squared regression patterns from all CMIP5 models for temperature and precipitation on an annual and sub-annual basis, along with the code used to generate these patterns.« less
An open-access CMIP5 pattern library for temperature and precipitation: Description and methodology
Lynch, Cary D.; Hartin, Corinne A.; Bond-Lamberty, Benjamin; ...
2017-05-15
Pattern scaling is used to efficiently emulate general circulation models and explore uncertainty in climate projections under multiple forcing scenarios. Pattern scaling methods assume that local climate changes scale with a global mean temperature increase, allowing for spatial patterns to be generated for multiple models for any future emission scenario. For uncertainty quantification and probabilistic statistical analysis, a library of patterns with descriptive statistics for each file would be beneficial, but such a library does not presently exist. Of the possible techniques used to generate patterns, the two most prominent are the delta and least squared regression methods. We exploremore » the differences and statistical significance between patterns generated by each method and assess performance of the generated patterns across methods and scenarios. Differences in patterns across seasons between methods and epochs were largest in high latitudes (60-90°N/S). Bias and mean errors between modeled and pattern predicted output from the linear regression method were smaller than patterns generated by the delta method. Across scenarios, differences in the linear regression method patterns were more statistically significant, especially at high latitudes. We found that pattern generation methodologies were able to approximate the forced signal of change to within ≤ 0.5°C, but choice of pattern generation methodology for pattern scaling purposes should be informed by user goals and criteria. As a result, this paper describes our library of least squared regression patterns from all CMIP5 models for temperature and precipitation on an annual and sub-annual basis, along with the code used to generate these patterns.« less
Predicting Subsequent Myopia in Initially Pilot-Qualified USAFA Cadets.
1985-12-27
Refraction Measurement 14 Accesion For . 4.0 RESULTS NTIS CRA&I 15 4.1 Descriptive Statistics DTIC TAB 0 15i ~ ~Unannoutwced [ 4.2 Predictive Statistics ...mentioned), and three were missing a status. The data of the subject who was commissionable were dropped from the statistical analyses. Of the 91...relatively equal numbers of participants from all classes will become obvious ’’" - within the results. J 4.1 Descriptive Statistics In the original plan
Evidence-based orthodontics. Current statistical trends in published articles in one journal.
Law, Scott V; Chudasama, Dipak N; Rinchuse, Donald J
2010-09-01
To ascertain the number, type, and overall usage of statistics in American Journal of Orthodontics and Dentofacial (AJODO) articles for 2008. These data were then compared to data from three previous years: 1975, 1985, and 2003. The frequency and distribution of statistics used in the AJODO original articles for 2008 were dichotomized into those using statistics and those not using statistics. Statistical procedures were then broadly divided into descriptive statistics (mean, standard deviation, range, percentage) and inferential statistics (t-test, analysis of variance). Descriptive statistics were used to make comparisons. In 1975, 1985, 2003, and 2008, AJODO published 72, 87, 134, and 141 original articles, respectively. The percentage of original articles using statistics was 43.1% in 1975, 75.9% in 1985, 94.0% in 2003, and 92.9% in 2008; original articles using statistics stayed relatively the same from 2003 to 2008, with only a small 1.1% decrease. The percentage of articles using inferential statistical analyses was 23.7% in 1975, 74.2% in 1985, 92.9% in 2003, and 84.4% in 2008. Comparing AJODO publications in 2003 and 2008, there was an 8.5% increase in the use of descriptive articles (from 7.1% to 15.6%), and there was an 8.5% decrease in articles using inferential statistics (from 92.9% to 84.4%).
Developing an industry-oriented safety curriculum using the Delphi technique.
Chen, Der-Fa; Wu, Tsung-Chih; Chen, Chi-Hsiang; Chang, Shu-Hsuan; Yao, Kai-Chao; Liao, Chin-Wen
2016-09-01
In this study, we examined the development of industry-oriented safety degree curricula at a college level. Based on a review of literature on the practices and study of the development of safety curricula, we classified occupational safety and health curricula into the following three domains: safety engineering, health engineering, and safety and health management. We invited 44 safety professionals to complete a four-round survey that was designed using a modified Delphi technique. We used Chi-square statistics to test the panel experts' consensus on the significance of the items in the three domains and employed descriptive statistics to rank the participants' rating of each item. The results showed that the top three items for each of the three domains were Risk Assessment, Dangerous Machinery and Equipment, and Fire and Explosion Prevention for safety engineering; Ergonomics, Industrial Toxicology, and Health Risk Assessment for health engineering; and Industrial Safety and Health Regulations, Accident Investigation and Analysis, and Emergency Response for safety and health management. Only graduates from safety programmes who possess practical industry-oriented abilities can satisfy industry demands and provide value to the existence of college safety programmes.
Job Satisfaction DEOCS 4.1 Construct Validity Summary
2017-08-01
focuses more specifically on satisfaction with the job. Included is a review of the 4.0 description and items, followed by the proposed modifications to...the factor. The DEOCS 4.0 description provided for job satisfaction is “the perception of personal fulfillment in a specific vocation, and sense of...piloting items on the DEOCS; (4) examining the descriptive statistics, exploratory factor analysis results, and aggregation statistics; and (5
[Experience in the surgical treatment of paranasal sinus mucoceles in a university hospital].
Waizel-Haiat, Salomón; Díaz-Lara, Ivette Margarita; Vargas-Aguayo, Alejandro Martin; Santiago-Cordova, Jorge Luis
Mucoceles are benign cystic lesions of the paranasal sinuses. Endoscopic marsupialisation is considered the first choice of treatment, due to its low morbidity and recurrence rates. To establish the number of patients with recurrence, who were diagnosed clinically or by computed tomography, and who were submitted to surgery in the Ear, Nose and Throat Unit in a tertiary university hospital. A clinical, cross-sectional, descriptive, observational and retrospective study was conducted on patients with a mucocele diagnosis operated on in the period from January 2006 to December 2013. A descriptive statistical analysis was performed to obtain the frequencies, ratios and proportions. Measures of central tendency and dispersion were obtained. The recurrence rates of each surgical technique were compared using the Chi-squared test. Of the 59 patients included in the study, 39 were female and 20 were men. The most common location was in the maxillary sinus (22 patients) followed by frontoethmoidal (20 patients). There was a recurrence of 9% in those submitted to a surgical procedure. The endoscopic approach was used in 51 patients, 8 cases were combined (open plus endoscopic), and there was no open approach. There was a recurrence in 7 of 51 of patients with endoscopic surgery, and one out of 8 patients had a recurrence with a combined technique. No statistically significant relationship was found between the type of surgery and recurrence, or between the presence or absence of a predisposing factor and recurrence. Copyright © 2016 Academia Mexicana de Cirugía A.C. Publicado por Masson Doyma México S.A. All rights reserved.
NASA Technical Reports Server (NTRS)
Davis, B. J.; Feiveson, A. H.
1975-01-01
Results are presented of CITARS data processing in raw form. Tables of descriptive statistics are given along with descriptions and results of inferential analyses. The inferential results are organized by questions which CITARS was designed to answer.
Alar-columellar and lateral nostril changes following tongue-in-groove rhinoplasty.
Shah, Ajul; Pfaff, Miles; Kinsman, Gianna; Steinbacher, Derek M
2015-04-01
Repositioning the medial crura cephalically onto the caudal septum (tongue-in-groove; TIG) allows alteration of the columella, ala, and nasal tip to address alar-columellar disproportion as seen from the lateral view. To date, quantitative analysis of nostril dimension, alar-columellar relationship, and nasal tip changes following the TIG rhinoplasty technique have not been described. The present study aims to evaluate post-operative lateral morphometric changes following TIG. Pre- and post-operative lateral views of a series of consecutive patients who underwent TIG rhinoplasty were produced from 3D images at multiple time points (≤2 weeks, 4-10 weeks, and >10 weeks post-operatively) for analysis. The 3D images were converted to 2D and set to scale. Exposed lateral nostril area, alar-columellar disproportion (divided into superior and inferior heights), nasolabial angle, nostril height, and nostril length were calculated and statistically analyzed using a pairwise t test. A P ≤ 0.05 was considered statistically significant. Ninety-four lateral views were analyzed from 20 patients (16 females; median age: 31.8). One patient had a history of current tobacco cigarette use. Lateral nostril area decreased at all time points post-operatively, in a statistically significant fashion. Alar-columellar disproportion was reduced following TIG at all time points. The nasolabial angle significantly increased post-operatively at ≤2 weeks, 4-10 weeks, and >10, all in a statistically significant fashion. Nostril height and nostril length decreased at all post-operative time points. Morphometric analysis reveals reduction in alar-columellar disproportion and lateral nostril shows following TIG rhinoplasty. Tip rotation, as a function of nasolabial angle, also increased. These results provide quantitative substantiation for qualitative descriptions attributed to the TIG technique. Future studies will focus on area and volumetric measurements, and assessment of long-term stability. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266.
Prison Radicalization: The New Extremist Training Grounds?
2007-09-01
distributing and collecting survey data , and the data analysis. The analytical methodology includes descriptive and inferential statistical methods, in... statistical analysis of the responses to identify significant correlations and relationships. B. SURVEY DATA COLLECTION To effectively access a...Q18, Q19, Q20, and Q21. Due to the exploratory nature of this small survey, data analyses were confined mostly to descriptive statistics and
A heuristic statistical stopping rule for iterative reconstruction in emission tomography.
Ben Bouallègue, F; Crouzet, J F; Mariano-Goulart, D
2013-01-01
We propose a statistical stopping criterion for iterative reconstruction in emission tomography based on a heuristic statistical description of the reconstruction process. The method was assessed for MLEM reconstruction. Based on Monte-Carlo numerical simulations and using a perfectly modeled system matrix, our method was compared with classical iterative reconstruction followed by low-pass filtering in terms of Euclidian distance to the exact object, noise, and resolution. The stopping criterion was then evaluated with realistic PET data of a Hoffman brain phantom produced using the GATE platform for different count levels. The numerical experiments showed that compared with the classical method, our technique yielded significant improvement of the noise-resolution tradeoff for a wide range of counting statistics compatible with routine clinical settings. When working with realistic data, the stopping rule allowed a qualitatively and quantitatively efficient determination of the optimal image. Our method appears to give a reliable estimation of the optimal stopping point for iterative reconstruction. It should thus be of practical interest as it produces images with similar or better quality than classical post-filtered iterative reconstruction with a mastered computation time.
Laser amplification of incoherent radiation
NASA Technical Reports Server (NTRS)
Menegozzi, L. N.; Lamb, W. E., Jr.
1978-01-01
The amplification of noise in a laser amplifier is treated theoretically. The model for the active medium and its description using density-matrix techniques, are taken from the theory of laser operation. The spectral behavior of the radiation in the nonlinear regime is studied and the formalism is written from the onset in the frequency domain. The statistics of the light are gradually modified by the nonlinear amplification process, and expressions are derived for the rate of change of fluctuations in intensity as a measure of statistical changes. In addition, the range of validity of Litvak's Gaussian-statistics approximation is discussed. In the homogeneous-broadening case, the evolution of initially broadband Gaussian radiation toward quasimonochromatic oscillations with laserlike statistics is explored in several numerical examples. The connections of this study with the time-domain work on self-pulsing in a ring-laser configuration, are established. Finally, spectral-narrowing and -rebroadening effects in Doppler-broadened media are discussed both analytically and with numerical examples. These examples show the distinct contribution of pulsations in the population ('Raman-type terms'), and saturation phenomena.
Sikorski, David M.; KizhakkeVeettil, Anupama; Tobias, Gene S.
2016-01-01
Objective: Surveys for the National Board of Chiropractic Examiners indicate that diversified chiropractic technique is the most commonly used chiropractic manipulation method. The study objective was to investigate the influences of our diversified core technique curriculum, a technique survey course, and extracurricular technique activities on students' future practice technique preferences. Methods: We conducted an anonymous, voluntary survey of 1st, 2nd, and 3rd year chiropractic students at our institution. Surveys were pretested for face validity, and data were analyzed using descriptive and inferential statistics. Results: We had 164 students (78% response rate) participate in the survey. Diversified was the most preferred technique for future practice by students, and more than half who completed the chiropractic technique survey course reported changing their future practice technique choice as a result. The students surveyed agreed that the chiropractic technique curriculum and their experiences with chiropractic practitioners were the two greatest bases for their current practice technique preference, and that their participation in extracurricular technique clubs and seminars was less influential. Conclusions: Students appear to have the same practice technique preferences as practicing chiropractors. The chiropractic technique curriculum and the students' experience with chiropractic practitioners seem to have the greatest influence on their choice of chiropractic technique for future practice. Extracurricular activities, including technique clubs and seminars, although well attended, showed a lesser influence on students' practice technique preferences. PMID:26655282
Sikorski, David M; KizhakkeVeettil, Anupama; Tobias, Gene S
2016-03-01
Surveys for the National Board of Chiropractic Examiners indicate that diversified chiropractic technique is the most commonly used chiropractic manipulation method. The study objective was to investigate the influences of our diversified core technique curriculum, a technique survey course, and extracurricular technique activities on students' future practice technique preferences. We conducted an anonymous, voluntary survey of 1st, 2nd, and 3rd year chiropractic students at our institution. Surveys were pretested for face validity, and data were analyzed using descriptive and inferential statistics. We had 164 students (78% response rate) participate in the survey. Diversified was the most preferred technique for future practice by students, and more than half who completed the chiropractic technique survey course reported changing their future practice technique choice as a result. The students surveyed agreed that the chiropractic technique curriculum and their experiences with chiropractic practitioners were the two greatest bases for their current practice technique preference, and that their participation in extracurricular technique clubs and seminars was less influential. Students appear to have the same practice technique preferences as practicing chiropractors. The chiropractic technique curriculum and the students' experience with chiropractic practitioners seem to have the greatest influence on their choice of chiropractic technique for future practice. Extracurricular activities, including technique clubs and seminars, although well attended, showed a lesser influence on students' practice technique preferences.
NASA Technical Reports Server (NTRS)
Szatmary, Steven A.; Gyekenyesi, John P.; Nemeth, Noel N.
1990-01-01
This manual describes the operation and theory of the PC-CARES (Personal Computer-Ceramic Analysis and Reliability Evaluation of Structures) computer program for the IBM PC and compatibles running PC-DOS/MS-DOR OR IBM/MS-OS/2 (version 1.1 or higher) operating systems. The primary purpose of this code is to estimate Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities. Included in the manual is the description of the calculation of shape and scale parameters of the two-parameter Weibull distribution using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. The methods for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull line, as well as the techniques for calculating the Batdorf flaw-density constants are also described.
[Health-related behavior in a sample of Brazilian college students: gender differences].
Colares, Viviane; Franca, Carolina da; Gonzalez, Emília
2009-03-01
This study investigated whether undergraduate students' health-risk behaviors differed according to gender. The sample consisted of 382 subjects, aged 20-29 years, from public universities in Pernambuco State, Brazil. Data were collected using the National College Health Risk Behavior Survey, previously validated in Portuguese. Descriptive and inferential statistical techniques were used. Associations were analyzed with the chi-square test or Fisher's exact test. Statistical significance was set at p < or = 0.05. In general, females engaged in the following risk behaviors less frequently than males: alcohol consumption (p = 0.005), smoking (p = 0.002), experimenting with marijuana (p = 0.002), consumption of inhalants (p < or = 0.001), steroid use (p = 0.003), carrying weapons (p = 0.001), and involvement in physical fights (p = 0.014). Meanwhile, female students displayed more concern about losing or maintaining weight, although they exercised less frequently than males. The findings thus showed statistically different health behaviors between genders. In conclusion, different approaches need to be used for the two genders.
NASA Astrophysics Data System (ADS)
Niedermeier, Dennis; Ervens, Barbara; Clauss, Tina; Voigtländer, Jens; Wex, Heike; Hartmann, Susan; Stratmann, Frank
2014-01-01
In a recent study, the Soccer ball model (SBM) was introduced for modeling and/or parameterizing heterogeneous ice nucleation processes. The model applies classical nucleation theory. It allows for a consistent description of both apparently singular and stochastic ice nucleation behavior, by distributing contact angles over the nucleation sites of a particle population assuming a Gaussian probability density function. The original SBM utilizes the Monte Carlo technique, which hampers its usage in atmospheric models, as fairly time-consuming calculations must be performed to obtain statistically significant results. Thus, we have developed a simplified and computationally more efficient version of the SBM. We successfully used the new SBM to parameterize experimental nucleation data of, e.g., bacterial ice nucleation. Both SBMs give identical results; however, the new model is computationally less expensive as confirmed by cloud parcel simulations. Therefore, it is a suitable tool for describing heterogeneous ice nucleation processes in atmospheric models.
NASA Astrophysics Data System (ADS)
Busthanul, N.; Lumoindong, Y.; Syafiuddin, M.; Heliawaty; Lanuhu, N.; Ibrahim, T.; Ambrosius, R. R.
2018-05-01
Farmers’ attitudes and perceptions may be the cause of ineffective implementation of conservation farming for agriculture sustainability due to vary of implementing of conservation techniques. The purpose of this research is to know the attitude and perception of farmer toward the application of conservation technique and to know correlation between farmer attitude and perception toward the application of conservation technique. The research was carried out in Kanreapia Village, Tombolo Pao District, Gowa Regency, South Sulawesi Province, Indonesia. Sampling was done by randomly with 30 farmers; using non-parametric statistics with quantitative and qualitative descriptive data analysis approach, using Likert scale. The result showed that farmer attitude and perception toward conservation technique implementation which having the highest category (appropriate) is seasonal crop rotation, while the lowest with less appropriate category is the processing of land according to the contour and the cultivation of the plants accordingly. There is a very strong relationship between farmer attitude and perception. The implications of the findings are that improvements the implementation of conservation farming techniques should be made through improved perceptions.
2016-11-15
participants who were followed for the development of back pain for an average of 3.9 years. Methods. Descriptive statistics and longitudinal...health, military personnel, occupational health, outcome assessment, statistics, survey methodology . Level of Evidence: 3 Spine 2016;41:1754–1763ack...based on the National Health and Nutrition Examination Survey.21 Statistical Analysis Descriptive and univariate analyses compared character- istics
Rebuilding Government Legitimacy in Post-conflict Societies: Case Studies of Nepal and Afghanistan
2015-09-09
administered via the verbal scales due to reduced time spent explaining the visual show cards. Statistical results corresponded with observations from...a three-step strategy for dealing with item non-response. First, basic descriptive statistics are calculated to determine the extent of item...descriptive statistics for all items in the survey), however this section of the report highlights just some of the findings. Thus, the results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bennett, Janine Camille; Thompson, David; Pebay, Philippe Pierre
Statistical analysis is typically used to reduce the dimensionality of and infer meaning from data. A key challenge of any statistical analysis package aimed at large-scale, distributed data is to address the orthogonal issues of parallel scalability and numerical stability. Many statistical techniques, e.g., descriptive statistics or principal component analysis, are based on moments and co-moments and, using robust online update formulas, can be computed in an embarrassingly parallel manner, amenable to a map-reduce style implementation. In this paper we focus on contingency tables, through which numerous derived statistics such as joint and marginal probability, point-wise mutual information, information entropy,more » and {chi}{sup 2} independence statistics can be directly obtained. However, contingency tables can become large as data size increases, requiring a correspondingly large amount of communication between processors. This potential increase in communication prevents optimal parallel speedup and is the main difference with moment-based statistics (which we discussed in [1]) where the amount of inter-processor communication is independent of data size. Here we present the design trade-offs which we made to implement the computation of contingency tables in parallel. We also study the parallel speedup and scalability properties of our open source implementation. In particular, we observe optimal speed-up and scalability when the contingency statistics are used in their appropriate context, namely, when the data input is not quasi-diffuse.« less
Assuring reliability program effectiveness.
NASA Technical Reports Server (NTRS)
Ball, L. W.
1973-01-01
An attempt is made to provide simple identification and description of techniques that have proved to be most useful either in developing a new product or in improving reliability of an established product. The first reliability task is obtaining and organizing parts failure rate data. Other tasks are parts screening, tabulation of general failure rates, preventive maintenance, prediction of new product reliability, and statistical demonstration of achieved reliability. Five principal tasks for improving reliability involve the physics of failure research, derating of internal stresses, control of external stresses, functional redundancy, and failure effects control. A final task is the training and motivation of reliability specialist engineers.
The method of expected number of deaths, 1786-1886-1986.
Keiding, N
1987-04-01
"The method of expected number of deaths is an integral part of standardization of vital rates, which is one of the oldest statistical techniques. The expected number of deaths was calculated in 18th century actuarial mathematics...but the method seems to have been forgotten, and was reinvented in connection with 19th century studies of geographical and occupational variations of mortality.... It is noted that standardization of rates is intimately connected to the study of relative mortality, and a short description of very recent developments in the methodology of that area is included." (SUMMARY IN FRE) excerpt
DOE Office of Scientific and Technical Information (OSTI.GOV)
KL Gaustad; DD Turner
2007-09-30
This report provides a short description of the Atmospheric Radiation Measurement (ARM) microwave radiometer (MWR) RETrievel (MWRRET) Value-Added Product (VAP) algorithm. This algorithm utilizes complimentary physical and statistical retrieval methods and applies brightness temperature offsets to reduce spurious liquid water path (LWP) bias in clear skies resulting in significantly improved precipitable water vapor (PWV) and LWP retrievals. We present a general overview of the technique, input parameters, output products, and describe data quality checks. A more complete discussion of the theory and results is given in Turner et al. (2007b).
Applying Descriptive Statistics to Teaching the Regional Classification of Climate.
ERIC Educational Resources Information Center
Lindquist, Peter S.; Hammel, Daniel J.
1998-01-01
Describes an exercise for college and high school students that relates descriptive statistics to the regional climatic classification. The exercise introduces students to simple calculations of central tendency and dispersion, the construction and interpretation of scatterplots, and the definition of climatic regions. Forces students to engage…
de Sá, Joceline Cássia Ferezini; Marini, Gabriela; Gelaleti, Rafael Bottaro; da Silva, João Batista; de Azevedo, George Gantas; Rudge, Marilza Vieira Cunha
2013-11-01
To evaluate the methodological and statistical design evolution of the publications in the Brazilian Journal of Gynecology and Obstetrics (RBGO) from resolution 196/96. A review of 133 articles published in 1999 (65) and 2009 (68) was performed by two independent reviewers with training in clinical epidemiology and methodology of scientific research. We included all original clinical articles, case and series reports and excluded editorials, letters to the editor, systematic reviews, experimental studies, opinion articles, besides abstracts of theses and dissertations. Characteristics related to the methodological quality of the studies were analyzed in each article using a checklist that evaluated two criteria: methodological aspects and statistical procedures. We used descriptive statistics and the χ2 test for comparison of the two years. There was a difference between 1999 and 2009 regarding the study and statistical design, with more accuracy in the procedures and the use of more robust tests between 1999 and 2009. In RBGO, we observed an evolution in the methods of published articles and a more in-depth use of the statistical analyses, with more sophisticated tests such as regression and multilevel analyses, which are essential techniques for the knowledge and planning of health interventions, leading to fewer interpretation errors.
Statistics in three biomedical journals.
Pilcík, T
2003-01-01
In this paper we analyze the use of statistics and associated problems, in three Czech biological journals in the year 2000. We investigated 23 articles Folia Biologica, 60 articles in Folia Microbiologica, and 88 articles in Physiological Research. The highest frequency of publications with statistical content have used descriptive statistics and t-test. The most usual mistake concerns the absence of reference about the used statistical software and insufficient description of the data. We have compared our results with the results of similar studies in some other medical journals. The use of important statistical methods is comparable with those used in most medical journals, the proportion of articles, in which the applied method is described insufficiently is moderately low.
An open-access CMIP5 pattern library for temperature and precipitation: description and methodology
NASA Astrophysics Data System (ADS)
Lynch, Cary; Hartin, Corinne; Bond-Lamberty, Ben; Kravitz, Ben
2017-05-01
Pattern scaling is used to efficiently emulate general circulation models and explore uncertainty in climate projections under multiple forcing scenarios. Pattern scaling methods assume that local climate changes scale with a global mean temperature increase, allowing for spatial patterns to be generated for multiple models for any future emission scenario. For uncertainty quantification and probabilistic statistical analysis, a library of patterns with descriptive statistics for each file would be beneficial, but such a library does not presently exist. Of the possible techniques used to generate patterns, the two most prominent are the delta and least squares regression methods. We explore the differences and statistical significance between patterns generated by each method and assess performance of the generated patterns across methods and scenarios. Differences in patterns across seasons between methods and epochs were largest in high latitudes (60-90° N/S). Bias and mean errors between modeled and pattern-predicted output from the linear regression method were smaller than patterns generated by the delta method. Across scenarios, differences in the linear regression method patterns were more statistically significant, especially at high latitudes. We found that pattern generation methodologies were able to approximate the forced signal of change to within ≤ 0.5 °C, but the choice of pattern generation methodology for pattern scaling purposes should be informed by user goals and criteria. This paper describes our library of least squares regression patterns from all CMIP5 models for temperature and precipitation on an annual and sub-annual basis, along with the code used to generate these patterns. The dataset and netCDF data generation code are available at doi:10.5281/zenodo.495632.
Unlawful Discrimination DEOCS 4.1 Construct Validity Summary
2017-08-01
Included is a review of the 4.0 description and items, followed by the proposed modifications to the factor. The current DEOCS (4.0) contains multiple...Officer (E7 – E9) 586 10.8% Junior Officer (O1 – O3) 474 9% Senior Officer (O4 and above) 391 6.1% Descriptive Statistics and Reliability This section...displays descriptive statistics for the items on the Unlawful Discrimination scale. All items had a range from 1 to 7 (strongly disagree to strongly
Shahi, Shahriar; Ghasemi, Negin; Rahimi, Saeed; Yavari, Hamidreza; Janani, Maryam; Mokhtari, Hadi; Bahari, Mahmood; Rabbani, Parastu
2015-01-01
The aim of the present study was to evaluate the effect of different mixing techniques (conventional, amalgamator and ultrasonic mixing) on the physical properties the working time (WT), setting time (ST), dimensional changes (DC) and film thickness (FT)] of calcium-enriched mixture (CEM) cement and mineral trioxide aggregate (MTA). The mentioned physical properties were determined using the ISO 6786:2001 specification. Six samples of each material were prepared for three mixing techniques (totally 36 samples). Data were analyzed using descriptive statistics, two-way ANOVA and Post Hoc Tukey's tests. The level of significance was defined at 0.05. Irrespective of mixing technique, there was no significant difference between the WT and FT of the tested materials. Except for the DC of MTA and the FT of the all materials, other properties were significantly affected with mixing techniques (P<0.05). The ultrasonic technique decreased the ST of MTA and CEM cement and increased the WT of CEM cement (P<0.05). The mixing technique of the materials had no significant effect on the dimensional changes of MTA and the film thickness of both materials.
Descriptive Statistics of the Genome: Phylogenetic Classification of Viruses.
Hernandez, Troy; Yang, Jie
2016-10-01
The typical process for classifying and submitting a newly sequenced virus to the NCBI database involves two steps. First, a BLAST search is performed to determine likely family candidates. That is followed by checking the candidate families with the pairwise sequence alignment tool for similar species. The submitter's judgment is then used to determine the most likely species classification. The aim of this article is to show that this process can be automated into a fast, accurate, one-step process using the proposed alignment-free method and properly implemented machine learning techniques. We present a new family of alignment-free vectorizations of the genome, the generalized vector, that maintains the speed of existing alignment-free methods while outperforming all available methods. This new alignment-free vectorization uses the frequency of genomic words (k-mers), as is done in the composition vector, and incorporates descriptive statistics of those k-mers' positional information, as inspired by the natural vector. We analyze five different characterizations of genome similarity using k-nearest neighbor classification and evaluate these on two collections of viruses totaling over 10,000 viruses. We show that our proposed method performs better than, or as well as, other methods at every level of the phylogenetic hierarchy. The data and R code is available upon request.
Using Microsoft Excel[R] to Calculate Descriptive Statistics and Create Graphs
ERIC Educational Resources Information Center
Carr, Nathan T.
2008-01-01
Descriptive statistics and appropriate visual representations of scores are important for all test developers, whether they are experienced testers working on large-scale projects, or novices working on small-scale local tests. Many teachers put in charge of testing projects do not know "why" they are important, however, and are utterly convinced…
Self-Esteem and Academic Achievement of High School Students
ERIC Educational Resources Information Center
Moradi Sheykhjan, Tohid; Jabari, Kamran; Rajeswari, K.
2014-01-01
The primary purpose of this study was to determine the influence of self-esteem on academic achievement among high school students in Miandoab City of Iran. The methodology of the research is descriptive and correlation that descriptive and inferential statistics were used to analyze the data. Statistical Society includes male and female high…
A Statistical Description of Neural Ensemble Dynamics
Long, John D.; Carmena, Jose M.
2011-01-01
The growing use of multi-channel neural recording techniques in behaving animals has produced rich datasets that hold immense potential for advancing our understanding of how the brain mediates behavior. One limitation of these techniques is they do not provide important information about the underlying anatomical connections among the recorded neurons within an ensemble. Inferring these connections is often intractable because the set of possible interactions grows exponentially with ensemble size. This is a fundamental challenge one confronts when interpreting these data. Unfortunately, the combination of expert knowledge and ensemble data is often insufficient for selecting a unique model of these interactions. Our approach shifts away from modeling the network diagram of the ensemble toward analyzing changes in the dynamics of the ensemble as they relate to behavior. Our contribution consists of adapting techniques from signal processing and Bayesian statistics to track the dynamics of ensemble data on time-scales comparable with behavior. We employ a Bayesian estimator to weigh prior information against the available ensemble data, and use an adaptive quantization technique to aggregate poorly estimated regions of the ensemble data space. Importantly, our method is capable of detecting changes in both the magnitude and structure of correlations among neurons missed by firing rate metrics. We show that this method is scalable across a wide range of time-scales and ensemble sizes. Lastly, the performance of this method on both simulated and real ensemble data is used to demonstrate its utility. PMID:22319486
Nieri, Michele; Clauser, Carlo; Franceschi, Debora; Pagliaro, Umberto; Saletta, Daniele; Pini-Prato, Giovanpaolo
2007-08-01
The aim of the present study was to investigate the relationships among reported methodological, statistical, clinical and paratextual variables of randomized clinical trials (RCTs) in implant therapy, and their influence on subsequent research. The material consisted of the RCTs in implant therapy published through the end of the year 2000. Methodological, statistical, clinical and paratextual features of the articles were assessed and recorded. The perceived clinical relevance was subjectively evaluated by an experienced clinician on anonymous abstracts. The impact on research was measured by the number of citations found in the Science Citation Index. A new statistical technique (Structural learning of Bayesian Networks) was used to assess the relationships among the considered variables. Descriptive statistics revealed that the reported methodology and statistics of RCTs in implant therapy were defective. Follow-up of the studies was generally short. The perceived clinical relevance appeared to be associated with the objectives of the studies and with the number of published images in the original articles. The impact on research was related to the nationality of the involved institutions and to the number of published images. RCTs in implant therapy (until 2000) show important methodological and statistical flaws and may not be appropriate for guiding clinicians in their practice. The methodological and statistical quality of the studies did not appear to affect their impact on practice and research. Bayesian Networks suggest new and unexpected relationships among the methodological, statistical, clinical and paratextual features of RCTs.
NASA Astrophysics Data System (ADS)
Jin, Zhenyu; Lin, Jing; Liu, Zhong
2008-07-01
By study of the classical testing techniques (such as Shack-Hartmann Wave-front Sensor) adopted in testing the aberration of ground-based astronomical optical telescopes, we bring forward two testing methods on the foundation of high-resolution image reconstruction technology. One is based on the averaged short-exposure OTF and the other is based on the Speckle Interferometric OTF by Antoine Labeyrie. Researches made by J.Ohtsubo, F. Roddier, Richard Barakat and J.-Y. ZHANG indicated that the SITF statistical results would be affected by the telescope optical aberrations, which means the SITF statistical results is a function of optical system aberration and the atmospheric Fried parameter (seeing). Telescope diffraction-limited information can be got through two statistics methods of abundant speckle images: by the first method, we can extract the low frequency information such as the full width at half maximum (FWHM) of the telescope PSF to estimate the optical quality; by the second method, we can get a more precise description of the telescope PSF with high frequency information. We will apply the two testing methods to the 2.4m optical telescope of the GMG Observatory, in china to validate their repeatability and correctness and compare the testing results with that of the Shack-Hartmann Wave-Front Sensor got. This part will be described in detail in our paper.
NASA Astrophysics Data System (ADS)
Haviz, M.
2018-04-01
The purpose of this article is to design and develop an interactive CD on spermatogenesis. This is a research and development. Procedure of development is making an outline of media program, making flowchart, making story board, gathering of materials, programming and finishing. The quantitative data obtained were analyzed by descriptive statistics. Qualitative data obtained were analyzed with Miles and Huberman techniques. The instrument used is a validation sheet. The result of CD design with a Macro flash MX program shows there are 17 slides generated. This prototype obtained a valid value after a self-review technique with many revisions, especially on sound and programming. This finding suggests that process-oriented spermatogenesis can be audio-visualized into a more comprehensive form of learning media. But this interactive CD product needs further testing to determine consistency and resistance to revisions.
Implementation of a new algorithm for Density Equalizing Map Projections (DEMP)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Close, E.R.; Merrill, D.W.; Holmes, H.H.
The purpose of the PAREP (Populations at Risk to Environmental Pollution) Project at Lawrence Berkeley National Laboratory (LBNL), an ongoing Department of Energy (DOE) project since 1978, is to develop resources (data, computing techniques, and biostatistical methodology) applicable to DOE`s needs. Specifically, the PAREP project has developed techniques for statistically analyzing disease distributions in the vicinity of supposed environmental hazards. Such techniques can be applied to assess the health risks in populations residing near DOE installations, provided adequate small-area health data are available. The FY 1994 task descriptions for the PAREP project were determined in discussions at LBNL on 11/2/93.more » The FY94 PAREP Work Authorization specified three major tasks: a prototype small area study, a feasibility study for obtaining small-area data, and preservation of the PAREP data archive. The complete FY94 work plan, and the subtasks accomplished to date, were included in the Cumulative FY94 progress report.« less
21 CFR 820.250 - Statistical techniques.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of process...
21 CFR 820.250 - Statistical techniques.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of process...
Low, Diana H P; Motakis, Efthymios
2013-10-01
Binding free energy calculations obtained through molecular dynamics simulations reflect intermolecular interaction states through a series of independent snapshots. Typically, the free energies of multiple simulated series (each with slightly different starting conditions) need to be estimated. Previous approaches carry out this task by moving averages at certain decorrelation times, assuming that the system comes from a single conformation description of binding events. Here, we discuss a more general approach that uses statistical modeling, wavelets denoising and hierarchical clustering to estimate the significance of multiple statistically distinct subpopulations, reflecting potential macrostates of the system. We present the deltaGseg R package that performs macrostate estimation from multiple replicated series and allows molecular biologists/chemists to gain physical insight into the molecular details that are not easily accessible by experimental techniques. deltaGseg is a Bioconductor R package available at http://bioconductor.org/packages/release/bioc/html/deltaGseg.html.
NASA Astrophysics Data System (ADS)
Böhm, Fabian; Grosse, Nicolai B.; Kolarczik, Mirco; Herzog, Bastian; Achtstein, Alexander; Owschimikow, Nina; Woggon, Ulrike
2017-09-01
Quantum state tomography and the reconstruction of the photon number distribution are techniques to extract the properties of a light field from measurements of its mean and fluctuations. These techniques are particularly useful when dealing with macroscopic or mesoscopic systems, where a description limited to the second order autocorrelation soon becomes inadequate. In particular, the emission of nonclassical light is expected from mesoscopic quantum dot systems strongly coupled to a cavity or in systems with large optical nonlinearities. We analyze the emission of a quantum dot-semiconductor optical amplifier system by quantifying the modifications of a femtosecond laser pulse propagating through the device. Using a balanced detection scheme in a self-heterodyning setup, we achieve precise measurements of the quadrature components and their fluctuations at the quantum noise limit1. We resolve the photon number distribution and the thermal-to-coherent evolution in the photon statistics of the emission. The interferometric detection achieves a high sensitivity in the few photon limit. From our data, we can also reconstruct the second order autocorrelation function with higher precision and time resolution compared with classical Hanbury Brown-Twiss experiments.
Statistical description and transport in stochastic magnetic fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vanden Eijnden, E.; Balescu, R.
1996-03-01
The statistical description of particle motion in a stochastic magnetic field is presented. Starting form the stochastic Liouville equation (or, hybrid kinetic equation) associated with the equations of motion of a test particle, the probability distribution function of the system is obtained for various magnetic fields and collisional processes. The influence of these two ingredients on the statistics of the particle dynamics is stressed. In all cases, transport properties of the system are discussed. {copyright} {ital 1996 American Institute of Physics.}
ERIC Educational Resources Information Center
Jabari, Kamran; Moradi Sheykhjan, Tohid
2015-01-01
Present study examined the relationship between stress among academic staff and students' satisfaction of their performances in Payame Noor University (PNU) of Miandoab City, Iran in 2014. The methodology of the research is descriptive and correlation that descriptive and inferential statistics were used to analyze the data. Statistical Society…
ERIC Educational Resources Information Center
Brattin, Barbara C.
Content analysis was performed on the top six core journals for 1990 in library and information science to determine the extent of research in the field. Articles (n=186) were examined for descriptive or inferential statistics and separately for the presence of mathematical models. Results show a marked (14%) increase in research for 1990,…
Smith, Ben J; Zehle, Katharina; Bauman, Adrian E; Chau, Josephine; Hawkshaw, Barbara; Frost, Steven; Thomas, Margaret
2006-04-01
This study examined the use of quantitative methods in Australian health promotion research in order to identify methodological trends and priorities for strengthening the evidence base for health promotion. Australian health promotion articles were identified by hand searching publications from 1992-2002 in six journals: Health Promotion Journal of Australia, Australian and New Zealand journal of Public Health, Health Promotion International, Health Education Research, Health Education and Behavior and the American Journal of Health Promotion. The study designs and statistical methods used in articles presenting quantitative research were recorded. 591 (57.7%) of the 1,025 articles used quantitative methods. Cross-sectional designs were used in the majority (54.3%) of studies with pre- and post-test (14.6%) and post-test only (9.5%) the next most common designs. Bivariate statistical methods were used in 45.9% of papers, multivariate methods in 27.1% and simple numbers and proportions in 25.4%. Few studies used higher-level statistical techniques. While most studies used quantitative methods, the majority were descriptive in nature. The study designs and statistical methods used provided limited scope for demonstrating intervention effects or understanding the determinants of change.
Khalaf, K A; Parnianpour, M; Sparto, P J; Barin, K
1999-01-01
In any quantitative gait or occupational biomechanics investigation, the quantification of the different kinematic, kinetic, and electromyographic parameters is essential towards assessment of functional capacity and development of a biomechanical profile of the task demands. In the current study, the authors presented a methodology for using inferential statistics to evaluate the effect of lift characteristics on phase-dependent and phase-independent variability in performance. Using a database of kinematic and kinetic profiles obtained from a manual lifting study, the phase-dependent effects of lift characteristics: box mass (load), mode (technique of lift), and speed (frequency of lift) were investigated through the use of analysis of variance (ANOVA) techniques, which recognize the vectorial constitution of the profiles. In addition, the Karhunen-Loeve Expansion (KLE) feature extraction method was used for representing the lifting patterns of measured joint angular position, velocity, acceleration, and net muscular torque profiles obtained from a 2-D biomechanical lifting model in order to study the phase-independent effects. In comparison to traditional descriptive statistical analyses currently used in various occupational biomechanics experimental investigations, this method allows the significant information content of the time varying signal to be captured, enhancing the sensitivity of subsequent hypothesis testing procedures. The application of this technique to MMH investigations allows identification of the lift characteristics that dominate the variability of task demands, hence aiding in the design and assessment of ergonomic solutions.
Gibert, Karina; García-Rudolph, Alejandro; García-Molina, Alberto; Roig-Rovira, Teresa; Bernabeu, Montse; Tormos, José María
2008-01-01
Develop a classificatory tool to identify different populations of patients with Traumatic Brain Injury based on the characteristics of deficit and response to treatment. A KDD framework where first, descriptive statistics of every variable was done, data cleaning and selection of relevant variables. Then data was mined using a generalization of Clustering based on rules (CIBR), an hybrid AI and Statistics technique which combines inductive learning (AI) and clustering (Statistics). A prior Knowledge Base (KB) is considered to properly bias the clustering; semantic constraints implied by the KB hold in final clusters, guaranteeing interpretability of the resultis. A generalization (Exogenous Clustering based on rules, ECIBR) is presented, allowing to define the KB in terms of variables which will not be considered in the clustering process itself, to get more flexibility. Several tools as Class panel graph are introduced in the methodology to assist final interpretation. A set of 5 classes was recommended by the system and interpretation permitted profiles labeling. From the medical point of view, composition of classes is well corresponding with different patterns of increasing level of response to rehabilitation treatments. All the patients initially assessable conform a single group. Severe impaired patients are subdivided in four profiles which clearly distinct response patterns. Particularly interesting the partial response profile, where patients could not improve executive functions. Meaningful classes were obtained and, from a semantics point of view, the results were sensibly improved regarding classical clustering, according to our opinion that hybrid AI & Stats techniques are more powerful for KDD than pure ones.
Okafoagu, Nneka Christina; Oche, Mansur; Awosan, Kehinde Joseph; Abdulmulmuni, Hashim Bala; Gana, Godwin Jiya; Ango, Jessica Timane; Raji, Ismail
2017-06-23
Textile dye workers are subject to occupational hazards on a daily basis due to exposure to precarious conditions in the workplace. This study aimed to assess the knowledge, attitude and safety practices and its determinants among textile dye workers in Sokoto metropolis, Nigeria. This is a descriptive cross-sectional study conducted among 200 textile dye workers and the respondents were selected by multi stage sampling technique. Data was collected using an interviewer administered questionnaire. Data was processed using SPSS IBM version 20 and analyzed using descriptive and inferential statistics. Majority of the respondents (74.0%) had good knowledge of workplace hazards; (81.0%) had positive attitude and only 20% observed all the safety practices. Formal education (P=0.047); working less than 5 days a week (P=0.001) and permanent employment (P=0.013) were found to be determinants of respondents' knowledge and attitude towards workplace hazards. Although the respondents had good knowledge and positive attitude, their lack of observance of safety practices brings to fore the need for direct safety instruction and training and retraining of textile dye workers on workplace hazards and safety practices.
Accurate Identification of MCI Patients via Enriched White-Matter Connectivity Network
NASA Astrophysics Data System (ADS)
Wee, Chong-Yaw; Yap, Pew-Thian; Brownyke, Jeffery N.; Potter, Guy G.; Steffens, David C.; Welsh-Bohmer, Kathleen; Wang, Lihong; Shen, Dinggang
Mild cognitive impairment (MCI), often a prodromal phase of Alzheimer's disease (AD), is frequently considered to be a good target for early diagnosis and therapeutic interventions of AD. Recent emergence of reliable network characterization techniques have made understanding neurological disorders at a whole brain connectivity level possible. Accordingly, we propose a network-based multivariate classification algorithm, using a collection of measures derived from white-matter (WM) connectivity networks, to accurately identify MCI patients from normal controls. An enriched description of WM connections, utilizing six physiological parameters, i.e., fiber penetration count, fractional anisotropy (FA), mean diffusivity (MD), and principal diffusivities (λ 1, λ 2, λ 3), results in six connectivity networks for each subject to account for the connection topology and the biophysical properties of the connections. Upon parcellating the brain into 90 regions-of-interest (ROIs), the average statistics of each ROI in relation to the remaining ROIs are extracted as features for classification. These features are then sieved to select the most discriminant subset of features for building an MCI classifier via support vector machines (SVMs). Cross-validation results indicate better diagnostic power of the proposed enriched WM connection description than simple description with any single physiological parameter.
1985-09-01
4 C/SCSC Terms and Definitions ...... ..... 5 Cost Performance Report Analysis (CPA) Progrra" m 6 Description of CPRA Terms and Formulas...hypotheses are: 1 2 C2: al’ 02 ’ The test statistic is then calculated as: F* (( SSEI + (nI - 2)) / (SSE 2 + (n 2 - 2))] The critical F value is: F(c, nl...353.90767 SIGNIF F = .0000 44 ,1 42 •.4 m . - .TABLE B.4 General Linear Test for EAC1 and EAC5 MEAN STD DEV CASES ECAC 827534.056 1202737.882 1630 EACS
Nuclear Deformation at Finite Temperature
NASA Astrophysics Data System (ADS)
Alhassid, Y.; Gilbreth, C. N.; Bertsch, G. F.
2014-12-01
Deformation, a key concept in our understanding of heavy nuclei, is based on a mean-field description that breaks the rotational invariance of the nuclear many-body Hamiltonian. We present a method to analyze nuclear deformations at finite temperature in a framework that preserves rotational invariance. The auxiliary-field Monte Carlo method is used to generate a statistical ensemble and calculate the probability distribution associated with the quadrupole operator. Applying the technique to nuclei in the rare-earth region, we identify model-independent signatures of deformation and find that deformation effects persist to temperatures higher than the spherical-to-deformed shape phase-transition temperature of mean-field theory.
Employee resourcing strategies and universities' corporate image: A survey dataset.
Falola, Hezekiah Olubusayo; Oludayo, Olumuyiwa Akinrole; Olokundun, Maxwell Ayodele; Salau, Odunayo Paul; Ibidunni, Ayodotun Stephen; Igbinoba, Ebe
2018-06-01
The data examined the effect of employee resourcing strategies on corporate image. The data were generated from a total of 500 copies of questionnaire administered to the academic staff of the six (6) selected private Universities in Southwest, Nigeria, out of which four hundred and forty-three (443) were retrieved. Stratified and simple random sampling techniques were used to select the respondents for this study. Descriptive and Linear Regression, were used for the presentation of the data. Mean score was used as statistical tool of analysis. Therefore, the data presented in this article is made available to facilitate further and more comprehensive investigation on the subject matter.
NASA Astrophysics Data System (ADS)
Del Giudice, Dario; Löwe, Roland; Madsen, Henrik; Mikkelsen, Peter Steen; Rieckermann, Jörg
2015-07-01
In urban rainfall-runoff, commonly applied statistical techniques for uncertainty quantification mostly ignore systematic output errors originating from simplified models and erroneous inputs. Consequently, the resulting predictive uncertainty is often unreliable. Our objective is to present two approaches which use stochastic processes to describe systematic deviations and to discuss their advantages and drawbacks for urban drainage modeling. The two methodologies are an external bias description (EBD) and an internal noise description (IND, also known as stochastic gray-box modeling). They emerge from different fields and have not yet been compared in environmental modeling. To compare the two approaches, we develop a unifying terminology, evaluate them theoretically, and apply them to conceptual rainfall-runoff modeling in the same drainage system. Our results show that both approaches can provide probabilistic predictions of wastewater discharge in a similarly reliable way, both for periods ranging from a few hours up to more than 1 week ahead of time. The EBD produces more accurate predictions on long horizons but relies on computationally heavy MCMC routines for parameter inferences. These properties make it more suitable for off-line applications. The IND can help in diagnosing the causes of output errors and is computationally inexpensive. It produces best results on short forecast horizons that are typical for online applications.
75 FR 4323 - Additional Quantitative Fit-testing Protocols for the Respiratory Protection Standard
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-27
... respirators (500 and 1000 for protocols 1 and 2, respectively). However, OSHA could not evaluate the results... the values of these descriptive statistics for revised PortaCount[supreg] QNFT protocols 1 (at RFFs of 100 and 500) and 2 (at RFFs of 200 and 1000). Table 2--Descriptive Statistics for RFFs of 100 and 200...
Regression modeling of ground-water flow
Cooley, R.L.; Naff, R.L.
1985-01-01
Nonlinear multiple regression methods are developed to model and analyze groundwater flow systems. Complete descriptions of regression methodology as applied to groundwater flow models allow scientists and engineers engaged in flow modeling to apply the methods to a wide range of problems. Organization of the text proceeds from an introduction that discusses the general topic of groundwater flow modeling, to a review of basic statistics necessary to properly apply regression techniques, and then to the main topic: exposition and use of linear and nonlinear regression to model groundwater flow. Statistical procedures are given to analyze and use the regression models. A number of exercises and answers are included to exercise the student on nearly all the methods that are presented for modeling and statistical analysis. Three computer programs implement the more complex methods. These three are a general two-dimensional, steady-state regression model for flow in an anisotropic, heterogeneous porous medium, a program to calculate a measure of model nonlinearity with respect to the regression parameters, and a program to analyze model errors in computed dependent variables such as hydraulic head. (USGS)
Exploring Marine Corps Officer Quality: An Analysis of Promotion to Lieutenant Colonel
2017-03-01
44 G. DESCRIPTIVE STATISTICS ................................................................44 1. Dependent...Variable Summary Statistics ...................................44 2. Performance...87 4. Further Research .........................................................................88 APPENDIX A. SUMMARY STATISTICS OF FITREP AND
Aquino-Pérez, Dulce María; Peña-Cadena, Daniel; Trujillo-García, José Ubaldo; Jiménez-Sandoval, Jaime Omar; Machorro-Muñoz, Olga Stephanie
2013-01-01
The use of metered dose inhaler (MDI) is key in the treatment of asthma; its effectiveness is related to proper technique. The purpose of this study is to evaluate the use of the technique of metered dose inhalers for the parents or guardians of school children with asthma. In this cross-sectional study, we used a sample of 221 individual caregivers (parent or guardian) of asthmatic children from 5 to 12 years old, who use MDI. We designed a validated questionnaire consisting of 27 items which addressed the handling of inhaler technique. Descriptive statistics was used. Caregivers were rated as "good technique" in 41 fathers (18.6%), 77 mothers (34.8%) and 9 tutors (4.1%), and with a "regular technique" 32 fathers (14.5%), 48 mothers (21.2%) and 14 guardians (6.3%). Asthmatic children aged 9 were rated as with "good technique" in 24 (10.9%). According to gender, we found a "good technique" in 80 boys (36.2%) and 47 girls (21.3%) and with a "regular technique" in 59 boys (26.7%) and 35 girls (15.8%), P 0.0973, RP 0.9. We found with a "regular technique" mainly those asthmatic children diagnosed at ages between 1 to 3 years. Most of the participants had a good technical qualification; however major mistakes were made at key points in the performance of it.
Statistics of the geomagnetic secular variation for the past 5Ma
NASA Technical Reports Server (NTRS)
Constable, C. G.; Parker, R. L.
1986-01-01
A new statistical model is proposed for the geomagnetic secular variation over the past 5Ma. Unlike previous models, the model makes use of statistical characteristics of the present day geomagnetic field. The spatial power spectrum of the non-dipole field is consistent with a white source near the core-mantle boundary with Gaussian distribution. After a suitable scaling, the spherical harmonic coefficients may be regarded as statistical samples from a single giant Gaussian process; this is the model of the non-dipole field. The model can be combined with an arbitrary statistical description of the dipole and probability density functions and cumulative distribution functions can be computed for declination and inclination that would be observed at any site on Earth's surface. Global paleomagnetic data spanning the past 5Ma are used to constrain the statistics of the dipole part of the field. A simple model is found to be consistent with the available data. An advantage of specifying the model in terms of the spherical harmonic coefficients is that it is a complete statistical description of the geomagnetic field, enabling us to test specific properties for a general description. Both intensity and directional data distributions may be tested to see if they satisfy the expected model distributions.
Statistics of the geomagnetic secular variation for the past 5 m.y
NASA Technical Reports Server (NTRS)
Constable, C. G.; Parker, R. L.
1988-01-01
A new statistical model is proposed for the geomagnetic secular variation over the past 5Ma. Unlike previous models, the model makes use of statistical characteristics of the present day geomagnetic field. The spatial power spectrum of the non-dipole field is consistent with a white source near the core-mantle boundary with Gaussian distribution. After a suitable scaling, the spherical harmonic coefficients may be regarded as statistical samples from a single giant Gaussian process; this is the model of the non-dipole field. The model can be combined with an arbitrary statistical description of the dipole and probability density functions and cumulative distribution functions can be computed for declination and inclination that would be observed at any site on Earth's surface. Global paleomagnetic data spanning the past 5Ma are used to constrain the statistics of the dipole part of the field. A simple model is found to be consistent with the available data. An advantage of specifying the model in terms of the spherical harmonic coefficients is that it is a complete statistical description of the geomagnetic field, enabling us to test specific properties for a general description. Both intensity and directional data distributions may be tested to see if they satisfy the expected model distributions.
Ontology-based, Tissue MicroArray oriented, image centered tissue bank
Viti, Federica; Merelli, Ivan; Caprera, Andrea; Lazzari, Barbara; Stella, Alessandra; Milanesi, Luciano
2008-01-01
Background Tissue MicroArray technique is becoming increasingly important in pathology for the validation of experimental data from transcriptomic analysis. This approach produces many images which need to be properly managed, if possible with an infrastructure able to support tissue sharing between institutes. Moreover, the available frameworks oriented to Tissue MicroArray provide good storage for clinical patient, sample treatment and block construction information, but their utility is limited by the lack of data integration with biomolecular information. Results In this work we propose a Tissue MicroArray web oriented system to support researchers in managing bio-samples and, through the use of ontologies, enables tissue sharing aimed at the design of Tissue MicroArray experiments and results evaluation. Indeed, our system provides ontological description both for pre-analysis tissue images and for post-process analysis image results, which is crucial for information exchange. Moreover, working on well-defined terms it is then possible to query web resources for literature articles to integrate both pathology and bioinformatics data. Conclusions Using this system, users associate an ontology-based description to each image uploaded into the database and also integrate results with the ontological description of biosequences identified in every tissue. Moreover, it is possible to integrate the ontological description provided by the user with a full compliant gene ontology definition, enabling statistical studies about correlation between the analyzed pathology and the most commonly related biological processes. PMID:18460177
Langenderfer, Joseph E; Rullkoetter, Paul J; Mell, Amy G; Laz, Peter J
2009-04-01
An accurate assessment of shoulder kinematics is useful for understanding healthy normal and pathological mechanics. Small variability in identifying and locating anatomical landmarks (ALs) has potential to affect reported shoulder kinematics. The objectives of this study were to quantify the effect of landmark location variability on scapular and humeral kinematic descriptions for multiple subjects using probabilistic analysis methods, and to evaluate the consistency in results across multiple subjects. Data from 11 healthy subjects performing humeral elevation in the scapular plane were used to calculate Euler angles describing humeral and scapular kinematics. Probabilistic analyses were performed for each subject to simulate uncertainty in the locations of 13 upper-extremity ALs. For standard deviations of 4 mm in landmark location, the analysis predicted Euler angle envelopes between the 1 and 99 percentile bounds of up to 16.6 degrees . While absolute kinematics varied with the subject, the average 1-99% kinematic ranges for the motion were consistent across subjects and sensitivity factors showed no statistically significant differences between subjects. The description of humeral kinematics was most sensitive to the location of landmarks on the thorax, while landmarks on the scapula had the greatest effect on the description of scapular elevation. The findings of this study can provide a better understanding of kinematic variability, which can aid in making accurate clinical diagnoses and refining kinematic measurement techniques.
Problems faced and coping strategies used by adolescents with mentally ill parents in Delhi.
George, Shoba; Shaiju, Bindu; Sharma, Veena
2012-01-01
The present study was conducted to assess the problems faced by adolescents whose parents suffer from major mental illness at selected mental health institutes of Delhi. The objectives also included assessment of the coping strategies of the adolescents in dealing with these problems. The Stuart Stress Adaptation Model of Psychiatric Nursing Care was used as the conceptual framework. A descriptive survey approach with cross-sectional design was used in the study. A structured interview schedule was prepared. Purposive non-probability sampling technique was employed to interview 50 adolescents whose parents suffer from major mental illness. Data gathered was analysed and interpreted using both descriptive and inferential statistics. The study showed that majority of the adolescents had moderate problems as a result of their parent's mental illness. Area-wise analysis of the problems revealed that the highest problems faced were in family relationship and support and majority of the adolescents used maladaptive coping strategies. A set of guidelines on effective coping strategies was disseminated to these adolescents.
Noncontact power/interrogation system for smart structures
NASA Astrophysics Data System (ADS)
Spillman, William B., Jr.; Durkee, S.
1994-05-01
The field of smart structures has been largely driven by the development of new high performance designed materials. Use of these materials has been generally limited due to the fact that they have not been in use long enough for statistical data bases to be developed on their failure modes. Real time health monitoring is therefore required for the benefits of structures using these materials to be realized. In this paper a non-contact method of powering and interrogating embedded electronic and opto-electronic systems is described. The technique utilizes inductive coupling between external and embedded coils etched on thin electronic circuit cards. The technique can be utilized to interrogate embedded sensors and to provide > 250 mW for embedded electronics. The system has been successfully demonstrated with a number of composite and plastic materials through material thicknesses up to 1 cm. An analytical description of the system is provided along with experimental results.
Diaby, Vakaramoko; Sanogo, Vassiki; Moussa, Kouame Richard
2016-01-01
In this paper, the readers are introduced to ELICIT, an imprecise weight elicitation technique for multicriteria decision analysis for healthcare. The application of ELICIT consists of two steps: the rank ordering of evaluation criteria based on decision-makers' (DMs) preferences using the principal component analysis; and the estimation of criteria weights and their descriptive statistics using the variable interdependent analysis and the Monte Carlo method. The application of ELICIT is illustrated with a hypothetical case study involving the elicitation of weights for five criteria used to select the best device for eye surgery. The criteria were ranked from 1-5, based on a strict preference relationship established by the DMs. For each criterion, the deterministic weight was estimated as well as the standard deviation and 95% credibility interval. ELICIT is appropriate in situations where only ordinal DMs' preferences are available to elicit decision criteria weights.
Statistical Analyses of Brain Surfaces Using Gaussian Random Fields on 2-D Manifolds
Staib, Lawrence H.; Xu, Dongrong; Zhu, Hongtu; Peterson, Bradley S.
2008-01-01
Interest in the morphometric analysis of the brain and its subregions has recently intensified because growth or degeneration of the brain in health or illness affects not only the volume but also the shape of cortical and subcortical brain regions, and new image processing techniques permit detection of small and highly localized perturbations in shape or localized volume, with remarkable precision. An appropriate statistical representation of the shape of a brain region is essential, however, for detecting, localizing, and interpreting variability in its surface contour and for identifying differences in volume of the underlying tissue that produce that variability across individuals and groups of individuals. Our statistical representation of the shape of a brain region is defined by a reference region for that region and by a Gaussian random field (GRF) that is defined across the entire surface of the region. We first select a reference region from a set of segmented brain images of healthy individuals. The GRF is then estimated as the signed Euclidean distances between points on the surface of the reference region and the corresponding points on the corresponding region in images of brains that have been coregistered to the reference. Correspondences between points on these surfaces are defined through deformations of each region of a brain into the coordinate space of the reference region using the principles of fluid dynamics. The warped, coregistered region of each subject is then unwarped into its native space, simultaneously bringing into that space the map of corresponding points that was established when the surfaces of the subject and reference regions were tightly coregistered. The proposed statistical description of the shape of surface contours makes no assumptions, other than smoothness, about the shape of the region or its GRF. The description also allows for the detection and localization of statistically significant differences in the shapes of the surfaces across groups of subjects at both a fine and coarse scale. We demonstrate the effectiveness of these statistical methods by applying them to study differences in shape of the amygdala and hippocampus in a large sample of normal subjects and in subjects with attention deficit/hyperactivity disorder (ADHD). PMID:17243583
The Performance and Retention of Female Navy Officers with a Military Spouse
2017-03-01
5 2. Female Officer Retention and Dual-Military Couples ...............7 3. Demographic Statistics ...23 III. DATA DESCRIPTION AND STATISTICS ...28 2. Independent Variables.................................................................31 C. SUMMARY STATISTICS
Arora, Mansi; Kohli, Shivani; Kalsi, Rupali
2016-05-01
Dual arch impression technique signifies an essential improvement in fixed prosthodontics and has numerous benefits over conventional impression techniques. The accuracy of working dies fabricated from dual arch impression technique remains in question because there is little information available in the literature. This study was conducted to compare the accuracy of working dies fabricated from impressions made from two different viscosities of impression materials using metal, plastic dual arch trays and custom made acrylic trays. The study samples were grouped into two groups based on the viscosity of impression material used i.e. Group I (monophase), whereas Group II consisted of Dual Mix technique using a combination of light and heavy body material. These were further divided into three subgroups A, B and C depending on the type of impression tray used (metal dual arch tray, plastic dual arch tray and custom made tray). Measurements of the master cast were made using profile projector. Descriptive statistics like mean, Standard Deviation (SD) were calculated for all the groups. One way analysis of variance (ANOVA) was used for multiple group comparisons. A p-value of 0.05 or less was considered statistically significant. The gypsum dies obtained with the three types of impression trays using two groups of impression materials were smaller than the master models in dimensions. The plastic dual arch trays produced dies which were the least accurate of the three groups. There was no significant difference in the die dimensions obtained using the two viscosities of impression materials.
Kohli, Shivani; Kalsi, Rupali
2016-01-01
Introduction Dual arch impression technique signifies an essential improvement in fixed prosthodontics and has numerous benefits over conventional impression techniques. The accuracy of working dies fabricated from dual arch impression technique remains in question because there is little information available in the literature. Aim This study was conducted to compare the accuracy of working dies fabricated from impressions made from two different viscosities of impression materials using metal, plastic dual arch trays and custom made acrylic trays. Materials and Methods The study samples were grouped into two groups based on the viscosity of impression material used i.e. Group I (monophase), whereas Group II consisted of Dual Mix technique using a combination of light and heavy body material. These were further divided into three subgroups A, B and C depending on the type of impression tray used (metal dual arch tray, plastic dual arch tray and custom made tray). Measurements of the master cast were made using profile projector. Descriptive statistics like mean, Standard Deviation (SD) were calculated for all the groups. One way analysis of variance (ANOVA) was used for multiple group comparisons. A p-value of 0.05 or less was considered statistically significant. Results The gypsum dies obtained with the three types of impression trays using two groups of impression materials were smaller than the master models in dimensions. Conclusion The plastic dual arch trays produced dies which were the least accurate of the three groups. There was no significant difference in the die dimensions obtained using the two viscosities of impression materials. PMID:27437342
Kirkendall, Abbie M; Waldrop, Deborah
2013-09-01
The purpose of the study was to describe the perceptions of community residence (CR) staff who have cared for older adults with developmental disabilities (ADDs) that are at the end of life. This exploratory, descriptive study utilized qualitative methods that involved semistructured interviews with CR staff members. The setting was a CR that was also an intermediate care facility (ICF) that provided 24-hour residential treatment for medical and/or behavioral needs. At least one registered nurse was present at all times. A CR with at least one resident who was over the age of 40 and had a diagnosis of a life-limiting illness was chosen. Participants included three frontline workers, four managers, and one registered nurse. In-person interviews included open-ended questions about end-of-life care for older ADDs. Demographics such as age, length of time working with ADDs, and education were analyzed using descriptive statistics. Descriptive statistics were used to analyze demographics such as age, and length of time working with ADDs. Interviews were digitally recorded, transcribed, and analyzed using grounded theory techniques. Four themes illuminated unique elements of the provision of end-of-life care in a CR: (1) influence of relationships, (2) expression of individuality, (3) contribution of hospice, (4) grief and bereavement, and (5) challenges to end-of-life care. The results provided insight into the unique needs of older ADDs at the end of life and how this influences their care. Emphasis was also placed on the importance of specialized care that involved collaborations with hospice for older ADDs who remain in a CR at the end of life.
Back to basics: an introduction to statistics.
Halfens, R J G; Meijers, J M M
2013-05-01
In the second in the series, Professor Ruud Halfens and Dr Judith Meijers give an overview of statistics, both descriptive and inferential. They describe the first principles of statistics, including some relevant inferential tests.
Students' attitudes towards learning statistics
NASA Astrophysics Data System (ADS)
Ghulami, Hassan Rahnaward; Hamid, Mohd Rashid Ab; Zakaria, Roslinazairimah
2015-05-01
Positive attitude towards learning is vital in order to master the core content of the subject matters under study. This is unexceptional in learning statistics course especially at the university level. Therefore, this study investigates the students' attitude towards learning statistics. Six variables or constructs have been identified such as affect, cognitive competence, value, difficulty, interest, and effort. The instrument used for the study is questionnaire that was adopted and adapted from the reliable instrument of Survey of Attitudes towards Statistics(SATS©). This study is conducted to engineering undergraduate students in one of the university in the East Coast of Malaysia. The respondents consist of students who were taking the applied statistics course from different faculties. The results are analysed in terms of descriptive analysis and it contributes to the descriptive understanding of students' attitude towards the teaching and learning process of statistics.
NASA Astrophysics Data System (ADS)
McKane, Alan
2003-12-01
This is a book about the modelling of complex systems and, unlike many books on this subject, concentrates on the discussion of specific systems and gives practical methods for modelling and simulating them. This is not to say that the author does not devote space to the general philosophy and definition of complex systems and agent-based modelling, but the emphasis is definitely on the development of concrete methods for analysing them. This is, in my view, to be welcomed and I thoroughly recommend the book, especially to those with a theoretical physics background who will be very much at home with the language and techniques which are used. The author has developed a formalism for understanding complex systems which is based on the Langevin approach to the study of Brownian motion. This is a mesoscopic description; details of the interactions between the Brownian particle and the molecules of the surrounding fluid are replaced by a randomly fluctuating force. Thus all microscopic detail is replaced by a coarse-grained description which encapsulates the essence of the interactions at the finer level of description. In a similar way, the influences on Brownian agents in a multi-agent system are replaced by stochastic influences which sum up the effects of these interactions on a finer scale. Unlike Brownian particles, Brownian agents are not structureless particles, but instead have some internal states so that, for instance, they may react to changes in the environment or to the presence of other agents. Most of the book is concerned with developing the idea of Brownian agents using the techniques of statistical physics. This development parallels that for Brownian particles in physics, but the author then goes on to apply the technique to problems in biology, economics and the social sciences. This is a clear and well-written book which is a useful addition to the literature on complex systems. It will be interesting to see if the use of Brownian agents becomes a standard tool in the study of complex systems in the future.
Student's Conceptions in Statistical Graph's Interpretation
ERIC Educational Resources Information Center
Kukliansky, Ida
2016-01-01
Histograms, box plots and cumulative distribution graphs are popular graphic representations for statistical distributions. The main research question that this study focuses on is how college students deal with interpretation of these statistical graphs when translating graphical representations into analytical concepts in descriptive statistics.…
Gamma, Alex; Lehmann, Dietrich; Frei, Edi; Iwata, Kazuki; Pascual-Marqui, Roberto D; Vollenweider, Franz X
2004-06-01
The complementary strengths and weaknesses of established functional brain imaging methods (high spatial, low temporal resolution) and EEG-based techniques (low spatial, high temporal resolution) make their combined use a promising avenue for studying brain processes at a more fine-grained level. However, this strategy requires a better understanding of the relationship between hemodynamic/metabolic and neuroelectric measures of brain activity. We investigated possible correspondences between cerebral blood flow (CBF) as measured by [H2O]-PET and intracerebral electric activity computed by Low Resolution Brain Electromagnetic Tomography (LORETA) from scalp-recorded multichannel EEG in healthy human subjects during cognitive and pharmacological stimulation. The two imaging modalities were compared by descriptive, correlational, and variance analyses, the latter carried out using statistical parametric mapping (SPM99). Descriptive visual comparison showed a partial overlap between the sets of active brain regions detected by the two modalities. A number of exclusively positive correlations of neuroelectric activity with regional CBF were found across the whole EEG frequency range, including slow wave activity, the latter finding being in contrast to most previous studies conducted in patients. Analysis of variance revealed an extensive lack of statistically significant correspondences between brain activity changes as measured by PET vs. EEG-LORETA. In general, correspondences, to the extent they were found, were dependent on experimental condition, brain region, and EEG frequency. Copyright 2004 Wiley-Liss, Inc.
NASA Astrophysics Data System (ADS)
Ramirez Cuesta, Timmy
Incoherent inelastic neutron scattering spectroscopy is a very powerful technique that requires the use of ab-initio models to interpret the experimental data. Albeit not exact the information obtained from the models gives very valuable insight into the dynamics of atoms in solids and molecules, that, in turn, provides unique access to the vibrational density of states. It is extremely sensitive to hydrogen since the neutron cross section of hydrogen is the largest of all chemical elements. Hydrogen, being the lightest element highlights quantum effects more pronounced than the rest of the elements.In the case of non-crystalline or disordered materials, the models provide partial information and only a reduced sampling of possible configurations can be done at the present. With very large computing power, as exascale computing will provide, a new opportunity arises to study these systems and introduce a description of statistical configurations including energetics and dynamics characterization of configurational entropy. As part of the ICE-MAN project, we are developing the tools to manage the workflows, visualize and analyze the results. To use state of the art computational methods and most neutron scattering that using atomistic models for interpretation of experimental data This work is supported by the Laboratory Directed Research and Development (LDRD 8237) program of the UT-Battelle, LLC under Contract No. DE-AC05-00OR22725 with the U.S. Department of Energy.
Nurse's Awareness on Ethico-legal Aspects of Nursing Profession.
Paudel Subedi, Krishna Kumari; Timalsina, Kalpana; Bhele, Raja Laxmi
2018-03-13
Nursing practice amicably includes practical efficacy and ethics. Now a days legal and ethical problems associated with client care are arising day by day. Therefore, nurses should have adequate understanding of basic legal concepts and issues relevant to nursing profession in order to protect the rights of the clients and the nurses. A cross sectional descriptive design was adopted for the study. 142 nurses were included by using purposive sampling technique. Data was collected with self-administered structured questionnaire. Descriptive statistics was used to reveal demographic information. Kruskal Wallis and Mann Whitney test were used to find out association of selected demographic variables and ethico legal aspects of nursing. Majority of participants were belonging to 20-29 years of age. More than half nurses had complete bachelor's degree and had less than 10 year's experiences. Majority of participants reported that they did not encounter any legal issues in their professional life till date. Similarly, majority of participants had average level knowledge and equate level of practice. Years of experiences and education level did not affect in knowledge level and existing practice related to ethico legal aspect of nursing. There was no significant relationship between level of knowledge and existing practice. Nurses have average knowledge and practice on ethico legal aspects. There is positive relationship between knowledge and practice though it is not statistically significant.
Factors affecting the choice of type of delivery with breast feeding in Iranian mothers.
Sharifi, Farangis; Nouraei, Soheila; Sharifi, Nader
2017-09-01
This study assessed the factors affecting the choice of type of delivery with breast feeding in Iranian mothers. This Cross section descriptive analytic study was performed using a random sampling technique, using data from 400 pregnant women who attended the maternity centers in Borazjan and Kazerun in Iran in 2014. A questionnaire covering demographic characteristics, mode of delivery and postpartum conditions was completed for each mother. Descriptive analysis and Chi square test were used along with SPSS 23 software to statistically analyze the data and p-value less than 0.05 was considered for statistical significance. In this study, the rate of normal delivery and cesarean operation are considered equal. In the main factors influencing the choice of delivery, mothers' education level (p=0.028) and pregnancy status (p=0.041) showed a significant relationship. Although no significant association between child nutrition with the type of delivery was found, duration of breastfeeding with the type of delivery showed significant association (p=0.046). Although cesarean delivery in many cases is life-saving for mother and fetus; in addition to medical indications, parents with higher education and pregnancy status are also important factors in increasing the rate of cesarean section compared to vaginal delivery. Babies of mothers with normal delivery had a longer time of breastfeeding. Further studies in Iran are necessary, regarding the reasons for high cesarean section and their outcomes.
Vetter, Thomas R
2017-11-01
Descriptive statistics are specific methods basically used to calculate, describe, and summarize collected research data in a logical, meaningful, and efficient way. Descriptive statistics are reported numerically in the manuscript text and/or in its tables, or graphically in its figures. This basic statistical tutorial discusses a series of fundamental concepts about descriptive statistics and their reporting. The mean, median, and mode are 3 measures of the center or central tendency of a set of data. In addition to a measure of its central tendency (mean, median, or mode), another important characteristic of a research data set is its variability or dispersion (ie, spread). In simplest terms, variability is how much the individual recorded scores or observed values differ from one another. The range, standard deviation, and interquartile range are 3 measures of variability or dispersion. The standard deviation is typically reported for a mean, and the interquartile range for a median. Testing for statistical significance, along with calculating the observed treatment effect (or the strength of the association between an exposure and an outcome), and generating a corresponding confidence interval are 3 tools commonly used by researchers (and their collaborating biostatistician or epidemiologist) to validly make inferences and more generalized conclusions from their collected data and descriptive statistics. A number of journals, including Anesthesia & Analgesia, strongly encourage or require the reporting of pertinent confidence intervals. A confidence interval can be calculated for virtually any variable or outcome measure in an experimental, quasi-experimental, or observational research study design. Generally speaking, in a clinical trial, the confidence interval is the range of values within which the true treatment effect in the population likely resides. In an observational study, the confidence interval is the range of values within which the true strength of the association between the exposure and the outcome (eg, the risk ratio or odds ratio) in the population likely resides. There are many possible ways to graphically display or illustrate different types of data. While there is often latitude as to the choice of format, ultimately, the simplest and most comprehensible format is preferred. Common examples include a histogram, bar chart, line chart or line graph, pie chart, scatterplot, and box-and-whisker plot. Valid and reliable descriptive statistics can answer basic yet important questions about a research data set, namely: "Who, What, Why, When, Where, How, How Much?"
2015-12-01
WAIVERS ..............................................................................................49 APPENDIX C. DESCRIPTIVE STATISTICS ... Statistics of Dependent Variables. .............................................23 Table 6. Summary Statistics of Academics Variables...24 Table 7. Summary Statistics of Application Variables ............................................25 Table 8
21 CFR 314.50 - Content and format of an application.
Code of Federal Regulations, 2013 CFR
2013-04-01
... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...
21 CFR 314.50 - Content and format of an application.
Code of Federal Regulations, 2012 CFR
2012-04-01
... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...
21 CFR 314.50 - Content and format of an application.
Code of Federal Regulations, 2014 CFR
2014-04-01
... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...
21 CFR 314.50 - Content and format of an application.
Code of Federal Regulations, 2011 CFR
2011-04-01
... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...
21 CFR 314.50 - Content and format of an application.
Code of Federal Regulations, 2010 CFR
2010-04-01
... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...
The Practicality of Statistical Physics Handout Based on KKNI and the Constructivist Approach
NASA Astrophysics Data System (ADS)
Sari, S. Y.; Afrizon, R.
2018-04-01
Statistical physics lecture shows that: 1) the performance of lecturers, social climate, students’ competence and soft skills needed at work are in enough category, 2) students feel difficulties in following the lectures of statistical physics because it is abstract, 3) 40.72% of students needs more understanding in the form of repetition, practice questions and structured tasks, and 4) the depth of statistical physics material needs to be improved gradually and structured. This indicates that learning materials in accordance of The Indonesian National Qualification Framework or Kerangka Kualifikasi Nasional Indonesia (KKNI) with the appropriate learning approach are needed to help lecturers and students in lectures. The author has designed statistical physics handouts which have very valid criteria (90.89%) according to expert judgment. In addition, the practical level of handouts designed also needs to be considered in order to be easy to use, interesting and efficient in lectures. The purpose of this research is to know the practical level of statistical physics handout based on KKNI and a constructivist approach. This research is a part of research and development with 4-D model developed by Thiagarajan. This research activity has reached part of development test at Development stage. Data collection took place by using a questionnaire distributed to lecturers and students. Data analysis using descriptive data analysis techniques in the form of percentage. The analysis of the questionnaire shows that the handout of statistical physics has very practical criteria. The conclusion of this study is statistical physics handouts based on the KKNI and constructivist approach have been practically used in lectures.
Naish, Suchithra; Dale, Pat; Mackenzie, John S; McBride, John; Mengersen, Kerrie; Tong, Shilu
2014-01-01
Dengue has been a major public health concern in Australia since it re-emerged in Queensland in 1992-1993. We explored spatio-temporal characteristics of locally-acquired dengue cases in northern tropical Queensland, Australia during the period 1993-2012. Locally-acquired notified cases of dengue were collected for northern tropical Queensland from 1993 to 2012. Descriptive spatial and temporal analyses were conducted using geographic information system tools and geostatistical techniques. 2,398 locally-acquired dengue cases were recorded in northern tropical Queensland during the study period. The areas affected by the dengue cases exhibited spatial and temporal variation over the study period. Notified cases of dengue occurred more frequently in autumn. Mapping of dengue by statistical local areas (census units) reveals the presence of substantial spatio-temporal variation over time and place. Statistically significant differences in dengue incidence rates among males and females (with more cases in females) (χ(2) = 15.17, d.f. = 1, p<0.01). Differences were observed among age groups, but these were not statistically significant. There was a significant positive spatial autocorrelation of dengue incidence for the four sub-periods, with the Moran's I statistic ranging from 0.011 to 0.463 (p<0.01). Semi-variogram analysis and smoothed maps created from interpolation techniques indicate that the pattern of spatial autocorrelation was not homogeneous across the northern Queensland. Tropical areas are potential high-risk areas for mosquito-borne diseases such as dengue. This study demonstrated that the locally-acquired dengue cases have exhibited a spatial and temporal variation over the past twenty years in northern tropical Queensland, Australia. Therefore, this study provides an impetus for further investigation of clusters and risk factors in these high-risk areas.
Naish, Suchithra; Dale, Pat; Mackenzie, John S.; McBride, John; Mengersen, Kerrie; Tong, Shilu
2014-01-01
Background Dengue has been a major public health concern in Australia since it re-emerged in Queensland in 1992–1993. We explored spatio-temporal characteristics of locally-acquired dengue cases in northern tropical Queensland, Australia during the period 1993–2012. Methods Locally-acquired notified cases of dengue were collected for northern tropical Queensland from 1993 to 2012. Descriptive spatial and temporal analyses were conducted using geographic information system tools and geostatistical techniques. Results 2,398 locally-acquired dengue cases were recorded in northern tropical Queensland during the study period. The areas affected by the dengue cases exhibited spatial and temporal variation over the study period. Notified cases of dengue occurred more frequently in autumn. Mapping of dengue by statistical local areas (census units) reveals the presence of substantial spatio-temporal variation over time and place. Statistically significant differences in dengue incidence rates among males and females (with more cases in females) (χ2 = 15.17, d.f. = 1, p<0.01). Differences were observed among age groups, but these were not statistically significant. There was a significant positive spatial autocorrelation of dengue incidence for the four sub-periods, with the Moran's I statistic ranging from 0.011 to 0.463 (p<0.01). Semi-variogram analysis and smoothed maps created from interpolation techniques indicate that the pattern of spatial autocorrelation was not homogeneous across the northern Queensland. Conclusions Tropical areas are potential high-risk areas for mosquito-borne diseases such as dengue. This study demonstrated that the locally-acquired dengue cases have exhibited a spatial and temporal variation over the past twenty years in northern tropical Queensland, Australia. Therefore, this study provides an impetus for further investigation of clusters and risk factors in these high-risk areas. PMID:24691549
One Yard Below: Education Statistics from a Different Angle.
ERIC Educational Resources Information Center
Education Intelligence Agency, Carmichael, CA.
This report offers a different perspective on education statistics by highlighting rarely used "stand-alone" statistics on public education, inputs, outputs, and descriptions, and it uses interactive statistics that combine two or more statistics in an unusual way. It is a report that presents much evidence, but few conclusions. It is not intended…
A Bibliography of Statistical Applications in Geography, Technical Paper No. 9.
ERIC Educational Resources Information Center
Greer-Wootten, Bryn; And Others
Included in this bibliography are resource materials available to both college instructors and students on statistical applications in geographic research. Two stages of statistical development are treated in the bibliography. They are 1) descriptive statistics, in which the sample is the focus of interest, and 2) analytical statistics, in which…
Research Education in Undergraduate Occupational Therapy Programs.
ERIC Educational Resources Information Center
Petersen, Paul; And Others
1992-01-01
Of 63 undergraduate occupational therapy programs surveyed, the 38 responses revealed some common areas covered: elementary descriptive statistics, validity, reliability, and measurement. Areas underrepresented include statistical analysis with or without computers, research design, and advanced statistics. (SK)
Policy Safeguards and the Legitimacy of Highway Interdiction
2016-12-01
17 B. BIAS WITHIN LAW ENFORCEMENT ..............................................19 C. STATISTICAL DATA GATHERING...32 3. Controlling Discretion .................................................................36 4. Statistical Data Collection for Traffic Stops...49 A. DESCRIPTION OF STATISTICAL DATA COLLECTED ...............50 B. DATA ORGANIZATION AND ANALYSIS
Fish: A New Computer Program for Friendly Introductory Statistics Help
ERIC Educational Resources Information Center
Brooks, Gordon P.; Raffle, Holly
2005-01-01
All introductory statistics students must master certain basic descriptive statistics, including means, standard deviations and correlations. Students must also gain insight into such complex concepts as the central limit theorem and standard error. This article introduces and describes the Friendly Introductory Statistics Help (FISH) computer…
Neyeloff, Jeruza L; Fuchs, Sandra C; Moreira, Leila B
2012-01-20
Meta-analyses are necessary to synthesize data obtained from primary research, and in many situations reviews of observational studies are the only available alternative. General purpose statistical packages can meta-analyze data, but usually require external macros or coding. Commercial specialist software is available, but may be expensive and focused in a particular type of primary data. Most available softwares have limitations in dealing with descriptive data, and the graphical display of summary statistics such as incidence and prevalence is unsatisfactory. Analyses can be conducted using Microsoft Excel, but there was no previous guide available. We constructed a step-by-step guide to perform a meta-analysis in a Microsoft Excel spreadsheet, using either fixed-effect or random-effects models. We have also developed a second spreadsheet capable of producing customized forest plots. It is possible to conduct a meta-analysis using only Microsoft Excel. More important, to our knowledge this is the first description of a method for producing a statistically adequate but graphically appealing forest plot summarizing descriptive data, using widely available software.
2012-01-01
Background Meta-analyses are necessary to synthesize data obtained from primary research, and in many situations reviews of observational studies are the only available alternative. General purpose statistical packages can meta-analyze data, but usually require external macros or coding. Commercial specialist software is available, but may be expensive and focused in a particular type of primary data. Most available softwares have limitations in dealing with descriptive data, and the graphical display of summary statistics such as incidence and prevalence is unsatisfactory. Analyses can be conducted using Microsoft Excel, but there was no previous guide available. Findings We constructed a step-by-step guide to perform a meta-analysis in a Microsoft Excel spreadsheet, using either fixed-effect or random-effects models. We have also developed a second spreadsheet capable of producing customized forest plots. Conclusions It is possible to conduct a meta-analysis using only Microsoft Excel. More important, to our knowledge this is the first description of a method for producing a statistically adequate but graphically appealing forest plot summarizing descriptive data, using widely available software. PMID:22264277
MEMS reliability: The challenge and the promise
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, W.M.; Tanner, D.M.; Miller, S.L.
1998-05-01
MicroElectroMechanical Systems (MEMS) that think, sense, act and communicate will open up a broad new array of cost effective solutions only if they prove to be sufficiently reliable. A valid reliability assessment of MEMS has three prerequisites: (1) statistical significance; (2) a technique for accelerating fundamental failure mechanisms, and (3) valid physical models to allow prediction of failures during actual use. These already exist for the microelectronics portion of such integrated systems. The challenge lies in the less well understood micromachine portions and its synergistic effects with microelectronics. This paper presents a methodology addressing these prerequisites and a description ofmore » the underlying physics of reliability for micromachines.« less
Study of optimum methods of optical communication
NASA Technical Reports Server (NTRS)
Harger, R. O.
1972-01-01
Optimum methods of optical communication accounting for the effects of the turbulent atmosphere and quantum mechanics, both by the semi-classical method and the full-fledged quantum theoretical model are described. A concerted effort to apply the techniques of communication theory to the novel problems of optical communication by a careful study of realistic models and their statistical descriptions, the finding of appropriate optimum structures and the calculation of their performance and, insofar as possible, comparing them to conventional and other suboptimal systems are discussed. In this unified way the bounds on performance and the structure of optimum communication systems for transmission of information, imaging, tracking, and estimation can be determined for optical channels.
The impact of Lean bundles on hospital performance: does size matter?
Al-Hyari, Khalil; Abu Hammour, Sewar; Abu Zaid, Mohammad Khair Saleem; Haffar, Mohamed
2016-10-10
Purpose The purpose of this paper is to study the effect of the implementation of Lean bundles on hospital performance in private hospitals in Jordan and evaluate how much the size of organization can affect the relationship between Lean bundles implementation and hospital performance. Design/methodology/approach The research is considered as quantitative method (descriptive and hypothesis testing). Three statistical techniques were adopted to analyse the data. Structural equation modeling techniques and multi-group analysis were used to examine the research's hypothesis, and to perform the required statistical analysis of the data from the survey. Reliability analysis and confirmatory factor analysis were used to test the construct validity, reliability and measurement loadings that were performed. Findings Lean bundles have been identified as an effective approach that can dramatically improve the organizational performance of private hospitals in Jordan. Main Lean bundles - just in time, human resource management, and total quality management are applicable to large, small and medium hospitals without significant differences in advantages that depend on size. Originality/value According to the researchers' best knowledge, this is the first research that studies the impact of Lean bundles implementation in healthcare sector in Jordan. This research also makes a significant contribution for decision makers in healthcare to increase their awareness of Lean bundles.
Random field assessment of nanoscopic inhomogeneity of bone
Dong, X. Neil; Luo, Qing; Sparkman, Daniel M.; Millwater, Harry R.; Wang, Xiaodu
2010-01-01
Bone quality is significantly correlated with the inhomogeneous distribution of material and ultrastructural properties (e.g., modulus and mineralization) of the tissue. Current techniques for quantifying inhomogeneity consist of descriptive statistics such as mean, standard deviation and coefficient of variation. However, these parameters do not describe the spatial variations of bone properties. The objective of this study was to develop a novel statistical method to characterize and quantitatively describe the spatial variation of bone properties at ultrastructural levels. To do so, a random field defined by an exponential covariance function was used to present the spatial uncertainty of elastic modulus by delineating the correlation of the modulus at different locations in bone lamellae. The correlation length, a characteristic parameter of the covariance function, was employed to estimate the fluctuation of the elastic modulus in the random field. Using this approach, two distribution maps of the elastic modulus within bone lamellae were generated using simulation and compared with those obtained experimentally by a combination of atomic force microscopy and nanoindentation techniques. The simulation-generated maps of elastic modulus were in close agreement with the experimental ones, thus validating the random field approach in defining the inhomogeneity of elastic modulus in lamellae of bone. Indeed, generation of such random fields will facilitate multi-scale modeling of bone in more pragmatic details. PMID:20817128
NASA Astrophysics Data System (ADS)
Ilyas, Muhammad; Salwah
2017-02-01
The type of this research was experiment. The purpose of this study was to determine the difference and the quality of student's learning achievement between students who obtained learning through Realistic Mathematics Education (RME) approach and students who obtained learning through problem solving approach. This study was a quasi-experimental research with non-equivalent experiment group design. The population of this study was all students of grade VII in one of junior high school in Palopo, in the second semester of academic year 2015/2016. Two classes were selected purposively as sample of research that was: year VII-5 as many as 28 students were selected as experiment group I and VII-6 as many as 23 students were selected as experiment group II. Treatment that used in the experiment group I was learning by RME Approach, whereas in the experiment group II by problem solving approach. Technique of data collection in this study gave pretest and posttest to students. The analysis used in this research was an analysis of descriptive statistics and analysis of inferential statistics using t-test. Based on the analysis of descriptive statistics, it can be concluded that the average score of students' mathematics learning after taught using problem solving approach was similar to the average results of students' mathematics learning after taught using realistic mathematics education (RME) approach, which are both at the high category. In addition, It can also be concluded that; (1) there was no difference in the results of students' mathematics learning taught using realistic mathematics education (RME) approach and students who taught using problem solving approach, (2) quality of learning achievement of students who received RME approach and problem solving approach learning was same, which was at the high category.
Markowski, Alycia; Watkins, Maureen K; Burnett, Todd; Ho, Melissa; Ling, Michael
2018-04-01
Often, physical therapy students struggle with the skill and the confidence to perform manual techniques for musculoskeletal examination. Current teaching methods lack concurrent objective feedback. Real-time ultrasound imaging (RTUI) has the advantage of generating visualization of anatomical structures in real-time in an efficient and safe manner. We hypothesize that the use of RTUI to augment teaching with concurrent objective visual feedback will result in students' improved ability to create a change in joint space when performing a manual knee traction and higher confidence scores. Eighty-six students were randomly allocated to a control or an experimental group. All participants received baseline instructions on how to perform knee traction. The control group received standardized lab instruction (visual, video, and instructor/partner feedback). The experimental group received standardized lab instruction augmented with RTUI feedback. Pre-data and post-data collection consisted of measuring participants' ability to create changes in joint space when performing knee traction, a confidence survey evaluating perceived ability and a reflection paper. Joint space changes between groups were compared using a paired t-test. Surveys were analyzed with descriptive statistics and compared using Wilcoxon Rank Sum and for the reflection papers, themes were identified and descriptive statistics reported. Although there were no statistically significant differences between the control and the experimental group, overall scores improved. Qualitative data suggests students found the use of ultrasound imaging beneficial and would like more exposure. This novel approach to teaching knee traction with RTUI has potential and may be a basis for further studies. Copyright © 2018 Elsevier Ltd. All rights reserved.
Symbolic dynamics techniques for complex systems: Application to share price dynamics
NASA Astrophysics Data System (ADS)
Xu, Dan; Beck, Christian
2017-05-01
The symbolic dynamics technique is well known for low-dimensional dynamical systems and chaotic maps, and lies at the roots of the thermodynamic formalism of dynamical systems. Here we show that this technique can also be successfully applied to time series generated by complex systems of much higher dimensionality. Our main example is the investigation of share price returns in a coarse-grained way. A nontrivial spectrum of Rényi entropies is found. We study how the spectrum depends on the time scale of returns, the sector of stocks considered, as well as the number of symbols used for the symbolic description. Overall our analysis confirms that in the symbol space transition probabilities of observed share price returns depend on the entire history of previous symbols, thus emphasizing the need for a modelling based on non-Markovian stochastic processes. Our method allows for quantitative comparisons of entirely different complex systems, for example the statistics of symbol sequences generated by share price returns using 4 symbols can be compared with that of genomic sequences.
Mall, Nathan A; Abrams, Geoffrey D; Azar, Frederick M; Traina, Steve M; Allen, Answorth A; Parker, Richard; Cole, Brian J
2014-06-01
Anterior cruciate ligament (ACL) tears are common in athletes. Techniques and methods of treatment for these injuries continue to vary among surgeons. Thirty National Basketball Association (NBA) team physicians were surveyed during the NBA Pre-Draft Combine. Survey questions involved current and previous practice methods of primary and revision ACL reconstruction, including technique, graft choice, rehabilitation, and treatment of combined ACL and medial collateral ligament injuries. Descriptive parametric statistics, Fisher exact test, and logistic regression were used, and significance was set at α = 0.05. All 30 team physicians completed the survey. Eighty-seven percent indicated they use autograft (81% bone-patellar tendon-bone) for primary ACL reconstruction in NBA athletes, and 43% indicated they use autograft for revision cases. Fourteen surgeons (47%) indicated they use an anteromedial portal (AMP) for femoral tunnel drilling, whereas 5 years earlier only 4 (13%) used this technique. There was a significant (P = .009) positive correlation between fewer years in practice and AMP use. NBA team physicians' use of an AMP for femoral tunnel drilling has increased over the past 5 years.
ERIC Educational Resources Information Center
Sharief, Mostafa; Naderi, Mahin; Hiedari, Maryam Shoja; Roodbari, Omolbanin; Jalilvand, Mohammad Reza
2012-01-01
The aim of current study is to determine the strengths and weaknesses of descriptive evaluation from the viewpoint of principals, teachers and experts of Chaharmahal and Bakhtiari province. A descriptive survey was performed. Statistical population includes 208 principals, 303 teachers, and 100 executive experts of descriptive evaluation scheme in…
NASA Astrophysics Data System (ADS)
Fan, Daidu; Tu, Junbiao; Cai, Guofu; Shang, Shuai
2015-06-01
Grain-size analysis is a basic routine in sedimentology and related fields, but diverse methods of sample collection, processing and statistical analysis often make direct comparisons and interpretations difficult or even impossible. In this paper, 586 published grain-size datasets from the Qiantang Estuary (East China Sea) sampled and analyzed by the same procedures were merged and their textural parameters calculated by a percentile and two moment methods. The aim was to explore which of the statistical procedures performed best in the discrimination of three distinct sedimentary units on the tidal flats of the middle Qiantang Estuary. A Gaussian curve-fitting method served to simulate mixtures of two normal populations having different modal sizes, sorting values and size distributions, enabling a better understanding of the impact of finer tail components on textural parameters, as well as the proposal of a unifying descriptive nomenclature. The results show that percentile and moment procedures yield almost identical results for mean grain size, and that sorting values are also highly correlated. However, more complex relationships exist between percentile and moment skewness (kurtosis), changing from positive to negative correlations when the proportions of the finer populations decrease below 35% (10%). This change results from the overweighting of tail components in moment statistics, which stands in sharp contrast to the underweighting or complete amputation of small tail components by the percentile procedure. Intercomparisons of bivariate plots suggest an advantage of the Friedman & Johnson moment procedure over the McManus moment method in terms of the description of grain-size distributions, and over the percentile method by virtue of a greater sensitivity to small variations in tail components. The textural parameter scalings of Folk & Ward were translated into their Friedman & Johnson moment counterparts by application of mathematical functions derived by regression analysis of measured and modeled grain-size data, or by determining the abscissa values of intersections between auxiliary lines running parallel to the x-axis and vertical lines corresponding to the descriptive percentile limits along the ordinate of representative bivariate plots. Twofold limits were extrapolated for the moment statistics in relation to single descriptive terms in the cases of skewness and kurtosis by considering both positive and negative correlations between percentile and moment statistics. The extrapolated descriptive scalings were further validated by examining entire size-frequency distributions simulated by mixing two normal populations of designated modal size and sorting values, but varying in mixing ratios. These were found to match well in most of the proposed scalings, although platykurtic and very platykurtic categories were questionable when the proportion of the finer population was below 5%. Irrespective of the statistical procedure, descriptive nomenclatures should therefore be cautiously used when tail components contribute less than 5% to grain-size distributions.
Statistical Tutorial | Center for Cancer Research
Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. ST is designed as a follow up to Statistical Analysis of Research Data (SARD) held in April 2018. The tutorial will apply the general principles of statistical analysis of research data including descriptive statistics, z- and t-tests of means and mean
Serum proteins by capillary zone electrophoresis: approaches to the definition of reference values.
Petrini, C; Alessio, M G; Scapellato, L; Brambilla, S; Franzini, C
1999-10-01
The Paragon CZE 2000 (Beckman Analytical, Milan, Italy) is an automatic dedicated capillary zone electrophoresis (CZE) system, producing a five-zone serum protein pattern with quantitative estimation of the zones. With the view of substituting this instrument for two previously used serum protein electrophoresis techniques, we planned to produce reference values for the "new" systems leading to compatible interpretation of the results. High resolution cellulose acetate electrophoresis with visual inspection and descriptive reporting (HR-CAE) and five-zone cellulose acetate electrophoresis with densitometry (CAE-D) were the previously used techniques. Serum samples (n = 167) giving "normal pattern" with HR-CAE were assayed with the CZE system, and the results were statistically assessed to yield 0.95 reference intervals. One thousand normal and pathological serum samples were then assayed with the CAE-D and the CZE techniques, and the regression equations of the CAE-D values over the CZE values for the five zones were used to transform the CAE-D reference limits into the CZE reference limits. The two sets of reference values thereby produced were in good agreement with each other and also with reference values previously reported for the CZE system. Thus, reference values for the CZE techniques permit interpretation of results coherent with the previously used techniques and reporting modes.
A descriptive study of "being with woman" during labor and birth.
Hunter, Lauren P
2009-01-01
The objective of this study was to learn more about women's perceptions of the nurse-midwifery practice of "being with woman" during childbirth. The descriptive, correlational design used a convenience sample of 238 low-risk postpartum women in a hospital nurse-midwifery practice, with two childbirth settings: a standard labor and delivery unit and an in-hospital birth center. The main outcome measure was a 29-item seven-response Likert scale questionnaire, the Positive Presence Index (PPI), administered to women cared for during labor and birth by nurse-midwives to measure the concept of being with woman. Statistical analysis demonstrated women who gave birth in the in-hospital birth center or who began labor in the in-hospital birth center prior to an indicated transfer to the standard labor and delivery unit gave higher PPI scores than women who were admitted to and gave birth on the standard labor and delivery unit. Parity, ethnicity, number of midwives attending, presence of personal support persons, length of labor, and pain relief medications were unrelated to PPI scores. Two coping/comfort techniques, music therapy and breathing, were found to be correlated with reported higher PPI scores than those of women who did not use the techniques. These results can be used to encourage continued use of midwifery care and for low client to midwife caseloads during childbirth, and to modify hospital settings to include more in-hospital birth centers.
Compressive Sampling based Image Coding for Resource-deficient Visual Communication.
Liu, Xianming; Zhai, Deming; Zhou, Jiantao; Zhang, Xinfeng; Zhao, Debin; Gao, Wen
2016-04-14
In this paper, a new compressive sampling based image coding scheme is developed to achieve competitive coding efficiency at lower encoder computational complexity, while supporting error resilience. This technique is particularly suitable for visual communication with resource-deficient devices. At the encoder, compact image representation is produced, which is a polyphase down-sampled version of the input image; but the conventional low-pass filter prior to down-sampling is replaced by a local random binary convolution kernel. The pixels of the resulting down-sampled pre-filtered image are local random measurements and placed in the original spatial configuration. The advantages of local random measurements are two folds: 1) preserve high-frequency image features that are otherwise discarded by low-pass filtering; 2) remain a conventional image and can therefore be coded by any standardized codec to remove statistical redundancy of larger scales. Moreover, measurements generated by different kernels can be considered as multiple descriptions of the original image and therefore the proposed scheme has the advantage of multiple description coding. At the decoder, a unified sparsity-based soft-decoding technique is developed to recover the original image from received measurements in a framework of compressive sensing. Experimental results demonstrate that the proposed scheme is competitive compared with existing methods, with a unique strength of recovering fine details and sharp edges at low bit-rates.
Müller, Erich A; Jackson, George
2014-01-01
A description of fluid systems with molecular-based algebraic equations of state (EoSs) and by direct molecular simulation is common practice in chemical engineering and the physical sciences, but the two approaches are rarely closely coupled. The key for an integrated representation is through a well-defined force field and Hamiltonian at the molecular level. In developing coarse-grained intermolecular potential functions for the fluid state, one typically starts with a detailed, bottom-up quantum-mechanical or atomic-level description and then integrates out the unwanted degrees of freedom using a variety of techniques; an iterative heuristic simulation procedure is then used to refine the parameters of the model. By contrast, with a top-down technique, one can use an accurate EoS to link the macroscopic properties of the fluid and the force-field parameters. We discuss the latest developments in a top-down representation of fluids, with a particular focus on a group-contribution formulation of the statistical associating fluid theory (SAFT-γ). The accurate SAFT-γ EoS is used to estimate the parameters of the Mie force field, which can then be used with confidence in direct molecular simulations to obtain thermodynamic, structural, interfacial, and dynamical properties that are otherwise inaccessible from the EoS. This is exemplified for several prototypical fluids and mixtures, including carbon dioxide, hydrocarbons, perfluorohydrocarbons, and aqueous surfactants.
Ojelabi, Rapheal A; Afolabi, Adedeji O; Oyeyipo, Opeyemi O; Tunji-Olayeni, Patience F; Adewale, Bukola A
2018-06-01
Integrating social client relationship management (CRM 2.0) in the built environment can enhance the relationship between construction organizations and client towards sustaining a long and lasting collaboration. The data exploration analyzed the e-readiness of contracting and consulting construction firms in the uptake of CRM 2.0 and the barriers encountered in the adoption of the modern business tool. The targeted organizations consist of seventy five (75) construction businesses operating in Lagos State which were selected from a pool of registered contracting and consulting construction firms using random sampling technique. Descriptive statistics of the e-readiness of contracting and consulting construction firms for CRM 2.0 adoption and barriers limiting its uptake were analyzed. Also, inferential analysis using Mann-Whitney U statistical and independent sample t-test was performed on the dataset obtained. The data generated will support construction firms on the necessity to engage in client social relationship management in ensuring sustainable client relationship management in the built environment.
Detection and Estimation of an Optical Image by Photon-Counting Techniques. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Wang, Lily Lee
1973-01-01
Statistical description of a photoelectric detector is given. The photosensitive surface of the detector is divided into many small areas, and the moment generating function of the photo-counting statistic is derived for large time-bandwidth product. The detection of a specified optical image in the presence of the background light by using the hypothesis test is discussed. The ideal detector based on the likelihood ratio from a set of numbers of photoelectrons ejected from many small areas of the photosensitive surface is studied and compared with the threshold detector and a simple detector which is based on the likelihood ratio by counting the total number of photoelectrons from a finite area of the surface. The intensity of the image is assumed to be Gaussian distributed spatially against the uniformly distributed background light. The numerical approximation by the method of steepest descent is used, and the calculations of the reliabilities for the detectors are carried out by a digital computer.
Brondani, Lucas Pradebon; Pereira-Cenci, Tatiana; Wandsher, Vinicius Felipe; Pereira, Gabriel Kalil; Valandro, Luis Felipe; Bergoli, César Dalmolin
2017-04-10
Resin cements are often used for single crown cementation due to their physical properties. Self-adhesive resin cements gained widespread due to their simplified technique compared to regular resin cement. However, there is lacking clinical evidence about the long-term behavior of this material. The aim of this prospective clinical trial was to assess the survival rates of metal-ceramic crowns cemented with self-adhesive resin cement up to six years. One hundred and twenty-nine subjects received 152 metal-ceramic crowns. The cementation procedures were standardized and performed by previously trained operators. The crowns were assessed as to primary outcome (debonding) and FDI criteria. Statistical analysis was performed using Kaplan-Meier statistics and descriptive analysis. Three failures occurred (debonding), resulting in a 97.6% survival rate. FDI criteria assessment resulted in scores 1 and 2 (acceptable clinical evaluation) for all surviving crowns. The use of self-adhesive resin cement is a feasible alternative for metal-ceramic crowns cementation, achieving high and adequate survival rates.
Sexual Assault Prevention and Response Climate DEOCS 4.1 Construct Validity Summary
2017-08-01
DEOCS, (7) examining variance and descriptive statistics (8) examining the relationship among items/areas to reduce multicollinearity, and (9...selecting items that demonstrate the strongest scale properties. Included is a review of the 4.0 description and items, followed by the proposed...Tables 1 – 7 for the description of each measure and corresponding items. Table 1. DEOCS 4.0 Perceptions of Safety Measure Description
Statistical analysis of content of Cs-137 in soils in Bansko-Razlog region
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kobilarov, R. G., E-mail: rkobi@tu-sofia.bg
Statistical analysis of the data set consisting of the activity concentrations of {sup 137}Cs in soils in Bansko–Razlog region is carried out in order to establish the dependence of the deposition and the migration of {sup 137}Cs on the soil type. The descriptive statistics and the test of normality show that the data set have not normal distribution. Positively skewed distribution and possible outlying values of the activity of {sup 137}Cs in soils were observed. After reduction of the effects of outliers, the data set is divided into two parts, depending on the soil type. Test of normality of themore » two new data sets shows that they have a normal distribution. Ordinary kriging technique is used to characterize the spatial distribution of the activity of {sup 137}Cs over an area covering 40 km{sup 2} (whole Razlog valley). The result (a map of the spatial distribution of the activity concentration of {sup 137}Cs) can be used as a reference point for future studies on the assessment of radiological risk to the population and the erosion of soils in the study area.« less
Maric, Marija; de Haan, Else; Hogendoorn, Sanne M; Wolters, Lidewij H; Huizenga, Hilde M
2015-03-01
Single-case experimental designs are useful methods in clinical research practice to investigate individual client progress. Their proliferation might have been hampered by methodological challenges such as the difficulty applying existing statistical procedures. In this article, we describe a data-analytic method to analyze univariate (i.e., one symptom) single-case data using the common package SPSS. This method can help the clinical researcher to investigate whether an intervention works as compared with a baseline period or another intervention type, and to determine whether symptom improvement is clinically significant. First, we describe the statistical method in a conceptual way and show how it can be implemented in SPSS. Simulation studies were performed to determine the number of observation points required per intervention phase. Second, to illustrate this method and its implications, we present a case study of an adolescent with anxiety disorders treated with cognitive-behavioral therapy techniques in an outpatient psychotherapy clinic, whose symptoms were regularly assessed before each session. We provide a description of the data analyses and results of this case study. Finally, we discuss the advantages and shortcomings of the proposed method. Copyright © 2014. Published by Elsevier Ltd.
Radiation from quantum weakly dynamical horizons in loop quantum gravity.
Pranzetti, Daniele
2012-07-06
We provide a statistical mechanical analysis of quantum horizons near equilibrium in the grand canonical ensemble. By matching the description of the nonequilibrium phase in terms of weakly dynamical horizons with a local statistical framework, we implement loop quantum gravity dynamics near the boundary. The resulting radiation process provides a quantum gravity description of the horizon evaporation. For large black holes, the spectrum we derive presents a discrete structure which could be potentially observable.
Hardware description languages
NASA Technical Reports Server (NTRS)
Tucker, Jerry H.
1994-01-01
Hardware description languages are special purpose programming languages. They are primarily used to specify the behavior of digital systems and are rapidly replacing traditional digital system design techniques. This is because they allow the designer to concentrate on how the system should operate rather than on implementation details. Hardware description languages allow a digital system to be described with a wide range of abstraction, and they support top down design techniques. A key feature of any hardware description language environment is its ability to simulate the modeled system. The two most important hardware description languages are Verilog and VHDL. Verilog has been the dominant language for the design of application specific integrated circuits (ASIC's). However, VHDL is rapidly gaining in popularity.
ERIC Educational Resources Information Center
Chan, Shiau Wei; Ismail, Zaleha
2014-01-01
The focus of assessment in statistics has gradually shifted from traditional assessment towards alternative assessment where more attention has been paid to the core statistical concepts such as center, variability, and distribution. In spite of this, there are comparatively few assessments that combine the significant three types of statistical…
Skinner, Carl G; Patel, Manish M; Thomas, Jerry D; Miller, Michael A
2011-01-01
Statistical methods are pervasive in medical research and general medical literature. Understanding general statistical concepts will enhance our ability to critically appraise the current literature and ultimately improve the delivery of patient care. This article intends to provide an overview of the common statistical methods relevant to medicine.
34 CFR 668.49 - Institutional fire safety policies and fire statistics.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 34 Education 3 2010-07-01 2010-07-01 false Institutional fire safety policies and fire statistics... fire statistics. (a) Additional definitions that apply to this section. Cause of fire: The factor or... statistics described in paragraph (c) of this section. (2) A description of each on-campus student housing...
Use of communication techniques by Maryland dentists.
Maybury, Catherine; Horowitz, Alice M; Wang, Min Qi; Kleinman, Dushanka V
2013-12-01
Health care providers' use of recommended communication techniques can increase patients' adherence to prevention and treatment regimens and improve patient health outcomes. The authors conducted a survey of Maryland dentists to determine the number and type of communication techniques they use on a routine basis. The authors mailed a 30-item questionnaire to a random sample of 1,393 general practice dentists and all 169 members of the Maryland chapter of the American Academy of Pediatric Dentistry. The overall response rate was 38.4 percent. Analysis included descriptive statistics, analysis of variance and ordinary least squares regression analysis to examine the association of dentists' characteristics with the number of communication techniques used. They set the significance level at P < .05. General dentists reported routinely using a mean of 7.9 of the 18 communication techniques and 3.6 of the seven basic techniques, whereas pediatric dentists reported using a mean of 8.4 and 3.8 of those techniques, respectively. General dentists who had taken a communication course outside of dental school were more likely than those who had not to use the 18 techniques (P < .01) but not the seven basic techniques (P < .05). Pediatric dentists who had taken a communication course outside of dental school were more likely than those who had not to use the 18 techniques (P < .05) and the seven basic techniques (P < .01). The number of communication techniques that dentists used routinely varied across the 18 techniques and was low for most techniques. Practical Implications. Professional education is needed both in dental school curricula and continuing education courses to increase use of recommended communication techniques. Specifically, dentists and their team members should consider taking communication skills courses and conducting an overall evaluation of their practices for user friendliness.
FAST COGNITIVE AND TASK ORIENTED, ITERATIVE DATA DISPLAY (FACTOID)
2017-06-01
approaches. As a result, the following assumptions guided our efforts in developing modeling and descriptive metrics for evaluation purposes...Application Evaluation . Our analytic workflow for evaluation is to first provide descriptive statistics about applications across metrics (performance...distributions for evaluation purposes because the goal of evaluation is accurate description , not inference (e.g., prediction). Outliers depicted
The Use of Recommended Communication Techniques by Maryland Family Physicians and Pediatricians
Weatherspoon, Darien J.; Horowitz, Alice M.; Kleinman, Dushanka V.; Wang, Min Qi
2015-01-01
Background Health literacy experts and the American Medical Association have developed recommended communication techniques for healthcare providers given that effective communication has been shown to greatly improve health outcomes. The purpose of this study was to determine the number and types of communication techniques routinely used by Maryland physicians. Methods In 2010, a 30-item survey was mailed to a random sample of 1,472 Maryland family physicians and pediatricians, with 294 surveys being returned and usable. The survey contained questions about provider and practice characteristics, and 17 items related to communication techniques, including seven basic communication techniques. Physicians’ use of recommended communication techniques was analyzed using descriptive statistics, analysis of variance, and ordinary least squares regression. Results Family physicians routinely used an average of 6.6 of the 17 total techniques and 3.3 of the seven basic techniques, whereas pediatricians routinely used 6.4 and 3.2 techniques, respectively. The use of simple language was the only technique that nearly all physicians routinely utilized (Family physicians, 91%; Pediatricians, 93%). Physicians who had taken a communications course used significantly more techniques than those who had not. Physicians with a low percentage of patients on Medicaid were significantly less likely to use the recommended communication techniques compared to those providers who had high proportion of their patient population on Medicaid. Conclusions Overall, the use of recommended communication techniques was low. Additionally, many physicians were unsure of the effectiveness of several of the recommended techniques, which could suggest that physicians are unaware of valuable skills that could enhance their communication. The findings of this study suggest that communications training should be given a higher priority in the medical training process in the United States. PMID:25856371
Chadeau-Hyam, Marc; Campanella, Gianluca; Jombart, Thibaut; Bottolo, Leonardo; Portengen, Lutzen; Vineis, Paolo; Liquet, Benoit; Vermeulen, Roel C H
2013-08-01
Recent technological advances in molecular biology have given rise to numerous large-scale datasets whose analysis imposes serious methodological challenges mainly relating to the size and complex structure of the data. Considerable experience in analyzing such data has been gained over the past decade, mainly in genetics, from the Genome-Wide Association Study era, and more recently in transcriptomics and metabolomics. Building upon the corresponding literature, we provide here a nontechnical overview of well-established methods used to analyze OMICS data within three main types of regression-based approaches: univariate models including multiple testing correction strategies, dimension reduction techniques, and variable selection models. Our methodological description focuses on methods for which ready-to-use implementations are available. We describe the main underlying assumptions, the main features, and advantages and limitations of each of the models. This descriptive summary constitutes a useful tool for driving methodological choices while analyzing OMICS data, especially in environmental epidemiology, where the emergence of the exposome concept clearly calls for unified methods to analyze marginally and jointly complex exposure and OMICS datasets. Copyright © 2013 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Quitadamo, L. R.; Cavrini, F.; Sbernini, L.; Riillo, F.; Bianchi, L.; Seri, S.; Saggio, G.
2017-02-01
Support vector machines (SVMs) are widely used classifiers for detecting physiological patterns in human-computer interaction (HCI). Their success is due to their versatility, robustness and large availability of free dedicated toolboxes. Frequently in the literature, insufficient details about the SVM implementation and/or parameters selection are reported, making it impossible to reproduce study analysis and results. In order to perform an optimized classification and report a proper description of the results, it is necessary to have a comprehensive critical overview of the applications of SVM. The aim of this paper is to provide a review of the usage of SVM in the determination of brain and muscle patterns for HCI, by focusing on electroencephalography (EEG) and electromyography (EMG) techniques. In particular, an overview of the basic principles of SVM theory is outlined, together with a description of several relevant literature implementations. Furthermore, details concerning reviewed papers are listed in tables and statistics of SVM use in the literature are presented. Suitability of SVM for HCI is discussed and critical comparisons with other classifiers are reported.
Mathematical and Statistical Software Index. Final Report.
ERIC Educational Resources Information Center
Black, Doris E., Comp.
Brief descriptions are provided of general-purpose mathematical and statistical software, including 27 "stand-alone" programs, three subroutine systems, and two nationally recognized statistical packages, which are available in the Air Force Human Resources Laboratory (AFHRL) software library. This index was created to enable researchers…
Education Statistics Quarterly, Spring 2001.
ERIC Educational Resources Information Center
Education Statistics Quarterly, 2001
2001-01-01
The "Education Statistics Quarterly" gives a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications, data products and funding opportunities developed over a 3-month period. Each issue…
Radar derived spatial statistics of summer rain. Volume 1: Experiment description
NASA Technical Reports Server (NTRS)
Katz, I.; Arnold, A.; Goldhirsh, J.; Konrad, T. G.; Vann, W. L.; Dobson, E. B.; Rowland, J. R.
1975-01-01
An experiment was performed at Wallops Island, Virginia, to obtain a statistical description of summer rainstorms. Its purpose was to obtain information needed for design of earth and space communications systems in which precipitation in the earth's atmosphere scatters or attenuates the radio signal. Rainstorms were monitored with the high resolution SPANDAR radar and the 3-dimensional structures of the storms were recorded on digital tape. The equipment, the experiment, and tabulated data obtained during the experiment are described.
The value of job analysis, job description and performance.
Wolfe, M N; Coggins, S
1997-01-01
All companies, regardless of size, are faced with the same employment concerns. Efficient personnel management requires the use of three human resource techniques--job analysis, job description and performance appraisal. These techniques and tools are not for large practices only. Small groups can obtain the same benefits by employing these performance control measures. Job analysis allows for the development of a compensation system. Job descriptions summarize the most important duties. Performance appraisals help reward outstanding work.
Cole, William G.; Michael, Patricia; Blois, Marsden S.
1987-01-01
A computer program was created to use information about the statistical distribution of words in journal abstracts to make probabilistic judgments about the level of description (e.g. molecular, cell, organ) of medical text. Statistical analysis of 7,409 journal abstracts taken from three medical journals representing distinct levels of description revealed that many medical words seem to be highly specific to one or another level of description. For example, the word adrenoreceptors occurred only in the American Journal of Physiology, never in Journal of Biological Chemistry or in Journal of American Medical Association. Such highly specific words occured so frequently that the automatic classification program was able to classify correctly 45 out of 45 test abstracts, with 100% confidence. These findings are interpreted in terms of both a theory of the structure of medical knowledge and the pragmatics of automatic classification.
78 FR 34101 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-06
... and basic descriptive statistics on the quantity and type of consumer-reported patient safety events... conduct correlations, cross tabulations of responses and other statistical analysis. Estimated Annual...
Diaby, Vakaramoko; Sanogo, Vassiki; Moussa, Kouame Richard
2015-01-01
Objective In this paper, the readers are introduced to ELICIT, an imprecise weight elicitation technique for multicriteria decision analysis for healthcare. Methods The application of ELICIT consists of two steps: the rank ordering of evaluation criteria based on decision-makers’ (DMs) preferences using the principal component analysis; and the estimation of criteria weights and their descriptive statistics using the variable interdependent analysis and the Monte Carlo method. The application of ELICIT is illustrated with a hypothetical case study involving the elicitation of weights for five criteria used to select the best device for eye surgery. Results The criteria were ranked from 1–5, based on a strict preference relationship established by the DMs. For each criterion, the deterministic weight was estimated as well as the standard deviation and 95% credibility interval. Conclusions ELICIT is appropriate in situations where only ordinal DMs’ preferences are available to elicit decision criteria weights. PMID:26361235
Insufficient Knowledge of Breast Cancer Risk Factors Among Malaysian Female University Students
Samah, Asnarulkhadi Abu; Ahmadian, Maryam; Latiff, Latiffah A.
2016-01-01
Background: Despite continuous argument about the efficacy of breast self-examination; it still could be a life-saving technique through inspiring and empowering women to take better control over their body/breast and health. This study investigated Malaysian female university students’ knowledge about breast cancer risk factors, signs, and symptoms and assessed breast self-examination frequency among students. Method: A cross-sectional survey was conducted in 2013 in nine public and private universities in the Klang Valley and Selangor. 842 female students were respondents for the self-administered survey technique. Simple descriptive and inferential statistics were employed for data analysis. Results: The uptake of breast self-examination (BSE) was less than 50% among the students. Most of students had insufficient knowledge on several breast cancer risk factors. Conclusion: Actions and efforts should be done to increase knowledge of breast cancer through the development of ethnically and traditionally sensitive educational training on BSE and breast cancer literacy. PMID:26234996
Assessment of and standardization for quantitative nondestructive test
NASA Technical Reports Server (NTRS)
Neuschaefer, R. W.; Beal, J. B.
1972-01-01
Present capabilities and limitations of nondestructive testing (NDT) as applied to aerospace structures during design, development, production, and operational phases are assessed. It will help determine what useful structural quantitative and qualitative data may be provided from raw materials to vehicle refurbishment. This assessment considers metal alloys systems and bonded composites presently applied in active NASA programs or strong contenders for future use. Quantitative and qualitative data has been summarized from recent literature, and in-house information, and presented along with a description of those structures or standards where the information was obtained. Examples, in tabular form, of NDT technique capabilities and limitations have been provided. NDT techniques discussed and assessed were radiography, ultrasonics, penetrants, thermal, acoustic, and electromagnetic. Quantitative data is sparse; therefore, obtaining statistically reliable flaw detection data must be strongly emphasized. The new requirements for reusable space vehicles have resulted in highly efficient design concepts operating in severe environments. This increases the need for quantitative NDT evaluation of selected structural components, the end item structure, and during refurbishment operations.
Research design and statistical methods in Pakistan Journal of Medical Sciences (PJMS).
Akhtar, Sohail; Shah, Syed Wadood Ali; Rafiq, M; Khan, Ajmal
2016-01-01
This article compares the study design and statistical methods used in 2005, 2010 and 2015 of Pakistan Journal of Medical Sciences (PJMS). Only original articles of PJMS were considered for the analysis. The articles were carefully reviewed for statistical methods and designs, and then recorded accordingly. The frequency of each statistical method and research design was estimated and compared with previous years. A total of 429 articles were evaluated (n=74 in 2005, n=179 in 2010, n=176 in 2015) in which 171 (40%) were cross-sectional and 116 (27%) were prospective study designs. A verity of statistical methods were found in the analysis. The most frequent methods include: descriptive statistics (n=315, 73.4%), chi-square/Fisher's exact tests (n=205, 47.8%) and student t-test (n=186, 43.4%). There was a significant increase in the use of statistical methods over time period: t-test, chi-square/Fisher's exact test, logistic regression, epidemiological statistics, and non-parametric tests. This study shows that a diverse variety of statistical methods have been used in the research articles of PJMS and frequency improved from 2005 to 2015. However, descriptive statistics was the most frequent method of statistical analysis in the published articles while cross-sectional study design was common study design.
Geostatistics applied to gas reservoirs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meunier, G.; Coulomb, C.; Laille, J.P.
1989-09-01
The spatial distribution of many of the physical parameters connected with a gas reservoir is of primary interest to both engineers and geologists throughout the study, development, and operation of a field. It is therefore desirable for the distribution to be capable of statistical interpretation, to have a simple graphical representation, and to allow data to be entered from either two- or three-dimensional grids. To satisfy these needs while dealing with the geographical variables, new methods have been developed under the name geostatistics. This paper describes briefly the theory of geostatistics and its most recent improvements for the specific problemmore » of subsurface description. The external-drift technique has been emphasized in particular, and in addition, four case studies related to gas reservoirs are presented.« less
Modern morphometry: new perspectives in physical anthropology.
Mantini, Simone; Ripani, Maurizio
2009-06-01
In the past one hundred years physical anthropology has recourse to more and more efficient methods, which provide several new information regarding, human evolution and biology. Apart from the molecular approach, the introduction of new computed assisted techniques gave rise to a new concept of morphometry. Computed tomography and 3D-imaging, allowed providing anatomical description of the external and inner structures exceeding the problems encountered with the traditional morphometric methods. Furthermore, the support of geometric morphometrics, allowed creating geometric models to investigate morphological variation in terms of evolution, ontogeny and variability. The integration of these new tools gave rise to the virtual anthropology and to a new image of the anthropologist in which anatomical, biological, mathematical statistical and data processing information are fused in a multidisciplinary approach.
Data survey on the effect of product features on competitive advantage of selected firms in Nigeria.
Olokundun, Maxwell; Iyiola, Oladele; Ibidunni, Stephen; Falola, Hezekiah; Salau, Odunayo; Amaihian, Augusta; Peter, Fred; Borishade, Taiye
2018-06-01
The main objective of this study was to present a data article that investigates the effect product features on firm's competitive advantage. Few studies have examined how the features of a product could help in driving the competitive advantage of a firm. Descriptive research method was used. Statistical Package for Social Sciences (SPSS 22) was engaged for analysis of one hundred and fifty (150) valid questionnaire which were completed by small business owners registered under small and medium scale enterprises development of Nigeria (SMEDAN). Stratified and simple random sampling techniques were employed; reliability and validity procedures were also confirmed. The field data set is made publicly available to enable critical or extended analysis.
Robotic radical cystectomy and intracorporeal urinary diversion: The USC technique.
Abreu, Andre Luis de Castro; Chopra, Sameer; Azhar, Raed A; Berger, Andre K; Miranda, Gus; Cai, Jie; Gill, Inderbir S; Aron, Monish; Desai, Mihir M
2014-07-01
Radical cystectomy is the gold-standard treatment for muscle-invasive and refractory nonmuscle-invasive bladder cancer. We describe our technique for robotic radical cystectomy (RRC) and intracorporeal urinary diversion (ICUD), that replicates open surgical principles, and present our preliminary results. Specific descriptions for preoperative planning, surgical technique, and postoperative care are provided. Demographics, perioperative and 30-day complications data were collected prospectively and retrospectively analyzed. Learning curve trends were analyzed individually for ileal conduits (IC) and neobladders (NB). SAS(®) Software Version 9.3 was used for statistical analyses with statistical significance set at P < 0.05. Between July 2010 and September 2013, RRC and lymph node dissection with ICUD were performed in 103 consecutive patients (orthotopic NB=46, IC 57). All procedures were completed robotically replicating the open surgical principles. The learning curve trends showed a significant reduction in hospital stay for both IC (11 vs. 6-day, P < 0.01) and orthotopic NB (13 vs. 7.5-day, P < 0.01) when comparing the first third of the cohort with the rest of the group. Overall median (range) operative time and estimated blood loss was 7 h (4.8-13) and 200 mL (50-1200), respectively. Within 30-day postoperatively, complications occurred in 61 (59%) patients, with the majority being low grade (n = 43), and no patient died. Median (range) nodes yield was 36 (0-106) and 4 (3.9%) specimens had positive surgical margins. Robotic radical cystectomy with totally ICUD is safe and feasible. It can be performed using the established open surgical principles with encouraging perioperative outcomes.
2012 aerospace medical certification statistical handbook.
DOT National Transportation Integrated Search
2013-12-01
The annual Aerospace Medical Certification Statistical Handbook reports descriptive : characteristics of all active U.S. civil aviation airmen and the aviation medical examiners (AMEs) that : perform the required medical examinations. The 2012 annual...
Teaching Statistics Using Classic Psychology Research: An Activities-Based Approach
ERIC Educational Resources Information Center
Holmes, Karen Y.; Dodd, Brett A.
2012-01-01
In this article, we discuss a collection of active learning activities derived from classic psychology studies that illustrate the appropriate use of descriptive and inferential statistics. (Contains 2 tables.)
NASA Astrophysics Data System (ADS)
Trigila, Alessandro; Iadanza, Carla; Esposito, Carlo; Scarascia-Mugnozza, Gabriele
2015-11-01
The aim of this work is to define reliable susceptibility models for shallow landslides using Logistic Regression and Random Forests multivariate statistical techniques. The study area, located in North-East Sicily, was hit on October 1st 2009 by a severe rainstorm (225 mm of cumulative rainfall in 7 h) which caused flash floods and more than 1000 landslides. Several small villages, such as Giampilieri, were hit with 31 fatalities, 6 missing persons and damage to buildings and transportation infrastructures. Landslides, mainly types such as earth and debris translational slides evolving into debris flows, were triggered on steep slopes and involved colluvium and regolith materials which cover the underlying metamorphic bedrock. The work has been carried out with the following steps: i) realization of a detailed event landslide inventory map through field surveys coupled with observation of high resolution aerial colour orthophoto; ii) identification of landslide source areas; iii) data preparation of landslide controlling factors and descriptive statistics based on a bivariate method (Frequency Ratio) to get an initial overview on existing relationships between causative factors and shallow landslide source areas; iv) choice of criteria for the selection and sizing of the mapping unit; v) implementation of 5 multivariate statistical susceptibility models based on Logistic Regression and Random Forests techniques and focused on landslide source areas; vi) evaluation of the influence of sample size and type of sampling on results and performance of the models; vii) evaluation of the predictive capabilities of the models using ROC curve, AUC and contingency tables; viii) comparison of model results and obtained susceptibility maps; and ix) analysis of temporal variation of landslide susceptibility related to input parameter changes. Models based on Logistic Regression and Random Forests have demonstrated excellent predictive capabilities. Land use and wildfire variables were found to have a strong control on the occurrence of very rapid shallow landslides.
USDA-ARS?s Scientific Manuscript database
The introduction to the second edition of the Compendium of Apple and Pear Diseases contains a general description of genus and species of commercial importance, some general information about growth and fruiting habits as well as recent production statistics. A general description of major scion c...
2011 aerospace medical certification statistical handbook.
DOT National Transportation Integrated Search
2013-01-01
The annual Aerospace Medical Certification Statistical Handbook reports descriptive characteristics of all active U.S. civil aviation airmen and the aviation medical examiners (AMEs) that perform the required medical examinations. The 2011 annual han...
DOT National Transportation Integrated Search
2007-02-01
This annual edition of Large Truck Crash Facts contains descriptive statistics about fatal, injury, and property damage only crashes involving large trucks in 2005. Selected crash statistics on passenger vehicles are also presented for comparison pur...
Sequi, Marco; Campi, Rita; Clavenna, Antonio; Bonati, Maurizio
2013-03-01
To evaluate the quality of data reporting and statistical methods performed in drug utilization studies in the pediatric population. Drug utilization studies evaluating all drug prescriptions to children and adolescents published between January 1994 and December 2011 were retrieved and analyzed. For each study, information on measures of exposure/consumption, the covariates considered, descriptive and inferential analyses, statistical tests, and methods of data reporting was extracted. An overall quality score was created for each study using a 12-item checklist that took into account the presence of outcome measures, covariates of measures, descriptive measures, statistical tests, and graphical representation. A total of 22 studies were reviewed and analyzed. Of these, 20 studies reported at least one descriptive measure. The mean was the most commonly used measure (18 studies), but only five of these also reported the standard deviation. Statistical analyses were performed in 12 studies, with the chi-square test being the most commonly performed test. Graphs were presented in 14 papers. Sixteen papers reported the number of drug prescriptions and/or packages, and ten reported the prevalence of the drug prescription. The mean quality score was 8 (median 9). Only seven of the 22 studies received a score of ≥10, while four studies received a score of <6. Our findings document that only a few of the studies reviewed applied statistical methods and reported data in a satisfactory manner. We therefore conclude that the methodology of drug utilization studies needs to be improved.
Toward improved analysis of concentration data: Embracing nondetects.
Shoari, Niloofar; Dubé, Jean-Sébastien
2018-03-01
Various statistical tests on concentration data serve to support decision-making regarding characterization and monitoring of contaminated media, assessing exposure to a chemical, and quantifying the associated risks. However, the routine statistical protocols cannot be directly applied because of challenges arising from nondetects or left-censored observations, which are concentration measurements below the detection limit of measuring instruments. Despite the existence of techniques based on survival analysis that can adjust for nondetects, these are seldom taken into account properly. A comprehensive review of the literature showed that managing policies regarding analysis of censored data do not always agree and that guidance from regulatory agencies may be outdated. Therefore, researchers and practitioners commonly resort to the most convenient way of tackling the censored data problem by substituting nondetects with arbitrary constants prior to data analysis, although this is generally regarded as a bias-prone approach. Hoping to improve the interpretation of concentration data, the present article aims to familiarize researchers in different disciplines with the significance of left-censored observations and provides theoretical and computational recommendations (under both frequentist and Bayesian frameworks) for adequate analysis of censored data. In particular, the present article synthesizes key findings from previous research with respect to 3 noteworthy aspects of inferential statistics: estimation of descriptive statistics, hypothesis testing, and regression analysis. Environ Toxicol Chem 2018;37:643-656. © 2017 SETAC. © 2017 SETAC.
2015-03-26
to my reader, Lieutenant Colonel Robert Overstreet, for helping solidify my research, coaching me through the statistical analysis, and positive...61 Descriptive Statistics .............................................................................................................. 61...common-method bias requires careful assessment of potential sources of bias and implementing procedural and statistical control methods. Podsakoff
Using Facebook Data to Turn Introductory Statistics Students into Consultants
ERIC Educational Resources Information Center
Childers, Adam F.
2017-01-01
Facebook provides businesses and organizations with copious data that describe how users are interacting with their page. This data affords an excellent opportunity to turn introductory statistics students into consultants to analyze the Facebook data using descriptive and inferential statistics. This paper details a semester-long project that…
ALISE Library and Information Science Education Statistical Report, 1999.
ERIC Educational Resources Information Center
Daniel, Evelyn H., Ed.; Saye, Jerry D., Ed.
This volume is the twentieth annual statistical report on library and information science (LIS) education published by the Association for Library and Information Science Education (ALISE). Its purpose is to compile, analyze, interpret, and report statistical (and other descriptive) information about library/information science programs offered by…
Statistical Tutorial | Center for Cancer Research
Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. ST is designed as a follow up to Statistical Analysis of Research Data (SARD) held in April 2018. The tutorial will apply the general principles of statistical analysis of research data including descriptive statistics, z- and t-tests of means and mean differences, simple and multiple linear regression, ANOVA tests, and Chi-Squared distribution.
Random field assessment of nanoscopic inhomogeneity of bone.
Dong, X Neil; Luo, Qing; Sparkman, Daniel M; Millwater, Harry R; Wang, Xiaodu
2010-12-01
Bone quality is significantly correlated with the inhomogeneous distribution of material and ultrastructural properties (e.g., modulus and mineralization) of the tissue. Current techniques for quantifying inhomogeneity consist of descriptive statistics such as mean, standard deviation and coefficient of variation. However, these parameters do not describe the spatial variations of bone properties. The objective of this study was to develop a novel statistical method to characterize and quantitatively describe the spatial variation of bone properties at ultrastructural levels. To do so, a random field defined by an exponential covariance function was used to represent the spatial uncertainty of elastic modulus by delineating the correlation of the modulus at different locations in bone lamellae. The correlation length, a characteristic parameter of the covariance function, was employed to estimate the fluctuation of the elastic modulus in the random field. Using this approach, two distribution maps of the elastic modulus within bone lamellae were generated using simulation and compared with those obtained experimentally by a combination of atomic force microscopy and nanoindentation techniques. The simulation-generated maps of elastic modulus were in close agreement with the experimental ones, thus validating the random field approach in defining the inhomogeneity of elastic modulus in lamellae of bone. Indeed, generation of such random fields will facilitate multi-scale modeling of bone in more pragmatic details. Copyright © 2010 Elsevier Inc. All rights reserved.
Obala, A. A.; Simiyu, C. J.; Odhiambo, D. O.; Nanyu, V.; Chege, P.; Downing, R.; Mwaliko, E.; Mwangi, A. W.; Menya, D.; Chelagat, D.; Nyamogoba, H. D. N.; Ayuo, P. O.; O'Meara, W. P.; Twagirumukiza, M.; Vandenbroek, D.; Otsyula, B. B. O.; de Maeseneer, J.
2013-01-01
Background. The intestinal parasitic infections (IPIs) are globally endemic, and they constitute the greatest cause of illness and disease worldwide. Transmission of IPIs occurs as a result of inadequate sanitation, inaccessibility to potable water, and poor living conditions. Objectives. To determine a baseline prevalence of IPIs among children of five years and below at Webuye Health and Demographic Surveillance (HDSS) area in western Kenya. Methods. Cross-sectional survey was used to collect data. Direct saline and formal-ether-sedimentation techniques were used to process the specimens. Descriptive and inferential statistics such as Chi-square statistics were used to analyze the data. Results. A prevalence of 52.3% (417/797) was obtained with the male child slightly more infected than the female (53.5% versus 51%), but this was not significant (χ 2 = 0.482, P > 0.05). Giardia lamblia and Entamoeba histolytica were the most common pathogenic IPIs with a prevalence of 26.1% (208/797) and 11.2% (89/797), respectively. Soil-transmitted helminths (STHs) were less common with a prevalence of 4.8% (38/797), 3.8% (30/797), and 0.13% (1/797) for Ascaris lumbricoides, hookworms, and Trichuris trichiura, respectively. Conclusions. Giardia lamblia and E. histolytica were the most prevalent pathogenic intestinal protozoa, while STHs were less common. Community-based health promotion techniques are recommended for controlling these parasites. PMID:23533444
NASA Astrophysics Data System (ADS)
Pacheco, Adriano M. G.; Freitas, Maria do Carmo; Reis, Miguel A.
2003-06-01
As part of an ongoing evaluation of its suitability for atmospheric biomonitoring, bark from olive trees ( Olea europaea Linn.) has been collected and searched for trace elements by means of two nuclear-analytical techniques—instrumental neutron activation analysis (INAA) and proton-induced X-ray emission (PIXE). The sampling for the present study was carried out across two separate sections of an established grid for air-quality surveys in mainland Portugal. The dual location comprises 58 collection sites—littoral-north (29 sites) and littoral-centre (29 sites). Both techniques are intrinsically accurate and may be seen to complement each other in the way that, as a whole, they yield 46 elements, with an overlap of 16 elements. Among the latter, this paper focuses on four of them and looks into their joint determination. Descriptive statistics for soil-related Al and Ti, and for sea-related Cl and Br, show results for each element to be fairly comparable. The degree of association between elemental patterns by either technique, as seen through nonparametric tests (Kendall's RK), is outstanding. No statistical evidence (Wilcoxon's T) for relative bias in correlated samples—consistently higher or lower results by one technique—could be found as well. As far as this study goes, INAA and PIXE may be used interchangeably for determining the present elements in olive-tree bark.
50 CFR Figure 1 to Part 679 - Bering Sea and Aleutian Islands Statistical and Reporting Areas
Code of Federal Regulations, 2011 CFR
2011-10-01
... Statistical and Reporting Areas 1 Figure 1 to Part 679 Wildlife and Fisheries FISHERY CONSERVATION AND... Islands Statistical and Reporting Areas ER15NO99.000 b. Coordinates Code Description 300 Russian waters... statistical area is the part of a reporting area contained in the EEZ. [64 FR 61983, Nov. 15, 1999; 65 FR...
50 CFR Figure 1 to Part 679 - Bering Sea and Aleutian Islands Statistical and Reporting Areas
Code of Federal Regulations, 2010 CFR
2010-10-01
... Statistical and Reporting Areas 1 Figure 1 to Part 679 Wildlife and Fisheries FISHERY CONSERVATION AND... Islands Statistical and Reporting Areas ER15NO99.000 b. Coordinates Code Description 300 Russian waters... statistical area is the part of a reporting area contained in the EEZ. [64 FR 61983, Nov. 15, 1999; 65 FR...
The Performance of Preparatory School Candidates at the United States Naval Academy
2001-09-01
79 1. Differences in Characteristics .....................................................79 2. Differences in...Coefficients ......................................42 Table 3.3 Applicant/Midshipman Background Characteristics ...45 Table 3.4 Descriptive Characteristics for Midshipmen by Accession Source .................46 Table 3.5 Descriptive Statistics for
Trends in motor vehicle traffic collision statistics, 1988-1997
DOT National Transportation Integrated Search
2001-02-01
This report presents descriptive statistics about Canadian traffic collisions during the ten-year period : from 1988 to 1997, focusing specifically on casualty collisions. Casualty collisions are defined as all : reportable motor vehicle crashes resu...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-23
... in Tables A and B. Table D--Borrower Closing Costs and Seller Concessions Descriptive Statistics by... accuracy of the statistical data illustrating the correlation between higher seller concessions and an...
Episodes of voluntary total fasting (hunger strike) in Spanish prisons: A descriptive analysis.
García-Guerrero, J; Vera-Remartínez, E J
2015-08-01
To provide a description of the frequency and main features of the episodes of voluntary total fasting (VTF) taking place in Spanish prisons. Information on the episodes of VTF reported between 04/01/2013 and 03/31/2014 was gathered. Once the appropriate informed consent was given, other data on social, demographic, penitentiary and clinical aspects were collected. A descriptive study of such variables together with a bivariate analysis was then carried out by means of standard statistical techniques and binary logistic regression models. IBM SPSS Statistics v.20 software was used for this purpose. This study was approved by an accredited Clinical Research Ethics Committee. 354 episodes of VTF took place among an average population of 29,762 prisoners. Therefore, the incidence rate was 11.9 VTF episodes per ‰ inmates-year. Informed consent (IC) was given in 180 cases (50.8%). 114 were of Spanish nationality and the average age was 38.7 years old (95% CI 37.2-40.1). The median duration of the episodes was 3 days (IQR 1-10), ranged between 1 and 71 days. The main reason was a disagreement on the decisions of treatment groups (57 cases, 31.7%). The average weight loss was 1.3 kg (70.8 vs. 69.5; p < 0.0001) and 0.7 of the BMI (24.5 vs. 23.8; p < 0.0001). 60 prisoners (33.3%) lost no weight at all and only 8 (4.4%) lost over 12% of the basal weight (8.5 kg). Ketone smell was identified in 61 cases (33.9%) and ketonuria in 63 (35%). Only one third of those who go on hunger strike in prison actually fast. Revindicative episodes of voluntary total fasting are somewhat common in Spanish prisons, but rarely are they carried out rigorously and entail a risk for those who fast. Copyright © 2015. Published by Elsevier Ltd.
Minimum Information about a Genotyping Experiment (MIGEN)
Huang, Jie; Mirel, Daniel; Pugh, Elizabeth; Xing, Chao; Robinson, Peter N.; Pertsemlidis, Alexander; Ding, LiangHao; Kozlitina, Julia; Maher, Joseph; Rios, Jonathan; Story, Michael; Marthandan, Nishanth; Scheuermann, Richard H.
2011-01-01
Genotyping experiments are widely used in clinical and basic research laboratories to identify associations between genetic variations and normal/abnormal phenotypes. Genotyping assay techniques vary from single genomic regions that are interrogated using PCR reactions to high throughput assays examining genome-wide sequence and structural variation. The resulting genotype data may include millions of markers of thousands of individuals, requiring various statistical, modeling or other data analysis methodologies to interpret the results. To date, there are no standards for reporting genotyping experiments. Here we present the Minimum Information about a Genotyping Experiment (MIGen) standard, defining the minimum information required for reporting genotyping experiments. MIGen standard covers experimental design, subject description, genotyping procedure, quality control and data analysis. MIGen is a registered project under MIBBI (Minimum Information for Biological and Biomedical Investigations) and is being developed by an interdisciplinary group of experts in basic biomedical science, clinical science, biostatistics and bioinformatics. To accommodate the wide variety of techniques and methodologies applied in current and future genotyping experiment, MIGen leverages foundational concepts from the Ontology for Biomedical Investigations (OBI) for the description of the various types of planned processes and implements a hierarchical document structure. The adoption of MIGen by the research community will facilitate consistent genotyping data interpretation and independent data validation. MIGen can also serve as a framework for the development of data models for capturing and storing genotyping results and experiment metadata in a structured way, to facilitate the exchange of metadata. PMID:22180825
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shumway, R.H.; McQuarrie, A.D.
Robust statistical approaches to the problem of discriminating between regional earthquakes and explosions are developed. We compare linear discriminant analysis using descriptive features like amplitude and spectral ratios with signal discrimination techniques using the original signal waveforms and spectral approximations to the log likelihood function. Robust information theoretic techniques are proposed and all methods are applied to 8 earthquakes and 8 mining explosions in Scandinavia and to an event from Novaya Zemlya of unknown origin. It is noted that signal discrimination approaches based on discrimination information and Renyi entropy perform better in the test sample than conventional methods based onmore » spectral ratios involving the P and S phases. Two techniques for identifying the ripple-firing pattern for typical mining explosions are proposed and shown to work well on simulated data and on several Scandinavian earthquakes and explosions. We use both cepstral analysis in the frequency domain and a time domain method based on the autocorrelation and partial autocorrelation functions. The proposed approach strips off underlying smooth spectral and seasonal spectral components corresponding to the echo pattern induced by two simple ripple-fired models. For two mining explosions, a pattern is identified whereas for two earthquakes, no pattern is evident.« less
Organizational Commitment DEOCS 4.1 Construct Validity Summary
2017-08-01
commitment construct that targets more specifically on the workgroup frame of reference. Included is a review of the 4.0 description and items...followed by the proposed modifications to the factor. The DEOCS 4.0 description provided for organizational commitment is “members’ dedication to the...5) examining variance and descriptive statistics, and (6) selecting items that demonstrate the strongest scale properties. Table 1. DEOCS 4.0
ERIC Educational Resources Information Center
McClain, Robert L.; Wright, John C.
2014-01-01
A description of shot noise and the role it plays in absorption and emission measurements using photodiode and photomultiplier tube detection systems is presented. This description includes derivations of useful forms of the shot noise equation based on Poisson counting statistics. This approach can deepen student understanding of a fundamental…
Statistical representation of multiphase flow
NASA Astrophysics Data System (ADS)
Subramaniam
2000-11-01
The relationship between two common statistical representations of multiphase flow, namely, the single--point Eulerian statistical representation of two--phase flow (D. A. Drew, Ann. Rev. Fluid Mech. (15), 1983), and the Lagrangian statistical representation of a spray using the dropet distribution function (F. A. Williams, Phys. Fluids 1 (6), 1958) is established for spherical dispersed--phase elements. This relationship is based on recent work which relates the droplet distribution function to single--droplet pdfs starting from a Liouville description of a spray (Subramaniam, Phys. Fluids 10 (12), 2000). The Eulerian representation, which is based on a random--field model of the flow, is shown to contain different statistical information from the Lagrangian representation, which is based on a point--process model. The two descriptions are shown to be simply related for spherical, monodisperse elements in statistically homogeneous two--phase flow, whereas such a simple relationship is precluded by the inclusion of polydispersity and statistical inhomogeneity. The common origin of these two representations is traced to a more fundamental statistical representation of a multiphase flow, whose concepts derive from a theory for dense sprays recently proposed by Edwards (Atomization and Sprays 10 (3--5), 2000). The issue of what constitutes a minimally complete statistical representation of a multiphase flow is resolved.
Long-term strategy for the statistical design of a forest health monitoring system
Hans T. Schreuder; Raymond L. Czaplewski
1993-01-01
A conceptual framework is given for a broad-scale survey of forest health that accomplishes three objectives: generate descriptive statistics; detect changes in such statistics; and simplify analytical inferences that identify, and possibly establish cause-effect relationships. Our paper discusses the development of sampling schemes to satisfy these three objectives,...
Design-based Sample and Probability Law-Assumed Sample: Their Role in Scientific Investigation.
ERIC Educational Resources Information Center
Ojeda, Mario Miguel; Sahai, Hardeo
2002-01-01
Discusses some key statistical concepts in probabilistic and non-probabilistic sampling to provide an overview for understanding the inference process. Suggests a statistical model constituting the basis of statistical inference and provides a brief review of the finite population descriptive inference and a quota sampling inferential theory.…
Computers as an Instrument for Data Analysis. Technical Report No. 11.
ERIC Educational Resources Information Center
Muller, Mervin E.
A review of statistical data analysis involving computers as a multi-dimensional problem provides the perspective for consideration of the use of computers in statistical analysis and the problems associated with large data files. An overall description of STATJOB, a particular system for doing statistical data analysis on a digital computer,…
A Descriptive Study of Individual and Cross-Cultural Differences in Statistics Anxiety
ERIC Educational Resources Information Center
Baloglu, Mustafa; Deniz, M. Engin; Kesici, Sahin
2011-01-01
The present study investigated individual and cross-cultural differences in statistics anxiety among 223 Turkish and 237 American college students. A 2 x 2 between-subjects factorial multivariate analysis of covariance (MANCOVA) was performed on the six dependent variables which are the six subscales of the Statistical Anxiety Rating Scale.…
Children in the UK: Signposts to Statistics.
ERIC Educational Resources Information Center
Grey, Eleanor
This guide indicates statistical sources in the United Kingdom dealing with children and young people. Regular and occasional sources are listed in a three-column format including the name of the source, a brief description, and the geographic area to which statistics refer. Information is classified under 25 topic headings: abortions; accidents;…
An analysis of the relationship of flight hours and naval rotary wing aviation mishaps
2017-03-01
evidence to support indicators used for sequestration, high flight hours, night flight, and overwater flight had statistically significant effects on...estimates found enough evidence to support indicators used for sequestration, high flight hours, night flight, and overwater flight had statistically ...38 C. DESCRIPTIVE STATISTICS ................................................................38 D
Practicing Statistics by Creating Exercises for Fellow Students
ERIC Educational Resources Information Center
Bebermeier, Sarah; Reiss, Katharina
2016-01-01
This article outlines the execution of a workshop in which students were encouraged to actively review the course contents on descriptive statistics by creating exercises for their fellow students. In a first-year statistics course in psychology, 39 out of 155 students participated in the workshop. In a subsequent evaluation, the workshop was…
Research design and statistical methods in Pakistan Journal of Medical Sciences (PJMS)
Akhtar, Sohail; Shah, Syed Wadood Ali; Rafiq, M.; Khan, Ajmal
2016-01-01
Objective: This article compares the study design and statistical methods used in 2005, 2010 and 2015 of Pakistan Journal of Medical Sciences (PJMS). Methods: Only original articles of PJMS were considered for the analysis. The articles were carefully reviewed for statistical methods and designs, and then recorded accordingly. The frequency of each statistical method and research design was estimated and compared with previous years. Results: A total of 429 articles were evaluated (n=74 in 2005, n=179 in 2010, n=176 in 2015) in which 171 (40%) were cross-sectional and 116 (27%) were prospective study designs. A verity of statistical methods were found in the analysis. The most frequent methods include: descriptive statistics (n=315, 73.4%), chi-square/Fisher’s exact tests (n=205, 47.8%) and student t-test (n=186, 43.4%). There was a significant increase in the use of statistical methods over time period: t-test, chi-square/Fisher’s exact test, logistic regression, epidemiological statistics, and non-parametric tests. Conclusion: This study shows that a diverse variety of statistical methods have been used in the research articles of PJMS and frequency improved from 2005 to 2015. However, descriptive statistics was the most frequent method of statistical analysis in the published articles while cross-sectional study design was common study design. PMID:27022365
Algorithm for computing descriptive statistics for very large data sets and the exa-scale era
NASA Astrophysics Data System (ADS)
Beekman, Izaak
2017-11-01
An algorithm for Single-point, Parallel, Online, Converging Statistics (SPOCS) is presented. It is suited for in situ analysis that traditionally would be relegated to post-processing, and can be used to monitor the statistical convergence and estimate the error/residual in the quantity-useful for uncertainty quantification too. Today, data may be generated at an overwhelming rate by numerical simulations and proliferating sensing apparatuses in experiments and engineering applications. Monitoring descriptive statistics in real time lets costly computations and experiments be gracefully aborted if an error has occurred, and monitoring the level of statistical convergence allows them to be run for the shortest amount of time required to obtain good results. This algorithm extends work by Pébay (Sandia Report SAND2008-6212). Pébay's algorithms are recast into a converging delta formulation, with provably favorable properties. The mean, variance, covariances and arbitrary higher order statistical moments are computed in one pass. The algorithm is tested using Sillero, Jiménez, & Moser's (2013, 2014) publicly available UPM high Reynolds number turbulent boundary layer data set, demonstrating numerical robustness, efficiency and other favorable properties.
Barth, Jürgen; Michlig, Nadja; Munder, Thomas
2014-01-01
Randomised controlled trials (RCTs) of psychotherapeutic interventions assume that specific techniques are used in treatments, which are responsible for changes in the client's symptoms. This assumption also holds true for meta-analyses, where evidence for specific interventions and techniques is compiled. However, it has also been argued that different treatments share important techniques and that an upcoming consensus about useful treatment strategies is leading to a greater integration of treatments. This makes assumptions about the effectiveness of specific interventions ingredients questionable if the shared (common) techniques are more often used in interventions than are the unique techniques. This study investigated the unique or shared techniques in RCTs of cognitive-behavioural therapy (CBT) and short-term psychodynamic psychotherapy (STPP). Psychotherapeutic techniques were coded from 42 masked treatment descriptions of RCTs in the field of depression (1979–2010). CBT techniques were often used in studies identified as either CBT or STPP. However, STPP techniques were only used in STPP-identified studies. Empirical clustering of treatment descriptions did not confirm the original distinction of CBT versus STPP, but instead showed substantial heterogeneity within both approaches. Extraction of psychotherapeutic techniques from the treatment descriptions is feasible and could be used as a content-based approach to classify treatments in systematic reviews and meta-analyses. PMID:25750827
Large truck and bus crash facts, 2010.
DOT National Transportation Integrated Search
2012-09-01
This annual edition of Large Truck and Bus Crash Facts contains descriptive statistics about fatal, injury, and : property damage only crashes involving large trucks and buses in 2010. Selected crash statistics on passenger : vehicles are also presen...
Large truck and bus crash facts, 2007.
DOT National Transportation Integrated Search
2009-03-01
This annual edition of Large Truck and Bus Crash Facts contains descriptive statistics about fatal, injury, and : property damage only crashes involving large trucks and buses in 2007. Selected crash statistics on passenger : vehicles are also presen...
Large truck and bus crash facts, 2008.
DOT National Transportation Integrated Search
2010-03-01
This annual edition of Large Truck and Bus Crash Facts contains descriptive statistics about fatal, injury, and : property damage only crashes involving large trucks and buses in 2008. Selected crash statistics on passenger : vehicles are also presen...
Large truck and bus crash facts, 2011.
DOT National Transportation Integrated Search
2013-10-01
This annual edition of Large Truck and Bus Crash Facts contains descriptive statistics about fatal, injury, and : property damage only crashes involving large trucks and buses in 2011. Selected crash statistics on passenger : vehicles are also presen...
Large truck and bus crash facts, 2013.
DOT National Transportation Integrated Search
2015-04-01
This annual edition of Large Truck and Bus Crash Facts contains descriptive statistics about fatal, injury, and property damage only crashes involving large trucks and buses in 2013. Selected crash statistics on passenger vehicles are also presented ...
Large truck and bus crash facts, 2009.
DOT National Transportation Integrated Search
2011-10-01
This annual edition of Large Truck and Bus Crash Facts contains descriptive statistics about fatal, injury, and : property damage only crashes involving large trucks and buses in 2009. Selected crash statistics on passenger : vehicles are also presen...
Large truck and bus crash facts, 2012.
DOT National Transportation Integrated Search
2014-06-01
This annual edition of Large Truck and Bus Crash Facts contains descriptive statistics about fatal, injury, and property damage only crashes involving large trucks and buses in 2012. Selected crash statistics on passenger vehicles are also presented ...
Surgical Techniques at Cesarean Delivery: A U.S. Survey
Lyell, Deirdre J.; Power, Michael; Murtough, Katie; Ness, Amen; Anderson, Britta; Erickson, Kristine; Schulkin, Jay
2016-01-01
Objective To assess the frequency of surgical techniques at cesarean delivery (CD) among U.S. obstetricians. Methods Members of the American College of Obstetrician Gynecologists were randomly selected and e-mailed an online survey that assessed surgical closure techniques, demographics, and reasons. Data were analyzed using SPSS (IBM Corp., Armonk, New York, United States), descriptive statistics, and analysis of variance. Results Our response rate was 53%, and 247 surveys were analyzed. A similar number of respondents either “always or usually” versus “rarely or never” reapproximate the rectus muscles (38.4% versus 43.3%, p = 0.39), and close parietal peritoneum (42.5% versus 46.9%, p = 0.46). The most frequently used techniques were double-layer hysterotomy closure among women planning future children (73.3%) and suturing versus stapling skin (67.6%); the least frequent technique was closure of visceral peritoneum (12.2%). Surgeons who perform double-layer hysterotomy closure had fewer years in practice (15.0 versus 18.7 years, p = 0.021); surgeons who close visceral peritoneum were older (55.5 versus 46.4 years old, p < 0.001) and had more years in practice (23.8 versus 13.8 years practice; p < 0.001). Conclusion Similar numbers of obstetricians either reapproximate or leave open the rectus muscles and parietal peritoneum at CD, suggesting that wide variation in practice exists. Surgeon demographics and safety concerns play a role in some techniques. PMID:28825004
Realistic finite temperature simulations of magnetic systems using quantum statistics
NASA Astrophysics Data System (ADS)
Bergqvist, Lars; Bergman, Anders
2018-01-01
We have performed realistic atomistic simulations at finite temperatures using Monte Carlo and atomistic spin dynamics simulations incorporating quantum (Bose-Einstein) statistics. The description is much improved at low temperatures compared to classical (Boltzmann) statistics normally used in these kind of simulations, while at higher temperatures the classical statistics are recovered. This corrected low-temperature description is reflected in both magnetization and the magnetic specific heat, the latter allowing for improved modeling of the magnetic contribution to free energies. A central property in the method is the magnon density of states at finite temperatures, and we have compared several different implementations for obtaining it. The method has no restrictions regarding chemical and magnetic order of the considered materials. This is demonstrated by applying the method to elemental ferromagnetic systems, including Fe and Ni, as well as Fe-Co random alloys and the ferrimagnetic system GdFe3.
The integration of system specifications and program coding
NASA Technical Reports Server (NTRS)
Luebke, W. R.
1970-01-01
Experience in maintaining up-to-date documentation for one module of the large-scale Medical Literature Analysis and Retrieval System 2 (MEDLARS 2) is described. Several innovative techniques were explored in the development of this system's data management environment, particularly those that use PL/I as an automatic documenter. The PL/I data description section can provide automatic documentation by means of a master description of data elements that has long and highly meaningful mnemonic names and a formalized technique for the production of descriptive commentary. The techniques discussed are practical methods that employ the computer during system development in a manner that assists system implementation, provides interim documentation for customer review, and satisfies some of the deliverable documentation requirements.
McClean, Stuart; Brilleman, Sam; Wye, Lesley
2015-07-28
Randomised controlled trial evidence indicates that Alexander Technique is clinically and cost effective for chronic back pain. The aim of this mixed methods evaluation was to explore the role and perceived impact of Alexander Technique lessons in the naturalistic setting of an acute hospital Pain Management Clinic in England. To capture changes in health status and resource use amongst service users, 43 service users were administered three widely used questionnaires (Brief Pain Inventory, MYMOP and Client Service Resource Inventory) at three time points: baseline, six weeks and three months after baseline. We also carried out 27 telephone interviews with service users and seven face-to-face interviews with pain clinic staff and Alexander Technique teachers. Quantitative data were analysed using descriptive statistics and qualitative data were analysed thematically. Those taking Alexander Technique lessons reported small improvements in health outcomes, and condition-related costs fell. However, due to the non-randomised, uncontrolled nature of the study design, changes cannot be attributed to the Alexander Technique lessons. Service users stated that their relationship to pain and pain management had changed, especially those who were more committed to practising the techniques regularly. These changes may explain the reported reduction in pain-related service use and the corresponding lower associated costs. Alexander Technique lessons may be used as another approach to pain management. The findings suggests that Alexander Technique lessons can help improve self-efficacy for those who are sufficiently motivated, which in turn may have an impact on service utilisation levels.
Ayna, Buket; Yılmaz, Berivan D; Izol, Bozan S; Ayna, Emrah; Tacir, İbrahim Halil
2018-06-15
BACKGROUND The purpose of this study was to determine the influence of 2 different esthetic post materials on the final color of direct-composite restorations by using a digital technique under in vivo conditions. MATERIAL AND METHODS We included 22 pulpless incisor teeth treated with conventionally cemented zirconia (n=11) and polyethylene fiber (n=11) posts in the study. Teeth were restored with a hybrid resin. The color of direct-composite restorations and contralateral control teeth was measured using a digital technique. The Commission Internationale de L'Eclairage, or CIE, L*a*b* and RGB color systems were investigated. Descriptive statistical analysis was performed for the CIE L*a*b* values. Color differences (ΔE) for the average L*, a*, and b* color parameters between every pair of groups were calculated (P>.05). RESULTS Significant differences were not found in the color difference luminosity (lum), R, G, B, and L* a* b* values between the zircon-rich glass fiber post (Z) and contralateral control teeth (Cz) (P>.05) and between the polyethylene fiber post (P) and contralateral control teeth (Cp) (P>.05). However, there was a statistically significant difference between the color a* values of the polyethylene fiber post (P) and contralateral control teeth (Cp) (p<0.05). Color differences (ΔE) between the zircon-rich glass fiber post (Z) and contralateral control teeth, and the polyethylene fiber post (P) and contralateral teeth were not statistically significant (P>.05). CONCLUSIONS Definitive restorations were equally affected by the 2 materials. Both materials can be used reliably in clinical practice. However, further research that focuses on the effect of intraoral conditions is needed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shen, Samuel S. P.
2013-09-01
The long-range goal of several past and current projects in our DOE-supported research has been the development of new and improved parameterizations of cloud-radiation effects and related processes, using ARM data, and the implementation and testing of these parameterizations in global models. The main objective of the present project being reported on here has been to develop and apply advanced statistical techniques, including Bayesian posterior estimates, to diagnose and evaluate features of both observed and simulated clouds. The research carried out under this project has been novel in two important ways. The first is that it is a key stepmore » in the development of practical stochastic cloud-radiation parameterizations, a new category of parameterizations that offers great promise for overcoming many shortcomings of conventional schemes. The second is that this work has brought powerful new tools to bear on the problem, because it has been an interdisciplinary collaboration between a meteorologist with long experience in ARM research (Somerville) and a mathematician who is an expert on a class of advanced statistical techniques that are well-suited for diagnosing model cloud simulations using ARM observations (Shen). The motivation and long-term goal underlying this work is the utilization of stochastic radiative transfer theory (Lane-Veron and Somerville, 2004; Lane et al., 2002) to develop a new class of parametric representations of cloud-radiation interactions and closely related processes for atmospheric models. The theoretical advantage of the stochastic approach is that it can accurately calculate the radiative heating rates through a broken cloud layer without requiring an exact description of the cloud geometry.« less
Resolving the Antarctic contribution to sea-level rise: a hierarchical modelling framework.
Zammit-Mangion, Andrew; Rougier, Jonathan; Bamber, Jonathan; Schön, Nana
2014-06-01
Determining the Antarctic contribution to sea-level rise from observational data is a complex problem. The number of physical processes involved (such as ice dynamics and surface climate) exceeds the number of observables, some of which have very poor spatial definition. This has led, in general, to solutions that utilise strong prior assumptions or physically based deterministic models to simplify the problem. Here, we present a new approach for estimating the Antarctic contribution, which only incorporates descriptive aspects of the physically based models in the analysis and in a statistical manner. By combining physical insights with modern spatial statistical modelling techniques, we are able to provide probability distributions on all processes deemed to play a role in both the observed data and the contribution to sea-level rise. Specifically, we use stochastic partial differential equations and their relation to geostatistical fields to capture our physical understanding and employ a Gaussian Markov random field approach for efficient computation. The method, an instantiation of Bayesian hierarchical modelling, naturally incorporates uncertainty in order to reveal credible intervals on all estimated quantities. The estimated sea-level rise contribution using this approach corroborates those found using a statistically independent method. © 2013 The Authors. Environmetrics Published by John Wiley & Sons, Ltd.
Bridging stylized facts in finance and data non-stationarities
NASA Astrophysics Data System (ADS)
Camargo, Sabrina; Duarte Queirós, Sílvio M.; Anteneodo, Celia
2013-04-01
Employing a recent technique which allows the representation of nonstationary data by means of a juxtaposition of locally stationary paths of different length, we introduce a comprehensive analysis of the key observables in a financial market: the trading volume and the price fluctuations. From the segmentation procedure we are able to introduce a quantitative description of statistical features of these two quantities, which are often named stylized facts, namely the tails of the distribution of trading volume and price fluctuations and a dynamics compatible with the U-shaped profile of the volume in a trading section and the slow decay of the autocorrelation function. The segmentation of the trading volume series provides evidence of slow evolution of the fluctuating parameters of each patch, pointing to the mixing scenario. Assuming that long-term features are the outcome of a statistical mixture of simple local forms, we test and compare different probability density functions to provide the long-term distribution of the trading volume, concluding that the log-normal gives the best agreement with the empirical distribution. Moreover, the segmentation of the magnitude price fluctuations are quite different from the results for the trading volume, indicating that changes in the statistics of price fluctuations occur at a faster scale than in the case of trading volume.
Resolving the Antarctic contribution to sea-level rise: a hierarchical modelling framework†
Zammit-Mangion, Andrew; Rougier, Jonathan; Bamber, Jonathan; Schön, Nana
2014-01-01
Determining the Antarctic contribution to sea-level rise from observational data is a complex problem. The number of physical processes involved (such as ice dynamics and surface climate) exceeds the number of observables, some of which have very poor spatial definition. This has led, in general, to solutions that utilise strong prior assumptions or physically based deterministic models to simplify the problem. Here, we present a new approach for estimating the Antarctic contribution, which only incorporates descriptive aspects of the physically based models in the analysis and in a statistical manner. By combining physical insights with modern spatial statistical modelling techniques, we are able to provide probability distributions on all processes deemed to play a role in both the observed data and the contribution to sea-level rise. Specifically, we use stochastic partial differential equations and their relation to geostatistical fields to capture our physical understanding and employ a Gaussian Markov random field approach for efficient computation. The method, an instantiation of Bayesian hierarchical modelling, naturally incorporates uncertainty in order to reveal credible intervals on all estimated quantities. The estimated sea-level rise contribution using this approach corroborates those found using a statistically independent method. © 2013 The Authors. Environmetrics Published by John Wiley & Sons, Ltd. PMID:25505370
A statistical pixel intensity model for segmentation of confocal laser scanning microscopy images.
Calapez, Alexandre; Rosa, Agostinho
2010-09-01
Confocal laser scanning microscopy (CLSM) has been widely used in the life sciences for the characterization of cell processes because it allows the recording of the distribution of fluorescence-tagged macromolecules on a section of the living cell. It is in fact the cornerstone of many molecular transport and interaction quantification techniques where the identification of regions of interest through image segmentation is usually a required step. In many situations, because of the complexity of the recorded cellular structures or because of the amounts of data involved, image segmentation either is too difficult or inefficient to be done by hand and automated segmentation procedures have to be considered. Given the nature of CLSM images, statistical segmentation methodologies appear as natural candidates. In this work we propose a model to be used for statistical unsupervised CLSM image segmentation. The model is derived from the CLSM image formation mechanics and its performance is compared to the existing alternatives. Results show that it provides a much better description of the data on classes characterized by their mean intensity, making it suitable not only for segmentation methodologies with known number of classes but also for use with schemes aiming at the estimation of the number of classes through the application of cluster selection criteria.
47 CFR 1.363 - Introduction of statistical data.
Code of Federal Regulations, 2010 CFR
2010-10-01
... case of sample surveys, there shall be a clear description of the survey design, including the... evidence in common carrier hearing proceedings, including but not limited to sample surveys, econometric... description of the experimental design shall be set forth, including a specification of the controlled...
47 CFR 1.363 - Introduction of statistical data.
Code of Federal Regulations, 2013 CFR
2013-10-01
... case of sample surveys, there shall be a clear description of the survey design, including the... evidence in common carrier hearing proceedings, including but not limited to sample surveys, econometric... description of the experimental design shall be set forth, including a specification of the controlled...
47 CFR 1.363 - Introduction of statistical data.
Code of Federal Regulations, 2014 CFR
2014-10-01
... case of sample surveys, there shall be a clear description of the survey design, including the... evidence in common carrier hearing proceedings, including but not limited to sample surveys, econometric... description of the experimental design shall be set forth, including a specification of the controlled...
47 CFR 1.363 - Introduction of statistical data.
Code of Federal Regulations, 2012 CFR
2012-10-01
... case of sample surveys, there shall be a clear description of the survey design, including the... evidence in common carrier hearing proceedings, including but not limited to sample surveys, econometric... description of the experimental design shall be set forth, including a specification of the controlled...
47 CFR 1.363 - Introduction of statistical data.
Code of Federal Regulations, 2011 CFR
2011-10-01
... case of sample surveys, there shall be a clear description of the survey design, including the... evidence in common carrier hearing proceedings, including but not limited to sample surveys, econometric... description of the experimental design shall be set forth, including a specification of the controlled...
Han, Kyunghwa; Jung, Inkyung
2018-05-01
This review article presents an assessment of trends in statistical methods and an evaluation of their appropriateness in articles published in the Archives of Plastic Surgery (APS) from 2012 to 2017. We reviewed 388 original articles published in APS between 2012 and 2017. We categorized the articles that used statistical methods according to the type of statistical method, the number of statistical methods, and the type of statistical software used. We checked whether there were errors in the description of statistical methods and results. A total of 230 articles (59.3%) published in APS between 2012 and 2017 used one or more statistical method. Within these articles, there were 261 applications of statistical methods with continuous or ordinal outcomes, and 139 applications of statistical methods with categorical outcome. The Pearson chi-square test (17.4%) and the Mann-Whitney U test (14.4%) were the most frequently used methods. Errors in describing statistical methods and results were found in 133 of the 230 articles (57.8%). Inadequate description of P-values was the most common error (39.1%). Among the 230 articles that used statistical methods, 71.7% provided details about the statistical software programs used for the analyses. SPSS was predominantly used in the articles that presented statistical analyses. We found that the use of statistical methods in APS has increased over the last 6 years. It seems that researchers have been paying more attention to the proper use of statistics in recent years. It is expected that these positive trends will continue in APS.
Stochastic Optimally Tuned Range-Separated Hybrid Density Functional Theory.
Neuhauser, Daniel; Rabani, Eran; Cytter, Yael; Baer, Roi
2016-05-19
We develop a stochastic formulation of the optimally tuned range-separated hybrid density functional theory that enables significant reduction of the computational effort and scaling of the nonlocal exchange operator at the price of introducing a controllable statistical error. Our method is based on stochastic representations of the Coulomb convolution integral and of the generalized Kohn-Sham density matrix. The computational cost of the approach is similar to that of usual Kohn-Sham density functional theory, yet it provides a much more accurate description of the quasiparticle energies for the frontier orbitals. This is illustrated for a series of silicon nanocrystals up to sizes exceeding 3000 electrons. Comparison with the stochastic GW many-body perturbation technique indicates excellent agreement for the fundamental band gap energies, good agreement for the band edge quasiparticle excitations, and very low statistical errors in the total energy for large systems. The present approach has a major advantage over one-shot GW by providing a self-consistent Hamiltonian that is central for additional postprocessing, for example, in the stochastic Bethe-Salpeter approach.
NASA Astrophysics Data System (ADS)
Szücs, T.; Kiss, G. G.; Gyürky, Gy.; Halász, Z.; Fülöp, Zs.; Rauscher, T.
2018-01-01
The stellar reaction rates of radiative α-capture reactions on heavy isotopes are of crucial importance for the γ process network calculations. These rates are usually derived from statistical model calculations, which need to be validated, but the experimental database is very scarce. This paper presents the results of α-induced reaction cross section measurements on iridium isotopes carried out at first close to the astrophysically relevant energy region. Thick target yields of 191Ir(α,γ)195Au, 191Ir(α,n)194Au, 193Ir(α,n)196mAu, 193Ir(α,n)196Au reactions have been measured with the activation technique between Eα = 13.4 MeV and 17 MeV. For the first time the thick target yield was determined with X-ray counting. This led to a previously unprecedented sensitivity. From the measured thick target yields, reaction cross sections are derived and compared with statistical model calculations. The recently suggested energy-dependent modification of the α + nucleus optical potential gives a good description of the experimental data.
Temporal changes and variability in temperature series over Peninsular Malaysia
NASA Astrophysics Data System (ADS)
Suhaila, Jamaludin
2015-02-01
With the current concern over climate change, the descriptions on how temperature series changed over time are very useful. Annual mean temperature has been analyzed for several stations over Peninsular Malaysia. Non-parametric statistical techniques such as Mann-Kendall test and Theil-Sen slope estimation are used primarily for assessing the significance and detection of trends, while a nonparametric Pettitt's test and sequential Mann-Kendall test are adopted to detect any abrupt climate change. Statistically significance increasing trends for annual mean temperature are detected for almost all studied stations with the magnitude of significant trend varied from 0.02°C to 0.05°C per year. The results shows that climate over Peninsular Malaysia is getting warmer than before. In addition, the results of the abrupt changes in temperature using Pettitt's and sequential Mann-Kendall test reveal the beginning of trends which can be related to El Nino episodes that occur in Malaysia. In general, the analysis results can help local stakeholders and water managers to understand the risks and vulnerabilities related to climate change in terms of mean events in the region.
Graefe, F.; Marschke, J.; Dimpfl, T.; Tunn, R.
2012-01-01
Vaginal vault suspension during hysterectomy for prolapse is both a therapy for apical insufficiency and helps prevent recurrence. Numerous techniques exist, with different anatomical results and differing complications. The description of the different approaches together with a description of the vaginal vault suspension technique used at the Department for Urogynaecology at St. Hedwig Hospital could serve as a basis for reassessment and for recommendations by scientific associations regarding general standards. PMID:25278621
Castor oil as a natural alternative to labor induction: A retrospective descriptive study.
DeMaria, Andrea L; Sundstrom, Beth; Moxley, Grace E; Banks, Kendall; Bishop, Ashlan; Rathbun, Lesley
2018-04-01
To describe birthing outcomes among women who consumed castor oil cocktail as part of a freestanding birth center labor induction protocol. De-identified data from birth logs and electronic medical records were entered into SPSS Statistics 22.0 for analysis for all women who received the castor oil cocktail (n=323) to induce labor between January 2008 and May 2015 at a birth center in the United States. Descriptive statistics were analyzed for trends in safety and birthing outcomes. Of the women who utilized the castor oil cocktail to stimulate labor, 293 (90.7%) birthed vaginally at the birth center or hospital. The incidence of maternal adverse effects (e.g., nausea, vomiting, extreme diarrhea) was less than 7%, and adverse effects of any kind were reported in less than 15% of births. An independent sample t-test revealed that parous women were more likely to birth vaginally at the birth center after using the castor oil cocktail than their nulliparous counterparts (p<.010), while gestational age (p=.26), woman's age (p=.23), and body mass index (p=.28) were not significantly associated. Nearly 91% of women in the study who consumed the castor oil cocktail to induce labor were able to give birth vaginally with little to no maternal or fetal complications. Findings indicate further research is needed to compare the safety and effectiveness of natural labor induction methodologies, including castor oil, to commonly used labor induction techniques in a prospective study or clinical trial. Copyright © 2017 Australian College of Midwives. Published by Elsevier Ltd. All rights reserved.
Model documentation report: Transportation sector model of the National Energy Modeling System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1994-03-01
This report documents the objectives, analytical approach and development of the National Energy Modeling System (NEMS) Transportation Model (TRAN). The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated by the model. This document serves three purposes. First, it is a reference document providing a detailed description of TRAN for model analysts, users, and the public. Second, this report meets the legal requirements of the Energy Information Administration (EIA) to provide adequate documentation in support of its statistical and forecast reports (Public Law 93-275, 57(b)(1)). Third, it permits continuity inmore » model development by providing documentation from which energy analysts can undertake model enhancements, data updates, and parameter refinements.« less
Forecast of future aviation fuels: The model
NASA Technical Reports Server (NTRS)
Ayati, M. B.; Liu, C. Y.; English, J. M.
1981-01-01
A conceptual models of the commercial air transportation industry is developed which can be used to predict trends in economics, demand, and consumption. The methodology is based on digraph theory, which considers the interaction of variables and propagation of changes. Air transportation economics are treated by examination of major variables, their relationships, historic trends, and calculation of regression coefficients. A description of the modeling technique and a compilation of historic airline industry statistics used to determine interaction coefficients are included. Results of model validations show negligible difference between actual and projected values over the twenty-eight year period of 1959 to 1976. A limited application of the method presents forecasts of air tranportation industry demand, growth, revenue, costs, and fuel consumption to 2020 for two scenarios of future economic growth and energy consumption.
Impact of an Onsite Clinic on Utilization of Preventive Services.
Ostovari, Mina; Yu, Denny; Yih, Yuehwern; Steele-Morris, Charlotte Joy
2017-07-01
To assess impact of an onsite clinic on healthcare utilization of preventive services for employees of a public university and their dependents. Descriptive statistics, logistic regression and classification tree techniques were used to assess health claim data to identify changes in patterns of healthcare utilization and factors impacting usage of onsite clinic. Utilization of preventive services significantly increased for women and men employees by 9% and 14% one year after implementation of the onsite clinic. Hourly-paid employees, employees without diabetes, employees with spouse opt out or no coverage were more likely to go to the onsite clinic. Adapted framework for assessing performance of onsite clinics based on usage of health informatics would help to identify health utilization patterns and interaction between onsite clinic and offsite health providers.
Autofluorescent polarimetry of bile films in the liver pathology differentiation
NASA Astrophysics Data System (ADS)
Prysyazhnyuk, V. P.; Ushenko, Yu. O.; Dubolazov, O. V.; Ushenko, A. G.; Savich, V. O.; Karachevtsev, A. O.
2015-09-01
A new information optical technique of diagnostics of the structure of the polycrystalline bile films is proposed. The model of Mueller-matrix description of mechanisms of optical anisotropy of such objects as optical activity, birefringence, as well as linear and circular dichroism is suggested. The ensemble of informationally topical azimuthally stable Mueller-matrix invariants is determined. Within the statistical analysis of such parameters distributions the objective criteria of differentiation of the polycrystalline bile films taken from patients with fatty degeneration (group 1) chronic hepatitis (group 2) of the liver were determined. From the point of view of probative medicine the operational characteristics (sensitivity, specificity and accuracy) of the information-optical method of Mueller-matrix mapping of polycrystalline films of bile were found and its efficiency in diagnostics of pathological changes was demonstrated.
Accurate simulations of helium pick-up experiments using a rejection-free Monte Carlo method
NASA Astrophysics Data System (ADS)
Dutra, Matthew; Hinde, Robert
2018-04-01
In this paper, we present Monte Carlo simulations of helium droplet pick-up experiments with the intention of developing a robust and accurate theoretical approach for interpreting experimental helium droplet calorimetry data. Our approach is capable of capturing the evaporative behavior of helium droplets following dopant acquisition, allowing for a more realistic description of the pick-up process. Furthermore, we circumvent the traditional assumption of bulk helium behavior by utilizing density functional calculations of the size-dependent helium droplet chemical potential. The results of this new Monte Carlo technique are compared to commonly used Poisson pick-up statistics for simulations that reflect a broad range of experimental parameters. We conclude by offering an assessment of both of these theoretical approaches in the context of our observed results.
Phospholipid Fatty Acid Analysis: Past, Present and Future
NASA Astrophysics Data System (ADS)
Findlay, R. H.
2008-12-01
With their 1980 publication, Bobbie and White initiated the use of phospholipid fatty acids for the study of microbial communities. This method, integrated with a previously published biomass assay based on the colorimetric detection of orthophosphate liberated from phospholipids, provided the first quantitative method for determining microbial community structure. The method is based on a quantitative extraction of lipids from the sample matrix, isolation of the phospholipids, conversion of the phospholipid fatty acids to their corresponding fatty acid methyl esters (known by the acronym FAME) and the separation, identification and quantification of the FAME by gas chromatography. Early laboratory and field samples focused on correlating individual fatty acids to particular groups of microorganisms. Subsequent improvements to the methodology include reduced solvent volumes for extractions, improved sensitivity in the detection of orthophosphate and the use of solid phase extraction technology. Improvements in the field of gas chromatography also increased accessibility of the technique and it has been widely applied to water, sediment, soil and aerosol samples. Whole cell fatty acid analysis, a related but not equal technique, is currently used for phenotypic characterization in bacterial species descriptions and is the basis for a commercial, rapid bacterial identification system. In the early 1990ês application of multivariate statistical analysis, first cluster analysis and then principal component analysis, further improved the usefulness of the technique and allowed the development of a functional group approach to interpretation of phospholipid fatty acid profiles. Statistical techniques currently applied to the analysis of phospholipid fatty acid profiles include constrained ordinations and neutral networks. Using redundancy analysis, a form of constrained ordination, we have recently shown that both cation concentration and dissolved organic matter (DOM) quality are determinates of microbial community structure in forested headwater streams. One of the most exciting recent developments in phospholipid fatty acid analysis is the application of compound specific stable isotope analysis. We are currently applying this technique to stream sediments to help determine which microorganisms are involved in the initial processing of DOM and the technique promises to be a useful tool for assigning ecological function to microbial populations.
Yang, Yongji; Moser, Michael A J; Zhang, Edwin; Zhang, Wenjun; Zhang, Bing
2018-01-01
The aim of this study was to develop a statistical model for cell death by irreversible electroporation (IRE) and to show that the statistic model is more accurate than the electric field threshold model in the literature using cervical cancer cells in vitro. HeLa cell line was cultured and treated with different IRE protocols in order to obtain data for modeling the statistical relationship between the cell death and pulse-setting parameters. In total, 340 in vitro experiments were performed with a commercial IRE pulse system, including a pulse generator and an electric cuvette. Trypan blue staining technique was used to evaluate cell death after 4 hours of incubation following IRE treatment. Peleg-Fermi model was used in the study to build the statistical relationship using the cell viability data obtained from the in vitro experiments. A finite element model of IRE for the electric field distribution was also built. Comparison of ablation zones between the statistical model and electric threshold model (drawn from the finite element model) was used to show the accuracy of the proposed statistical model in the description of the ablation zone and its applicability in different pulse-setting parameters. The statistical models describing the relationships between HeLa cell death and pulse length and the number of pulses, respectively, were built. The values of the curve fitting parameters were obtained using the Peleg-Fermi model for the treatment of cervical cancer with IRE. The difference in the ablation zone between the statistical model and the electric threshold model was also illustrated to show the accuracy of the proposed statistical model in the representation of ablation zone in IRE. This study concluded that: (1) the proposed statistical model accurately described the ablation zone of IRE with cervical cancer cells, and was more accurate compared with the electric field model; (2) the proposed statistical model was able to estimate the value of electric field threshold for the computer simulation of IRE in the treatment of cervical cancer; and (3) the proposed statistical model was able to express the change in ablation zone with the change in pulse-setting parameters.
2017-03-01
53 ix LIST OF TABLES Table 1. Descriptive Statistics for Control Variables by... Statistics for Control Variables by Gender (Random Subsample with Complete Survey) ............................................................30 Table...empirical analysis. Chapter IV describes the summary statistics and results. Finally, Chapter V offers concluding thoughts, study limitations, and
ERIC Educational Resources Information Center
Bailey, Thomas; Jenkins, Davis; Leinbach, Timothy
2005-01-01
This report summarizes the latest available national statistics on access and attainment by low income and minority community college students. The data come from the National Center for Education Statistics' (NCES) Integrated Postsecondary Education Data System (IPEDS) annual surveys of all postsecondary educational institutions and the NCES…
A First Assignment to Create Student Buy-In in an Introductory Business Statistics Course
ERIC Educational Resources Information Center
Newfeld, Daria
2016-01-01
This paper presents a sample assignment to be administered after the first two weeks of an introductory business focused statistics course in order to promote student buy-in. This assignment integrates graphical displays of data, descriptive statistics and cross-tabulation analysis through the lens of a marketing analysis study. A marketing sample…
Xu, Yiling; Oh, Heesoo; Lagravère, Manuel O
2017-09-01
The purpose of this study was to locate traditionally-used landmarks in two-dimensional (2D) images and newly-suggested ones in three-dimensional (3D) images (cone-beam computer tomographies [CBCTs]) and determine possible relationships between them to categorize patients with Class II-1 malocclusion. CBCTs from 30 patients diagnosed with Class II-1 malocclusion were obtained from the University of Alberta Graduate Orthodontic Program database. The reconstructed images were downloaded and visualized using the software platform AVIZO ® . Forty-two landmarks were chosen and the coordinates were then obtained and analyzed using linear and angular measurements. Ten images were analyzed three times to determine the reliability and measurement error of each landmark using Intra-Class Correlation coefficient (ICC). Descriptive statistics were done using the SPSS statistical package to determine any relationships. ICC values were excellent for all landmarks in all axes, with the highest measurement error of 2mm in the y-axis for the Gonion Left landmark. Linear and angular measurements were calculated using the coordinates of each landmark. Descriptive statistics showed that the linear and angular measurements used in the 2D images did not correlate well with the 3D images. The lowest standard deviation obtained was 0.6709 for S-GoR/N-Me, with a mean of 0.8016. The highest standard deviation was 20.20704 for ANS-InfraL, with a mean of 41.006. The traditional landmarks used for 2D malocclusion analysis show good reliability when transferred to 3D images. However, they did not reveal specific skeletal or dental patterns when trying to analyze 3D images for malocclusion. Thus, another technique should be considered when classifying 3D CBCT images for Class II-1malocclusion. Copyright © 2017 CEO. Published by Elsevier Masson SAS. All rights reserved.
TPS as an Effective Technique to Enhance the Students' Achievement on Writing Descriptive Text
ERIC Educational Resources Information Center
Sumarsih, M. Pd.; Sanjaya, Dedi
2013-01-01
Students' achievement in writing descriptive text is very low, in this study Think Pair Share (TPS) is applied to solve the problem. Action research is conducted for the result. Additionally, qualitative and quantitative techniques are applied in this research. The subject of this research is grade VIII in Junior High School in Indonesia. From…
Maljaei, Ensiyeh; Pourkazemi, Maryam; Ghanizadeh, Milad; Ranjbar, Rana
2017-01-01
During the early mixed dentition period, the location of the deciduous maxillary second molar results in ineffectiveness of the infiltration technique in this area. In such cases, administration of posterior superior alveolar (PSA) nerve block is recommended; however, such a technique has some complications. The present study was undertaken to compare the effects of buccal infiltration of 4% Articaine and PSA technique with 2% Lidocaine on the success of anesthesia of maxillary deciduous second molars in 6 to 9-year old children. In the present double-blind randomized clinical trial, 56 children aged 6-9 years requiring vital pulp therapy of deciduous maxillary second molar were included. In group 1, 4% Articaine was injected using a buccal infiltration technique. In group 2, 2% Lidocaine was injected using the PSA nerve block technique. After 10 min, the caries was removed and access cavity preparation was instituted. The patients were asked to report the presence or absence of pain during the procedure. Therefore, the existence of pain was measured by the patient's self-report. Data were analyzed with descriptive statistical methods and the chi -squared test. Pain was reported by 6 (21.4%) and 9 (32.1%) subjects in the Articaine and Lidocaine groups, respectively. Chi -squared test did not reveal any significant differences between the two groups ( P =0.54). Under the limitations of the present study, there was no significant differences between the results of Articaine buccal infiltration and Lidocaine PSA technique, so Articaine buccal infiltration can be used as a substitute for the PSA technique.
Sylvester, B.D.; Zammit, K.; Fong, A.J.; Sabiston, C.M.
2017-01-01
Background Cancer centre Web sites can be a useful tool for distributing information about the benefits of physical activity for breast cancer (bca) survivors, and they hold potential for supporting health behaviour change. However, the extent to which cancer centre Web sites use evidence-based behaviour change techniques to foster physical activity behaviour among bca survivors is currently unknown. The aim of our study was to evaluate the presentation of behaviour-change techniques on Canadian cancer centre Web sites to promote physical activity behaviour for bca survivors. Methods All Canadian cancer centre Web sites (n = 39) were evaluated by two raters using the Coventry, Aberdeen, and London–Refined (calo-re) taxonomy of behaviour change techniques and the eEurope 2002 Quality Criteria for Health Related Websites. Descriptive statistics were calculated. Results The most common behaviour change techniques used on Web sites were providing information about consequences in general (80%), suggesting goal-setting behaviour (56%), and planning social support or social change (46%). Overall, Canadian cancer centre Web sites presented an average of M = 6.31 behaviour change techniques (of 40 that were coded) to help bca survivors increase their physical activity behaviour. Evidence of quality factors ranged from 90% (sites that provided evidence of readability) to 0% (sites that provided an editorial policy). Conclusions Our results provide preliminary evidence that, of 40 behaviour-change techniques that were coded, fewer than 20% were used to promote physical activity behaviour to bca survivors on cancer centre Web sites, and that the most effective techniques were inconsistently used. On cancer centre Web sites, health promotion specialists could focus on emphasizing knowledge mobilization efforts using available research into behaviour-change techniques to help bca survivors increase their physical activity. PMID:29270056
Sylvester, B D; Zammit, K; Fong, A J; Sabiston, C M
2017-12-01
Cancer centre Web sites can be a useful tool for distributing information about the benefits of physical activity for breast cancer (bca) survivors, and they hold potential for supporting health behaviour change. However, the extent to which cancer centre Web sites use evidence-based behaviour change techniques to foster physical activity behaviour among bca survivors is currently unknown. The aim of our study was to evaluate the presentation of behaviour-change techniques on Canadian cancer centre Web sites to promote physical activity behaviour for bca survivors. All Canadian cancer centre Web sites ( n = 39) were evaluated by two raters using the Coventry, Aberdeen, and London-Refined (calo-re) taxonomy of behaviour change techniques and the eEurope 2002 Quality Criteria for Health Related Websites. Descriptive statistics were calculated. The most common behaviour change techniques used on Web sites were providing information about consequences in general (80%), suggesting goal-setting behaviour (56%), and planning social support or social change (46%). Overall, Canadian cancer centre Web sites presented an average of M = 6.31 behaviour change techniques (of 40 that were coded) to help bca survivors increase their physical activity behaviour. Evidence of quality factors ranged from 90% (sites that provided evidence of readability) to 0% (sites that provided an editorial policy). Our results provide preliminary evidence that, of 40 behaviour-change techniques that were coded, fewer than 20% were used to promote physical activity behaviour to bca survivors on cancer centre Web sites, and that the most effective techniques were inconsistently used. On cancer centre Web sites, health promotion specialists could focus on emphasizing knowledge mobilization efforts using available research into behaviour-change techniques to help bca survivors increase their physical activity.
Sordi, Marina de; Mourão, Lucia Figueiredo; Silva, Ariovaldo Armando da; Flosi, Luciana Claudia Leite
2009-01-01
Patients with dysphagia have impairments in many aspects, and an interdisciplinary approach is fundamental to define diagnosis and treatment. A joint approach in the clinical and videoendoscopy evaluation is paramount. To study the correlation between the clinical assessment (ACD) and the videoendoscopic (VED) assessment of swallowing by classifying the degree of severity and the qualitative/descriptive analyses of the procedures. cross-sectional, descriptive and comparative. held from March to December of 2006, at the Otolaryngology/Dysphagia ward of a hospital in the country side of São Paulo. 30 dysphagic patients with different disorders were assessed by ACD and VED. The data was classified by means of severity scales and qualitative/ descriptive analysis. the correlation between severity ACD and VED scales pointed to a statistically significant low agreement (KAPA = 0.4) (p=0,006). The correlation between the qualitative/descriptive analysis pointed to an excellent and statistically significant agreement (KAPA=0.962) (p<0.001) concerning the entire sample. the low agreement between the severity scales point to a need to perform both procedures, reinforcing VED as a doable procedure. The descriptive qualitative analysis pointed to an excellent agreement, and such data reinforces our need to understand swallowing as a process.
R is an open source language and environment for statistical computing and graphics that can also be used for both spatial analysis (i.e. geoprocessing and mapping of different types of spatial data) and spatial data analysis (i.e. the application of statistical descriptions and ...
Survey of statistical techniques used in validation studies of air pollution prediction models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bornstein, R D; Anderson, S F
1979-03-01
Statistical techniques used by meteorologists to validate predictions made by air pollution models are surveyed. Techniques are divided into the following three groups: graphical, tabular, and summary statistics. Some of the practical problems associated with verification are also discussed. Characteristics desired in any validation program are listed and a suggested combination of techniques that possesses many of these characteristics is presented.
WASP (Write a Scientific Paper) using Excel - 2: Pivot tables.
Grech, Victor
2018-02-01
Data analysis at the descriptive stage and the eventual presentation of results requires the tabulation and summarisation of data. This exercise should always precede inferential statistics. Pivot tables and pivot charts are one of Excel's most powerful and underutilised features, with tabulation functions that immensely facilitate descriptive statistics. Pivot tables permit users to dynamically summarise and cross-tabulate data, create tables in several dimensions, offer a range of summary statistics and can be modified interactively with instant outputs. Large and detailed datasets are thereby easily manipulated making pivot tables arguably the best way to explore, summarise and present data from many different angles. This second paper in the WASP series in Early Human Development provides pointers for pivot table manipulation in Excel™. Copyright © 2018 Elsevier B.V. All rights reserved.
Derivation of exact master equation with stochastic description: dissipative harmonic oscillator.
Li, Haifeng; Shao, Jiushu; Wang, Shikuan
2011-11-01
A systematic procedure for deriving the master equation of a dissipative system is reported in the framework of stochastic description. For the Caldeira-Leggett model of the harmonic-oscillator bath, a detailed and elementary derivation of the bath-induced stochastic field is presented. The dynamics of the system is thereby fully described by a stochastic differential equation, and the desired master equation would be acquired with statistical averaging. It is shown that the existence of a closed-form master equation depends on the specificity of the system as well as the feature of the dissipation characterized by the spectral density function. For a dissipative harmonic oscillator it is observed that the correlation between the stochastic field due to the bath and the system can be decoupled, and the master equation naturally results. Such an equation possesses the Lindblad form in which time-dependent coefficients are determined by a set of integral equations. It is proved that the obtained master equation is equivalent to the well-known Hu-Paz-Zhang equation based on the path-integral technique. The procedure is also used to obtain the master equation of a dissipative harmonic oscillator in time-dependent fields.
Frequency of depression, anxiety and stress among the undergraduate physiotherapy students.
Syed, Annosha; Ali, Syed Shazad; Khan, Muhammad
2018-01-01
To assess the frequency of Depression, Anxiety and Stress (DAS) among the undergraduate physiotherapy students. A descriptive cross sectional study was conducted in various Physiotherapy Institutes in Sindh, Pakistan among undergraduate physiotherapy students. The total duration of this study was 4 months from September, 2016 to January, 2017. Data was collected from 267 students with no physical and mental illness; more than half were female students 75.3%. They were selected through Non probability purposive sampling technique. A self-administered standardized DASS (depression, anxiety and stress scale) was used to collect data and result was analyzed using its severity rating index. Data was entered and analyzed by using SPSS version 21. Descriptive statistics including the frequency of depression, anxiety, stress and demographic characteristic of the participant was collected. The mean age of students was 19.3371±1.18839 years. The Frequency of depression, anxiety and stress found among undergraduates Physiotherapy students was 48.0%, 68.54% and 53.2%, respectively. It was observed that the frequency of depression, anxiety and stress among physiotherapy undergraduates students were high. It suggests the urgent need of carrying out evidence based Psychological health promotion for undergraduate Physiotherapy students to control this growing problem.
The influence of care interventions on the continuity of sleep of intensive care unit patients1
Hamze, Fernanda Luiza; de Souza, Cristiane Chaves; Chianca, Tânia Couto Machado
2015-01-01
Objective: to identify care interventions, performed by the health team, and their influence on the continuity of sleep of patients hospitalized in the Intensive Care Unit. Method: descriptive study with a sample of 12 patients. A filming technique was used for the data collection. The awakenings from sleep were measured using the actigraphy method. The analysis of the data was descriptive, processed using the Statistical Package for the Social Sciences software. Results: 529 care interventions were identified, grouped into 28 different types, of which 12 (42.8%) caused awakening from sleep for the patients. A mean of 44.1 interventions/patient/day was observed, with 1.8 interventions/patient/hour. The administration of oral medicine and food were the interventions that caused higher frequencies of awakenings in the patients. Conclusion: it was identified that the health care interventions can harm the sleep of ICU patients. It is recommended that health professionals rethink the planning of interventions according to the individual demand of the patients, with the diversification of schedules and introduction of new practices to improve the quality of sleep of Intensive Care Unit patients. PMID:26487127
Factors Affecting Jordanian School Adolescents' Experience of Being Bullied.
Shaheen, Abeer M; Hammad, Sawsan; Haourani, Eman M; Nassar, Omayyah S
The purpose of this study was to identify the Jordanian school adolescents' experience of being bullied, and to examine its association with selected socio-demographic variables. This cross sectional descriptive study used multi-stages cluster sampling technique to recruit a sample of in-school adolescents in Jordan (N=436). The Personal Experiences Checklist was used to measure the experience of bullying. Descriptive statistics and parametric tests were used in the analysis. Relational-verbal bullying was the most common form of bullying while cyber bullying was the least common type. Male adolescents experienced bullying more than females. In addition, adolescents belonging to low-income families experienced bullying more than those from moderate-income families. Finally, being bullied was negatively correlated with academic performance of students. This study indicated that risk factors for bullying are multifaceted which necessitate the development of prevention and intervention strategies to combat bullying taking into consideration these factors. Schools should introduce environmental changes to discourage bullying and establish a policy with specific guidelines of what constitutes bullying behavior and expected disciplinary procedures. Staff training on information about the definition of bullying, current trends, and the effects of bullying is also recommended. Copyright © 2017 Elsevier Inc. All rights reserved.
Predictors of Errors of Novice Java Programmers
ERIC Educational Resources Information Center
Bringula, Rex P.; Manabat, Geecee Maybelline A.; Tolentino, Miguel Angelo A.; Torres, Edmon L.
2012-01-01
This descriptive study determined which of the sources of errors would predict the errors committed by novice Java programmers. Descriptive statistics revealed that the respondents perceived that they committed the identified eighteen errors infrequently. Thought error was perceived to be the main source of error during the laboratory programming…
The Status of Child Nutrition Programs in Colorado.
ERIC Educational Resources Information Center
McMillan, Daniel C.; Vigil, Herminia J.
This report provides descriptive and statistical data on the status of child nutrition programs in Colorado. The report contains descriptions of the National School Lunch Program, school breakfast programs, the Special Milk Program, the Summer Food Service Program, the Nutrition Education and Training Program, state dietary guidelines, Colorado…
NASA Technical Reports Server (NTRS)
Purves, L.; Strang, R. F.; Dube, M. P.; Alea, P.; Ferragut, N.; Hershfeld, D.
1983-01-01
The software and procedures of a system of programs used to generate a report of the statistical correlation between NASTRAN modal analysis results and physical tests results from modal surveys are described. Topics discussed include: a mathematical description of statistical correlation, a user's guide for generating a statistical correlation report, a programmer's guide describing the organization and functions of individual programs leading to a statistical correlation report, and a set of examples including complete listings of programs, and input and output data.
2016-06-01
theories of the mammalian visual system, and exploiting descriptive text that may accompany a still image for improved inference. The focus of the Brown...test, computer vision, semantic description , street scenes, belief propagation, generative models, nonlinear filtering, sufficient statistics 16...visual system, and exploiting descriptive text that may accompany a still image for improved inference. The focus of the Brown team was on single images
Yang, Hyeri; Na, Jihye; Jang, Won-Hee; Jung, Mi-Sook; Jeon, Jun-Young; Heo, Yong; Yeo, Kyung-Wook; Jo, Ji-Hoon; Lim, Kyung-Min; Bae, SeungJin
2015-05-05
Mouse local lymph node assay (LLNA, OECD TG429) is an alternative test replacing conventional guinea pig tests (OECD TG406) for the skin sensitization test but the use of a radioisotopic agent, (3)H-thymidine, deters its active dissemination. New non-radioisotopic LLNA, LLNA:BrdU-FCM employs a non-radioisotopic analog, 5-bromo-2'-deoxyuridine (BrdU) and flow cytometry. For an analogous method, OECD TG429 performance standard (PS) advises that two reference compounds be tested repeatedly and ECt(threshold) values obtained must fall within acceptable ranges to prove within- and between-laboratory reproducibility. However, this criteria is somewhat arbitrary and sample size of ECt is less than 5, raising concerns about insufficient reliability. Here, we explored various statistical methods to evaluate the reproducibility of LLNA:BrdU-FCM with stimulation index (SI), the raw data for ECt calculation, produced from 3 laboratories. Descriptive statistics along with graphical representation of SI was presented. For inferential statistics, parametric and non-parametric methods were applied to test the reproducibility of SI of a concurrent positive control and the robustness of results were investigated. Descriptive statistics and graphical representation of SI alone could illustrate the within- and between-laboratory reproducibility. Inferential statistics employing parametric and nonparametric methods drew similar conclusion. While all labs passed within- and between-laboratory reproducibility criteria given by OECD TG429 PS based on ECt values, statistical evaluation based on SI values showed that only two labs succeeded in achieving within-laboratory reproducibility. For those two labs that satisfied the within-lab reproducibility, between-laboratory reproducibility could be also attained based on inferential as well as descriptive statistics. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Statistical Analysis of Research Data | Center for Cancer Research
Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. The Statistical Analysis of Research Data (SARD) course will be held on April 5-6, 2018 from 9 a.m.-5 p.m. at the National Institutes of Health's Natcher Conference Center, Balcony C on the Bethesda Campus. SARD is designed to provide an overview on the general principles of statistical analysis of research data. The first day will feature univariate data analysis, including descriptive statistics, probability distributions, one- and two-sample inferential statistics.
[Application of statistics on chronic-diseases-relating observational research papers].
Hong, Zhi-heng; Wang, Ping; Cao, Wei-hua
2012-09-01
To study the application of statistics on Chronic-diseases-relating observational research papers which were recently published in the Chinese Medical Association Magazines, with influential index above 0.5. Using a self-developed criterion, two investigators individually participated in assessing the application of statistics on Chinese Medical Association Magazines, with influential index above 0.5. Different opinions reached an agreement through discussion. A total number of 352 papers from 6 magazines, including the Chinese Journal of Epidemiology, Chinese Journal of Oncology, Chinese Journal of Preventive Medicine, Chinese Journal of Cardiology, Chinese Journal of Internal Medicine and Chinese Journal of Endocrinology and Metabolism, were reviewed. The rate of clear statement on the following contents as: research objectives, t target audience, sample issues, objective inclusion criteria and variable definitions were 99.43%, 98.57%, 95.43%, 92.86% and 96.87%. The correct rates of description on quantitative and qualitative data were 90.94% and 91.46%, respectively. The rates on correctly expressing the results, on statistical inference methods related to quantitative, qualitative data and modeling were 100%, 95.32% and 87.19%, respectively. 89.49% of the conclusions could directly response to the research objectives. However, 69.60% of the papers did not mention the exact names of the study design, statistically, that the papers were using. 11.14% of the papers were in lack of further statement on the exclusion criteria. Percentage of the papers that could clearly explain the sample size estimation only taking up as 5.16%. Only 24.21% of the papers clearly described the variable value assignment. Regarding the introduction on statistical conduction and on database methods, the rate was only 24.15%. 18.75% of the papers did not express the statistical inference methods sufficiently. A quarter of the papers did not use 'standardization' appropriately. As for the aspect of statistical inference, the rate of description on statistical testing prerequisite was only 24.12% while 9.94% papers did not even employ the statistical inferential method that should be used. The main deficiencies on the application of Statistics used in papers related to Chronic-diseases-related observational research were as follows: lack of sample-size determination, variable value assignment description not sufficient, methods on statistics were not introduced clearly or properly, lack of consideration for pre-requisition regarding the use of statistical inferences.
Environmental statistics with S-Plus
DOE Office of Scientific and Technical Information (OSTI.GOV)
Millard, S.P.; Neerchal, N.K.
1999-12-01
The combination of easy-to-use software with easy access to a description of the statistical methods (definitions, concepts, etc.) makes this book an excellent resource. One of the major features of this book is the inclusion of general information on environmental statistical methods and examples of how to implement these methods using the statistical software package S-Plus and the add-in modules Environmental-Stats for S-Plus, S+SpatialStats, and S-Plus for ArcView.
Statistical summaries of New Jersey streamflow records
Laskowski, Stanley L.
1970-01-01
In 1961 the U.S. Geological Survey prepared a report which was published by the State of New Jersey as Water Resources Circular 6, "New Jersey Streamflow Records analyzed with Electronic Computer" by Miller and McCall. Basic discharge data for periods of record through 1958 were analyzed for 59 stream-gaging stations in New Jersey and flow-duration, low-flow, and high-flow tables were presented.The purpose of the current report is to update and expand Circular 6 by presenting, with a few meaningful statistics and tables, the bulk of the information that may be obtained from the mass of streamflow records available. The records for 79 of approximately 110 stream-gaging stations presently or previously operated in New Jersey, plus records for three stations in Pennsylvania, and one in New York are presented in summarized form. In addition to inclusing a great number of stations in this report, more years of record and more tables are listed for each station. A description of the station, three arrangements of data summarizing the daily flow records and one table listing statistics of the monthly mean flows are provided. No data representing instantaneous extreme flows are given. Plotting positions for the three types of curves describing the characteristics of daily discharge are listed for each station. Statistical parameters are also presented so that alternate curves may be drawn.All stations included in this report have 5 or more years of record. The data presented herein are based on observed flow past the gaging station. For any station where the observed flow is affected by regulation or diversion, a "Remarks" paragraph, explaining the possible effect on the data, is included in the station description.Since any streamflow record is a sample in time, the data derived from these records can provide only a guide to expected future flows. For this reason the flow records are analyzed by statistical techniques, and the magnitude of sampling errors should be recognized.These analyzed data will be useful to a large number of municipal, state, and federal agencies, industries, utilities, engineers, and hydrologists concerned with the availability, conservation, control, and use of surface waters. The tabulated data and curves illustrated herein can be used to select sites for water supplies, to determine flood or drought storage requirements, and to appraise the adequacy of flows for dilution of wastes or generation of power. The statistical values presented herein can be used in computer programs available in many universities, Federal and State agencies, and engineering firms for a broad spectrum of research and other studies.
Diagnostics of Thermal Spraying Plasma Jets
NASA Astrophysics Data System (ADS)
Fauchais, P.; Coudert, J. F.; Vardelle, M.; Vardelle, A.; Denoirjean, A.
D.C. thermal plasma jets are strongly affected on the one hand by the arc root fluctuations at the anode, resulting in a type of pulsed flow and enhanced turbulence, and on the other hand by the entrainment of surrounding cold gas in the plasma jet. These phenomena and the resulting temperature distributions have been studied using a wide range of diagnostic techniques including fast cameras, laser doppler anemometry (LDA), coherent anti-Stokes Raman spectroscopy (CARS), Rayleigh scattering, emission spectroscopy, Schlieren photography, enthalpy probes and sampling probes. The information given by these techniques is evaluated and compared. The effect of the arc fluctuations on the spectroscopic measurements is emphasized and the possibility of using these fluctuations to determine informations on the arc behaviour and the axial velocity of the jet is presented. Optimization of plasma processing of solid particules requires information about their size and surface temperature, as well as number flux and velocity distributions at various locations in the flow field. The different statistical techniques of in-flight measurements are discussed together with their limitations. A method to determine the temperature and species density of the vapor cloud or comet travelling with each particule in flight is then presented. However, such statistical measurements present ambiguities in their interpretation, which can be adressed only by additional measurements to determine the velocity, diameter, and surface temperature of a single particule in flight. Moreover, information on single particules is required in order to understand the coating properties, which depend strongly on the way the particules flatten and solidify upon impact. A method to obtain data related to a single particule in flight and to follow the temperature evolution of the corresponding splat upon cooling is presented. The paper concludes with the description of the experimental techniques to follow the temperature evolution of the successive layers and passes. This is important because temperature distribution within coating and substrate controls the adhesion and cohesion of coatings as well as their residual stress.
Saleem, Fahad; Hassali, Mohamed Azmi; Shafie, Asrul Akmal; Atif, Muhammad; Ul Haq, Noman; Aljadhey, Hisham
2012-07-01
This study aims to evaluate association between Health related quality of lifeand disease state knowledge among hypertensive population of Pakistan. A cross sectional descriptive study was undertaken with a representative cohort of hypertension patients. Using prevalence based sampling technique, a total of 385 hypertensive patients were selected from two public hospitals of Quetta city, Pakistan. Hypertension Fact Questionnaire (HFQ) and European Quality of Life scale (EQ-5D) were used for data collection. Statistical Package for the Social Sciences 16.0 was used to compute descriptive analysis of patients' demographic and disease related information. Categorical variables were described as percentages while continuous variables were expressed as mean ± standard deviation (SD). Spearman's rho correlation was used to identify the association between study variables. The mean (SD) age of the patients was 39.02 (6.59), with 68.8% males (n=265). The mean (SD) duration of hypertension was 3.01 (0.93) years. Forty percent (n=154) had bachelor degree with 34.8% (n=134) working in private sector. Almost forty one percent (n=140) had monthly income of more than 15000 Pakistan rupees per month with 75.1% (n=289) having urban residency. The mean EQ-5D descriptive score (0.46±0.28) and EQ-VAS score (63.97±6.62) indicated lower HRQoL in our study participants. Mean knowledge score was 8.03 ± 0.42. Correlation coefficient between HRQoL and knowledge was 0.208 (p< 0.001), indicating a week positive association. Results of this study highlight hypertension knowledge to be weakly associated with HRQoL suggesting that imparting knowledge to patients do not necessarily improve HRQoL. More attention should be given to identify individualized factors affecting HRQoL.
ERIC Educational Resources Information Center
Sinaga, Megawati
2017-01-01
The Objectives of this paper as an experimental research was to investigate the effect of Roundtable and Clustering teaching techniques and students' personal traits on students' achievement in descriptive writing. The students in grade ix of SMP Negeri 2 Pancurbatu 2016/2017 school academic year were chose as the population of this research. The…
Statistical Package User’s Guide.
1980-08-01
261 C. STACH Nonparametric Descriptive Statistics ... ......... ... 265 D. CHIRA Coefficient of Concordance...135 I.- -a - - W 7- Test Data: This program was tested using data from John Neter and William Wasserman, Applied Linear Statistical Models: Regression...length of data file e. new fileý name (not same as raw data file) 5. Printout as optioned for only. Comments: Ranked data are used for program CHIRA
Anger and depression levels of mothers with premature infants in the neonatal intensive care unit.
Kardaşözdemir, Funda; AKGüN Şahin, Zümrüt
2016-02-04
The aim of this study was to examine anger and depression levels of mothers who had a premature infant in the NICU, and all factors affecting the situation. This descriptive study was performed in the level I and II units of NICU at three state hospitals in Turkey. The data was collected with a demographic questionnaire, "Beck Depression Inventory" and "Anger Expression Scale". Descriptive statistics, parametric and nonparametric statistical tests and Pearson correlation were used in the data analysis. Mothers whose infants are under care in NICU have moderate depression. It has also been determined that mothers' educational level, income level and gender of infants were statistically significant (p <0.05). A positive relationship between depression and trait anger scores was found to be statistically significant. A negative relationship existed between depression and anger-control scores for the mothers, which was statistically significant (p <0.05). Due to the results of research, recommended that mothers who are at risk of depression and anger in the NICU evaluated by nurses and these nurses to develop their consulting roles.
Gómez-Escolar Larrañaga, Lucía; Delgado Martínez, Julio; Miguelena Bobadilla, José María
2017-12-01
It has been proved that a breast reconstruction after a mastectomy has a great psycho-social impact on patients. For this reason, it is increasingly done in a greater percentage of cases. There are two major groups of reconstructive techniques: a reconstruction with implants and a reconstruction with autologous tissue of the patient. In order to make a more objective assessment of the results, it is important to know how satisfied these patients are with the results. Therefore, we performed a study using Q-BREAST, the aim of which is to analyze the satisfaction of mastectomized patients according to the different surgical reconstruction techniques. A retrospective, descriptive and observational study of patients reconstructed in our service from 2008 to 2011 was carried out. Patient satisfaction levels were compared according to the surgical technique used in breast reconstruction using the Q-BREAST test, which was mailed to them. There are no statistical differences in the levels of satisfaction in terms of age, type of mastectomy done, coadjutant treatment or existence of complications. Higher levels of satisfaction are observed in patients reconstructed with autologous tissue versus implants (P=.028). Patients reconstructed with autologous tissue have higher levels of satisfaction than those reconstructed with implants. Copyright © 2017 AEC. Publicado por Elsevier España, S.L.U. All rights reserved.
Alhadlaq, Adel; Alkhadra, Thamer; El-Bialy, Tarek
2016-05-01
To compare anchorage condition in cases in which transpalatal arch was used to enhance anchorage in both continuous and segmented arch techniques. Twenty cases that required first premolar extraction for orthodontic treatment and transpalatal arch to enhance anchorage were included in this study. Ten cases were treated using the continuous arch technique, while the other 10 cases were treated using 0.019 × 0.025-inch TMA T-loops with posterior anchorage bend according to the Burstone and Marcotte description. Lateral cephalometric analysis of before and after canine retraction was performed using Ricketts analysis to measure the anteroposterior position of the upper first molar to the vertical line from the Pt point. Data were analyzed using an independent sample t-test. There was a statistically significant forward movement of the upper first molar in cases treated by continuous arch mechanics (4.5 ± 3.0 mm) compared with segmented arch mechanics (-0.7 ± 1.4 mm; P = .01). The posterior anchorage bend to T-loop used to retract the maxillary canine can enhance anchorage during maxillary canine retraction.
Regression analysis for solving diagnosis problem of children's health
NASA Astrophysics Data System (ADS)
Cherkashina, Yu A.; Gerget, O. M.
2016-04-01
The paper includes results of scientific researches. These researches are devoted to the application of statistical techniques, namely, regression analysis, to assess the health status of children in the neonatal period based on medical data (hemostatic parameters, parameters of blood tests, the gestational age, vascular-endothelial growth factor) measured at 3-5 days of children's life. In this paper a detailed description of the studied medical data is given. A binary logistic regression procedure is discussed in the paper. Basic results of the research are presented. A classification table of predicted values and factual observed values is shown, the overall percentage of correct recognition is determined. Regression equation coefficients are calculated, the general regression equation is written based on them. Based on the results of logistic regression, ROC analysis was performed, sensitivity and specificity of the model are calculated and ROC curves are constructed. These mathematical techniques allow carrying out diagnostics of health of children providing a high quality of recognition. The results make a significant contribution to the development of evidence-based medicine and have a high practical importance in the professional activity of the author.
De Sousa Fontes, Aderito; Sandrea Jiménez, Minaret; Chacaltana Ayerve, Rosa R
2013-01-01
The microdebrider is a surgical tool which has been used successfully in many endoscopic surgical procedures in otolaryngology. In this study, we analysed our experience using this powered instrument in the resection of obstructive nasal septum deviations. This was a longitudinal, prospective, descriptive study conducted between January and June 2007 on 141 patients who consulted for chronic nasal obstruction caused by a septal deviation or deformity and underwent powered endoscopic septoplasty (PES). The mean age was 39.9 years (15-63 years); 60.28% were male (n=85) The change in nasal symptom severity decreased after surgery from 6.12 (preoperative) to 2.01 (postoperative). Patients undergoing PES had a significant reduction of nasal symptoms in the pre- and postoperative period, which was statistically significant (P<.05). There were no statistically significant differences between the results at the 2 nd week, 6th week and 5th year after surgery. The 100% of patients were satisfied with the results of surgery and no patient answered "No" to the question added to compare patient satisfaction after surgery. Minor complications in the postoperative period were present in 4.96% of the cases. Powered endoscopic septoplasty allows accurate, conservative repair of obstructive nasal septum deviations, with fewer complications and better functional results. In our experience, this technique offered significant perioperative advantages with high postoperative patient satisfaction in terms of reducing the severity of nasal symptoms. Copyright © 2012 Elsevier España, S.L. All rights reserved.
Statistics of high-level scene context.
Greene, Michelle R
2013-01-01
CONTEXT IS CRITICAL FOR RECOGNIZING ENVIRONMENTS AND FOR SEARCHING FOR OBJECTS WITHIN THEM: contextual associations have been shown to modulate reaction time and object recognition accuracy, as well as influence the distribution of eye movements and patterns of brain activations. However, we have not yet systematically quantified the relationships between objects and their scene environments. Here I seek to fill this gap by providing descriptive statistics of object-scene relationships. A total of 48, 167 objects were hand-labeled in 3499 scenes using the LabelMe tool (Russell et al., 2008). From these data, I computed a variety of descriptive statistics at three different levels of analysis: the ensemble statistics that describe the density and spatial distribution of unnamed "things" in the scene; the bag of words level where scenes are described by the list of objects contained within them; and the structural level where the spatial distribution and relationships between the objects are measured. The utility of each level of description for scene categorization was assessed through the use of linear classifiers, and the plausibility of each level for modeling human scene categorization is discussed. Of the three levels, ensemble statistics were found to be the most informative (per feature), and also best explained human patterns of categorization errors. Although a bag of words classifier had similar performance to human observers, it had a markedly different pattern of errors. However, certain objects are more useful than others, and ceiling classification performance could be achieved using only the 64 most informative objects. As object location tends not to vary as a function of category, structural information provided little additional information. Additionally, these data provide valuable information on natural scene redundancy that can be exploited for machine vision, and can help the visual cognition community to design experiments guided by statistics rather than intuition.
Lyssenko, Nathalie; Redies, Christoph; Hayn-Leichsenring, Gregor U
2016-01-01
One of the major challenges in experimental aesthetics is the uncertainty of the terminology used in experiments. In this study, we recorded terms that are spontaneously used by participants to describe abstract artworks and studied their relation to the second-order statistical image properties of the same artworks (Experiment 1). We found that the usage frequency of some structure-describing terms correlates with statistical image properties, such as PHOG Self-Similarity, Anisotropy and Complexity. Additionally, emotion-associated terms correlate with measured color values. Next, based on the most frequently used terms, we created five different rating scales (Experiment 2) and obtained ratings of participants for the abstract paintings on these scales. We found significant correlations between descriptive score ratings (e.g., between structure and subjective complexity), between evaluative and descriptive score ratings (e.g., between preference and subjective complexity/interest) and between descriptive score ratings and statistical image properties (e.g., between interest and PHOG Self-Similarity, Complexity and Anisotropy). Additionally, we determined the participants' personality traits as described in the 'Big Five Inventory' (Goldberg, 1990; Rammstedt and John, 2005) and correlated them with the ratings and preferences of individual participants. Participants with higher scores for Neuroticism showed preferences for objectively more complex images, as well as a different notion of the term complex when compared with participants with lower scores for Neuroticism. In conclusion, this study demonstrates an association between objectively measured image properties and the subjective terms that participants use to describe or evaluate abstract artworks. Moreover, our results suggest that the description of abstract artworks, their evaluation and the preference of participants for their low-level statistical properties are linked to personality traits.
Lyssenko, Nathalie; Redies, Christoph; Hayn-Leichsenring, Gregor U.
2016-01-01
One of the major challenges in experimental aesthetics is the uncertainty of the terminology used in experiments. In this study, we recorded terms that are spontaneously used by participants to describe abstract artworks and studied their relation to the second-order statistical image properties of the same artworks (Experiment 1). We found that the usage frequency of some structure-describing terms correlates with statistical image properties, such as PHOG Self-Similarity, Anisotropy and Complexity. Additionally, emotion-associated terms correlate with measured color values. Next, based on the most frequently used terms, we created five different rating scales (Experiment 2) and obtained ratings of participants for the abstract paintings on these scales. We found significant correlations between descriptive score ratings (e.g., between structure and subjective complexity), between evaluative and descriptive score ratings (e.g., between preference and subjective complexity/interest) and between descriptive score ratings and statistical image properties (e.g., between interest and PHOG Self-Similarity, Complexity and Anisotropy). Additionally, we determined the participants’ personality traits as described in the ‘Big Five Inventory’ (Goldberg, 1990; Rammstedt and John, 2005) and correlated them with the ratings and preferences of individual participants. Participants with higher scores for Neuroticism showed preferences for objectively more complex images, as well as a different notion of the term complex when compared with participants with lower scores for Neuroticism. In conclusion, this study demonstrates an association between objectively measured image properties and the subjective terms that participants use to describe or evaluate abstract artworks. Moreover, our results suggest that the description of abstract artworks, their evaluation and the preference of participants for their low-level statistical properties are linked to personality traits. PMID:27445933
Targeted Muscle Reinnervation for Transradial Amputation: Description of Operative Technique.
Morgan, Emily N; Kyle Potter, Benjamin; Souza, Jason M; Tintle, Scott M; Nanos, George P
2016-12-01
Targeted muscle reinnervation (TMR) is a revolutionary surgical technique that, together with advances in upper extremity prostheses and advanced neuromuscular pattern recognition, allows intuitive and coordinated control in multiple planes of motion for shoulder disarticulation and transhumeral amputees. TMR also may provide improvement in neuroma-related pain and may represent an opportunity for sensory reinnervation as advances in prostheses and haptic feedback progress. Although most commonly utilized following shoulder disarticulation and transhumeral amputations, TMR techniques also represent an exciting opportunity for improvement in integrated prosthesis control and neuroma-related pain improvement in patients with transradial amputations. As there are no detailed descriptions of this technique in the literature to date, we provide our surgical technique for TMR in transradial amputations.
Feasibility study consisting of a review of contour generation methods from stereograms
NASA Technical Reports Server (NTRS)
Kim, C. J.; Wyant, J. C.
1980-01-01
A review of techniques for obtaining contour information from stereo pairs is given. Photogrammetric principles including a description of stereoscopic vision are presented. The use of conventional contour generation methods, such as the photogrammetric plotting technique, electronic correlator, and digital correlator are described. Coherent optical techniques for contour generation are discussed and compared to the electronic correlator. The optical techniques are divided into two categories: (1) image plane operation and (2) frequency plane operation. The description of image plane correlators are further divided into three categories: (1) image to image correlator, (2) interferometric correlator, and (3) positive negative transparencies. The frequency plane correlators are divided into two categories: (1) correlation of Fourier transforms, and (2) filtering techniques.
Current practice in airway management: A descriptive evaluation.
Kjonegaard, Rebecca; Fields, Willa; King, Major L
2010-03-01
Ventilator-associated pneumonia, a common complication of mechanical ventilation, could be reduced if health care workers implemented evidence-based practices that decrease the risk for this complication. To determine current practice and differences in practices between registered nurses and respiratory therapists in managing patients receiving mechanical ventilation. A descriptive comparative design was used. A convenience sample of 41 registered nurses and 25 respiratory therapists who manage critical care patients treated with mechanical ventilation at Sharp Grossmont Hospital, La Mesa, California, completed a survey on suctioning techniques and airway management practices. Descriptive and inferential statistics were used to analyze the data. Significant differences existed between nurses and respiratory therapists for hyperoxygenation before suctioning (P =.03). In the 2 groups, nurses used the ventilator for hyper-oxygenation more often, and respiratory therapists used a bag-valve device more often (P =.03). Respiratory therapists instilled saline (P <.001) and rinsed the closed system with saline after suctioning (P =.003) more often than nurses did. Nurses suctioned oral secretions (P <.001) and the nose of orally intubated patients (P =.01), brushed patients' teeth with a toothbrush (P<.001), and used oral swabs to clean the mouth (P <.001) more frequently than respiratory therapists did. Nurses and respiratory therapists differed significantly in the management of patients receiving mechanical ventilation. To reduce the risk of ventilator-associated pneumonia, both nurses and respiratory therapists must be consistent in using best practices when managing patients treated with mechanical ventilation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Borovsky, Joseph E; Cayton, Thomas E; Denton, Michael H
Electron flux measurements from 7 satellites in geosynchronous orbit from 1990-2007 are fit with relativistic bi-Maxwellians, yielding a number density n and temperature T description of the outer electron radiation belt. For 54.5 spacecraft years of measurements the median value ofn is 3.7x10-4 cm-3 and the median value ofT is 142 keY. General statistical properties of n, T, and the 1.1-1.5 MeV flux J are investigated, including local-time and solar-cycle dependencies. Using superposed-epoch analysis triggered on storm onset, the evolution of the outer electron radiation belt through high-speed-steam-driven storms is investigated. The number density decay during the calm before themore » storm is seen, relativistic-electron dropouts and recoveries from dropout are investigated, and the heating of the outer electron radiation belt during storms is examined. Using four different triggers (SSCs, southward-IMF CME sheaths, southward-IMF magnetic clouds, and minimum Dst), CME-driven storms are analyzed with superposed-epoch techniques. For CME-driven storms an absence of a density decay prior to storm onset is found, the compression of the outer electron radiation belt at time of SSC is analyzed, the number-density increase and temperature decrease during storm main phase is seen, and the increase in density and temperature during storm recovery phase is observed. Differences are found between the density-temperature and the flux descriptions, with more information for analysis being available in the density-temperature description.« less
Molecular filter based planar Doppler velocimetry
NASA Astrophysics Data System (ADS)
Elliott, Gregory S.; Beutner, Thomas J.
1999-11-01
Molecular filter based diagnostics are continuing to gain popularity as a research tool for investigations in areas of aerodynamics, fluid mechanics, and combustion. This class of diagnostics has gone by many terms including Filtered Rayleigh Scattering, Doppler Global Velocimetry, and Planar Doppler Velocimetry. The majority of this article reviews recent advances in Planar Doppler Velocimetry in measuring up to three velocity components over a planar region in a flowfield. The history of the development of these techniques is given with a description of typical systems, components, and levels of uncertainty in the measurement. Current trends indicate that uncertainties on the order of 1 m/s are possible with these techniques. A comprehensive review is also given on the application of Planar Doppler Velocimetry to laboratory flows, supersonic flows, and large scale subsonic wind tunnels. The article concludes with a description of future trends, which may simplify the technique, followed by a description of techniques which allow multi-property measurements (i.e. velocity, density, temperature, and pressure) simultaneously.
2011-01-01
Background Improvements in the techniques for metabolomics analyses and growing interest in metabolomic approaches are resulting in the generation of increasing numbers of metabolomic profiles. Platforms are required for profile management, as a function of experimental design, and for metabolite identification, to facilitate the mining of the corresponding data. Various databases have been created, including organism-specific knowledgebases and analytical technique-specific spectral databases. However, there is currently no platform meeting the requirements for both profile management and metabolite identification for nuclear magnetic resonance (NMR) experiments. Description MeRy-B, the first platform for plant 1H-NMR metabolomic profiles, is designed (i) to provide a knowledgebase of curated plant profiles and metabolites obtained by NMR, together with the corresponding experimental and analytical metadata, (ii) for queries and visualization of the data, (iii) to discriminate between profiles with spectrum visualization tools and statistical analysis, (iv) to facilitate compound identification. It contains lists of plant metabolites and unknown compounds, with information about experimental conditions, the factors studied and metabolite concentrations for several plant species, compiled from more than one thousand annotated NMR profiles for various organs or tissues. Conclusion MeRy-B manages all the data generated by NMR-based plant metabolomics experiments, from description of the biological source to identification of the metabolites and determinations of their concentrations. It is the first database allowing the display and overlay of NMR metabolomic profiles selected through queries on data or metadata. MeRy-B is available from http://www.cbib.u-bordeaux2.fr/MERYB/index.php. PMID:21668943
Skin tumor area extraction using an improved dynamic programming approach.
Abbas, Qaisar; Celebi, M E; Fondón García, Irene
2012-05-01
Border (B) description of melanoma and other pigmented skin lesions is one of the most important tasks for the clinical diagnosis of dermoscopy images using the ABCD rule. For an accurate description of the border, there must be an effective skin tumor area extraction (STAE) method. However, this task is complicated due to uneven illumination, artifacts present in the lesions and smooth areas or fuzzy borders of the desired regions. In this paper, a novel STAE algorithm based on improved dynamic programming (IDP) is presented. The STAE technique consists of the following four steps: color space transform, pre-processing, rough tumor area detection and refinement of the segmented area. The procedure is performed in the CIE L(*) a(*) b(*) color space, which is approximately uniform and is therefore related to dermatologist's perception. After pre-processing the skin lesions to reduce artifacts, the DP algorithm is improved by introducing a local cost function, which is based on color and texture weights. The STAE method is tested on a total of 100 dermoscopic images. In order to compare the performance of STAE with other state-of-the-art algorithms, various statistical measures based on dermatologist-drawn borders are utilized as a ground truth. The proposed method outperforms the others with a sensitivity of 96.64%, a specificity of 98.14% and an error probability of 5.23%. The results demonstrate that this STAE method by IDP is an effective solution when compared with other state-of-the-art segmentation techniques. The proposed method can accurately extract tumor borders in dermoscopy images. © 2011 John Wiley & Sons A/S.
Antweiler, Ronald C.; Taylor, Howard E.
2008-01-01
The main classes of statistical treatment of below-detection limit (left-censored) environmental data for the determination of basic statistics that have been used in the literature are substitution methods, maximum likelihood, regression on order statistics (ROS), and nonparametric techniques. These treatments, along with using all instrument-generated data (even those below detection), were evaluated by examining data sets in which the true values of the censored data were known. It was found that for data sets with less than 70% censored data, the best technique overall for determination of summary statistics was the nonparametric Kaplan-Meier technique. ROS and the two substitution methods of assigning one-half the detection limit value to censored data or assigning a random number between zero and the detection limit to censored data were adequate alternatives. The use of these two substitution methods, however, requires a thorough understanding of how the laboratory censored the data. The technique of employing all instrument-generated data - including numbers below the detection limit - was found to be less adequate than the above techniques. At high degrees of censoring (greater than 70% censored data), no technique provided good estimates of summary statistics. Maximum likelihood techniques were found to be far inferior to all other treatments except substituting zero or the detection limit value to censored data.
Statistical analysis of vehicle crashes in Mississippi based on crash data from 2010 to 2014.
DOT National Transportation Integrated Search
2017-08-15
Traffic crash data from 2010 to 2014 were collected by Mississippi Department of Transportation (MDOT) and extracted for the study. Three tasks were conducted in this study: (1) geographic distribution of crashes; (2) descriptive statistics of crash ...
Using Carbon Emissions Data to "Heat Up" Descriptive Statistics
ERIC Educational Resources Information Center
Brooks, Robert
2012-01-01
This article illustrates using carbon emissions data in an introductory statistics assignment. The carbon emissions data has desirable characteristics including: choice of measure; skewness; and outliers. These complexities allow research and public policy debate to be introduced. (Contains 4 figures and 2 tables.)
Statistical mechanics of economics I
NASA Astrophysics Data System (ADS)
Kusmartsev, F. V.
2011-02-01
We show that statistical mechanics is useful in the description of financial crisis and economics. Taking a large amount of instant snapshots of a market over an interval of time we construct their ensembles and study their statistical interference. This results in a probability description of the market and gives capital, money, income, wealth and debt distributions, which in the most cases takes the form of the Bose-Einstein distribution. In addition, statistical mechanics provides the main market equations and laws which govern the correlations between the amount of money, debt, product, prices and number of retailers. We applied the found relations to a study of the evolution of the economics in USA between the years 1996 to 2008 and observe that over that time the income of a major population is well described by the Bose-Einstein distribution which parameters are different for each year. Each financial crisis corresponds to a peak in the absolute activity coefficient. The analysis correctly indicates the past crises and predicts the future one.
How Good Are Statistical Models at Approximating Complex Fitness Landscapes?
du Plessis, Louis; Leventhal, Gabriel E.; Bonhoeffer, Sebastian
2016-01-01
Fitness landscapes determine the course of adaptation by constraining and shaping evolutionary trajectories. Knowledge of the structure of a fitness landscape can thus predict evolutionary outcomes. Empirical fitness landscapes, however, have so far only offered limited insight into real-world questions, as the high dimensionality of sequence spaces makes it impossible to exhaustively measure the fitness of all variants of biologically meaningful sequences. We must therefore revert to statistical descriptions of fitness landscapes that are based on a sparse sample of fitness measurements. It remains unclear, however, how much data are required for such statistical descriptions to be useful. Here, we assess the ability of regression models accounting for single and pairwise mutations to correctly approximate a complex quasi-empirical fitness landscape. We compare approximations based on various sampling regimes of an RNA landscape and find that the sampling regime strongly influences the quality of the regression. On the one hand it is generally impossible to generate sufficient samples to achieve a good approximation of the complete fitness landscape, and on the other hand systematic sampling schemes can only provide a good description of the immediate neighborhood of a sequence of interest. Nevertheless, we obtain a remarkably good and unbiased fit to the local landscape when using sequences from a population that has evolved under strong selection. Thus, current statistical methods can provide a good approximation to the landscape of naturally evolving populations. PMID:27189564
Attitude towards Pre-Marital Genetic Screening among Students of Osun State Polytechnics in Nigeria
ERIC Educational Resources Information Center
Odelola, J. O.; Adisa, O.; Akintaro, O. A.
2013-01-01
This study investigated the attitude towards pre-marital genetic screening among students of Osun State Polytechnics. Descriptive survey design was used for the study. The instrument for data collection was self developed and structured questionnaire in four-point likert scale format. Descriptive statistics of frequency count and percentages were…
Basic School Teachers' Perceptions about Curriculum Design in Ghana
ERIC Educational Resources Information Center
Abudu, Amadu Musah; Mensah, Mary Afi
2016-01-01
This study focused on teachers' perceptions about curriculum design and barriers to their participation. The sample size was 130 teachers who responded to a questionnaire. The analyses made use of descriptive statistics and descriptions. The study found that the level of teachers' participation in curriculum design is low. The results further…
Descriptive and dynamic psychiatry: a perspective on DSM-III.
Frances, A; Cooper, A M
1981-09-01
The APA Task Force on Nomenclature and Statistics attempted to make DSM-III a descriptive nosology that is atheoretical in regard to etiology. The authors believe that a sharp polarity between morphological classification and explanatory formulation is artificial and misleading, and they critically review DSM-III from a psychodynamic perspective. They compare and contrast the descriptive orientation in psychiatry with the psychodynamic orientation and conclude that the two approaches overlap, that they are complementary and necessary to each other, and that there is a descriptive data base underlying dynamic psychiatry which may be usefully included in future nomenclatures.
Constante, Isa Geralda Teixeira; Davidowicz, Harry; Barletta, Fernando Branco; de Moura, Abilio Albuquerque Maranhão
2007-01-01
The purpose of this study was to compare, in vitro, by means of computerized analysis of digital radiographic images, the anatomic alterations produced in the mandibular molar tooth dentinal walls of mesiobucal canals with severe curvature by three different endodontic techniques: Progressive Preparation, Staged and Serial Preparation. A selection was made of 45 extracted, human, mandibular molars, with root curvatures greater than 25 degrees. They were divided into three groups for every technique studied, which were then sub-divided into three sub-groups in accordance with the position of the curvature along the root: cervical, median or apical. After access surgery and tooth length determination, the canals were filled with 100% Barium Sulphate radiological contrast and the teeth were then radiographed with a direct digital radiography system, using a special apparatus capable of keeping the samples in the same spatial position during the different radiographic takes. After the above-mentioned endodontic techniques had been performed, the teeth were again filled with Barium sulphate and were also radiographed under the same previously mentioned conditions. The pre- and post-operative digital images were then analyzed in two computerized programs, AutoCAD 2004 and CorelDraw 10, to assess, respectively, the areas and the horizontal alterations which occurred in the internal and external walls of the root canals. The results indicated that although no significant differences among the techniques were shown in the statistical analysis, in a descriptive analysis the Progressive Preparation technique was shown to be more regular, uniform and effective.
Asano, Masanari; Basieva, Irina; Khrennikov, Andrei; Ohya, Masanori; Tanaka, Yoshiharu; Yamato, Ichiro
2012-06-01
We developed a quantum-like model describing the gene regulation of glucose/lactose metabolism in a bacterium, Escherichia coli. Our quantum-like model can be considered as a kind of the operational formalism for microbiology and genetics. Instead of trying to describe processes in a cell in the very detail, we propose a formal operator description. Such a description may be very useful in situation in which the detailed description of processes is impossible or extremely complicated. We analyze statistical data obtained from experiments, and we compute the degree of E. coli's preference within adaptive dynamics. It is known that there are several types of E. coli characterized by the metabolic system. We demonstrate that the same type of E. coli can be described by the well determined operators; we find invariant operator quantities characterizing each type. Such invariant quantities can be calculated from the obtained statistical data.
Interactive application of quadratic expansion of chi-square statistic to nonlinear curve fitting
NASA Technical Reports Server (NTRS)
Badavi, F. F.; Everhart, Joel L.
1987-01-01
This report contains a detailed theoretical description of an all-purpose, interactive curve-fitting routine that is based on P. R. Bevington's description of the quadratic expansion of the Chi-Square statistic. The method is implemented in the associated interactive, graphics-based computer program. Taylor's expansion of Chi-Square is first introduced, and justifications for retaining only the first term are presented. From the expansion, a set of n simultaneous linear equations is derived, then solved by matrix algebra. A brief description of the code is presented along with a limited number of changes that are required to customize the program of a particular task. To evaluate the performance of the method and the goodness of nonlinear curve fitting, two typical engineering problems are examined and the graphical and tabular output of each is discussed. A complete listing of the entire package is included as an appendix.
Definition of a near real time microbiological monitor for space vehicles
NASA Technical Reports Server (NTRS)
Kilgore, Melvin V., Jr.; Zahorchak, Robert J.; Arendale, William F.
1989-01-01
Efforts to identify the ideal candidate to serve as the biological monitor on the space station Freedom are discussed. The literature review, the evaluation scheme, descriptions of candidate monitors, experimental studies, test beds, and culture techniques are discussed. Particular attention is given to descriptions of five candidate monitors or monitoring techniques: laser light scattering, primary fluorescence, secondary fluorescence, the volatile product detector, and the surface acoustic wave detector.
CAPSAS: Computer Assisted Program for the Selection of Appropriate Statistics.
ERIC Educational Resources Information Center
Shermis, Mark D.; Albert, Susan L.
A computer-assisted program has been developed for the selection of statistics or statistical techniques by both students and researchers. Based on Andrews, Klem, Davidson, O'Malley and Rodgers "A Guide for Selecting Statistical Techniques for Analyzing Social Science Data," this FORTRAN-compiled interactive computer program was…
SEP thrust subsystem performance sensitivity analysis
NASA Technical Reports Server (NTRS)
Atkins, K. L.; Sauer, C. G., Jr.; Kerrisk, D. J.
1973-01-01
This is a two-part report on solar electric propulsion (SEP) performance sensitivity analysis. The first part describes the preliminary analysis of the SEP thrust system performance for an Encke rendezvous mission. A detailed description of thrust subsystem hardware tolerances on mission performance is included together with nominal spacecraft parameters based on these tolerances. The second part describes the method of analysis and graphical techniques used in generating the data for Part 1. Included is a description of both the trajectory program used and the additional software developed for this analysis. Part 2 also includes a comprehensive description of the use of the graphical techniques employed in this performance analysis.
NASA Technical Reports Server (NTRS)
Park, Steve
1990-01-01
A large and diverse number of computational techniques are routinely used to process and analyze remotely sensed data. These techniques include: univariate statistics; multivariate statistics; principal component analysis; pattern recognition and classification; other multivariate techniques; geometric correction; registration and resampling; radiometric correction; enhancement; restoration; Fourier analysis; and filtering. Each of these techniques will be considered, in order.
Seeing is believing: on the use of image databases for visually exploring plant organelle dynamics.
Mano, Shoji; Miwa, Tomoki; Nishikawa, Shuh-ichi; Mimura, Tetsuro; Nishimura, Mikio
2009-12-01
Organelle dynamics vary dramatically depending on cell type, developmental stage and environmental stimuli, so that various parameters, such as size, number and behavior, are required for the description of the dynamics of each organelle. Imaging techniques are superior to other techniques for describing organelle dynamics because these parameters are visually exhibited. Therefore, as the results can be seen immediately, investigators can more easily grasp organelle dynamics. At present, imaging techniques are emerging as fundamental tools in plant organelle research, and the development of new methodologies to visualize organelles and the improvement of analytical tools and equipment have allowed the large-scale generation of image and movie data. Accordingly, image databases that accumulate information on organelle dynamics are an increasingly indispensable part of modern plant organelle research. In addition, image databases are potentially rich data sources for computational analyses, as image and movie data reposited in the databases contain valuable and significant information, such as size, number, length and velocity. Computational analytical tools support image-based data mining, such as segmentation, quantification and statistical analyses, to extract biologically meaningful information from each database and combine them to construct models. In this review, we outline the image databases that are dedicated to plant organelle research and present their potential as resources for image-based computational analyses.
Incorporating principal component analysis into air quality ...
The efficacy of standard air quality model evaluation techniques is becoming compromised as the simulation periods continue to lengthen in response to ever increasing computing capacity. Accordingly, the purpose of this paper is to demonstrate a statistical approach called Principal Component Analysis (PCA) with the intent of motivating its use by the evaluation community. One of the main objectives of PCA is to identify, through data reduction, the recurring and independent modes of variations (or signals) within a very large dataset, thereby summarizing the essential information of that dataset so that meaningful and descriptive conclusions can be made. In this demonstration, PCA is applied to a simple evaluation metric – the model bias associated with EPA's Community Multi-scale Air Quality (CMAQ) model when compared to weekly observations of sulfate (SO42−) and ammonium (NH4+) ambient air concentrations measured by the Clean Air Status and Trends Network (CASTNet). The advantages of using this technique are demonstrated as it identifies strong and systematic patterns of CMAQ model bias across a myriad of spatial and temporal scales that are neither constrained to geopolitical boundaries nor monthly/seasonal time periods (a limitation of many current studies). The technique also identifies locations (station–grid cell pairs) that are used as indicators for a more thorough diagnostic evaluation thereby hastening and facilitating understanding of the prob
Elderly quality of life impacted by traditional chinese medicine techniques
Figueira, Helena A; Figueira, Olivia A; Figueira, Alan A; Figueira, Joana A; Giani, Tania S; Dantas, Estélio HM
2010-01-01
Background: The shift in age structure is having a profound impact, suggesting that the aged should be consulted as reporters on the quality of their own lives. Objectives: The aim of this research was to establish the possible impact of traditional Chinese medicine (TCM) techniques on the quality of life (QOL) of the elderly. Sample: Two non-selected, volunteer groups of Rio de Janeiro municipality inhabitants: a control group (36 individuals), not using TCM, and an experimental group (28 individuals), using TCM at ABACO/Sohaku-in Institute, Brazil. Methods: A questionnaire on elderly QOL devised by the World Health Organization, the WHOQOL-Old, was adopted and descriptive statistical techniques were used: mean and standard deviation. The Shapiro–Wilk test checked the normality of the distribution. Furthermore, based on its normality distribution for the intergroup comparison, the Student t test was applied to facets 2, 4, 5, 6, and total score, and the Mann–Whitney U rank test to facets 1 and 3, both tests aiming to analyze the P value between experimental and control groups. The significance level utilized was 95% (P < 0.05). Results: The experimental group reported the highest QOL for every facet and the total score. Conclusions: The results suggest that TCM raises the level of QOL. PMID:21103400
The discovery of indicator variables for QSAR using inductive logic programming
NASA Astrophysics Data System (ADS)
King, Ross D.; Srinivasan, Ashwin
1997-11-01
A central problem in forming accurate regression equations in QSAR studies isthe selection of appropriate descriptors for the compounds under study. Wedescribe a novel procedure for using inductive logic programming (ILP) todiscover new indicator variables (attributes) for QSAR problems, and show thatthese improve the accuracy of the derived regression equations. ILP techniqueshave previously been shown to work well on drug design problems where thereis a large structural component or where clear comprehensible rules arerequired. However, ILP techniques have had the disadvantage of only being ableto make qualitative predictions (e.g. active, inactive) and not to predictreal numbers (regression). We unify ILP and linear regression techniques togive a QSAR method that has the strength of ILP at describing stericstructure, with the familiarity and power of linear regression. We evaluatedthe utility of this new QSAR technique by examining the prediction ofbiological activity with and without the addition of new structural indicatorvariables formed by ILP. In three out of five datasets examined the additionof ILP variables produced statistically better results (P < 0.01) over theoriginal description. The new ILP variables did not increase the overallcomplexity of the derived QSAR equations and added insight into possiblemechanisms of action. We conclude that ILP can aid in the process of drugdesign.
Rees, Terry F.
1990-01-01
Colloidal materials, dispersed phases with dimensions between 0.001 and 1 μm, are potential transport media for a variety of contaminants in surface and ground water. Characterization of these colloids, and identification of the parameters that control their movement, are necessary before transport simulations can be attempted. Two techniques that can be used to determine the particle-size distribution of colloidal materials suspended in natural waters are compared. Photon correlation Spectroscopy (PCS) utilizes the Doppler frequency shift of photons scattered off particles undergoing Brownian motion to determine the size of colloids suspended in water. Photosedimentation analysis (PSA) measures the time-dependent change in optical density of a suspension of colloidal particles undergoing centrifugation. A description of both techniques, important underlying assumptions, and limitations are given. Results for a series of river water samples show that the colloid-size distribution means are statistically identical as determined by both techniques. This also is true of the mass median diameter (MMD), even though MMD values determined by PSA are consistently smaller than those determined by PCS. Because of this small negative bias, the skew parameters for the distributions are generally smaller for the PCS-determined distributions than for the PSA-determined distributions. Smaller polydispersity indices for the distributions are also determined by PCS.
Hatam, Nahid; Kafashi, Shahnaz; Kavosi, Zahra
2015-07-01
The importance of health indicators in the recent years has created challenges in resource allocation. Balanced and fair distribution of health resources is one of the main principles in achieving equity. The goal of this cross-sectional descriptive study, conducted in 2010, was to classify health structural indicators in the Fars province using the scalogram technique. Health structural indicators were selected and classified in three categories; namely institutional, human resources, and rural health. The data were obtained from the statistical yearbook of Iran and was analyzed according to the scalogram technique. The distribution map of the Fars province was drawn using ArcGIS (geographic information system). The results showed an interesting health structural indicator map across the province. Our findings revealed that the city of Mohr with 85 and Zarindasht with 36 had the highest and the lowest scores, respectively. This information is valuable to provincial health policymakers to plan appropriately based on factual data and minimize chaos in allocating health resources. Based on such data and reflecting on the local needs, one could develop equity based resource allocation policies and prevent inequality. It is concluded that, as top priority, the provincial policymakers should place dedicated deprivation programs for Farashband, Eghlid and Zaindasht regions.
Quasi-Monochromatic Visual Environments and the Resting Point of Accommodation
1988-01-01
accommodation. No statistically significant differences were revealed to support the possibility of color mediated differential regression to resting...discussed with respect to the general findings of the total sample as well as the specific behavior of individual participants. The summarized statistics ...remaining ten varied considerably with respect to the averaged trends reported in the above descriptive statistics as well as with respect to precision
Physics-based statistical learning approach to mesoscopic model selection.
Taverniers, Søren; Haut, Terry S; Barros, Kipton; Alexander, Francis J; Lookman, Turab
2015-11-01
In materials science and many other research areas, models are frequently inferred without considering their generalization to unseen data. We apply statistical learning using cross-validation to obtain an optimally predictive coarse-grained description of a two-dimensional kinetic nearest-neighbor Ising model with Glauber dynamics (GD) based on the stochastic Ginzburg-Landau equation (sGLE). The latter is learned from GD "training" data using a log-likelihood analysis, and its predictive ability for various complexities of the model is tested on GD "test" data independent of the data used to train the model on. Using two different error metrics, we perform a detailed analysis of the error between magnetization time trajectories simulated using the learned sGLE coarse-grained description and those obtained using the GD model. We show that both for equilibrium and out-of-equilibrium GD training trajectories, the standard phenomenological description using a quartic free energy does not always yield the most predictive coarse-grained model. Moreover, increasing the amount of training data can shift the optimal model complexity to higher values. Our results are promising in that they pave the way for the use of statistical learning as a general tool for materials modeling and discovery.
Performing Inferential Statistics Prior to Data Collection
ERIC Educational Resources Information Center
Trafimow, David; MacDonald, Justin A.
2017-01-01
Typically, in education and psychology research, the investigator collects data and subsequently performs descriptive and inferential statistics. For example, a researcher might compute group means and use the null hypothesis significance testing procedure to draw conclusions about the populations from which the groups were drawn. We propose an…
Inside Rural Pennsylvania: A Statistical Profile.
ERIC Educational Resources Information Center
Center for Rural Pennsylvania, Harrisburg.
Graphs, data tables, maps, and written descriptions give a statistical overview of rural Pennsylvania. A section on rural demographics covers population changes, racial and ethnic makeup, age cohorts, and families and income. Pennsylvania's rural population, the nation's largest, has increased more than its urban population since 1950, with the…
Education Statistics Quarterly, Summer 2002.
ERIC Educational Resources Information Center
Dillow, Sally, Ed.
2002-01-01
This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications, data products, and funding opportunities developed over a 3-month period. Each issue also contains a message…
Education Statistics Quarterly, Spring 2002.
ERIC Educational Resources Information Center
Dillow, Sally, Ed.
2002-01-01
This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications, data products, and funding opportunities developed over a 3-month period. Each issue also contains a message…
From creation and annihilation operators to statistics
NASA Astrophysics Data System (ADS)
Hoyuelos, M.
2018-01-01
A procedure to derive the partition function of non-interacting particles with exotic or intermediate statistics is presented. The partition function is directly related to the associated creation and annihilation operators that obey some specific commutation or anti-commutation relations. The cases of Gentile statistics, quons, Polychronakos statistics, and ewkons are considered. Ewkons statistics was recently derived from the assumption of free diffusion in energy space (Hoyuelos and Sisterna, 2016); an ideal gas of ewkons has negative pressure, a feature that makes them suitable for the description of dark energy.
Kent, Peter; Stochkendahl, Mette Jensen; Christensen, Henrik Wulff; Kongsted, Alice
2015-01-01
Recognition of homogeneous subgroups of patients can usefully improve prediction of their outcomes and the targeting of treatment. There are a number of research approaches that have been used to recognise homogeneity in such subgroups and to test their implications. One approach is to use statistical clustering techniques, such as Cluster Analysis or Latent Class Analysis, to detect latent relationships between patient characteristics. Influential patient characteristics can come from diverse domains of health, such as pain, activity limitation, physical impairment, social role participation, psychological factors, biomarkers and imaging. However, such 'whole person' research may result in data-driven subgroups that are complex, difficult to interpret and challenging to recognise clinically. This paper describes a novel approach to applying statistical clustering techniques that may improve the clinical interpretability of derived subgroups and reduce sample size requirements. This approach involves clustering in two sequential stages. The first stage involves clustering within health domains and therefore requires creating as many clustering models as there are health domains in the available data. This first stage produces scoring patterns within each domain. The second stage involves clustering using the scoring patterns from each health domain (from the first stage) to identify subgroups across all domains. We illustrate this using chest pain data from the baseline presentation of 580 patients. The new two-stage clustering resulted in two subgroups that approximated the classic textbook descriptions of musculoskeletal chest pain and atypical angina chest pain. The traditional single-stage clustering resulted in five clusters that were also clinically recognisable but displayed less distinct differences. In this paper, a new approach to using clustering techniques to identify clinically useful subgroups of patients is suggested. Research designs, statistical methods and outcome metrics suitable for performing that testing are also described. This approach has potential benefits but requires broad testing, in multiple patient samples, to determine its clinical value. The usefulness of the approach is likely to be context-specific, depending on the characteristics of the available data and the research question being asked of it.
Polarization-correlation analysis of maps of optical anisotropy biological layers
NASA Astrophysics Data System (ADS)
Ushenko, Yu. A.; Dubolazov, A. V.; Prysyazhnyuk, V. S.; Marchuk, Y. F.; Pashkovskaya, N. V.; Motrich, A. V.; Novakovskaya, O. Y.
2014-08-01
A new information optical technique of diagnostics of the structure of polycrystalline films of bile is proposed. The model of Mueller-matrix description of mechanisms of optical anisotropy of such objects as optical activity, birefringence, as well as linear and circular dichroism is suggested. The ensemble of informationally topical azimuthally stable Mueller-matrix invariants is determined. Within the statistical analysis of such parameters distributions the objective criteria of differentiation of films of bile taken from healthy donors and diabetes of type 2 were determined. From the point of view of probative medicine the operational characteristics (sensitivity, specificity and accuracy) of the information-optical method of Mueller-matrix mapping of polycrystalline films of bile were found and its efficiency in diagnostics of diabetes extent of type 2 was demonstrated. Considered prospects of applying this method in the diagnosis of cirrhosis.
Medical Image Retrieval Using Multi-Texton Assignment.
Tang, Qiling; Yang, Jirong; Xia, Xianfu
2018-02-01
In this paper, we present a multi-texton representation method for medical image retrieval, which utilizes the locality constraint to encode each filter bank response within its local-coordinate system consisting of the k nearest neighbors in texton dictionary and subsequently employs spatial pyramid matching technique to implement feature vector representation. Comparison with the traditional nearest neighbor assignment followed by texton histogram statistics method, our strategies reduce the quantization errors in mapping process and add information about the spatial layout of texton distributions and, thus, increase the descriptive power of the image representation. We investigate the effects of different parameters on system performance in order to choose the appropriate ones for our datasets and carry out experiments on the IRMA-2009 medical collection and the mammographic patch dataset. The extensive experimental results demonstrate that the proposed method has superior performance.
Multi-Group Maximum Entropy Model for Translational Non-Equilibrium
NASA Technical Reports Server (NTRS)
Jayaraman, Vegnesh; Liu, Yen; Panesi, Marco
2017-01-01
The aim of the current work is to describe a new model for flows in translational non- equilibrium. Starting from the statistical description of a gas proposed by Boltzmann, the model relies on a domain decomposition technique in velocity space. Using the maximum entropy principle, the logarithm of the distribution function in each velocity sub-domain (group) is expressed with a power series in molecular velocity. New governing equations are obtained using the method of weighted residuals by taking the velocity moments of the Boltzmann equation. The model is applied to a spatially homogeneous Boltzmann equation with a Bhatnagar-Gross-Krook1(BGK) model collision operator and the relaxation of an initial non-equilibrium distribution to a Maxwellian is studied using the model. In addition, numerical results obtained using the model for a 1D shock tube problem are also reported.
Vibrational Mode-Specific Reaction of Methane on a Nickel Surface
NASA Astrophysics Data System (ADS)
Beck, Rainer D.; Maroni, Plinio; Papageorgopoulos, Dimitrios C.; Dang, Tung T.; Schmid, Mathieu P.; Rizzo, Thomas R.
2003-10-01
The dissociation of methane on a nickel catalyst is a key step in steam reforming of natural gas for hydrogen production. Despite substantial effort in both experiment and theory, there is still no atomic-scale description of this important gas-surface reaction. We report quantum state-resolved studies, using pulsed laser and molecular beam techniques, of vibrationally excited methane reacting on the nickel (100) surface. For doubly deuterated methane (CD2H2), we observed that the reaction probability with two quanta of excitation in one C-H bond was greater (by as much as a factor of 5) than with one quantum in each of two C-H bonds. These results clearly exclude the possibility of statistical models correctly describing the mechanism of this process and attest to the importance of full-dimensional calculations of the reaction dynamics.
Vibrational mode-specific reaction of methane on a nickel surface.
Beck, Rainer D; Maroni, Plinio; Papageorgopoulos, Dimitrios C; Dang, Tung T; Schmid, Mathieu P; Rizzo, Thomas R
2003-10-03
The dissociation of methane on a nickel catalyst is a key step in steam reforming of natural gas for hydrogen production. Despite substantial effort in both experiment and theory, there is still no atomic-scale description of this important gas-surface reaction. We report quantum state-resolved studies, using pulsed laser and molecular beam techniques, of vibrationally excited methane reacting on the nickel (100) surface. For doubly deuterated methane (CD2H2), we observed that the reaction probability with two quanta of excitation in one C-H bond was greater (by as much as a factor of 5) than with one quantum in each of two C-H bonds. These results clearly exclude the possibility of statistical models correctly describing the mechanism of this process and attest to the importance of full-dimensional calculations of the reaction dynamics.
Takoo, Sarla; Chhugani, Manju; Sharma, Veena
2013-01-01
The present study was conducted to evaluate the effectiveness of an Information, Education and Communication (IEC) programme on knowledge of pregnant mothers regarding prevention and management of warning signs during pregnancy in a selected health care setting at New Delhi. An evaluative research approach with one group pre-test and post-test design was adopted for the present study. A structured interview schedule was prepared. Purposive non-probability sampling technique was employed to interview 30 pregnant mothers who attended antenatal clinic. Data gathered was analysed and interpreted using both descriptive and inferential statistics. The study revealed that there was maximum knowledge deficit regarding warning signs of pregnancy. IEC programme was effective in enhancing the knowledge of pregnant mothers on prevention and management of warning signs during pregnancy.
Multispectral scanner system parameter study and analysis software system description, volume 2
NASA Technical Reports Server (NTRS)
Landgrebe, D. A. (Principal Investigator); Mobasseri, B. G.; Wiersma, D. J.; Wiswell, E. R.; Mcgillem, C. D.; Anuta, P. E.
1978-01-01
The author has identified the following significant results. The integration of the available methods provided the analyst with the unified scanner analysis package (USAP), the flexibility and versatility of which was superior to many previous integrated techniques. The USAP consisted of three main subsystems; (1) a spatial path, (2) a spectral path, and (3) a set of analytic classification accuracy estimators which evaluated the system performance. The spatial path consisted of satellite and/or aircraft data, data correlation analyzer, scanner IFOV, and random noise model. The output of the spatial path was fed into the analytic classification and accuracy predictor. The spectral path consisted of laboratory and/or field spectral data, EXOSYS data retrieval, optimum spectral function calculation, data transformation, and statistics calculation. The output of the spectral path was fended into the stratified posterior performance estimator.
Geodetic positioning using a global positioning system of satellites
NASA Technical Reports Server (NTRS)
Fell, P. J.
1980-01-01
Geodetic positioning using range, integrated Doppler, and interferometric observations from a constellation of twenty-four Global Positioning System satellites is analyzed. A summary of the proposals for geodetic positioning and baseline determination is given which includes a description of measurement techniques and comments on rank deficiency and error sources. An analysis of variance comparison of range, Doppler, and interferometric time delay to determine their relative geometric strength for baseline determination is included. An analytic examination to the effect of a priori constraints on positioning using simultaneous observations from two stations is presented. Dynamic point positioning and baseline determination using range and Doppler is examined in detail. Models for the error sources influencing dynamic positioning are developed. Included is a discussion of atomic clock stability, and range and Doppler observation error statistics based on random correlated atomic clock error are derived.
Knowledge Discovery and Data Mining in Iran's Climatic Researches
NASA Astrophysics Data System (ADS)
Karimi, Mostafa
2013-04-01
Advances in measurement technology and data collection is the database gets larger. Large databases require powerful tools for analysis data. Iterative process of acquiring knowledge from information obtained from data processing is done in various forms in all scientific fields. However, when the data volume large, and many of the problems the Traditional methods cannot respond. in the recent years, use of databases in various scientific fields, especially atmospheric databases in climatology expanded. in addition, increases in the amount of data generated by the climate models is a challenge for analysis of it for extraction of hidden pattern and knowledge. The approach to this problem has been made in recent years uses the process of knowledge discovery and data mining techniques with the use of the concepts of machine learning, artificial intelligence and expert (professional) systems is overall performance. Data manning is analytically process for manning in massive volume data. The ultimate goal of data mining is access to information and finally knowledge. climatology is a part of science that uses variety and massive volume data. Goal of the climate data manning is Achieve to information from variety and massive atmospheric and non-atmospheric data. in fact, Knowledge Discovery performs these activities in a logical and predetermined and almost automatic process. The goal of this research is study of uses knowledge Discovery and data mining technique in Iranian climate research. For Achieve This goal, study content (descriptive) analysis and classify base method and issue. The result shown that in climatic research of Iran most clustering, k-means and wards applied and in terms of issues precipitation and atmospheric circulation patterns most introduced. Although several studies in geography and climate issues with statistical techniques such as clustering and pattern extraction is done, Due to the nature of statistics and data mining, but cannot say for internal climate studies in data mining and knowledge discovery techniques are used. However, it is necessary to use the KDD Approach and DM techniques in the climatic studies, specific interpreter of climate modeling result.
Metamodels for Computer-Based Engineering Design: Survey and Recommendations
NASA Technical Reports Server (NTRS)
Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.
1997-01-01
The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.
Parallel auto-correlative statistics with VTK.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pebay, Philippe Pierre; Bennett, Janine Camille
2013-08-01
This report summarizes existing statistical engines in VTK and presents both the serial and parallel auto-correlative statistics engines. It is a sequel to [PT08, BPRT09b, PT09, BPT09, PT10] which studied the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k-means, and order statistics engines. The ease of use of the new parallel auto-correlative statistics engine is illustrated by the means of C++ code snippets and algorithm verification is provided. This report justifies the design of the statistics engines with parallel scalability in mind, and provides scalability and speed-up analysis results for the autocorrelative statistics engine.
Effect of different mixing methods on the bacterial microleakage of calcium-enriched mixture cement.
Shahi, Shahriar; Jeddi Khajeh, Soniya; Rahimi, Saeed; Yavari, Hamid R; Jafari, Farnaz; Samiei, Mohammad; Ghasemi, Negin; Milani, Amin S
2016-10-01
Calcium-enriched mixture (CEM) cement is used in the field of endodontics. It is similar to mineral trioxide aggregate in its main ingredients. The present study investigated the effect of different mixing methods on the bacterial microleakage of CEM cement. A total of 55 human single-rooted human permanent teeth were decoronated so that 14-mm-long samples were obtained and obturated with AH26 sealer and gutta-percha using lateral condensation technique. Three millimeters of the root end were cut off and randomly divided into 3 groups of 15 each (3 mixing methods of amalgamator, ultrasonic and conventional) and 2 negative and positive control groups (each containing 5 samples). BHI (brain-heart infusion agar) suspension containing Enterococcus faecalis was used for bacterial leakage assessment. Statistical analysis was carried out using descriptive statistics, Kaplan-Meier survival analysis with censored data and log rank test. Statistical significance was set at P<0.05. The survival means for conventional, amalgamator and ultrasonic methods were 62.13±12.44, 68.87±12.79 and 77.53±12.52 days, respectively. The log rank test showed no significant differences between the groups. Based on the results of the present study it can be concluded that different mixing methods had no significant effect on the bacterial microleakage of CEM cement.
de Agostino Biella Passos, Vivian; de Carvalho Carrara, Cleide Felício; da Silva Dalben, Gisele; Costa, Beatriz; Gomide, Marcia Ribeiro
2014-03-01
To evaluate the prevalence of fistulas after palate repair and analyze their location and association with possible causal factors. Retrospective analysis of patient records and evaluation of preoperative initial photographs. Tertiary craniofacial center. Five hundred eighty-nine individuals with complete unilateral cleft lip and palate that underwent palate repair at the age of 12 to 36 months by the von Langenbeck technique, in a single stage, by the plastic surgery team of the hospital, from January 2003 to July 2007. The cleft width was visually classified by a single examiner as narrow, regular, or wide. The following regions of the palate were considered for the location: anterior, medium, transition (between hard and soft palate), and soft palate. Descriptive statistics and analysis of association between the occurrence of fistula and the different parameters were evaluated. Palatal fistulas were observed in 27% of the sample, with a greater proportion at the anterior region (37.11%). The chi-square statistical test revealed statistically significant association (P ≤ .05) between the fistulas and initial cleft width (P = .0003), intraoperative problems (P = .0037), and postoperative problems (P = .00002). The prevalence of palatal fistula was similar to mean values reported in the literature. Analysis of causal factors showed a positive association between palatal fistulas with wide and regular initial cleft width and intraoperative and postoperative problems. The anterior region presented the greatest occurrence of fistulas.
2010-01-01
Background To assess the reliability of the measurements obtained with the PalmScan™, when compared with another standardized A-mode ultrasound device, and assess the consistency and correlation between the two methods. Methods Transversal, descriptive, and comparative study. We recorded the axial length (AL), anterior chamber depth (ACD) and lens thickness (LT) obtained with two A-mode ultrasounds (PalmScan™ A2000 and Eye Cubed™) using an immersion technique. We compared the measurements with a two-sample t-test. Agreement between the two devices was assessed with Bland-Altman plots and 95% limits of agreement. Results 70 eyes of 70 patients were enrolled in this study. The measurements with the Eye Cubed™ of AL and ACD were shorter than the measurements taken by the PalmScan™. The differences were not statistically significant regarding AL (p < 0.4) but significant regarding ACD (p < 0.001). The highest agreement between the two devices was obtained during LT measurement. The PalmScan™ measurements were shorter, but not statistically significantly (p < 0.2). Conclusions The values of AL and LT, obtained with both devices are not identical, but within the limits of agreement. The agreement is not affected by the magnitude of the ocular dimensions (but only between range of 20 mm to 27 mm of AL and 3.5 mm to 5.7 mm of LT). A correction of about 0.5 D could be considered if an intraocular lens is being calculated. However due to the large variability of the results, the authors recommend discretion in using this conversion factor, and to adjust the power of the intraocular lenses based upon the personal experience of the surgeon. PMID:20334670
Velez-Montoya, Raul; Shusterman, Eugene Mark; López-Miranda, Miriam Jessica; Mayorquin-Ruiz, Mariana; Salcedo-Villanueva, Guillermo; Quiroz-Mercado, Hugo; Morales-Cantón, Virgilio
2010-03-24
To assess the reliability of the measurements obtained with the PalmScan, when compared with another standardized A-mode ultrasound device, and assess the consistency and correlation between the two methods. Transversal, descriptive, and comparative study. We recorded the axial length (AL), anterior chamber depth (ACD) and lens thickness (LT) obtained with two A-mode ultrasounds (PalmScan A2000 and Eye Cubed) using an immersion technique. We compared the measurements with a two-sample t-test. Agreement between the two devices was assessed with Bland-Altman plots and 95% limits of agreement. 70 eyes of 70 patients were enrolled in this study. The measurements with the Eye Cubed of AL and ACD were shorter than the measurements taken by the PalmScan. The differences were not statistically significant regarding AL (p < 0.4) but significant regarding ACD (p < 0.001). The highest agreement between the two devices was obtained during LT measurement. The PalmScan measurements were shorter, but not statistically significantly (p < 0.2). The values of AL and LT, obtained with both devices are not identical, but within the limits of agreement. The agreement is not affected by the magnitude of the ocular dimensions (but only between range of 20 mm to 27 mm of AL and 3.5 mm to 5.7 mm of LT). A correction of about 0.5 D could be considered if an intraocular lens is being calculated. However due to the large variability of the results, the authors recommend discretion in using this conversion factor, and to adjust the power of the intraocular lenses based upon the personal experience of the surgeon.
Fossil diatoms and neogene paleolimnology
Platt, Bradbury J.
1988-01-01
Diatoms have played an important role in the development of Neogene continental biostratigraphy and paleolimnology since the mid-19th Century. The history of progress in Quaternary diatom biostratigraphy has developed as a result of improved coring techniques that enable sampling sediments beneath existing lakes coupled with improved chronological control (including radiometric dating and varve enumeration), improved statistical treatment of fossil diatom assemblages (from qualitative description to influx calculations of diatom numbers or volumes), and improved ecological information about analogous living diatom associations. The last factor, diatom ecology, is the most critical in many ways, but progresses slowly. Fortunately, statistical comparison of modern diatom assemblages and insightful studies of the nutrient requirements of some common freshwater species are enabling diatom paleolimnologists to make more detailed interpretations of the Quaternary record than had been possible earlier, and progress in the field of diatom biology and ecology will continue to refine paleolimnological studies. The greater age and geologic setting of Tertiary diatomaceous deposits has prompted their study in the contexts of geologic history, biochronology and evolution. The distribution of diatoms of marine affinities in continental deposits has given geologists insights about tectonism and sea-level change, and the distribution of distinctive (extinct?) diatoms has found utilization both in making stratigraphic correlations between outcrops of diatomaceous deposits and in various types of biochronological studies that involve dating deposits in different areas. A continental diatom biochronologic scheme will rely upon evolution, such as the appearance of new genera within a family, in combination with regional environmental changes that are responsible for the wide distribution of distinctive diatom species. The increased use of the scanning electron microscope for the detailed descriptions of fossil diatoms will provide the basis for making more accurate correlations and identifications, and the micromorphological detail for speculations about evolutionary relationships. ?? 1988.
Validation and extraction of molecular-geometry information from small-molecule databases.
Long, Fei; Nicholls, Robert A; Emsley, Paul; Graǽulis, Saulius; Merkys, Andrius; Vaitkus, Antanas; Murshudov, Garib N
2017-02-01
A freely available small-molecule structure database, the Crystallography Open Database (COD), is used for the extraction of molecular-geometry information on small-molecule compounds. The results are used for the generation of new ligand descriptions, which are subsequently used by macromolecular model-building and structure-refinement software. To increase the reliability of the derived data, and therefore the new ligand descriptions, the entries from this database were subjected to very strict validation. The selection criteria made sure that the crystal structures used to derive atom types, bond and angle classes are of sufficiently high quality. Any suspicious entries at a crystal or molecular level were removed from further consideration. The selection criteria included (i) the resolution of the data used for refinement (entries solved at 0.84 Å resolution or higher) and (ii) the structure-solution method (structures must be from a single-crystal experiment and all atoms of generated molecules must have full occupancies), as well as basic sanity checks such as (iii) consistency between the valences and the number of connections between atoms, (iv) acceptable bond-length deviations from the expected values and (v) detection of atomic collisions. The derived atom types and bond classes were then validated using high-order moment-based statistical techniques. The results of the statistical analyses were fed back to fine-tune the atom typing. The developed procedure was repeated four times, resulting in fine-grained atom typing, bond and angle classes. The procedure will be repeated in the future as and when new entries are deposited in the COD. The whole procedure can also be applied to any source of small-molecule structures, including the Cambridge Structural Database and the ZINC database.
A comparison of two- and three-dimensional stochastic models of regional solute movement
Shapiro, A.M.; Cvetkovic, V.D.
1990-01-01
Recent models of solute movement in porous media that are based on a stochastic description of the porous medium properties have been dedicated primarily to a three-dimensional interpretation of solute movement. In many practical problems, however, it is more convenient and consistent with measuring techniques to consider flow and solute transport as an areal, two-dimensional phenomenon. The physics of solute movement, however, is dependent on the three-dimensional heterogeneity in the formation. A comparison of two- and three-dimensional stochastic interpretations of solute movement in a porous medium having a statistically isotropic hydraulic conductivity field is investigated. To provide an equitable comparison between the two- and three-dimensional analyses, the stochastic properties of the transmissivity are defined in terms of the stochastic properties of the hydraulic conductivity. The variance of the transmissivity is shown to be significantly reduced in comparison to that of the hydraulic conductivity, and the transmissivity is spatially correlated over larger distances. These factors influence the two-dimensional interpretations of solute movement by underestimating the longitudinal and transverse growth of the solute plume in comparison to its description as a three-dimensional phenomenon. Although this analysis is based on small perturbation approximations and the special case of a statistically isotropic hydraulic conductivity field, it casts doubt on the use of a stochastic interpretation of the transmissivity in describing regional scale movement. However, by assuming the transmissivity to be the vertical integration of the hydraulic conductivity field at a given position, the stochastic properties of the hydraulic conductivity can be estimated from the stochastic properties of the transmissivity and applied to obtain a more accurate interpretation of solute movement. ?? 1990 Kluwer Academic Publishers.
Jorgenson, Andrew K; Clark, Brett
2013-01-01
This study examines the regional and temporal differences in the statistical relationship between national-level carbon dioxide emissions and national-level population size. The authors analyze panel data from 1960 to 2005 for a diverse sample of nations, and employ descriptive statistics and rigorous panel regression modeling techniques. Initial descriptive analyses indicate that all regions experienced overall increases in carbon emissions and population size during the 45-year period of investigation, but with notable differences. For carbon emissions, the sample of countries in Asia experienced the largest percent increase, followed by countries in Latin America, Africa, and lastly the sample of relatively affluent countries in Europe, North America, and Oceania combined. For population size, the sample of countries in Africa experienced the largest percent increase, followed countries in Latin America, Asia, and the combined sample of countries in Europe, North America, and Oceania. Findings for two-way fixed effects panel regression elasticity models of national-level carbon emissions indicate that the estimated elasticity coefficient for population size is much smaller for nations in Africa than for nations in other regions of the world. Regarding potential temporal changes, from 1960 to 2005 the estimated elasticity coefficient for population size decreased by 25% for the sample of Africa countries, 14% for the sample of Asia countries, 6.5% for the sample of Latin America countries, but remained the same in size for the sample of countries in Europe, North America, and Oceania. Overall, while population size continues to be the primary driver of total national-level anthropogenic carbon dioxide emissions, the findings for this study highlight the need for future research and policies to recognize that the actual impacts of population size on national-level carbon emissions differ across both time and region.
Maljaei, Ensiyeh; Pourkazemi, Maryam; Ghanizadeh, Milad; Ranjbar, Rana
2017-01-01
Introduction: During the early mixed dentition period, the location of the deciduous maxillary second molar results in ineffectiveness of the infiltration technique in this area. In such cases, administration of posterior superior alveolar (PSA) nerve block is recommended; however, such a technique has some complications. The present study was undertaken to compare the effects of buccal infiltration of 4% Articaine and PSA technique with 2% Lidocaine on the success of anesthesia of maxillary deciduous second molars in 6 to 9-year old children. Methods and Materials: In the present double-blind randomized clinical trial, 56 children aged 6-9 years requiring vital pulp therapy of deciduous maxillary second molar were included. In group 1, 4% Articaine was injected using a buccal infiltration technique. In group 2, 2% Lidocaine was injected using the PSA nerve block technique. After 10 min, the caries was removed and access cavity preparation was instituted. The patients were asked to report the presence or absence of pain during the procedure. Therefore, the existence of pain was measured by the patient's self-report. Data were analyzed with descriptive statistical methods and the chi-squared test. Results: Pain was reported by 6 (21.4%) and 9 (32.1%) subjects in the Articaine and Lidocaine groups, respectively. Chi-squared test did not reveal any significant differences between the two groups (P=0.54). Conclusion: Under the limitations of the present study, there was no significant differences between the results of Articaine buccal infiltration and Lidocaine PSA technique, so Articaine buccal infiltration can be used as a substitute for the PSA technique. PMID:28808450
Description of the control system design for the SSF PMAD DC testbed
NASA Technical Reports Server (NTRS)
Baez, Anastacio N.; Kimnach, Greg L.
1991-01-01
The Power Management and Distribution (PMAD) DC Testbed Control System for Space Station Freedom was developed using a top down approach based on classical control system and conventional terrestrial power utilities design techniques. The design methodology includes the development of a testbed operating concept. This operating concept describes the operation of the testbed under all possible scenarios. A unique set of operating states was identified and a description of each state, along with state transitions, was generated. Each state is represented by a unique set of attributes and constraints, and its description reflects the degree of system security within which the power system is operating. Using the testbed operating states description, a functional design for the control system was developed. This functional design consists of a functional outline, a text description, and a logical flowchart for all the major control system functions. Described here are the control system design techniques, various control system functions, and the status of the design and implementation.
NASA Technical Reports Server (NTRS)
Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.
1992-01-01
The Penn State Finite Difference Time Domain Electromagnetic Scattering Code version D is a 3-D numerical electromagnetic scattering code based upon the finite difference time domain technique (FDTD). The manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into 14 sections: introduction; description of the FDTD method; operation; resource requirements; version D code capabilities; a brief description of the default scattering geometry; a brief description of each subroutine; a description of the include file; a section briefly discussing Radar Cross Section computations; a section discussing some scattering results; a sample problem setup section; a new problem checklist; references and figure titles. The FDTD technique models transient electromagnetic scattering and interactions with objects of arbitrary shape and/or material composition. In the FDTD method, Maxwell's curl equations are discretized in time-space and all derivatives (temporal and spatial) are approximated by central differences.
Maternal characteristics and immunization status of children in North Central of Nigeria
Adenike, Olugbenga-Bello; Adejumoke, Jimoh; Olufunmi, Oke; Ridwan, Oladejo
2017-01-01
Introduction Routine immunization coverage in Nigeria is one of the lowest national coverage rates in the world. The objective of this study was to compare the mother’ characteristics and the child’s Immunization status in some selected rural and urban communities in the North central part of Nigeria. Methods A descriptive cross sectional study, using a multistage sampling technique to select 600 respondent women with an index child between 0-12 months. Results Mean age of rural respondents was 31.40±7.21 years and 32.72+6.77 years among urban respondents, though there was no statistically significant difference in age between the 2 locations (p-0.762). One hundred and ninetyseven (65.7%) and 241(80.3%) of rural and urban respondents respectively were aware of immunization, the difference was statistically significant (p-0.016). knowledge in urban areas was better than among rural respondents. There was statistically significant association between respondents age, employment status, mothers' educational status and the child's immunization status (P<0.05), while variables like parity, age at marriage, marital status, No of children, household income and place of index were not statistically associated with immunization status as P>0.05. More than half 179(59.7%) of rural and 207(69.0%) of urban had good practice of immunization though the difference was not statistically significant (p-0.165) Conclusion The immunization coverage in urban community was better than that of the rural community. The result of this study has clearly indicated that mothers in Nigeria have improved on taking their children for immunization in both rural and urban area compared to previous reports PMID:28588745
WASP (Write a Scientific Paper) using Excel - 6: Standard error and confidence interval.
Grech, Victor
2018-03-01
The calculation of descriptive statistics includes the calculation of standard error and confidence interval, an inevitable component of data analysis in inferential statistics. This paper provides pointers as to how to do this in Microsoft Excel™. Copyright © 2018 Elsevier B.V. All rights reserved.
Education Statistics Quarterly, Fall 2002.
ERIC Educational Resources Information Center
Dillow, Sally, Ed.
2003-01-01
This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications and data products released in a 3-month period. Each issue also contains a message from the NCES on a timely…
BLS Machine-Readable Data and Tabulating Routines.
ERIC Educational Resources Information Center
DiFillipo, Tony
This report describes the machine-readable data and tabulating routines that the Bureau of Labor Statistics (BLS) is prepared to distribute. An introduction discusses the LABSTAT (Labor Statistics) database and the BLS policy on release of unpublished data. Descriptions summarizing data stored in 25 files follow this format: overview, data…
Education Statistics Quarterly, Fall 2001.
ERIC Educational Resources Information Center
Dillow, Sally, Ed.
2001-01-01
The publication gives a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications, data products, and funding opportunities developed over a 3-month period. Each issue also contains a message from…
ERIC Educational Resources Information Center
Tryon, Warren W.; Lewis, Charles
2009-01-01
Tryon presented a graphic inferential confidence interval (ICI) approach to analyzing two independent and dependent means for statistical difference, equivalence, replication, indeterminacy, and trivial difference. Tryon and Lewis corrected the reduction factor used to adjust descriptive confidence intervals (DCIs) to create ICIs and introduced…
Examples of Data Analysis with SPSS-X.
ERIC Educational Resources Information Center
MacFarland, Thomas W.
Intended for classroom use only, these unpublished notes contain computer lessons on descriptive statistics using SPSS-X Release 3.0 for VAX/UNIX. Statistical measures covered include Chi-square analysis; Spearman's rank correlation coefficient; Student's t-test with two independent samples; Student's t-test with a paired sample; One-way analysis…
Statistics as Unbiased Estimators: Exploring the Teaching of Standard Deviation
ERIC Educational Resources Information Center
Wasserman, Nicholas H.; Casey, Stephanie; Champion, Joe; Huey, Maryann
2017-01-01
This manuscript presents findings from a study about the knowledge for and planned teaching of standard deviation. We investigate how understanding variance as an unbiased (inferential) estimator--not just a descriptive statistic for the variation (spread) in data--is related to teachers' instruction regarding standard deviation, particularly…
Education Statistics Quarterly. Volume 5, Issue 1.
ERIC Educational Resources Information Center
Dillow, Sally, Ed.
2003-01-01
This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications, data product, and funding opportunities developed over a 3-month period. Each issue also contains a message…
Education Statistics Quarterly, Winter 2001.
ERIC Educational Resources Information Center
Dillow, Sally, Ed.
2002-01-01
This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications and data products released in a 3-month period. Each issue also contains a message from the NCES on a timely…
76 FR 60817 - Notice of Proposed Information Collection Requests
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-30
... Statistics (NCES) is seeking a three-year clearance for a new survey data collection for the College... most recent data are available. The clearance being requested is to survey the institutions on this... and sector specific findings from the CATE using descriptive statistics. The main cost areas showing...
Basic Statistical Concepts and Methods for Earth Scientists
Olea, Ricardo A.
2008-01-01
INTRODUCTION Statistics is the science of collecting, analyzing, interpreting, modeling, and displaying masses of numerical data primarily for the characterization and understanding of incompletely known systems. Over the years, these objectives have lead to a fair amount of analytical work to achieve, substantiate, and guide descriptions and inferences.
North Carolina Migrant Education Program. 1971 Project Evaluation Reports, Vol. I.
ERIC Educational Resources Information Center
North Carolina State Dept. of Public Instruction, Raleigh.
Evaluation reports for 10 of the 23 1971 Summer Migrant Projects in North Carolina are presented in Volume I of this compilation. Each report contains the following information: (1) descriptive statistics and results of student achievement; (2) description of the project as obtained from site team reports and other available information; and (3)…
The Social Profile of Students in Basic General Education in Ecuador: A Data Analysis
ERIC Educational Resources Information Center
Buri, Olga Elizabeth Minchala; Stefos, Efstathios
2017-01-01
The objective of this study is to examine the social profile of students who are enrolled in Basic General Education in Ecuador. Both a descriptive and multidimensional statistical analysis was carried out based on the data provided by the National Survey of Employment, Unemployment and Underemployment in 2015. The descriptive analysis shows the…
Policymakers Dependence on Evidence in Education Decision Making in Oyo State Ministry of Education
ERIC Educational Resources Information Center
Babalola, Joel B.; Gbolahan, Sowunmi
2016-01-01
This study investigated policymaker dependence on evidence in education decision making in Oyo State Ministry of Education. The study was conducted under a descriptive survey design, 44 out of the 290 policymakers of the Ministry and Board of Education across the State were purposively selected for the study. Descriptive statistics of frequency…
Comparison of online marketing techniques on food and beverage companies' websites in six countries.
Bragg, Marie A; Eby, Margaret; Arshonsky, Josh; Bragg, Alex; Ogedegbe, Gbenga
2017-10-26
Food and beverage marketing contributes to poor dietary choices among adults and children. As consumers spend more time on the Internet, food and beverage companies have increased their online marketing efforts. Studies have shown food companies' online promotions use a variety of marketing techniques to promote mostly energy-dense, nutrient-poor products, but no studies have compared the online marketing techniques and nutritional quality of products promoted on food companies' international websites. For this descriptive study, we developed a qualitative codebook to catalogue the marketing themes used on 18 international corporate websites associated with the world's three largest fast food and beverage companies (i.e. Coca-Cola, McDonald's, Kentucky Fried Chicken). Nutritional quality of foods featured on those websites was evaluated based on quantitative Nutrient Profile Index scores and food category (e.g. fried, fresh). Beverages were sorted into categories based on added sugar content. We report descriptive statistics to compare the marketing techniques and nutritional quality of products featured on the company websites for the food and beverage company websites in two high-income countries (HICs), Germany and the United States, two upper-middle-income countries (UMICs), China and Mexico, and two lower-middle-income countries (LMICs), India and the Philippines. Of the 406 screenshots captured from company websites, 67·8% depicted a food or beverage product. HICs' websites promoted diet food or beverage products/healthier alternatives (e.g. baked chicken sandwich) significantly more often on their pages (25%), compared to LMICs (14·5%). Coca-Cola featured diet products significantly more frequently on HIC websites compared to LMIC websites. Charities were featured more often on webpages in LMICs (15·4%) compared to UMICs (2·6%) and HICs (2·3%). This study demonstrates that companies showcase healthier products in wealthier countries and advertise their philanthropic activities in lower income countries, which is concerning given the negative effect of nutrition transition (double burden of overnutrition and undernutrition) on burden of non-communicable diseases and obesity in lower income countries.
Analysis of statistical misconception in terms of statistical reasoning
NASA Astrophysics Data System (ADS)
Maryati, I.; Priatna, N.
2018-05-01
Reasoning skill is needed for everyone to face globalization era, because every person have to be able to manage and use information from all over the world which can be obtained easily. Statistical reasoning skill is the ability to collect, group, process, interpret, and draw conclusion of information. Developing this skill can be done through various levels of education. However, the skill is low because many people assume that statistics is just the ability to count and using formulas and so do students. Students still have negative attitude toward course which is related to research. The purpose of this research is analyzing students’ misconception in descriptive statistic course toward the statistical reasoning skill. The observation was done by analyzing the misconception test result and statistical reasoning skill test; observing the students’ misconception effect toward statistical reasoning skill. The sample of this research was 32 students of math education department who had taken descriptive statistic course. The mean value of misconception test was 49,7 and standard deviation was 10,6 whereas the mean value of statistical reasoning skill test was 51,8 and standard deviation was 8,5. If the minimal value is 65 to state the standard achievement of a course competence, students’ mean value is lower than the standard competence. The result of students’ misconception study emphasized on which sub discussion that should be considered. Based on the assessment result, it was found that students’ misconception happen on this: 1) writing mathematical sentence and symbol well, 2) understanding basic definitions, 3) determining concept that will be used in solving problem. In statistical reasoning skill, the assessment was done to measure reasoning from: 1) data, 2) representation, 3) statistic format, 4) probability, 5) sample, and 6) association.
Application of pedagogy reflective in statistical methods course and practicum statistical methods
NASA Astrophysics Data System (ADS)
Julie, Hongki
2017-08-01
Subject Elementary Statistics, Statistical Methods and Statistical Methods Practicum aimed to equip students of Mathematics Education about descriptive statistics and inferential statistics. The students' understanding about descriptive and inferential statistics were important for students on Mathematics Education Department, especially for those who took the final task associated with quantitative research. In quantitative research, students were required to be able to present and describe the quantitative data in an appropriate manner, to make conclusions from their quantitative data, and to create relationships between independent and dependent variables were defined in their research. In fact, when students made their final project associated with quantitative research, it was not been rare still met the students making mistakes in the steps of making conclusions and error in choosing the hypothetical testing process. As a result, they got incorrect conclusions. This is a very fatal mistake for those who did the quantitative research. There were some things gained from the implementation of reflective pedagogy on teaching learning process in Statistical Methods and Statistical Methods Practicum courses, namely: 1. Twenty two students passed in this course and and one student did not pass in this course. 2. The value of the most accomplished student was A that was achieved by 18 students. 3. According all students, their critical stance could be developed by them, and they could build a caring for each other through a learning process in this course. 4. All students agreed that through a learning process that they undergo in the course, they can build a caring for each other.
A new instrument and technique for lung biopsy using local anaesthesia
Thompson, D. T.
1973-01-01
The more generally used techniques of lung biopsy are examined. There follows a description of a new instrument and technique for removing a small portion of lung using local anaesthesia. The technique, its advantages, and possible shortcomings are discussed. Images PMID:4731122
Designing an Error Resolution Checklist for a Shared Manned-Unmanned Environment
2010-06-01
performance during the Olympics. Thank you to Birsen Donmez, who took an active role in my statistics instruction. I appreciate your time and patience...in teaching me the finer details of “varsity statistics ”. Also, thank you for being so responsive through e-mail, even though you are now located in...105! 6.3.! Experiment recommendations and future work................................................ 105! Appendix A: Descriptive Statistics
Reframing Serial Murder Within Empirical Research.
Gurian, Elizabeth A
2017-04-01
Empirical research on serial murder is limited due to the lack of consensus on a definition, the continued use of primarily descriptive statistics, and linkage to popular culture depictions. These limitations also inhibit our understanding of these offenders and affect credibility in the field of research. Therefore, this comprehensive overview of a sample of 508 cases (738 total offenders, including partnered groups of two or more offenders) provides analyses of solo male, solo female, and partnered serial killers to elucidate statistical differences and similarities in offending and adjudication patterns among the three groups. This analysis of serial homicide offenders not only supports previous research on offending patterns present in the serial homicide literature but also reveals that empirically based analyses can enhance our understanding beyond traditional case studies and descriptive statistics. Further research based on these empirical analyses can aid in the development of more accurate classifications and definitions of serial murderers.
NASA Astrophysics Data System (ADS)
Guala, M.; Liu, M.
2017-12-01
The kinematics of sediment particles is investigated by non-intrusive imaging methods to provide a statistical description of bedload transport in conditions near the threshold of motion. In particular, we focus on the cyclic transition between motion and rest regimes to quantify the waiting time statistics inferred to be responsible for anomalous diffusion, and so far elusive. Despite obvious limitations in the spatio-temporal domain of the observations, we are able to identify the probability distributions of the particle step time and length, velocity, acceleration, waiting time, and thus distinguish which quantities exhibit well converged mean values, based on the thickness of their respective tails. The experimental results shown here for four different transport conditions highlight the importance of the waiting time distribution and represent a benchmark dataset for the stochastic modeling of bedload transport.
NASA Astrophysics Data System (ADS)
Csordás, A.; Graham, R.; Szépfalusy, P.; Vattay, G.
1994-01-01
One wall of an Artin's billiard on the Poincaré half-plane is replaced by a one-parameter (cp) family of nongeodetic walls. A brief description of the classical phase space of this system is given. In the quantum domain, the continuous and gradual transition from the Poisson-like to Gaussian-orthogonal-ensemble (GOE) level statistics due to the small perturbations breaking the symmetry responsible for the ``arithmetic chaos'' at cp=1 is studied. Another GOE-->Poisson transition due to the mixed phase space for large perturbations is also investigated. A satisfactory description of the intermediate level statistics by the Brody distribution was found in both cases. The study supports the existence of a scaling region around cp=1. A finite-size scaling relation for the Brody parameter as a function of 1-cp and the number of levels considered can be established.
NASA Technical Reports Server (NTRS)
Field, G. B.
1974-01-01
Measurements are described of atmospheric conditions affecting astronomical observations at White Mountain, California. Measurements were made at more than 1400 times spaced over more than 170 days at the Summit Laboratory and a small number of days at the Barcroft Laboratory. The recorded quantities were ten micron sky noise and precipitable water vapor, plus wet and dry bulb temperatures, wind speed and direction, brightness of the sky near the sun, fisheye lens photographs of the sky, description of cloud cover and other observable parameters, color photographs of air pollution astronomical seeing, and occasional determinations of the visible light brightness of the night sky. Measurements of some of these parameters have been made for over twenty years at the Barcroft and Crooked Creek Laboratories, and statistical analyses were made of them. These results and interpretations are given. The bulk of the collected data are statistically analyzed, and disposition of the detailed data is described. Most of the data are available in machine readable form. A detailed discussion of the techniques proposed for operation at White Mountain is given, showing how to cope with the mountain and climatic problems.
Comparative analysis of profitability of honey production using traditional and box hives.
Al-Ghamdi, Ahmed A; Adgaba, Nuru; Herab, Ahmed H; Ansari, Mohammad J
2017-07-01
Information on the profitability and productivity of box hives is important to encourage beekeepers to adopt the technology. However, comparative analysis of profitability and productivity of box and traditional hives is not adequately available. The study was carried out on 182 beekeepers using cross sectional survey and employing a random sampling technique. The data were analyzed using descriptive statistics, analysis of variance (ANOVA), the Cobb-Douglas (CD) production function and partial budgeting. The CD production function revealed that supplementary bee feeds, labor and medication were statistically significant for both box and traditional hives. Generally, labor for bee management, supplementary feeding, and medication led to productivity differences of approximately 42.83%, 7.52%, and 5.34%, respectively, between box and traditional hives. The study indicated that productivity of box hives were 72% higher than traditional hives. The average net incomes of beekeepers using box and traditional hives were 33,699.7 SR/annum and 16,461.4 SR/annum respectively. The incremental net benefit of box hives over traditional hives was nearly double. Our study results clearly showed the importance of adoption of box hives for better productivity of the beekeeping subsector.
Vaidya, Prutha; Mahale, Swapna; Badade, Pallavi; Warang, Ayushya; Kale, Sunila; Kalekar, Lavanya
2017-01-01
Widespread interest in epidermal ridges developed only in the last several decades; however, it is still at infancy in the world of dentistry. The word "dermatoglyphics" comes from two Greek words (derma: Skin and glyphe: Carve) and refers to the epidermal skin ridge formations which appear on the fingers, palms of the hands, and soles of the feet. This study aims to assess the relationship between finger prints and chronic periodontitis. Two hundred patients were equally divided into chronic periodontitis and periodontally healthy group. The fingerprint patterns of the participants were recorded with a rolling impression technique using duplicating ink on executive bond paper. The descriptive analysis of the data was presented as percentage frequency. The percentage frequencies of each pattern on each individual finger were calculated, and statistical tests were applied. Unpaired t-test was used for intergroup comparisons (P < 0.05). There were statistically more whorls and less arches in both right and left hands in patients with chronic periodontitis. Dermatoglyphics can lead to early diagnosis, treatment, and better prevention of many genetic disorders of the oral cavity and other diseases whose etiology may be influenced directly or indirectly by genetic inheritance.
Gambling market and individual patterns of gambling in Germany.
Albers, N; Hübl, L
1997-01-01
In this paper individual patterns of gambling in Germany are estimated for the first time. The probit technique is used to test the influence of a set of individual characteristics on the probability of participating in each of the various legal games. A sample size of 1,586 adults collected for the pool of German lotteries provides a reliable set of data. All disaggregated estimations of participation are statistically significant at least at the 5 percent level. The basic findings suggest that gambling is a widespread normal (superior) consumption good because gambling participation tends to rise with income. Moreover, no demand anomaly can be found to justify assessing gambling as a social demerit. Only the participation in gaming machines is higher for younger, unemployed and less educated adults. While a moral evaluation of gambling is beyond the scope of this paper, the legislator's preference for a highly taxed state monopoly in gambling markets is to be rejected, at least for Germany. Additional statistical findings suggest distinct consumer perceptions of the characteristics of the various games and may be used for market segmentation. The paper starts with a descriptive introduction to the German gambling market.
NASA Astrophysics Data System (ADS)
Al-Ajmi, R. M.; Abou-Ziyan, H. Z.; Mahmoud, M. A.
2012-01-01
This paper reports the results of a comprehensive study that aimed at identifying best neural network architecture and parameters to predict subcooled boiling characteristics of engine oils. A total of 57 different neural networks (NNs) that were derived from 14 different NN architectures were evaluated for four different prediction cases. The NNs were trained on experimental datasets performed on five engine oils of different chemical compositions. The performance of each NN was evaluated using a rigorous statistical analysis as well as careful examination of smoothness of predicted boiling curves. One NN, out of the 57 evaluated, correctly predicted the boiling curves for all cases considered either for individual oils or for all oils taken together. It was found that the pattern selection and weight update techniques strongly affect the performance of the NNs. It was also revealed that the use of descriptive statistical analysis such as R2, mean error, standard deviation, and T and slope tests, is a necessary but not sufficient condition for evaluating NN performance. The performance criteria should also include inspection of the smoothness of the predicted curves either visually or by plotting the slopes of these curves.
García-Pérez, M A
2001-11-01
This paper presents an analysis of research published in the decade 1989-1998 by Spanish faculty members in the areas of statistical methods, research methodology, and psychometric theory. Database search and direct correspondence with faculty members in Departments of Methodology across Spain rendered a list of 193 papers published in these broad areas by 82 faculty members. These and other faculty members had actually published 931 papers over the decade of analysis, but 738 of them addressed topics not appropriate for description in this report. Classification and analysis of these 193 papers revealed topics that have attracted the most interest (psychophysics, item response theory, analysis of variance, sequential analysis, and meta-analysis) as well as other topics that have received less attention (scaling, factor analysis, time series, and structural models). A significant number of papers also dealt with various methodological issues (software, algorithms, instrumentation, and techniques). A substantial part of this report is devoted to describing the issues addressed across these 193 papers--most of which are written in the Spanish language and published in Spanish journals--and some representative references are given.
Jeukens, Cécile R L P N; Lalji, Ulrich C; Meijer, Eduard; Bakija, Betina; Theunissen, Robin; Wildberger, Joachim E; Lobbes, Marc B I
2014-10-01
Contrast-enhanced spectral mammography (CESM) shows promising initial results but comes at the cost of increased dose as compared with full-field digital mammography (FFDM). We aimed to quantitatively assess the dose increase of CESM in comparison with FFDM. Radiation exposure-related data (such as kilovoltage, compressed breast thickness, glandularity, entrance skin air kerma (ESAK), and average glandular dose (AGD) were retrieved for 47 CESM and 715 FFDM patients. All examinations were performed on 1 mammography unit. Radiation dose values reported by the unit were validated by phantom measurements. Descriptive statistics of the patient data were generated using a statistical software package. Dose values reported by the mammography unit were in good qualitative agreement with those of phantom measurements. Mean ESAK was 10.5 mGy for a CESM exposure and 7.46 mGy for an FFDM exposure. Mean AGD for a CESM exposure was 2.80 mGy and 1.55 mGy for an FFDM exposure. Compared with our institutional FFDM, the AGD of a single CESM exposure is increased by 1.25 mGy (+81%), whereas ESAK is increased by 3.07 mGy (+41%). Dose values of both techniques meet the recommendations for maximum dose in mammography.
Substance Use as a Strong Predictor of Poor Academic Achievement among University Students
Fekadu, Wubalem; Mekonnen, Tefera Chane; Workie, Shimelash Bitew
2017-01-01
Background Substance use is a growing concern globally and its association with students' academic performance is not well studied. Objective This study was aimed to assess the prevalence of substance use (alcohol, tobacco, and khat) and its association with academic performance among university students. Methods Cross-sectional study was conducted among Wolaita Sodo University students. A total of 747 students were selected by using cluster sampling technique. Data were collected by pretested self-administered questionnaire and examined using descriptive statistics and linear regression with 95% confidence intervals. Variables with p value of less than 0.05 were considered as statistically significant. Result Prevalence of substance use (alcohol, tobacco, and khat) was 28.6%. Substance use (current smoking, chewing khat at least weekly, drinking alcohol on a daily basis, and having intimate friend who uses substance) was significantly and negatively associated with students' academic performance. Conclusion Substance use among Wolaita Sodo University students was as common as other studies in Sub-Saharan countries and negatively associated with students' academic achievement. The common practice of substance use and its association with poor academic performance demand the universities to have a good control of substance and to implement youth friendly activities. PMID:28680879
Modeling Stochastic Kinetics of Molecular Machines at Multiple Levels: From Molecules to Modules
Chowdhury, Debashish
2013-01-01
A molecular machine is either a single macromolecule or a macromolecular complex. In spite of the striking superficial similarities between these natural nanomachines and their man-made macroscopic counterparts, there are crucial differences. Molecular machines in a living cell operate stochastically in an isothermal environment far from thermodynamic equilibrium. In this mini-review we present a catalog of the molecular machines and an inventory of the essential toolbox for theoretically modeling these machines. The tool kits include 1), nonequilibrium statistical-physics techniques for modeling machines and machine-driven processes; and 2), statistical-inference methods for reverse engineering a functional machine from the empirical data. The cell is often likened to a microfactory in which the machineries are organized in modular fashion; each module consists of strongly coupled multiple machines, but different modules interact weakly with each other. This microfactory has its own automated supply chain and delivery system. Buoyed by the success achieved in modeling individual molecular machines, we advocate integration of these models in the near future to develop models of functional modules. A system-level description of the cell from the perspective of molecular machinery (the mechanome) is likely to emerge from further integrations that we envisage here. PMID:23746505
How to heat up from the cold: examining the preconditions for (unconscious) mood effects.
Ruys, Kirsten I; Stapel, Diederik A
2008-05-01
What are the necessary preconditions to make people feel good or bad? In this research, the authors aimed to uncover the bare essentials of mood induction. Several induction techniques exist, and most of these techniques demand a relatively high amount of cognitive capacity. Moreover, to be effective, most techniques require conscious awareness. The authors proposed that the common and defining element in all effective mood induction techniques is the dominating salience of evaluative tone over descriptive meaning. This evaluative-tone hypothesis was tested in two paradigms in which the evaluative meaning of the "primed" concept was more salient than its descriptive meaning (i.e., when subliminal stimulus exposure was so short that mainly the evaluative meaning was activated [see D. A. Stapel, W. Koomen, & K. I. Ruys, 2002] and when the primed concepts were sufficiently extreme such that evaluative meaning always dominated descriptive meaning). Explicit and implicit mood measures showed that the activation of a dominating evaluative tone affected people's mood states. Implications of these findings for theories on unconscious mood induction are discussed. (c) 2008 APA, all rights reserved
Football fever: goal distributions and non-Gaussian statistics
NASA Astrophysics Data System (ADS)
Bittner, E.; Nußbaumer, A.; Janke, W.; Weigel, M.
2009-02-01
Analyzing football score data with statistical techniques, we investigate how the not purely random, but highly co-operative nature of the game is reflected in averaged properties such as the probability distributions of scored goals for the home and away teams. As it turns out, especially the tails of the distributions are not well described by the Poissonian or binomial model resulting from the assumption of uncorrelated random events. Instead, a good effective description of the data is provided by less basic distributions such as the negative binomial one or the probability densities of extreme value statistics. To understand this behavior from a microscopical point of view, however, no waiting time problem or extremal process need be invoked. Instead, modifying the Bernoulli random process underlying the Poissonian model to include a simple component of self-affirmation seems to describe the data surprisingly well and allows to understand the observed deviation from Gaussian statistics. The phenomenological distributions used before can be understood as special cases within this framework. We analyzed historical football score data from many leagues in Europe as well as from international tournaments, including data from all past tournaments of the “FIFA World Cup” series, and found the proposed models to be applicable rather universally. In particular, here we analyze the results of the German women’s premier football league and consider the two separate German men’s premier leagues in the East and West during the cold war times as well as the unified league after 1990 to see how scoring in football and the component of self-affirmation depend on cultural and political circumstances.
Cassetta, M; Altieri, F
2017-07-01
The brass wire ligature is an efficient method to correct a moderately mesially impacted mandibular second molar (MM2). The aim of this prospective clinical pilot study was to evaluate the influence of mandibular third molar (MM3) germectomy on the treatment time for this procedure and to determine its impact on oral health-related quality of life (OHRQoL) using the short-form Oral Health Impact Profile (OHIP-14). The STROBE guidelines were followed. Impacted MM2 were assigned randomly to receive brass wire ligature treatment either with germectomy (group A) or without germectomy (group B). Descriptive statistics and the Student t-test were used in the statistical analysis; significance was set at P≤0.05. One thousand and thirty patients were assessed. Fourteen subjects with 20 mesially angulated (range 25-40°) impacted MM2 were identified. Paired comparisons of groups A and B showed no statistically significant difference in treatment time (171days for group A and 174days for group B; P=0.440), but a statistically significant difference in OHIP-14 values at the 3- (P=0.017) and 7-day (P=0.002) follow-up. The brass wire technique can be used effectively in moderately impacted MM2, but the combined use of MM3 germectomy does not influence the treatment time and shows a negative impact on OHRQoL. Copyright © 2017 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
Comparative Analysis Between Computed and Conventional Inferior Alveolar Nerve Block Techniques.
Araújo, Gabriela Madeira; Barbalho, Jimmy Charles Melo; Dias, Tasiana Guedes de Souza; Santos, Thiago de Santana; Vasconcellos, Ricardo José de Holanda; de Morais, Hécio Henrique Araújo
2015-11-01
The aim of this randomized, double-blind, controlled trial was to compare the computed and conventional inferior alveolar nerve block techniques in symmetrically positioned inferior third molars. Both computed and conventional anesthetic techniques were performed in 29 healthy patients (58 surgeries) aged between 18 and 40 years. The anesthetic of choice was 2% lidocaine with 1: 200,000 epinephrine. The Visual Analogue Scale assessed the pain variable after anesthetic infiltration. Patient satisfaction was evaluated using the Likert Scale. Heart and respiratory rates, mean time to perform technique, and the need for additional anesthesia were also evaluated. Pain variable means were higher for the conventional technique as compared with computed, 3.45 ± 2.73 and 2.86 ± 1.96, respectively, but no statistically significant differences were found (P > 0.05). Patient satisfaction showed no statistically significant differences. The average computed technique runtime and the conventional were 3.85 and 1.61 minutes, respectively, showing statistically significant differences (P <0.001). The computed anesthetic technique showed lower mean pain perception, but did not show statistically significant differences when contrasted to the conventional technique.
Antropov, K M; Varaksin, A N
2013-01-01
This paper provides the description of Land Use Regression (LUR) modeling and the result of its application in the study of nitrogen dioxide air pollution in Ekaterinburg. The paper describes the difficulties of the modeling for air pollution caused by motor vehicles exhaust, and the ways to address these challenges. To create LUR model of the NO2 air pollution in Ekaterinburg, concentrations of NO2 were measured, data on factors affecting air pollution were collected, a statistical analysis of the data were held. A statistical model of NO2 air pollution (coefficient of determination R2 = 0.70) and a map of pollution were created.
NASA Astrophysics Data System (ADS)
Oktavia, Y.
2018-03-01
This research aims to: (1) Analyze the level of socio-cultural dynamics of agibusiness aquaculture actors. (2) Analyze the influence of socio-cultural dynamics on convergence communication of capacity development of aquaculture agribusiness actors.Data was collected by questionnaire and interview of group members on agribusiness. Data analyze was done by descriptive and inferential statistics with using SEM method. The result of descriptive statistics on 284 agribusiness members showed that: Socio-cultural dynamics of agibusiness aquaculture actors was in low category, as shown by lack of the role of customary institutions and quality of local leadership.The communication convergence is significantly and positively influenced by the communication behavior of agribusiness actors in access information.
Positional estimation techniques for an autonomous mobile robot
NASA Technical Reports Server (NTRS)
Nandhakumar, N.; Aggarwal, J. K.
1990-01-01
Techniques for positional estimation of a mobile robot navigation in an indoor environment are described. A comprehensive review of the various positional estimation techniques studied in the literature is first presented. The techniques are divided into four different types and each of them is discussed briefly. Two different kinds of environments are considered for positional estimation; mountainous natural terrain and an urban, man-made environment with polyhedral buildings. In both cases, the robot is assumed to be equipped with single visual camera that can be panned and tilted and also a 3-D description (world model) of the environment is given. Such a description could be obtained from a stereo pair of aerial images or from the architectural plans of the buildings. Techniques for positional estimation using the camera input and the world model are presented.
Mazaheri, H; Ghaedi, M; Ahmadi Azqhandi, M H; Asfaram, A
2017-05-10
Analytical chemists apply statistical methods for both the validation and prediction of proposed models. Methods are required that are adequate for finding the typical features of a dataset, such as nonlinearities and interactions. Boosted regression trees (BRTs), as an ensemble technique, are fundamentally different to other conventional techniques, with the aim to fit a single parsimonious model. In this work, BRT, artificial neural network (ANN) and response surface methodology (RSM) models have been used for the optimization and/or modeling of the stirring time (min), pH, adsorbent mass (mg) and concentrations of MB and Cd 2+ ions (mg L -1 ) in order to develop respective predictive equations for simulation of the efficiency of MB and Cd 2+ adsorption based on the experimental data set. Activated carbon, as an adsorbent, was synthesized from walnut wood waste which is abundant, non-toxic, cheap and locally available. This adsorbent was characterized using different techniques such as FT-IR, BET, SEM, point of zero charge (pH pzc ) and also the determination of oxygen containing functional groups. The influence of various parameters (i.e. pH, stirring time, adsorbent mass and concentrations of MB and Cd 2+ ions) on the percentage removal was calculated by investigation of sensitive function, variable importance rankings (BRT) and analysis of variance (RSM). Furthermore, a central composite design (CCD) combined with a desirability function approach (DFA) as a global optimization technique was used for the simultaneous optimization of the effective parameters. The applicability of the BRT, ANN and RSM models for the description of experimental data was examined using four statistical criteria (absolute average deviation (AAD), mean absolute error (MAE), root mean square error (RMSE) and coefficient of determination (R 2 )). All three models demonstrated good predictions in this study. The BRT model was more precise compared to the other models and this showed that BRT could be a powerful tool for the modeling and optimizing of removal of MB and Cd(ii). Sensitivity analysis (calculated from the weight of neurons in ANN) confirmed that the adsorbent mass and pH were the essential factors affecting the removal of MB and Cd(ii), with relative importances of 28.82% and 38.34%, respectively. A good agreement (R 2 > 0.960) between the predicted and experimental values was obtained. Maximum removal (R% > 99) was achieved at an initial dye concentration of 15 mg L -1 , a Cd 2+ concentration of 20 mg L -1 , a pH of 5.2, an adsorbent mass of 0.55 g and a time of 35 min.
Martín-González, Jenifer; Echevarría-Pérez, Marta; Sánchez-Domínguez, Benito; Tarilonte-Delgado, Maria L.; Castellanos-Cosano, Lizett; López-Frías, Francisco J.
2012-01-01
Objective: To analyse the influence of root canal instrumentation and obturation techniques on intra-operative pain experienced by patients during endodontic therapy. Method and Materials: A descriptive cross-sectional study was carried out in Ponferrada and Sevilla, Spain, including 80 patients (46 men and 34 women), with ages ranged from 10 to 74 years, randomly recruited. Patient gender and age, affected tooth, pulpal diagnosis, periapical status, previous NSAID or antibiotic (AB) treatment, and root canal instrumentation and obturation techniques were recorded. After root canal treatment (RCT), patients completed a 10-cm visual analogue scale (VAS) that ranked the level of pain. Results were analysed statistically using the Chi-square and ANOVA tests and logistic regression analysis. Results: The mean pain level during root canal treatment was 2.9 ± 3.0 (median = 2) in a VAS between 0 and 10. Forty percent of patients experienced no pain. Gender, age, arch, previous NSAIDs or AB treatment and anaesthetic type did not influence significantly the pain level (p > 0.05). Pain during root canal treatment was significantly greater in molar teeth (OR = 10.1; 95% C.I. = 1.6 - 63.5; p = 0.013). Root canal instrumentation and obturation techniques did not affect significantly patient’s pain during root canal treatment (p > 0.05). Conclusion: Patients feel more pain when RCT is carried out on molar teeth. The root canal instrumentation and obturation techniques do not affect significantly the patients’ pain during RCT. Key words:Anaesthesia, endodontic pain, pulpitis, root canal instrumentation, root canal obturation, rotary files. PMID:22549694
Toth, Thomas L; Lee, Malinda S; Bendikson, Kristin A; Reindollar, Richard H
2017-04-01
To better understand practice patterns and opportunities for standardization of ET. Cross-sectional survey. Not applicable. Not applicable. An anonymous 82-question survey was emailed to the medical directors of 286 Society for Assisted Reproductive Technology member IVF practices. A follow-up survey composed of three questions specific to ET technique was emailed to the same medical directors. Descriptive statistics of the results were compiled. The survey assessed policies, protocols, restrictions, and specifics pertinent to the technique of ET. There were 117 (41%) responses; 32% practice in academic settings and 68% in private practice. Responders were experienced clinicians, half of whom had performed <10 procedures during training. Ninety-eight percent of practices allowed all practitioners to perform ET; half did not follow a standardized ET technique. Multiple steps in the ET process were identified as "highly conserved;" others demonstrated discordance. ET technique is divided among [1] trial transfer followed immediately with ET (40%); [2] afterload transfer (30%); and [3] direct transfer without prior trial or afterload (27%). Embryos are discharged in the upper (66%) and middle thirds (29%) of the endometrial cavity and not closer than 1-1.5 cm from fundus (87%). Details of each step were reported and allowed the development of a "common" practice ET procedure. ET training and practices vary widely. Improved training and standardization based on outcomes data and best practices are warranted. A common practice procedure is suggested for validation by a systematic literature review. Copyright © 2017 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Harper, Roosevelt
2014-01-01
This research study examined the specific categories of IT control deficiencies and their related effects on financial reporting. The approach to this study was considered non-experimental, an approach sometimes called descriptive. Descriptive statistics are used to describe the basic features of the data in a study, providing simple summaries…
ERIC Educational Resources Information Center
Mercado, Claudia
2012-01-01
The purpose of this study was to learn more about the Hispanic students attending Northeastern Illinois University, a four-year institution in Chicago, IL, and their student success. Little is known descriptively and statistically about this population at NEIU, which serves as a Hispanic-Serving Institution. In addition, little is known about…
ERIC Educational Resources Information Center
Ho, Andrew D.; Yu, Carol C.
2015-01-01
Many statistical analyses benefit from the assumption that unconditional or conditional distributions are continuous and normal. More than 50 years ago in this journal, Lord and Cook chronicled departures from normality in educational tests, and Micerri similarly showed that the normality assumption is met rarely in educational and psychological…
An Analysis of Research Trends in Dissertations and Theses Studying Blended Learning
ERIC Educational Resources Information Center
Drysdale, Jeffery S.; Graham, Charles R.; Spring, Kristian J.; Halverson, Lisa R.
2013-01-01
This article analyzes the research of 205 doctoral dissertations and masters' theses in the domain of blended learning. A summary of trends regarding the growth and context of blended learning research is presented. Methodological trends are described in terms of qualitative, inferential statistics, descriptive statistics, and combined approaches…
In Search of the Most Likely Value
ERIC Educational Resources Information Center
Letkowski, Jerzy
2014-01-01
Descripting Statistics provides methodology and tools for user-friendly presentation of random data. Among the summary measures that describe focal tendencies in random data, the mode is given the least amount of attention and it is frequently misinterpreted in many introductory textbooks on statistics. The purpose of the paper is to provide a…
[Artificial neural networks for decision making in urologic oncology].
Remzi, M; Djavan, B
2007-06-01
This chapter presents a detailed introduction regarding Artificial Neural Networks (ANNs) and their contribution to modern Urologic Oncology. It includes a description of ANNs methodology and points out the differences between Artifical Intelligence and traditional statistic models in terms of usefulness for patients and clinicians, and its advantages over current statistical analysis.
Financial Statistics, 1980-81. Our Colleges and Universities Today. Volume XIX, Number 8.
ERIC Educational Resources Information Center
Hottinger, Gerald W.
Financial statistics for Pennsylvania colleges and universities for the fiscal year (FY) ending 1981, for 1971-1972 through 1980-1981, and for 1977-1978 through 1980-1981 are presented, along with narrative descriptions of financial trends at the institutions. Information includes the following: current-funds revenues by institutional control;…
ERIC Educational Resources Information Center
Spinella, Sarah
2011-01-01
As result replicability is essential to science and difficult to achieve through external replicability, the present paper notes the insufficiency of null hypothesis statistical significance testing (NHSST) and explains the bootstrap as a plausible alternative, with a heuristic example to illustrate the bootstrap method. The bootstrap relies on…
Difficulties in Learning and Teaching Statistics: Teacher Views
ERIC Educational Resources Information Center
Koparan, Timur
2015-01-01
The purpose of this study is to define teacher views about the difficulties in learning and teaching middle school statistics subjects. To serve this aim, a number of interviews were conducted with 10 middle school maths teachers in 2011-2012 school year in the province of Trabzon. Of the qualitative descriptive research methods, the…
Seed Dispersal Near and Far: Patterns Across Temperate and Tropical Forests
James S. Clark; Miles Silman; Ruth Kern; Eric Macklin; Janneke HilleRisLambers
1999-01-01
Dispersal affects community dynamics and vegetation response to global change. Understanding these effects requires descriptions of dispersal at local and regional scales and statistical models that permit estimation. Classical models of dispersal describe local or long-distance dispersal, but not both. The lack of statistical methods means that models have rarely been...
ERIC Educational Resources Information Center
Farmer, Tod Allen
2012-01-01
The study assessed the need for learning organizations to implement evidence-based policies and practices designed to enhance the academic and social success of Hispanic learners. Descriptive statistics and longitudinal data from the National Center for Educational Statistics (NCES) and the National Clearinghouse for English Language Acquisition…
Damned Lies. And Statistics. Otto Neurath and Soviet Propaganda in the 1930s.
ERIC Educational Resources Information Center
Chizlett, Clive
1992-01-01
Examines the philosophical and historical context in which Otto Neurath (1882-1945) worked. Examines critically (in the light of descriptive statistics) the principles of his Isotype Picture Language. Tests Neurath's personal credibility and scientific integrity by looking at his contributions to Soviet propaganda in the early 1930s. (SR)
Forest statistics for the upper Koyukuk River, Alaska, 1971.
Karl M. Hegg
1974-01-01
Area and volume statistics from the first intensive forest inventory of the upper Koyukuk River drainage, in north-central Alaska, are given. Observations are made on forest location, description, defect, regeneration, growth, and mortality. Commercial forests, although generally restricted to a narrow band along drainages, were found as far as 70 miles (113 kilometers...
Education Statistics Quarterly. Volume 4 Issue 4, 2002.
ERIC Educational Resources Information Center
National Center for Education Statistics, 2002
2002-01-01
This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications and data products released in a 3-month period. Each issue also contains a message from the NCES on a timely…
Fotina, I; Lütgendorf-Caucig, C; Stock, M; Pötter, R; Georg, D
2012-02-01
Inter-observer studies represent a valid method for the evaluation of target definition uncertainties and contouring guidelines. However, data from the literature do not yet give clear guidelines for reporting contouring variability. Thus, the purpose of this work was to compare and discuss various methods to determine variability on the basis of clinical cases and a literature review. In this study, 7 prostate and 8 lung cases were contoured on CT images by 8 experienced observers. Analysis of variability included descriptive statistics, calculation of overlap measures, and statistical measures of agreement. Cross tables with ratios and correlations were established for overlap parameters. It was shown that the minimal set of parameters to be reported should include at least one of three volume overlap measures (i.e., generalized conformity index, Jaccard coefficient, or conformation number). High correlation between these parameters and scatter of the results was observed. A combination of descriptive statistics, overlap measure, and statistical measure of agreement or reliability analysis is required to fully report the interrater variability in delineation.
Carter, Laura; Wilson, Stephen; Tumer, Erwin G
2010-01-01
The purpose of this retrospective chart review was to document sedation and analgesic medications administered preoperotively, intraoperatively, and during postanesthesia care for children undergoing dental rehabilitation using general anesthesia (GA). Patient gender, age, procedure type performed, and ASA status were recorded from the medical charts of children undergoing GA for dental rehabilitation. The sedative and analgesic drugs administered pre-, intra-, and postoperatively were recorded. Statistical analysis included descriptive statistics and cross-tabulation. A sample of 115 patients with a mean age of 64 (+/-30) months was studied; 47% were females, and 71% were healthy. Over 80% of the patients were administered medications primarily during pre- and intraoperative phases, with fewer than 25% receiving medications postoperatively. Morphine and fentanyl were the most frequently administered agents intraoperatively. The procedure type, gender, and health status were not statistically associated with the number of agents administered. Younger patients, however, were statistically more likely to receive additional analgesic medications. Our study suggests that a minority of patients have postoperative discomfort in the postanesthesia care unit; mild to moderate analgesics were administered during intraoperative phases of dental rehabilitation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neilson, James R.; McQueen, Tyrel M.
With the increased availability of high-intensity time-of-flight neutron and synchrotron X-ray scattering sources that can access wide ranges of momentum transfer, the pair distribution function method has become a standard analysis technique for studying disorder of local coordination spheres and at intermediate atomic separations. In some cases, rational modeling of the total scattering data (Bragg and diffuse) becomes intractable with least-squares approaches, necessitating reverse Monte Carlo simulations using large atomistic ensembles. However, the extraction of meaningful information from the resulting atomistic ensembles is challenging, especially at intermediate length scales. Representational analysis is used here to describe the displacements of atomsmore » in reverse Monte Carlo ensembles from an ideal crystallographic structure in an approach analogous to tight-binding methods. Rewriting the displacements in terms of a local basis that is descriptive of the ideal crystallographic symmetry provides a robust approach to characterizing medium-range order (and disorder) and symmetry breaking in complex and disordered crystalline materials. Lastly, this method enables the extraction of statistically relevant displacement modes (orientation, amplitude and distribution) of the crystalline disorder and provides directly meaningful information in a locally symmetry-adapted basis set that is most descriptive of the crystal chemistry and physics.« less
Neilson, James R.; McQueen, Tyrel M.
2015-09-20
With the increased availability of high-intensity time-of-flight neutron and synchrotron X-ray scattering sources that can access wide ranges of momentum transfer, the pair distribution function method has become a standard analysis technique for studying disorder of local coordination spheres and at intermediate atomic separations. In some cases, rational modeling of the total scattering data (Bragg and diffuse) becomes intractable with least-squares approaches, necessitating reverse Monte Carlo simulations using large atomistic ensembles. However, the extraction of meaningful information from the resulting atomistic ensembles is challenging, especially at intermediate length scales. Representational analysis is used here to describe the displacements of atomsmore » in reverse Monte Carlo ensembles from an ideal crystallographic structure in an approach analogous to tight-binding methods. Rewriting the displacements in terms of a local basis that is descriptive of the ideal crystallographic symmetry provides a robust approach to characterizing medium-range order (and disorder) and symmetry breaking in complex and disordered crystalline materials. Lastly, this method enables the extraction of statistically relevant displacement modes (orientation, amplitude and distribution) of the crystalline disorder and provides directly meaningful information in a locally symmetry-adapted basis set that is most descriptive of the crystal chemistry and physics.« less
Free-Form Region Description with Second-Order Pooling.
Carreira, João; Caseiro, Rui; Batista, Jorge; Sminchisescu, Cristian
2015-06-01
Semantic segmentation and object detection are nowadays dominated by methods operating on regions obtained as a result of a bottom-up grouping process (segmentation) but use feature extractors developed for recognition on fixed-form (e.g. rectangular) patches, with full images as a special case. This is most likely suboptimal. In this paper we focus on feature extraction and description over free-form regions and study the relationship with their fixed-form counterparts. Our main contributions are novel pooling techniques that capture the second-order statistics of local descriptors inside such free-form regions. We introduce second-order generalizations of average and max-pooling that together with appropriate non-linearities, derived from the mathematical structure of their embedding space, lead to state-of-the-art recognition performance in semantic segmentation experiments without any type of local feature coding. In contrast, we show that codebook-based local feature coding is more important when feature extraction is constrained to operate over regions that include both foreground and large portions of the background, as typical in image classification settings, whereas for high-accuracy localization setups, second-order pooling over free-form regions produces results superior to those of the winning systems in the contemporary semantic segmentation challenges, with models that are much faster in both training and testing.
Frequency of depression, anxiety and stress among the undergraduate physiotherapy students
Syed, Annosha; Ali, Syed Shazad; Khan, Muhammad
2018-01-01
Objectives: To assess the frequency of Depression, Anxiety and Stress (DAS) among the undergraduate physiotherapy students. Methods: A descriptive cross sectional study was conducted in various Physiotherapy Institutes in Sindh, Pakistan among undergraduate physiotherapy students. The total duration of this study was 4 months from September, 2016 to January, 2017. Data was collected from 267 students with no physical and mental illness; more than half were female students 75.3%. They were selected through Non probability purposive sampling technique. A self-administered standardized DASS (depression, anxiety and stress scale) was used to collect data and result was analyzed using its severity rating index. Data was entered and analyzed by using SPSS version 21. Descriptive statistics including the frequency of depression, anxiety, stress and demographic characteristic of the participant was collected. Results: The mean age of students was 19.3371±1.18839 years. The Frequency of depression, anxiety and stress found among undergraduates Physiotherapy students was 48.0%, 68.54% and 53.2%, respectively. Conclusions: It was observed that the frequency of depression, anxiety and stress among physiotherapy undergraduates students were high. It suggests the urgent need of carrying out evidence based Psychological health promotion for undergraduate Physiotherapy students to control this growing problem. PMID:29805428
Weekes, D P; Kagan, S H; James, K; Seboni, N
1993-01-01
The purpose of this study was to understand the phenomenon of hand holding as a coping strategy used by adolescents to deal with treatment-related pain. The convenience sample consisted of 20 adolescents whose ages were 11 to 19 years: 10 had cancer and 10 had renal disease (this served as the comparison group). Using a descriptive design, a semistructured interview was conducted with each adolescent. To supplement and support interview data, structured observations were conducted as adolescents underwent painful treatments (eg, blood draws, shunt placement, peripheral chemotherapy, lumbar punctures, and bone marrow aspirations). Data were analyzed using descriptive statistics and qualitative analytic techniques similar to those delineated by Strauss and Corbin. The results of this study indicated that subjects in both the cancer and the renal disease group perceived hand holding to be a very effective coping strategy in ameliorating treatment-related pain. Overwhelmingly the patients preferred to hold their mother's hand. When the mother was unavailable, they preferred to hold a specific nurse's hand. Hand holding functioned to reduce tension associated with impending treatments, as a source of distraction, and as a source of security. Accordingly, adolescents' subjective experience of treatment-related pain was reduced when they felt more secure, less tense, and were distracted.
Avecilla-Ramírez, G N; Ruiz-Correa, S; Marroquin, J L; Harmony, T; Alba, A; Mendoza-Montoya, O
2011-12-01
This study presents evidence suggesting that electrophysiological responses to language-related auditory stimuli recorded at 46weeks postconceptional age (PCA) are associated with language development, particularly in infants with periventricular leukomalacia (PVL). In order to investigate this hypothesis, electrophysiological responses to a set of auditory stimuli consisting of series of syllables and tones were recorded from a population of infants with PVL at 46weeks PCA. A communicative development inventory (i.e., parent report) was applied to this population during a follow-up study performed at 14months of age. The results of this later test were analyzed with a statistical clustering procedure, which resulted in two well-defined groups identified as the high-score (HS) and low-score (LS) groups. The event-induced power of the EEG data recorded at 46weeks PCA was analyzed using a dimensionality reduction approach, resulting in a new set of descriptive variables. The LS and HS groups formed well-separated clusters in the space spanned by these descriptive variables, which can therefore be used to predict whether a new subject will belong to either of these groups. A predictive classification rate of 80% was obtained by using a linear classifier that was trained with a leave-one-out cross-validation technique. 2011 Elsevier Inc. All rights reserved.
Nieuwenhuijsen, Mark; Paustenbach, Dennis; Duarte-Davidson, Raquel
2006-12-01
The field of exposure assessment has matured significantly over the past 10-15 years. Dozens of studies have measured the concentrations of numerous chemicals in many media to which humans are exposed. Others have catalogued the various exposure pathways and identified typical values which can be used in the exposure calculations for the general population such as amount of water or soil ingested per day or the percent of a chemical than can pass through the skin. In addition, studies of the duration of exposure for many tasks (e.g. showering, jogging, working in the office) have been conducted which allow for more general descriptions of the likely range of exposures. All of this information, as well as the development of new and better models (e.g. air dispersion or groundwater models), allow for better estimates of exposure. In addition to identifying better exposure factors, and better mathematical models for predicting the aerial distribution of chemicals, the conduct of simulation studies and dose-reconstruction studies can offer extraordinary opportunities for filling in data gaps regarding historical exposures which are critical to improving the power of epidemiology studies. The use of probabilistic techniques such as Monte Carlo analysis and Bayesian statistics have revolutionized the practice of exposure assessment and has greatly enhanced the quality of the risk characterization. Lastly, the field of epidemiology is about to undergo a sea change with respect to the exposure component because each year better environmental and exposure models, statistical techniques and new biological monitoring techniques are being introduced. This paper reviews these techniques and discusses where additional research is likely to pay a significant dividend. Exposure assessment techniques are now available which can significantly improve the quality of epidemiology and health risk assessment studies and vastly improve their usefulness. As more quantitative exposure components can now be incorporated into these studies, they can be better used to identify safe levels of exposure using customary risk assessment methodologies. Examples are drawn from both environmental and occupational studies illustrating how these techniques have been used to better understand exposure to specific chemicals. Some thoughts are also presented on what lessons have been learned about conducting exposure assessment for health risk assessments and epidemiological studies.
Auteur Description: From the Director's Creative Vision to Audio Description
ERIC Educational Resources Information Center
Szarkowska, Agnieszka
2013-01-01
In this report, the author follows the suggestion that a film director's creative vision should be incorporated into Audio description (AD), a major technique for making films, theater performances, operas, and other events accessible to people who are blind or have low vision. The author presents a new type of AD for auteur and artistic films:…
Nonlinear Curve-Fitting Program
NASA Technical Reports Server (NTRS)
Everhart, Joel L.; Badavi, Forooz F.
1989-01-01
Nonlinear optimization algorithm helps in finding best-fit curve. Nonlinear Curve Fitting Program, NLINEAR, interactive curve-fitting routine based on description of quadratic expansion of X(sup 2) statistic. Utilizes nonlinear optimization algorithm calculating best statistically weighted values of parameters of fitting function and X(sup 2) minimized. Provides user with such statistical information as goodness of fit and estimated values of parameters producing highest degree of correlation between experimental data and mathematical model. Written in FORTRAN 77.
ERIC Educational Resources Information Center
Kadhi, T.; Rudley, D.; Holley, D.; Krishna, K.; Ogolla, C.; Rene, E.; Green, T.
2010-01-01
The following report of descriptive statistics addresses the attendance of the 2012 class and the average Actual and Predicted 1L Grade Point Averages (GPAs). Correlational and Inferential statistics are also run on the variables of Attendance (Y/N), Attendance Number of Times, Actual GPA, and Predictive GPA (Predictive GPA is defined as the Index…
Visually guided tube thoracostomy insertion comparison to standard of care in a large animal model.
Hernandez, Matthew C; Vogelsang, David; Anderson, Jeff R; Thiels, Cornelius A; Beilman, Gregory; Zielinski, Martin D; Aho, Johnathon M
2017-04-01
Tube thoracostomy (TT) is a lifesaving procedure for a variety of thoracic pathologies. The most commonly utilized method for placement involves open dissection and blind insertion. Image guided placement is commonly utilized but is limited by an inability to see distal placement location. Unfortunately, TT is not without complications. We aim to demonstrate the feasibility of a disposable device to allow for visually directed TT placement compared to the standard of care in a large animal model. Three swine were sequentially orotracheally intubated and anesthetized. TT was conducted utilizing a novel visualization device, tube thoracostomy visual trocar (TTVT) and standard of care (open technique). Position of the TT in the chest cavity were recorded using direct thoracoscopic inspection and radiographic imaging with the operator blinded to results. Complications were evaluated using a validated complication grading system. Standard descriptive statistical analyses were performed. Thirty TT were placed, 15 using TTVT technique, 15 using standard of care open technique. All of the TT placed using TTVT were without complication and in optimal position. Conversely, 27% of TT placed using standard of care open technique resulted in complications. Necropsy revealed no injury to intrathoracic organs. Visual directed TT placement using TTVT is feasible and non-inferior to the standard of care in a large animal model. This improvement in instrumentation has the potential to greatly improve the safety of TT. Further study in humans is required. Therapeutic Level II. Copyright © 2017 Elsevier Ltd. All rights reserved.
Collecting and Analyzing Patient Experiences of Health Care From Social Media.
Rastegar-Mojarad, Majid; Ye, Zhan; Wall, Daniel; Murali, Narayana; Lin, Simon
2015-07-02
Social Media, such as Yelp, provides rich information of consumer experience. Previous studies suggest that Yelp can serve as a new source to study patient experience. However, the lack of a corpus of patient reviews causes a major bottleneck for applying computational techniques. The objective of this study is to create a corpus of patient experience (COPE) and report descriptive statistics to characterize COPE. Yelp reviews about health care-related businesses were extracted from the Yelp Academic Dataset. Natural language processing (NLP) tools were used to split reviews into sentences, extract noun phrases and adjectives from each sentence, and generate parse trees and dependency trees for each sentence. Sentiment analysis techniques and Hadoop were used to calculate a sentiment score of each sentence and for parallel processing, respectively. COPE contains 79,173 sentences from 6914 patient reviews of 985 health care facilities near 30 universities in the United States. We found that patients wrote longer reviews when they rated the facility poorly (1 or 2 stars). We demonstrated that the computed sentiment scores correlated well with consumer-generated ratings. A consumer vocabulary to describe their health care experience was constructed by a statistical analysis of word counts and co-occurrences in COPE. A corpus called COPE was built as an initial step to utilize social media to understand patient experiences at health care facilities. The corpus is available to download and COPE can be used in future studies to extract knowledge of patients' experiences from their perspectives. Such information can subsequently inform and provide opportunity to improve the quality of health care.
Ahmadi, Maryam; Valinejadi, Ali; Goodarzi, Afshin; Safari, Ameneh; Hemmat, Morteza; Majdabadi, Hesamedin Askari; Mohammadi, Ali
2017-06-01
Traffic accidents are one of the more important national and international issues, and their consequences are important for the political, economical, and social level in a country. Management of traffic accident information requires information systems with analytical and accessibility capabilities to spatial and descriptive data. The aim of this study was to determine the capabilities of a Geographic Information System (GIS) in management of traffic accident information. This qualitative cross-sectional study was performed in 2016. In the first step, GIS capabilities were identified via literature retrieved from the Internet and based on the included criteria. Review of the literature was performed until data saturation was reached; a form was used to extract the capabilities. In the second step, study population were hospital managers, police, emergency, statisticians, and IT experts in trauma, emergency and police centers. Sampling was purposive. Data was collected using a questionnaire based on the first step data; validity and reliability were determined by content validity and Cronbach's alpha of 75%. Data was analyzed using the decision Delphi technique. GIS capabilities were identified in ten categories and 64 sub-categories. Import and process of spatial and descriptive data and so, analysis of this data were the most important capabilities of GIS in traffic accident information management. Storing and retrieving of descriptive and spatial data, providing statistical analysis in table, chart and zoning format, management of bad structure issues, determining the cost effectiveness of the decisions and prioritizing their implementation were the most important capabilities of GIS which can be efficient in the management of traffic accident information.
Sevcik, Emily E; Jones, Jennifer D; Myers, Charles E
2017-11-01
Given the rise in music therapy master's programs that offer dual degrees in music therapy and counseling or programs that satisfy state mental health counseling licensure laws, the professional counseling field is playing an increased role in the advanced education and professional practices of music therapists. To identify factors that lead music therapists to pursue advanced education with an emphasis in professional counseling, perceptions about benefits and drawbacks for three advanced degree options (i.e., music therapy, counseling, and music therapy/counseling dual degree), and describe the professional practices and identity of dual-trained music therapists as counselors. A convenience sample of music therapists (n = 123) who held board certification, and held a master's degree or higher that emphasized professional counseling, completed an online survey. We used descriptive statistics to analyze categorical and numeric survey data. Eligibility for licensure as a professional counselor was the most important decisional factor in selecting a specific master's degree program. Respondents also reported favorable perceptions of the dual degree in music therapy and counseling. With regard to professional practice and identity, respondents reported high use of verbal processing techniques alongside music therapy interventions, and dual-trained music therapists retained their professional identity as a music therapist. The reported view of licensure in a related field as beneficial and frequent use of verbal processing techniques warrants future study into the role of counseling in the advanced training of music therapists. Given contradictory findings across studies, we recommend investigators also explore how a degree in a related field affects career longevity of music therapists. © the American Music Therapy Association 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com
ERIC Educational Resources Information Center
Williams, Immanuel James; Williams, Kelley Kim
2016-01-01
Understanding summary statistics and graphical techniques are building blocks to comprehending concepts beyond basic statistics. It's known that motivated students perform better in school. Using examples that students find engaging allows them to understand the concepts at a deeper level.
Statistics of high-level scene context
Greene, Michelle R.
2013-01-01
Context is critical for recognizing environments and for searching for objects within them: contextual associations have been shown to modulate reaction time and object recognition accuracy, as well as influence the distribution of eye movements and patterns of brain activations. However, we have not yet systematically quantified the relationships between objects and their scene environments. Here I seek to fill this gap by providing descriptive statistics of object-scene relationships. A total of 48, 167 objects were hand-labeled in 3499 scenes using the LabelMe tool (Russell et al., 2008). From these data, I computed a variety of descriptive statistics at three different levels of analysis: the ensemble statistics that describe the density and spatial distribution of unnamed “things” in the scene; the bag of words level where scenes are described by the list of objects contained within them; and the structural level where the spatial distribution and relationships between the objects are measured. The utility of each level of description for scene categorization was assessed through the use of linear classifiers, and the plausibility of each level for modeling human scene categorization is discussed. Of the three levels, ensemble statistics were found to be the most informative (per feature), and also best explained human patterns of categorization errors. Although a bag of words classifier had similar performance to human observers, it had a markedly different pattern of errors. However, certain objects are more useful than others, and ceiling classification performance could be achieved using only the 64 most informative objects. As object location tends not to vary as a function of category, structural information provided little additional information. Additionally, these data provide valuable information on natural scene redundancy that can be exploited for machine vision, and can help the visual cognition community to design experiments guided by statistics rather than intuition. PMID:24194723
Complex networks as a unified framework for descriptive analysis and predictive modeling in climate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steinhaeuser, Karsten J K; Chawla, Nitesh; Ganguly, Auroop R
The analysis of climate data has relied heavily on hypothesis-driven statistical methods, while projections of future climate are based primarily on physics-based computational models. However, in recent years a wealth of new datasets has become available. Therefore, we take a more data-centric approach and propose a unified framework for studying climate, with an aim towards characterizing observed phenomena as well as discovering new knowledge in the climate domain. Specifically, we posit that complex networks are well-suited for both descriptive analysis and predictive modeling tasks. We show that the structural properties of climate networks have useful interpretation within the domain. Further,more » we extract clusters from these networks and demonstrate their predictive power as climate indices. Our experimental results establish that the network clusters are statistically significantly better predictors than clusters derived using a more traditional clustering approach. Using complex networks as data representation thus enables the unique opportunity for descriptive and predictive modeling to inform each other.« less
Catalog of earthquake hypocenters at Alaskan volcanoes: January 1 through December 31, 2006
Dixon, James P.; Stihler, Scott D.; Power, John A.; Searcy, Cheryl
2008-01-01
Between January 1 and December 31, 2006, AVO located 8,666 earthquakes of which 7,783 occurred on or near the 33 volcanoes monitored within Alaska. Monitoring highlights in 2006 include: an eruption of Augustine Volcano, a volcanic-tectonic earthquake swarm at Mount Martin, elevated seismicity and volcanic unrest at Fourpeaked Mountain, and elevated seismicity and low-level tremor at Mount Veniaminof and Korovin Volcano. A new seismic subnetwork was installed on Fourpeaked Mountain. This catalog includes: (1) descriptions and locations of seismic instrumentation deployed in the field during 2006, (2) a description of earthquake detection, recording, analysis, and data archival systems, (3) a description of seismic velocity models used for earthquake locations, (4) a summary of earthquakes located in 2006, and (5) an accompanying UNIX tar-file with a summary of earthquake origin times, hypocenters, magnitudes, phase arrival times, location quality statistics, daily station usage statistics, and all files used to determine the earthquake locations in 2006.
Verus: A Tool for Quantitative Analysis of Finite-State Real-Time Systems.
1996-08-12
Symbolic model checking is a technique for verifying finite-state concurrent systems that has been extended to handle real - time systems . Models with...up to 10(exp 30) states can often be verified in minutes. In this paper, we present a new tool to analyze real - time systems , based on this technique...We have designed a language, called Verus, for the description of real - time systems . Such a description is compiled into a state-transition graph and
Methodological reporting of randomized trials in five leading Chinese nursing journals.
Shi, Chunhu; Tian, Jinhui; Ren, Dan; Wei, Hongli; Zhang, Lihuan; Wang, Quan; Yang, Kehu
2014-01-01
Randomized controlled trials (RCTs) are not always well reported, especially in terms of their methodological descriptions. This study aimed to investigate the adherence of methodological reporting complying with CONSORT and explore associated trial level variables in the Chinese nursing care field. In June 2012, we identified RCTs published in five leading Chinese nursing journals and included trials with details of randomized methods. The quality of methodological reporting was measured through the methods section of the CONSORT checklist and the overall CONSORT methodological items score was calculated and expressed as a percentage. Meanwhile, we hypothesized that some general and methodological characteristics were associated with reporting quality and conducted a regression with these data to explore the correlation. The descriptive and regression statistics were calculated via SPSS 13.0. In total, 680 RCTs were included. The overall CONSORT methodological items score was 6.34 ± 0.97 (Mean ± SD). No RCT reported descriptions and changes in "trial design," changes in "outcomes" and "implementation," or descriptions of the similarity of interventions for "blinding." Poor reporting was found in detailing the "settings of participants" (13.1%), "type of randomization sequence generation" (1.8%), calculation methods of "sample size" (0.4%), explanation of any interim analyses and stopping guidelines for "sample size" (0.3%), "allocation concealment mechanism" (0.3%), additional analyses in "statistical methods" (2.1%), and targeted subjects and methods of "blinding" (5.9%). More than 50% of trials described randomization sequence generation, the eligibility criteria of "participants," "interventions," and definitions of the "outcomes" and "statistical methods." The regression analysis found that publication year and ITT analysis were weakly associated with CONSORT score. The completeness of methodological reporting of RCTs in the Chinese nursing care field is poor, especially with regard to the reporting of trial design, changes in outcomes, sample size calculation, allocation concealment, blinding, and statistical methods.
Occurrence and transport of pesticides and alkylphenols in water samples along the Ebro River Basin
NASA Astrophysics Data System (ADS)
Navarro, Alícia; Tauler, Romà; Lacorte, Sílvia; Barceló, Damià
2010-03-01
SummaryWe report the temporal and geographical variations of a set of 30 pesticides (including triazines, organophosphorus and acetanilides) and industrial compounds in surface waters along the Ebro River during the period 2004-2006. Using descriptive statistics we found that the compounds with industrial origin (tributylphosphate, octylphenol and nonylphenol) appeared in over 60% of the samples analyzed and at very high concentrations, while pesticides had a point source origin in the Ebro delta area and overall low-levels, between 0.005 and 2.575 μg L -1. Correlations among pollutants and their distributions were studied using Principal Component Analysis (PCA), a multivariate exploratory data analysis technique which permitted us to discern between agricultural and industrial source contamination. Over a 3 years period a seasonal trend revealed highest concentrations of pesticides over the spring-summer period following pesticide application.
Communication Skill Attributes Needed for Vocational Education enter The Workplace
NASA Astrophysics Data System (ADS)
Wahyuni, L. M.; Masih, I. K.; Rejeki, I. N. Mei
2018-01-01
Communication skills are generic skills which need to be developed for success in the vocational education entering the workforce. This study aimed to discover the attributes of communication skill considered important in entering the workforce as perceived by vocational education students. The research was conducted by survey method using questionnaire as data collecting tool. The research population is final year student of D3 Vocational education Program and D4 Managerial Vocational education in academic year 2016/2017 who have completed field work practice in industry. The sampling technique was proportional random sampling. Data were analyzed with descriptive statistics and independent sampel t-test. Have ten communication skills attributes with the highest important level required to enter the workplace as perceived by the vocational education diploma. These results indicate that there was the same need related communication skills to enter the workforce
Structural equation modeling: building and evaluating causal models: Chapter 8
Grace, James B.; Scheiner, Samuel M.; Schoolmaster, Donald R.
2015-01-01
Scientists frequently wish to study hypotheses about causal relationships, rather than just statistical associations. This chapter addresses the question of how scientists might approach this ambitious task. Here we describe structural equation modeling (SEM), a general modeling framework for the study of causal hypotheses. Our goals are to (a) concisely describe the methodology, (b) illustrate its utility for investigating ecological systems, and (c) provide guidance for its application. Throughout our presentation, we rely on a study of the effects of human activities on wetland ecosystems to make our description of methodology more tangible. We begin by presenting the fundamental principles of SEM, including both its distinguishing characteristics and the requirements for modeling hypotheses about causal networks. We then illustrate SEM procedures and offer guidelines for conducting SEM analyses. Our focus in this presentation is on basic modeling objectives and core techniques. Pointers to additional modeling options are also given.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zucchiatti, Alessandro
2013-07-18
The Centro de Micro Analisis de Materiales (CMAM) is a research centre of the Universidad Autonoma de Madrid dedicated to the modification and analysis of materials using ion beam techniques. The infrastructure, based on a HVEE 5MV tandem accelerator, provided with a coaxial Cockcroft Walton charging system, is fully open to research groups of the UAM, to other public research institutions and to private enterprises. The CMAM research covers a few important lines such as advanced materials, surface science, biomedical materials, cultural heritage, materials for energy production. The Centre gives as well support to university teaching and technical training. Amore » detail description of the research infrastructures and their use statistics will be given. Some of the main research results will be presented to show the progress of research in the Centre in the past few years and to motivate the strategic plans for the forthcoming.« less
Giacomo, Della Riccia; Stefania, Del Zotto
2013-12-15
Fumonisins are mycotoxins produced by Fusarium species that commonly live in maize. Whereas fungi damage plants, fumonisins cause disease both to cattle breedings and human beings. Law limits set fumonisins tolerable daily intake with respect to several maize based feed and food. Chemical techniques assure the most reliable and accurate measurements, but they are expensive and time consuming. A method based on Near Infrared spectroscopy and multivariate statistical regression is described as a simpler, cheaper and faster alternative. We apply Partial Least Squares with full cross validation. Two models are described, having high correlation of calibration (0.995, 0.998) and of validation (0.908, 0.909), respectively. Description of observed phenomenon is accurate and overfitting is avoided. Screening of contaminated maize with respect to European legal limit of 4 mg kg(-1) should be assured. Copyright © 2013 Elsevier Ltd. All rights reserved.
Methods of Fitting a Straight Line to Data: Examples in Water Resources
Hirsch, Robert M.; Gilroy, Edward J.
1984-01-01
Three methods of fitting straight lines to data are described and their purposes are discussed and contrasted in terms of their applicability in various water resources contexts. The three methods are ordinary least squares (OLS), least normal squares (LNS), and the line of organic correlation (OC). In all three methods the parameters are based on moment statistics of the data. When estimation of an individual value is the objective, OLS is the most appropriate. When estimation of many values is the objective and one wants the set of estimates to have the appropriate variance, then OC is most appropriate. When one wishes to describe the relationship between two variables and measurement error is unimportant, then OC is most appropriate. Where the error is important in descriptive problems or in calibration problems, then structural analysis techniques may be most appropriate. Finally, if the problem is one of describing some geographic trajectory, then LNS is most appropriate.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harstad, E. N.; Harlow, Francis Harvey,; Schreyer, H. L.
Our goal is to develop constitutive relations for the behavior of a solid polymer during high-strain-rate deformations. In contrast to the classic thermodynamic techniques for deriving stress-strain response in static (equilibrium) circumstances, we employ a statistical-mechanics approach, in which we evolve a probability distribution function (PDF) for the velocity fluctuations of the repeating units of the chain. We use a Langevin description for the dynamics of a single repeating unit and a Lioville equation to describe the variations of the PDF. Moments of the PDF give the conservation equations for a single polymer chain embedded in other similar chains. Tomore » extract single-chain analytical constitutive relations these equations have been solved for representative loading paths. By this process we discover that a measure of nonuniform chain link displacement serves this purpose very well. We then derive an evolution equation for the descriptor function, with the result being a history-dependent constitutive relation.« less
NASA Astrophysics Data System (ADS)
Ushenko, Yu. A.; Prysyazhnyuk, V. P.; Gavrylyak, M. S.; Gorsky, M. P.; Bachinskiy, V. T.; Vanchuliak, O. Ya.
2015-02-01
A new information optical technique of diagnostics of the structure of polycrystalline films of blood plasma is proposed. The model of Mueller-matrix description of mechanisms of optical anisotropy of such objects as optical activity, birefringence, as well as linear and circular dichroism is suggested. The ensemble of informationally topical azimuthally stable Mueller-matrix invariants is determined. Within the statistical analysis of such parameters distributions the objective criteria of differentiation of films of blood plasma taken from healthy and patients with liver cirrhosis were determined. From the point of view of probative medicine the operational characteristics (sensitivity, specificity and accuracy) of the information-optical method of Mueller-matrix mapping of polycrystalline films of blood plasma were found and its efficiency in diagnostics of liver cirrhosis was demonstrated. Prospects of application of the method in experimental medicine to differentiate postmortem changes of the myocardial tissue was examined.
NASA Technical Reports Server (NTRS)
Farmer, F. H.; Jarrett, O., Jr.; Brown, C. A., Jr.
1983-01-01
The concentration and composition of phytoplankton populations are measured by an optical method which can be used either in situ or remotely. This method is based upon the in vivo light absorption characteristics of phytoplankton. To provide a data base for testing assumptions relative to the proposed method, visible absorbance spectra of pure cultures of 20 marine phytoplankton were obtained under laboratory conditions. Descriptive and analytical statistics were computed for the absorbance spectra and were used to make comparisons between members of major taxonomic groups and between groups. Spectral variation between the members of the major taxonomic groups was observed to be considerably less than the spectral variation between these groups. In several cases the differences between the mean absorbance spectra of major taxonomic groups are significant enough to be detected with passive remote sensing techniques.
Gender differences in farmers' responses to climate change adaptation in Yongqiao District, China.
Jin, Jianjun; Wang, Xiaomin; Gao, Yiwei
2015-12-15
This study examines the gender differences in farmers' responses to climate change adaption in Yongqiao District, China. A random sampling technique was used to select 220 household heads, while descriptive statistics and binary logit models were used to analyze the data obtained from the households. We determine that male and female respondents are not significantly different in their knowledge and perceptions of climate change, but there is a gender difference in adopting climate change adaptation measures. Male-headed households are more likely to adopt new technology for water conservation and to increase investment in irrigation infrastructure. The research also indicates that the adaptation decisions of male and female heads are influenced by different sets of factors. The findings of this research help to elucidate the determinants of climate change adaptation decisions for male and female-headed households and the strategic interventions necessary for effective adaptation. Copyright © 2015 Elsevier B.V. All rights reserved.
An uncertainty analysis of the flood-stage upstream from a bridge.
Sowiński, M
2006-01-01
The paper begins with the formulation of the problem in the form of a general performance function. Next the Latin hypercube sampling (LHS) technique--a modified version of the Monte Carlo method is briefly described. The essential uncertainty analysis of the flood-stage upstream from a bridge starts with a description of the hydraulic model. This model concept is based on the HEC-RAS model developed for subcritical flow under a bridge without piers in which the energy equation is applied. The next section contains the characteristic of the basic variables including a specification of their statistics (means and variances). Next the problem of correlated variables is discussed and assumptions concerning correlation among basic variables are formulated. The analysis of results is based on LHS ranking lists obtained from the computer package UNCSAM. Results fot two examples are given: one for independent and the other for correlated variables.
Anatomical appraisal of the skulls and teeth associated with the family of Tsar Nicolay Romanov.
Kolesnikov, L L; Pashinyan, G A; Abramov, S S
2001-02-01
This article describes the identification of skeletal remains attributed to the family of Tsar Nicolay Romanov and other persons buried together at a site near present-day Ekaterinburg, Russia. Detailed descriptions are given regarding the objective methods of craniofacial and odontological identification that were used. Employing computer-assisted photographic superimposition techniques and statistical analysis of morphologic and other characteristics of the specimens, this study identifies with a high likelihood of certainty the remains of the Tsar, his wife, three of his four daughters, and four household assistants. Very strong evidence is presented that the Tsar's daughter Anastasia was killed in 1918. This study demonstrates the effectiveness of the methods and trustworthiness of the results, as well as the prospects of future application of the methods for the identification of skeletonized human remains. Anat Rec (New Anat) 265:15-32, 2001. Copyright 2001 Wiley-Liss, Inc.
Rainfall: State of the Science
NASA Astrophysics Data System (ADS)
Testik, Firat Y.; Gebremichael, Mekonnen
Rainfall: State of the Science offers the most up-to-date knowledge on the fundamental and practical aspects of rainfall. Each chapter, self-contained and written by prominent scientists in their respective fields, provides three forms of information: fundamental principles, detailed overview of current knowledge and description of existing methods, and emerging techniques and future research directions. The book discusses • Rainfall microphysics: raindrop morphodynamics, interactions, size distribution, and evolution • Rainfall measurement and estimation: ground-based direct measurement (disdrometer and rain gauge), weather radar rainfall estimation, polarimetric radar rainfall estimation, and satellite rainfall estimation • Statistical analyses: intensity-duration-frequency curves, frequency analysis of extreme events, spatial analyses, simulation and disaggregation, ensemble approach for radar rainfall uncertainty, and uncertainty analysis of satellite rainfall products The book is tailored to be an indispensable reference for researchers, practitioners, and graduate students who study any aspect of rainfall or utilize rainfall information in various science and engineering disciplines.
The art and science of weed mapping
Barnett, David T.; Stohlgren, Thomas J.; Jarnevich, Catherine S.; Chong, Geneva W.; Ericson, Jenny A.; Davern, Tracy R.; Simonson, Sara E.
2007-01-01
Land managers need cost-effective and informative tools for non-native plant species management. Many local, state, and federal agencies adopted mapping systems designed to collect comparable data for the early detection and monitoring of non-native species. We compared mapping information to statistically rigorous, plot-based methods to better understand the benefits and compatibility of the two techniques. Mapping non-native species locations provided a species list, associated species distributions, and infested area for subjectively selected survey sites. The value of this information may be compromised by crude estimates of cover and incomplete or biased estimations of species distributions. Incorporating plot-based assessments guided by a stratified-random sample design provided a less biased description of non-native species distributions and increased the comparability of data over time and across regions for the inventory, monitoring, and management of non-native and native plant species.
Quantitative study of Xanthosoma violaceum leaf surfaces using RIMAPS and variogram techniques.
Favret, Eduardo A; Fuentes, Néstor O; Molina, Ana M
2006-08-01
Two new imaging techniques (rotated image with maximum averaged power spectrum (RIMAPS) and variogram) are presented for the study and description of leaf surfaces. Xanthosoma violaceum was analyzed to illustrate the characteristics of both techniques. Both techniques produce a quantitative description of leaf surface topography. RIMAPS combines digitized images rotation with Fourier transform, and it is used to detect patterns orientation and characteristics of surface topography. Variogram relates the mathematical variance of a surface with the area of the sample window observed. It gives the typical scale lengths of the surface patterns. RIMAPS detects the morphological variations of the surface topography pattern between fresh and dried (herbarium) samples of the leaf. The variogram method finds the characteristic dimensions of the leaf microstructure, i.e., cell length, papillae diameter, etc., showing that there are not significant differences between dry and fresh samples. The results obtained show the robustness of RIMAPS and variogram analyses to detect, distinguish, and characterize leaf surfaces, as well as give scale lengths. Both techniques are tools for the biologist to study variations of the leaf surface when different patterns are present. The use of RIMAPS and variogram opens a wide spectrum of possibilities by providing a systematic, quantitative description of the leaf surface topography.