Applying Descriptive Statistics to Teaching the Regional Classification of Climate.
ERIC Educational Resources Information Center
Lindquist, Peter S.; Hammel, Daniel J.
1998-01-01
Describes an exercise for college and high school students that relates descriptive statistics to the regional climatic classification. The exercise introduces students to simple calculations of central tendency and dispersion, the construction and interpretation of scatterplots, and the definition of climatic regions. Forces students to engage…
Using Microsoft Excel[R] to Calculate Descriptive Statistics and Create Graphs
ERIC Educational Resources Information Center
Carr, Nathan T.
2008-01-01
Descriptive statistics and appropriate visual representations of scores are important for all test developers, whether they are experienced testers working on large-scale projects, or novices working on small-scale local tests. Many teachers put in charge of testing projects do not know "why" they are important, however, and are utterly convinced…
Rebuilding Government Legitimacy in Post-conflict Societies: Case Studies of Nepal and Afghanistan
2015-09-09
administered via the verbal scales due to reduced time spent explaining the visual show cards. Statistical results corresponded with observations from...a three-step strategy for dealing with item non-response. First, basic descriptive statistics are calculated to determine the extent of item...descriptive statistics for all items in the survey), however this section of the report highlights just some of the findings. Thus, the results
WASP (Write a Scientific Paper) using Excel - 6: Standard error and confidence interval.
Grech, Victor
2018-03-01
The calculation of descriptive statistics includes the calculation of standard error and confidence interval, an inevitable component of data analysis in inferential statistics. This paper provides pointers as to how to do this in Microsoft Excel™. Copyright © 2018 Elsevier B.V. All rights reserved.
Calculation of streamflow statistics for Ontario and the Great Lakes states
Piggott, Andrew R.; Neff, Brian P.
2005-01-01
Basic, flow-duration, and n-day frequency statistics were calculated for 779 current and historical streamflow gages in Ontario and 3,157 streamflow gages in the Great Lakes states with length-of-record daily mean streamflow data ending on December 31, 2000 and September 30, 2001, respectively. The statistics were determined using the U.S. Geological Survey’s SWSTAT and IOWDM, ANNIE, and LIBANNE software and Linux shell and PERL programming that enabled the mass processing of the data and calculation of the statistics. Verification exercises were performed to assess the accuracy of the processing and calculations. The statistics and descriptions, longitudes and latitudes, and drainage areas for each of the streamflow gages are summarized in ASCII text files and ESRI shapefiles.
Vetter, Thomas R
2017-11-01
Descriptive statistics are specific methods basically used to calculate, describe, and summarize collected research data in a logical, meaningful, and efficient way. Descriptive statistics are reported numerically in the manuscript text and/or in its tables, or graphically in its figures. This basic statistical tutorial discusses a series of fundamental concepts about descriptive statistics and their reporting. The mean, median, and mode are 3 measures of the center or central tendency of a set of data. In addition to a measure of its central tendency (mean, median, or mode), another important characteristic of a research data set is its variability or dispersion (ie, spread). In simplest terms, variability is how much the individual recorded scores or observed values differ from one another. The range, standard deviation, and interquartile range are 3 measures of variability or dispersion. The standard deviation is typically reported for a mean, and the interquartile range for a median. Testing for statistical significance, along with calculating the observed treatment effect (or the strength of the association between an exposure and an outcome), and generating a corresponding confidence interval are 3 tools commonly used by researchers (and their collaborating biostatistician or epidemiologist) to validly make inferences and more generalized conclusions from their collected data and descriptive statistics. A number of journals, including Anesthesia & Analgesia, strongly encourage or require the reporting of pertinent confidence intervals. A confidence interval can be calculated for virtually any variable or outcome measure in an experimental, quasi-experimental, or observational research study design. Generally speaking, in a clinical trial, the confidence interval is the range of values within which the true treatment effect in the population likely resides. In an observational study, the confidence interval is the range of values within which the true strength of the association between the exposure and the outcome (eg, the risk ratio or odds ratio) in the population likely resides. There are many possible ways to graphically display or illustrate different types of data. While there is often latitude as to the choice of format, ultimately, the simplest and most comprehensible format is preferred. Common examples include a histogram, bar chart, line chart or line graph, pie chart, scatterplot, and box-and-whisker plot. Valid and reliable descriptive statistics can answer basic yet important questions about a research data set, namely: "Who, What, Why, When, Where, How, How Much?"
Methodological reporting of randomized trials in five leading Chinese nursing journals.
Shi, Chunhu; Tian, Jinhui; Ren, Dan; Wei, Hongli; Zhang, Lihuan; Wang, Quan; Yang, Kehu
2014-01-01
Randomized controlled trials (RCTs) are not always well reported, especially in terms of their methodological descriptions. This study aimed to investigate the adherence of methodological reporting complying with CONSORT and explore associated trial level variables in the Chinese nursing care field. In June 2012, we identified RCTs published in five leading Chinese nursing journals and included trials with details of randomized methods. The quality of methodological reporting was measured through the methods section of the CONSORT checklist and the overall CONSORT methodological items score was calculated and expressed as a percentage. Meanwhile, we hypothesized that some general and methodological characteristics were associated with reporting quality and conducted a regression with these data to explore the correlation. The descriptive and regression statistics were calculated via SPSS 13.0. In total, 680 RCTs were included. The overall CONSORT methodological items score was 6.34 ± 0.97 (Mean ± SD). No RCT reported descriptions and changes in "trial design," changes in "outcomes" and "implementation," or descriptions of the similarity of interventions for "blinding." Poor reporting was found in detailing the "settings of participants" (13.1%), "type of randomization sequence generation" (1.8%), calculation methods of "sample size" (0.4%), explanation of any interim analyses and stopping guidelines for "sample size" (0.3%), "allocation concealment mechanism" (0.3%), additional analyses in "statistical methods" (2.1%), and targeted subjects and methods of "blinding" (5.9%). More than 50% of trials described randomization sequence generation, the eligibility criteria of "participants," "interventions," and definitions of the "outcomes" and "statistical methods." The regression analysis found that publication year and ITT analysis were weakly associated with CONSORT score. The completeness of methodological reporting of RCTs in the Chinese nursing care field is poor, especially with regard to the reporting of trial design, changes in outcomes, sample size calculation, allocation concealment, blinding, and statistical methods.
"Magnitude-based inference": a statistical review.
Welsh, Alan H; Knight, Emma J
2015-04-01
We consider "magnitude-based inference" and its interpretation by examining in detail its use in the problem of comparing two means. We extract from the spreadsheets, which are provided to users of the analysis (http://www.sportsci.org/), a precise description of how "magnitude-based inference" is implemented. We compare the implemented version of the method with general descriptions of it and interpret the method in familiar statistical terms. We show that "magnitude-based inference" is not a progressive improvement on modern statistics. The additional probabilities introduced are not directly related to the confidence interval but, rather, are interpretable either as P values for two different nonstandard tests (for different null hypotheses) or as approximate Bayesian calculations, which also lead to a type of test. We also discuss sample size calculations associated with "magnitude-based inference" and show that the substantial reduction in sample sizes claimed for the method (30% of the sample size obtained from standard frequentist calculations) is not justifiable so the sample size calculations should not be used. Rather than using "magnitude-based inference," a better solution is to be realistic about the limitations of the data and use either confidence intervals or a fully Bayesian analysis.
NASA Astrophysics Data System (ADS)
Mumpower, M. R.; Kawano, T.; Ullmann, J. L.; Krtička, M.; Sprouse, T. M.
2017-08-01
Radiative neutron capture is an important nuclear reaction whose accurate description is needed for many applications ranging from nuclear technology to nuclear astrophysics. The description of such a process relies on the Hauser-Feshbach theory which requires the nuclear optical potential, level density, and γ -strength function as model inputs. It has recently been suggested that the M 1 scissors mode may explain discrepancies between theoretical calculations and evaluated data. We explore statistical model calculations with the strength of the M 1 scissors mode estimated to be dependent on the nuclear deformation of the compound system. We show that the form of the M 1 scissors mode improves the theoretical description of evaluated data and the match to experiment in both the fission product and actinide regions. Since the scissors mode occurs in the range of a few keV to a few MeV, it may also impact the neutron capture cross sections of neutron-rich nuclei that participate in the rapid neutron capture process of nucleosynthesis. We comment on the possible impact to nucleosynthesis by evaluating neutron capture rates for neutron-rich nuclei with the M 1 scissors mode active.
“Magnitude-based Inference”: A Statistical Review
Welsh, Alan H.; Knight, Emma J.
2015-01-01
ABSTRACT Purpose We consider “magnitude-based inference” and its interpretation by examining in detail its use in the problem of comparing two means. Methods We extract from the spreadsheets, which are provided to users of the analysis (http://www.sportsci.org/), a precise description of how “magnitude-based inference” is implemented. We compare the implemented version of the method with general descriptions of it and interpret the method in familiar statistical terms. Results and Conclusions We show that “magnitude-based inference” is not a progressive improvement on modern statistics. The additional probabilities introduced are not directly related to the confidence interval but, rather, are interpretable either as P values for two different nonstandard tests (for different null hypotheses) or as approximate Bayesian calculations, which also lead to a type of test. We also discuss sample size calculations associated with “magnitude-based inference” and show that the substantial reduction in sample sizes claimed for the method (30% of the sample size obtained from standard frequentist calculations) is not justifiable so the sample size calculations should not be used. Rather than using “magnitude-based inference,” a better solution is to be realistic about the limitations of the data and use either confidence intervals or a fully Bayesian analysis. PMID:25051387
Methodological Reporting of Randomized Trials in Five Leading Chinese Nursing Journals
Shi, Chunhu; Tian, Jinhui; Ren, Dan; Wei, Hongli; Zhang, Lihuan; Wang, Quan; Yang, Kehu
2014-01-01
Background Randomized controlled trials (RCTs) are not always well reported, especially in terms of their methodological descriptions. This study aimed to investigate the adherence of methodological reporting complying with CONSORT and explore associated trial level variables in the Chinese nursing care field. Methods In June 2012, we identified RCTs published in five leading Chinese nursing journals and included trials with details of randomized methods. The quality of methodological reporting was measured through the methods section of the CONSORT checklist and the overall CONSORT methodological items score was calculated and expressed as a percentage. Meanwhile, we hypothesized that some general and methodological characteristics were associated with reporting quality and conducted a regression with these data to explore the correlation. The descriptive and regression statistics were calculated via SPSS 13.0. Results In total, 680 RCTs were included. The overall CONSORT methodological items score was 6.34±0.97 (Mean ± SD). No RCT reported descriptions and changes in “trial design,” changes in “outcomes” and “implementation,” or descriptions of the similarity of interventions for “blinding.” Poor reporting was found in detailing the “settings of participants” (13.1%), “type of randomization sequence generation” (1.8%), calculation methods of “sample size” (0.4%), explanation of any interim analyses and stopping guidelines for “sample size” (0.3%), “allocation concealment mechanism” (0.3%), additional analyses in “statistical methods” (2.1%), and targeted subjects and methods of “blinding” (5.9%). More than 50% of trials described randomization sequence generation, the eligibility criteria of “participants,” “interventions,” and definitions of the “outcomes” and “statistical methods.” The regression analysis found that publication year and ITT analysis were weakly associated with CONSORT score. Conclusions The completeness of methodological reporting of RCTs in the Chinese nursing care field is poor, especially with regard to the reporting of trial design, changes in outcomes, sample size calculation, allocation concealment, blinding, and statistical methods. PMID:25415382
WASP (Write a Scientific Paper) using Excel - 7: The t-distribution.
Grech, Victor
2018-03-01
The calculation of descriptive statistics after data collection provides researchers with an overview of the shape and nature of their datasets, along with basic descriptors, and may help identify true or incorrect outlier values. This exercise should always precede inferential statistics, when possible. This paper provides some pointers for doing so in Microsoft Excel, both statically and dynamically, with Excel's functions, including the calculation of standard deviation and variance and the relevance of the t-distribution. Copyright © 2018 Elsevier B.V. All rights reserved.
Nonlinear Curve-Fitting Program
NASA Technical Reports Server (NTRS)
Everhart, Joel L.; Badavi, Forooz F.
1989-01-01
Nonlinear optimization algorithm helps in finding best-fit curve. Nonlinear Curve Fitting Program, NLINEAR, interactive curve-fitting routine based on description of quadratic expansion of X(sup 2) statistic. Utilizes nonlinear optimization algorithm calculating best statistically weighted values of parameters of fitting function and X(sup 2) minimized. Provides user with such statistical information as goodness of fit and estimated values of parameters producing highest degree of correlation between experimental data and mathematical model. Written in FORTRAN 77.
Descriptive and inferential statistical methods used in burns research.
Al-Benna, Sammy; Al-Ajam, Yazan; Way, Benjamin; Steinstraesser, Lars
2010-05-01
Burns research articles utilise a variety of descriptive and inferential methods to present and analyse data. The aim of this study was to determine the descriptive methods (e.g. mean, median, SD, range, etc.) and survey the use of inferential methods (statistical tests) used in articles in the journal Burns. This study defined its population as all original articles published in the journal Burns in 2007. Letters to the editor, brief reports, reviews, and case reports were excluded. Study characteristics, use of descriptive statistics and the number and types of statistical methods employed were evaluated. Of the 51 articles analysed, 11(22%) were randomised controlled trials, 18(35%) were cohort studies, 11(22%) were case control studies and 11(22%) were case series. The study design and objectives were defined in all articles. All articles made use of continuous and descriptive data. Inferential statistics were used in 49(96%) articles. Data dispersion was calculated by standard deviation in 30(59%). Standard error of the mean was quoted in 19(37%). The statistical software product was named in 33(65%). Of the 49 articles that used inferential statistics, the tests were named in 47(96%). The 6 most common tests used (Student's t-test (53%), analysis of variance/co-variance (33%), chi(2) test (27%), Wilcoxon & Mann-Whitney tests (22%), Fisher's exact test (12%)) accounted for the majority (72%) of statistical methods employed. A specified significance level was named in 43(88%) and the exact significance levels were reported in 28(57%). Descriptive analysis and basic statistical techniques account for most of the statistical tests reported. This information should prove useful in deciding which tests should be emphasised in educating burn care professionals. These results highlight the need for burn care professionals to have a sound understanding of basic statistics, which is crucial in interpreting and reporting data. Advice should be sought from professionals in the fields of biostatistics and epidemiology when using more advanced statistical techniques. Copyright 2009 Elsevier Ltd and ISBI. All rights reserved.
Mumpower, Matthew Ryan; Kawano, Toshihiko; Ullmann, John Leonard; ...
2017-08-17
Radiative neutron capture is an important nuclear reaction whose accurate description is needed for many applications ranging from nuclear technology to nuclear astrophysics. The description of such a process relies on the Hauser-Feshbach theory which requires the nuclear optical potential, level density, and γ-strength function as model inputs. It has recently been suggested that the M1 scissors mode may explain discrepancies between theoretical calculations and evaluated data. We explore statistical model calculations with the strength of the M1 scissors mode estimated to be dependent on the nuclear deformation of the compound system. We show that the form of the M1more » scissors mode improves the theoretical description of evaluated data and the match to experiment in both the fission product and actinide regions. Since the scissors mode occurs in the range of a few keV to a few MeV, it may also impact the neutron capture cross sections of neutron-rich nuclei that participate in the rapid neutron capture process of nucleosynthesis. As a result, we comment on the possible impact to nucleosynthesis by evaluating neutron capture rates for neutron-rich nuclei with the M1 scissors mode active.« less
Tools for Basic Statistical Analysis
NASA Technical Reports Server (NTRS)
Luz, Paul L.
2005-01-01
Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.
Wegner, S.J.
1989-01-01
Multiple water samples from 115 wells and 3 surface water sites were collected between 1980 and 1988 for the ongoing quality assurance program at the Idaho National Engineering Laboratory. The reported results from the six laboratories involved were analyzed for agreement using descriptive statistics. The constituents and properties included: tritium, plutonium-238, plutonium-239, -240 (undivided), strontium-90, americium-241, cesium-137, total dissolved chromium, selected dissolved trace metals, sodium, chloride, nitrate, selected purgeable organic compounds, and specific conductance. Agreement could not be calculated for purgeable organic compounds, trace metals, some nitrates and blank sample analyses because analytical uncertainties were not consistently reported. However, differences between results for most of these data were calculated. The blank samples were not analyzed for differences. The laboratory results analyzed using descriptive statistics showed a median agreement between all useable data pairs of 95%. (USGS)
Probabilistic Meteorological Characterization for Turbine Loads
NASA Astrophysics Data System (ADS)
Kelly, M.; Larsen, G.; Dimitrov, N. K.; Natarajan, A.
2014-06-01
Beyond the existing, limited IEC prescription to describe fatigue loads on wind turbines, we look towards probabilistic characterization of the loads via analogous characterization of the atmospheric flow, particularly for today's "taller" turbines with rotors well above the atmospheric surface layer. Based on both data from multiple sites as well as theoretical bases from boundary-layer meteorology and atmospheric turbulence, we offer probabilistic descriptions of shear and turbulence intensity, elucidating the connection of each to the other as well as to atmospheric stability and terrain. These are used as input to loads calculation, and with a statistical loads output description, they allow for improved design and loads calculations.
NASA Astrophysics Data System (ADS)
Sergeenko, N. P.
2017-11-01
An adequate statistical method should be developed in order to predict probabilistically the range of ionospheric parameters. This problem is solved in this paper. The time series of the critical frequency of the layer F2- foF2( t) were subjected to statistical processing. For the obtained samples {δ foF2}, statistical distributions and invariants up to the fourth order are calculated. The analysis shows that the distributions differ from the Gaussian law during the disturbances. At levels of sufficiently small probability distributions, there are arbitrarily large deviations from the model of the normal process. Therefore, it is attempted to describe statistical samples {δ foF2} based on the Poisson model. For the studied samples, the exponential characteristic function is selected under the assumption that time series are a superposition of some deterministic and random processes. Using the Fourier transform, the characteristic function is transformed into a nonholomorphic excessive-asymmetric probability-density function. The statistical distributions of the samples {δ foF2} calculated for the disturbed periods are compared with the obtained model distribution function. According to the Kolmogorov's criterion, the probabilities of the coincidence of a posteriori distributions with the theoretical ones are P 0.7-0.9. The conducted analysis makes it possible to draw a conclusion about the applicability of a model based on the Poisson random process for the statistical description and probabilistic variation estimates during heliogeophysical disturbances of the variations {δ foF2}.
Densely calculated facial soft tissue thickness for craniofacial reconstruction in Chinese adults.
Shui, Wuyang; Zhou, Mingquan; Deng, Qingqiong; Wu, Zhongke; Ji, Yuan; Li, Kang; He, Taiping; Jiang, Haiyan
2016-09-01
Craniofacial reconstruction (CFR) is used to recreate a likeness of original facial appearance for an unidentified skull; this technique has been applied in both forensics and archeology. Many CFR techniques rely on the average facial soft tissue thickness (FSTT) of anatomical landmarks, related to ethnicity, age, sex, body mass index (BMI), etc. Previous studies typically employed FSTT at sparsely distributed anatomical landmarks, where different landmark definitions may affect the contrasting results. In the present study, a total of 90,198 one-to-one correspondence skull vertices are established on 171 head CT-scans and the FSTT of each corresponding vertex is calculated (hereafter referred to as densely calculated FSTT) for statistical analysis and CFR. Basic descriptive statistics (i.e., mean and standard deviation) for densely calculated FSTT are reported separately according to sex and age. Results show that 76.12% of overall vertices indicate that the FSTT is greater in males than females, with the exception of vertices around the zygoma, zygomatic arch and mid-lateral orbit. These sex-related significant differences are found at 55.12% of all vertices and the statistically age-related significant differences are depicted between the three age groups at a majority of all vertices (73.31% for males and 63.43% for females). Five non-overlapping categories are given and the descriptive statistics (i.e., mean, standard deviation, local standard deviation and percentage) are reported. Multiple appearances are produced using the densely calculated FSTT of various age and sex groups, and a quantitative assessment is provided to examine how relevant the choice of FSTT is to increasing the accuracy of CFR. In conclusion, this study provides a new perspective in understanding the distribution of FSTT and the construction of a new densely calculated FSTT database for craniofacial reconstruction. Copyright © 2016. Published by Elsevier Ireland Ltd.
Self-consistent mean-field approach to the statistical level density in spherical nuclei
NASA Astrophysics Data System (ADS)
Kolomietz, V. M.; Sanzhur, A. I.; Shlomo, S.
2018-06-01
A self-consistent mean-field approach within the extended Thomas-Fermi approximation with Skyrme forces is applied to the calculations of the statistical level density in spherical nuclei. Landau's concept of quasiparticles with the nucleon effective mass and the correct description of the continuum states for the finite-depth potentials are taken into consideration. The A dependence and the temperature dependence of the statistical inverse level-density parameter K is obtained in a good agreement with experimental data.
Gorobets, Yu I; Gorobets, O Yu
2015-01-01
The statistical model is proposed in this paper for description of orientation of trajectories of unicellular diamagnetic organisms in a magnetic field. The statistical parameter such as the effective energy is calculated on basis of this model. The resulting effective energy is the statistical characteristics of trajectories of diamagnetic microorganisms in a magnetic field connected with their metabolism. The statistical model is applicable for the case when the energy of the thermal motion of bacteria is negligible in comparison with their energy in a magnetic field and the bacteria manifest the significant "active random movement", i.e. there is the randomizing motion of the bacteria of non thermal nature, for example, movement of bacteria by means of flagellum. The energy of the randomizing active self-motion of bacteria is characterized by the new statistical parameter for biological objects. The parameter replaces the energy of the randomizing thermal motion in calculation of the statistical distribution. Copyright © 2014 Elsevier Ltd. All rights reserved.
New statistical scission-point model to predict fission fragment observables
NASA Astrophysics Data System (ADS)
Lemaître, Jean-François; Panebianco, Stefano; Sida, Jean-Luc; Hilaire, Stéphane; Heinrich, Sophie
2015-09-01
The development of high performance computing facilities makes possible a massive production of nuclear data in a full microscopic framework. Taking advantage of the individual potential calculations of more than 7000 nuclei, a new statistical scission-point model, called SPY, has been developed. It gives access to the absolute available energy at the scission point, which allows the use of a parameter-free microcanonical statistical description to calculate the distributions and the mean values of all fission observables. SPY uses the richness of microscopy in a rather simple theoretical framework, without any parameter except the scission-point definition, to draw clear answers based on perfect knowledge of the ingredients involved in the model, with very limited computing cost.
NASA Technical Reports Server (NTRS)
Szatmary, Steven A.; Gyekenyesi, John P.; Nemeth, Noel N.
1990-01-01
This manual describes the operation and theory of the PC-CARES (Personal Computer-Ceramic Analysis and Reliability Evaluation of Structures) computer program for the IBM PC and compatibles running PC-DOS/MS-DOR OR IBM/MS-OS/2 (version 1.1 or higher) operating systems. The primary purpose of this code is to estimate Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities. Included in the manual is the description of the calculation of shape and scale parameters of the two-parameter Weibull distribution using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. The methods for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull line, as well as the techniques for calculating the Batdorf flaw-density constants are also described.
Asano, Masanari; Basieva, Irina; Khrennikov, Andrei; Ohya, Masanori; Tanaka, Yoshiharu; Yamato, Ichiro
2012-06-01
We developed a quantum-like model describing the gene regulation of glucose/lactose metabolism in a bacterium, Escherichia coli. Our quantum-like model can be considered as a kind of the operational formalism for microbiology and genetics. Instead of trying to describe processes in a cell in the very detail, we propose a formal operator description. Such a description may be very useful in situation in which the detailed description of processes is impossible or extremely complicated. We analyze statistical data obtained from experiments, and we compute the degree of E. coli's preference within adaptive dynamics. It is known that there are several types of E. coli characterized by the metabolic system. We demonstrate that the same type of E. coli can be described by the well determined operators; we find invariant operator quantities characterizing each type. Such invariant quantities can be calculated from the obtained statistical data.
Moments of inclination error distribution computer program
NASA Technical Reports Server (NTRS)
Myler, T. R.
1981-01-01
A FORTRAN coded computer program is described which calculates orbital inclination error statistics using a closed-form solution. This solution uses a data base of trajectory errors from actual flights to predict the orbital inclination error statistics. The Scott flight history data base consists of orbit insertion errors in the trajectory parameters - altitude, velocity, flight path angle, flight azimuth, latitude and longitude. The methods used to generate the error statistics are of general interest since they have other applications. Program theory, user instructions, output definitions, subroutine descriptions and detailed FORTRAN coding information are included.
Effects of Alzheimer’s Disease in the Prediagnosis Period on Financial Outcomes
2017-10-01
merged data; derived key dependent and independent variables and calculated descriptive statistics; and performed initial analyses of the effect of AD on...during the period before it is diagnosable on financial outcomes differ depending on whether the financial head of household is afflicted or the spouse
The Relationship between Attendance Policies and Student Grades
ERIC Educational Resources Information Center
Aaron, Michael D.
2012-01-01
The relationship between attendance policies and student grades in college courses was investigated. Specifically, a calculated grade point average was determined for all academic classes taught at Shelton State Community College between 2000 and 2008. These grade point averages were compared descriptively and statistically in an effort to…
NASA Technical Reports Server (NTRS)
Krajewski, Witold F.; Rexroth, David T.; Kiriaki, Kiriakie
1991-01-01
Two problems related to radar rainfall estimation are described. The first part is a description of a preliminary data analysis for the purpose of statistical estimation of rainfall from multiple (radar and raingage) sensors. Raingage, radar, and joint radar-raingage estimation is described, and some results are given. Statistical parameters of rainfall spatial dependence are calculated and discussed in the context of optimal estimation. Quality control of radar data is also described. The second part describes radar scattering by ellipsoidal raindrops. An analytical solution is derived for the Rayleigh scattering regime. Single and volume scattering are presented. Comparison calculations with the known results for spheres and oblate spheroids are shown.
NASA Technical Reports Server (NTRS)
Krantz, Timothy L.
2002-01-01
The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type 1 censoring. The software was verified by reproducing results published by others.
NASA Technical Reports Server (NTRS)
Kranz, Timothy L.
2002-01-01
The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type I censoring. The software was verified by reproducing results published by others.
ERIC Educational Resources Information Center
Dinkel, Danae M.; Lee, Jung-Min; Schaffer, Connie
2016-01-01
This study examined teachers' zone of proximal development for classroom physical activity breaks by assessing teachers' knowledge and capacity for implementing classroom physical activity breaks. Five school districts of various sizes (n = 346 teachers) took part in a short online survey. Descriptive statistics were calculated and chi-square…
2012-01-01
EMG studies). Data Management and Analysis Descriptive statistics for subject demographics and nerve conduction study variables were calculated using...military N/A Family history of CTS; previous work history as electrician, guitar player 49 (R) None N/A Dental assistant; waiter NCS indicates
Hansen, John P
2003-01-01
Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 2, describes probability, populations, and samples. The uses of descriptive and inferential statistics are outlined. The article also discusses the properties and probability of normal distributions, including the standard normal distribution.
[Statistical analysis of German radiologic periodicals: developmental trends in the last 10 years].
Golder, W
1999-09-01
To identify which statistical tests are applied in German radiological publications, to what extent their use has changed during the last decade, and which factors might be responsible for this development. The major articles published in "ROFO" and "DER RADIOLOGE" during 1988, 1993 and 1998 were reviewed for statistical content. The contributions were classified by principal focus and radiological subspecialty. The methods used were assigned to descriptive, basal and advanced statistics. Sample size, significance level and power were established. The use of experts' assistance was monitored. Finally, we calculated the so-called cumulative accessibility of the publications. 525 contributions were found to be eligible. In 1988, 87% used descriptive statistics only, 12.5% basal, and 0.5% advanced statistics. The corresponding figures in 1993 and 1998 are 62 and 49%, 32 and 41%, and 6 and 10%, respectively. Statistical techniques were most likely to be used in research on musculoskeletal imaging and articles dedicated to MRI. Six basic categories of statistical methods account for the complete statistical analysis appearing in 90% of the articles. ROC analysis is the single most common advanced technique. Authors make increasingly use of statistical experts' opinion and programs. During the last decade, the use of statistical methods in German radiological journals has fundamentally improved, both quantitatively and qualitatively. Presently, advanced techniques account for 20% of the pertinent statistical tests. This development seems to be promoted by the increasing availability of statistical analysis software.
Fotina, I; Lütgendorf-Caucig, C; Stock, M; Pötter, R; Georg, D
2012-02-01
Inter-observer studies represent a valid method for the evaluation of target definition uncertainties and contouring guidelines. However, data from the literature do not yet give clear guidelines for reporting contouring variability. Thus, the purpose of this work was to compare and discuss various methods to determine variability on the basis of clinical cases and a literature review. In this study, 7 prostate and 8 lung cases were contoured on CT images by 8 experienced observers. Analysis of variability included descriptive statistics, calculation of overlap measures, and statistical measures of agreement. Cross tables with ratios and correlations were established for overlap parameters. It was shown that the minimal set of parameters to be reported should include at least one of three volume overlap measures (i.e., generalized conformity index, Jaccard coefficient, or conformation number). High correlation between these parameters and scatter of the results was observed. A combination of descriptive statistics, overlap measure, and statistical measure of agreement or reliability analysis is required to fully report the interrater variability in delineation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bochicchio, Davide; Panizon, Emanuele; Ferrando, Riccardo
2015-10-14
We compare the performance of two well-established computational algorithms for the calculation of free-energy landscapes of biomolecular systems, umbrella sampling and metadynamics. We look at benchmark systems composed of polyethylene and polypropylene oligomers interacting with lipid (phosphatidylcholine) membranes, aiming at the calculation of the oligomer water-membrane free energy of transfer. We model our test systems at two different levels of description, united-atom and coarse-grained. We provide optimized parameters for the two methods at both resolutions. We devote special attention to the analysis of statistical errors in the two different methods and propose a general procedure for the error estimation inmore » metadynamics simulations. Metadynamics and umbrella sampling yield the same estimates for the water-membrane free energy profile, but metadynamics can be more efficient, providing lower statistical uncertainties within the same simulation time.« less
Ho, Andrew D; Yu, Carol C
2015-06-01
Many statistical analyses benefit from the assumption that unconditional or conditional distributions are continuous and normal. More than 50 years ago in this journal, Lord and Cook chronicled departures from normality in educational tests, and Micerri similarly showed that the normality assumption is met rarely in educational and psychological practice. In this article, the authors extend these previous analyses to state-level educational test score distributions that are an increasingly common target of high-stakes analysis and interpretation. Among 504 scale-score and raw-score distributions from state testing programs from recent years, nonnormal distributions are common and are often associated with particular state programs. The authors explain how scaling procedures from item response theory lead to nonnormal distributions as well as unusual patterns of discreteness. The authors recommend that distributional descriptive statistics be calculated routinely to inform model selection for large-scale test score data, and they illustrate consequences of nonnormality using sensitivity studies that compare baseline results to those from normalized score scales.
Automated objective characterization of visual field defects in 3D
NASA Technical Reports Server (NTRS)
Fink, Wolfgang (Inventor)
2006-01-01
A method and apparatus for electronically performing a visual field test for a patient. A visual field test pattern is displayed to the patient on an electronic display device and the patient's responses to the visual field test pattern are recorded. A visual field representation is generated from the patient's responses. The visual field representation is then used as an input into a variety of automated diagnostic processes. In one process, the visual field representation is used to generate a statistical description of the rapidity of change of a patient's visual field at the boundary of a visual field defect. In another process, the area of a visual field defect is calculated using the visual field representation. In another process, the visual field representation is used to generate a statistical description of the volume of a patient's visual field defect.
A statistical method to estimate low-energy hadronic cross sections
NASA Astrophysics Data System (ADS)
Balassa, Gábor; Kovács, Péter; Wolf, György
2018-02-01
In this article we propose a model based on the Statistical Bootstrap approach to estimate the cross sections of different hadronic reactions up to a few GeV in c.m.s. energy. The method is based on the idea, when two particles collide a so-called fireball is formed, which after a short time period decays statistically into a specific final state. To calculate the probabilities we use a phase space description extended with quark combinatorial factors and the possibility of more than one fireball formation. In a few simple cases the probability of a specific final state can be calculated analytically, where we show that the model is able to reproduce the ratios of the considered cross sections. We also show that the model is able to describe proton-antiproton annihilation at rest. In the latter case we used a numerical method to calculate the more complicated final state probabilities. Additionally, we examined the formation of strange and charmed mesons as well, where we used existing data to fit the relevant model parameters.
Non-Equilibrium Properties from Equilibrium Free Energy Calculations
NASA Technical Reports Server (NTRS)
Pohorille, Andrew; Wilson, Michael A.
2012-01-01
Calculating free energy in computer simulations is of central importance in statistical mechanics of condensed media and its applications to chemistry and biology not only because it is the most comprehensive and informative quantity that characterizes the eqUilibrium state, but also because it often provides an efficient route to access dynamic and kinetic properties of a system. Most of applications of equilibrium free energy calculations to non-equilibrium processes rely on a description in which a molecule or an ion diffuses in the potential of mean force. In general case this description is a simplification, but it might be satisfactorily accurate in many instances of practical interest. This hypothesis has been tested in the example of the electrodiffusion equation . Conductance of model ion channels has been calculated directly through counting the number of ion crossing events observed during long molecular dynamics simulations and has been compared with the conductance obtained from solving the generalized Nernst-Plank equation. It has been shown that under relatively modest conditions the agreement between these two approaches is excellent, thus demonstrating the assumptions underlying the diffusion equation are fulfilled. Under these conditions the electrodiffusion equation provides an efficient approach to calculating the full voltage-current dependence routinely measured in electrophysiological experiments.
NASA Astrophysics Data System (ADS)
Fan, Daidu; Tu, Junbiao; Cai, Guofu; Shang, Shuai
2015-06-01
Grain-size analysis is a basic routine in sedimentology and related fields, but diverse methods of sample collection, processing and statistical analysis often make direct comparisons and interpretations difficult or even impossible. In this paper, 586 published grain-size datasets from the Qiantang Estuary (East China Sea) sampled and analyzed by the same procedures were merged and their textural parameters calculated by a percentile and two moment methods. The aim was to explore which of the statistical procedures performed best in the discrimination of three distinct sedimentary units on the tidal flats of the middle Qiantang Estuary. A Gaussian curve-fitting method served to simulate mixtures of two normal populations having different modal sizes, sorting values and size distributions, enabling a better understanding of the impact of finer tail components on textural parameters, as well as the proposal of a unifying descriptive nomenclature. The results show that percentile and moment procedures yield almost identical results for mean grain size, and that sorting values are also highly correlated. However, more complex relationships exist between percentile and moment skewness (kurtosis), changing from positive to negative correlations when the proportions of the finer populations decrease below 35% (10%). This change results from the overweighting of tail components in moment statistics, which stands in sharp contrast to the underweighting or complete amputation of small tail components by the percentile procedure. Intercomparisons of bivariate plots suggest an advantage of the Friedman & Johnson moment procedure over the McManus moment method in terms of the description of grain-size distributions, and over the percentile method by virtue of a greater sensitivity to small variations in tail components. The textural parameter scalings of Folk & Ward were translated into their Friedman & Johnson moment counterparts by application of mathematical functions derived by regression analysis of measured and modeled grain-size data, or by determining the abscissa values of intersections between auxiliary lines running parallel to the x-axis and vertical lines corresponding to the descriptive percentile limits along the ordinate of representative bivariate plots. Twofold limits were extrapolated for the moment statistics in relation to single descriptive terms in the cases of skewness and kurtosis by considering both positive and negative correlations between percentile and moment statistics. The extrapolated descriptive scalings were further validated by examining entire size-frequency distributions simulated by mixing two normal populations of designated modal size and sorting values, but varying in mixing ratios. These were found to match well in most of the proposed scalings, although platykurtic and very platykurtic categories were questionable when the proportion of the finer population was below 5%. Irrespective of the statistical procedure, descriptive nomenclatures should therefore be cautiously used when tail components contribute less than 5% to grain-size distributions.
Nursing education: contradictions and challenges of pedagogical practice.
Pinto, Joelma Batista Tebaldi; Pepe, Alda Muniz
2007-01-01
This study deals with the nursing curriculum, pedagogical practice and education. Nowadays, this theme has taken up considerable space in academic debates. Thus, this study aimed to get empirical knowledge and provide an analytical description of the academic reality of nursing education at Santa Cruz State University in the undergraduate nursing course. This is a descriptive study, which may provide a new view of the problem, with careful observation, description, and exploration of the situation aspects, interpreting the reality, without interfering in it and, consequently, being open to new studies. Descriptive statistics with simple frequency and percentage calculation was applied. In summary, results indicate that professors and students have difficulties to evaluate the curriculum. In addition, the curriculum under study is characterized as a collection curriculum, with a pedagogical practice predominantly directed at the traditional model. Hence, nursing education still shows features of the biomedical-technical model.
OCT Amplitude and Speckle Statistics of Discrete Random Media.
Almasian, Mitra; van Leeuwen, Ton G; Faber, Dirk J
2017-11-01
Speckle, amplitude fluctuations in optical coherence tomography (OCT) images, contains information on sub-resolution structural properties of the imaged sample. Speckle statistics could therefore be utilized in the characterization of biological tissues. However, a rigorous theoretical framework relating OCT speckle statistics to structural tissue properties has yet to be developed. As a first step, we present a theoretical description of OCT speckle, relating the OCT amplitude variance to size and organization for samples of discrete random media (DRM). Starting the calculations from the size and organization of the scattering particles, we analytically find expressions for the OCT amplitude mean, amplitude variance, the backscattering coefficient and the scattering coefficient. We assume fully developed speckle and verify the validity of this assumption by experiments on controlled samples of silica microspheres suspended in water. We show that the OCT amplitude variance is sensitive to sub-resolution changes in size and organization of the scattering particles. Experimentally determined and theoretically calculated optical properties are compared and in good agreement.
Ion Channel Conductance Measurements on a Silicon-Based Platform
2006-01-01
calculated using the molecular dynamics code, GROMACS . Reasonable agreement is obtained in the simulated versus measured conductance over the range of...measurements of the lipid giga-seal characteristics have been performed, including AC conductance measurements and statistical analysis in order to...Dynamics kernel self-consistently coupled to Poisson equations using a P3M force field scheme and the GROMACS description of protein structure and
Quantal diffusion description of multinucleon transfers in heavy-ion collisions
NASA Astrophysics Data System (ADS)
Ayik, S.; Yilmaz, B.; Yilmaz, O.; Umar, A. S.
2018-05-01
Employing the stochastic mean-field (SMF) approach, we develop a quantal diffusion description of the multi-nucleon transfer in heavy-ion collisions at finite impact parameters. The quantal transport coefficients are determined by the occupied single-particle wave functions of the time-dependent Hartree-Fock equations. As a result, the primary fragment mass and charge distribution functions are determined entirely in terms of the mean-field properties. This powerful description does not involve any adjustable parameter, includes the effects of shell structure, and is consistent with the fluctuation-dissipation theorem of the nonequilibrium statistical mechanics. As a first application of the approach, we analyze the fragment mass distribution in 48Ca+ 238U collisions at the center-of-mass energy Ec.m.=193 MeV and compare the calculations with the experimental data.
Yang, Hyeri; Na, Jihye; Jang, Won-Hee; Jung, Mi-Sook; Jeon, Jun-Young; Heo, Yong; Yeo, Kyung-Wook; Jo, Ji-Hoon; Lim, Kyung-Min; Bae, SeungJin
2015-05-05
Mouse local lymph node assay (LLNA, OECD TG429) is an alternative test replacing conventional guinea pig tests (OECD TG406) for the skin sensitization test but the use of a radioisotopic agent, (3)H-thymidine, deters its active dissemination. New non-radioisotopic LLNA, LLNA:BrdU-FCM employs a non-radioisotopic analog, 5-bromo-2'-deoxyuridine (BrdU) and flow cytometry. For an analogous method, OECD TG429 performance standard (PS) advises that two reference compounds be tested repeatedly and ECt(threshold) values obtained must fall within acceptable ranges to prove within- and between-laboratory reproducibility. However, this criteria is somewhat arbitrary and sample size of ECt is less than 5, raising concerns about insufficient reliability. Here, we explored various statistical methods to evaluate the reproducibility of LLNA:BrdU-FCM with stimulation index (SI), the raw data for ECt calculation, produced from 3 laboratories. Descriptive statistics along with graphical representation of SI was presented. For inferential statistics, parametric and non-parametric methods were applied to test the reproducibility of SI of a concurrent positive control and the robustness of results were investigated. Descriptive statistics and graphical representation of SI alone could illustrate the within- and between-laboratory reproducibility. Inferential statistics employing parametric and nonparametric methods drew similar conclusion. While all labs passed within- and between-laboratory reproducibility criteria given by OECD TG429 PS based on ECt values, statistical evaluation based on SI values showed that only two labs succeeded in achieving within-laboratory reproducibility. For those two labs that satisfied the within-lab reproducibility, between-laboratory reproducibility could be also attained based on inferential as well as descriptive statistics. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
The best motivator priorities parents choose via analytical hierarchy process
NASA Astrophysics Data System (ADS)
Farah, R. N.; Latha, P.
2015-05-01
Motivation is probably the most important factor that educators can target in order to improve learning. Numerous cross-disciplinary theories have been postulated to explain motivation. While each of these theories has some truth, no single theory seems to adequately explain all human motivation. The fact is that human beings in general and pupils in particular are complex creatures with complex needs and desires. In this paper, Analytic Hierarchy Process (AHP) has been proposed as an emerging solution to move towards too large, dynamic and complex real world multi-criteria decision making problems in selecting the most suitable motivator when choosing school for their children. Data were analyzed using SPSS 17.0 ("Statistical Package for Social Science") software. Statistic testing used are descriptive and inferential statistic. Descriptive statistic used to identify respondent pupils and parents demographic factors. The statistical testing used to determine the pupils and parents highest motivator priorities and parents' best priorities using AHP to determine the criteria chosen by parents such as school principals, teachers, pupils and parents. The moderating factors are selected schools based on "Standard Kualiti Pendidikan Malaysia" (SKPM) in Ampang. Inferential statistics such as One-way ANOVA used to get the significant and data used to calculate the weightage of AHP. School principals is found to be the best motivator for parents in choosing school for their pupils followed by teachers, parents and pupils.
A new equation of state Based on Nuclear Statistical Equilibrium for Core-Collapse Simulations
NASA Astrophysics Data System (ADS)
Furusawa, Shun; Yamada, Shoichi; Sumiyoshi, Kohsuke; Suzuki, Hideyuki
2012-09-01
We calculate a new equation of state for baryons at sub-nuclear densities for the use in core-collapse simulations of massive stars. The formulation is the nuclear statistical equilibrium description and the liquid drop approximation of nuclei. The model free energy to minimize is calculated by relativistic mean field theory for nucleons and the mass formula for nuclei with atomic number up to ~ 1000. We have also taken into account the pasta phase. We find that the free energy and other thermodynamical quantities are not very different from those given in the standard EOSs that adopt the single nucleus approximation. On the other hand, the average mass is systematically different, which may have an important effect on the rates of electron captures and coherent neutrino scatterings on nuclei in supernova cores.
Pupil Size in Outdoor Environments
2007-04-06
studies. .........................19 Table 3: Descriptive statistics for pupils measured over luminance range. .........50 Table 4: N in each...strata for all pupil measurements..........................................50 Table 5: Descriptive statistics stratified against eye color...59 Table 6: Descriptive statistics stratified against gender. .....................................64 Table 7: Descriptive
NASA Technical Reports Server (NTRS)
Kummerow, Christian; Giglio, Louis
1994-01-01
This paper describes a multichannel physical approach for retrieving rainfall and vertical structure information from satellite-based passive microwave observations. The algorithm makes use of statistical inversion techniques based upon theoretically calculated relations between rainfall rates and brightness temperatures. Potential errors introduced into the theoretical calculations by the unknown vertical distribution of hydrometeors are overcome by explicity accounting for diverse hydrometeor profiles. This is accomplished by allowing for a number of different vertical distributions in the theoretical brightness temperature calculations and requiring consistency between the observed and calculated brightness temperatures. This paper will focus primarily on the theoretical aspects of the retrieval algorithm, which includes a procedure used to account for inhomogeneities of the rainfall within the satellite field of view as well as a detailed description of the algorithm as it is applied over both ocean and land surfaces. The residual error between observed and calculated brightness temperatures is found to be an important quantity in assessing the uniqueness of the solution. It is further found that the residual error is a meaningful quantity that can be used to derive expected accuracies from this retrieval technique. Examples comparing the retrieved results as well as the detailed analysis of the algorithm performance under various circumstances are the subject of a companion paper.
Brudnik, Katarzyna; Twarda, Maria; Sarzyński, Dariusz; Jodkowski, Jerzy T
2013-10-01
Ab initio calculations at the G3 level were used in a theoretical description of the kinetics and mechanism of the chlorine abstraction reactions from mono-, di-, tri- and tetra-chloromethane by chlorine atoms. The calculated profiles of the potential energy surface of the reaction systems show that the mechanism of the studied reactions is complex and the Cl-abstraction proceeds via the formation of intermediate complexes. The multi-step reaction mechanism consists of two elementary steps in the case of CCl4 + Cl, and three for the other reactions. Rate constants were calculated using the theoretical method based on the RRKM theory and the simplified version of the statistical adiabatic channel model. The temperature dependencies of the calculated rate constants can be expressed, in temperature range of 200-3,000 K as [Formula: see text]. The rate constants for the reverse reactions CH3/CH2Cl/CHCl2/CCl3 + Cl2 were calculated via the equilibrium constants derived theoretically. The kinetic equations [Formula: see text] allow a very good description of the reaction kinetics. The derived expressions are a substantial supplement to the kinetic data necessary to describe and model the complex gas-phase reactions of importance in combustion and atmospheric chemistry.
Aljasser, Faisal; Vitevitch, Michael S
2018-02-01
A number of databases (Storkel Behavior Research Methods, 45, 1159-1167, 2013) and online calculators (Vitevitch & Luce Behavior Research Methods, Instruments, and Computers, 36, 481-487, 2004) have been developed to provide statistical information about various aspects of language, and these have proven to be invaluable assets to researchers, clinicians, and instructors in the language sciences. The number of such resources for English is quite large and continues to grow, whereas the number of such resources for other languages is much smaller. This article describes the development of a Web-based interface to calculate phonotactic probability in Modern Standard Arabic (MSA). A full description of how the calculator can be used is provided. It can be freely accessed at http://phonotactic.drupal.ku.edu/ .
Understanding Optimal Military Decision Making: Year 2 Progress Report
2014-01-01
measures. ARMY RELEVANCY AND MILITARY APPLICATION AREAS Objectively defining, measuring, and developing a means to assess military optimal decision making...has the potential to enhance training and refine procedures supporting more efficient learning and task accomplishment. Through the application of...26.79 (12.39) 7.94 (62.38) N/A = Not applicable ; as it is not possible to calculate this particular variable. Table 2. Descriptive statistics of
Methods for collection and analysis of aquatic biological and microbiological samples
Greeson, Phillip E.; Ehlke, T.A.; Irwin, G.A.; Lium, B.W.; Slack, K.V.
1977-01-01
Chapter A4 contains methods used by the U.S. Geological Survey to collect, preserve, and analyze waters to determine their biological and microbiological properties. Part 1 discusses biological sampling and sampling statistics. The statistical procedures are accompanied by examples. Part 2 consists of detailed descriptions of more than 45 individual methods, including those for bacteria, phytoplankton, zooplankton, seston, periphyton, macrophytes, benthic invertebrates, fish and other vertebrates, cellular contents, productivity, and bioassays. Each method is summarized, and the application, interferences, apparatus, reagents, collection, analysis, calculations, reporting of results, precision and references are given. Part 3 consists of a glossary. Part 4 is a list of taxonomic references.
Testing the system detection unit for measuring solid minerals bulk density
NASA Astrophysics Data System (ADS)
Voytyuk, I. N.; Kopteva, A. V.
2017-10-01
The paper provides a brief description of the system for measuring flux per volume of solid minerals via example of mineral coal. The paper discloses the operational principle of the detection unit. The paper provides full description of testing methodology, as well as practical implementation of the detection unit testing. This paper describes the removal of two data arrays via the channel of scattered anddirect radiation for the detection units of two generations. This paper describes Matlab software to determine the statistical characteristics of the studied objects. The mean value of pulses per cycles, and pulse counting inaccuracy relatively the mean value were determined for the calculation of the stability account of the detection units.
ERIC Educational Resources Information Center
Wendling, Wayne
This report is divided into four sections. Section 1 is a short discussion of the economic theory underlying the construction of the cost of education index and an example of how the index is calculated. Also presented are descriptions of the factors included in the statistical analysis to control for quality, quantity, and cost differences and…
Estimating chronic disease rates in Canada: which population-wide denominator to use?
Ellison, J; Nagamuthu, C; Vanderloo, S; McRae, B; Waters, C
2016-10-01
Chronic disease rates are produced from the Public Health Agency of Canada's Canadian Chronic Disease Surveillance System (CCDSS) using administrative health data from provincial/territorial health ministries. Denominators for these rates are based on estimates of populations derived from health insurance files. However, these data may not be accessible to all researchers. Another source for population size estimates is the Statistics Canada census. The purpose of our study was to calculate the major differences between the CCDSS and Statistics Canada's population denominators and to identify the sources or reasons for the potential differences between these data sources. We compared the 2009 denominators from the CCDSS and Statistics Canada. The CCDSS denominator was adjusted for the growth components (births, deaths, emigration and immigration) from Statistics Canada's census data. The unadjusted CCDSS denominator was 34 429 804, 3.2% higher than Statistics Canada's estimate of population in 2009. After the CCDSS denominator was adjusted for the growth components, the difference between the two estimates was reduced to 431 323 people, a difference of 1.3%. The CCDSS overestimates the population relative to Statistics Canada overall. The largest difference between the two estimates was from the migrant growth component, while the smallest was from the emigrant component. By using data descriptions by data source, researchers can make decisions about which population to use in their calculations of disease frequency.
Quality of reporting statistics in two Indian pharmacology journals.
Jaykaran; Yadav, Preeti
2011-04-01
To evaluate the reporting of the statistical methods in articles published in two Indian pharmacology journals. All original articles published since 2002 were downloaded from the journals' (Indian Journal of Pharmacology (IJP) and Indian Journal of Physiology and Pharmacology (IJPP)) website. These articles were evaluated on the basis of appropriateness of descriptive statistics and inferential statistics. Descriptive statistics was evaluated on the basis of reporting of method of description and central tendencies. Inferential statistics was evaluated on the basis of fulfilling of assumption of statistical methods and appropriateness of statistical tests. Values are described as frequencies, percentage, and 95% confidence interval (CI) around the percentages. Inappropriate descriptive statistics was observed in 150 (78.1%, 95% CI 71.7-83.3%) articles. Most common reason for this inappropriate descriptive statistics was use of mean ± SEM at the place of "mean (SD)" or "mean ± SD." Most common statistical method used was one-way ANOVA (58.4%). Information regarding checking of assumption of statistical test was mentioned in only two articles. Inappropriate statistical test was observed in 61 (31.7%, 95% CI 25.6-38.6%) articles. Most common reason for inappropriate statistical test was the use of two group test for three or more groups. Articles published in two Indian pharmacology journals are not devoid of statistical errors.
User’s guide for MapMark4GUI—A graphical user interface for the MapMark4 R package
Shapiro, Jason
2018-05-29
MapMark4GUI is an R graphical user interface (GUI) developed by the U.S. Geological Survey to support user implementation of the MapMark4 R statistical software package. MapMark4 was developed by the U.S. Geological Survey to implement probability calculations for simulating undiscovered mineral resources in quantitative mineral resource assessments. The GUI provides an easy-to-use tool to input data, run simulations, and format output results for the MapMark4 package. The GUI is written and accessed in the R statistical programming language. This user’s guide includes instructions on installing and running MapMark4GUI and descriptions of the statistical output processes, output files, and test data files.
NASA Astrophysics Data System (ADS)
Szücs, T.; Kiss, G. G.; Gyürky, Gy.; Halász, Z.; Fülöp, Zs.; Rauscher, T.
2018-01-01
The stellar reaction rates of radiative α-capture reactions on heavy isotopes are of crucial importance for the γ process network calculations. These rates are usually derived from statistical model calculations, which need to be validated, but the experimental database is very scarce. This paper presents the results of α-induced reaction cross section measurements on iridium isotopes carried out at first close to the astrophysically relevant energy region. Thick target yields of 191Ir(α,γ)195Au, 191Ir(α,n)194Au, 193Ir(α,n)196mAu, 193Ir(α,n)196Au reactions have been measured with the activation technique between Eα = 13.4 MeV and 17 MeV. For the first time the thick target yield was determined with X-ray counting. This led to a previously unprecedented sensitivity. From the measured thick target yields, reaction cross sections are derived and compared with statistical model calculations. The recently suggested energy-dependent modification of the α + nucleus optical potential gives a good description of the experimental data.
Role strain among male RNs in the critical care setting: Perceptions of an unfriendly workplace.
Carte, Nicholas S; Williams, Collette
2017-12-01
Traditionally, nursing has been a female-dominated profession. Men employed as registered nurses have been in the minority and little is known about the experiences of this demographic. The purpose of this descriptive, quantitative study was to understand the relationship between the variables of demographics and causes of role strain among male nurses in critical care settings. The Sherrod Role Strain Scale assesses role strain within the context of role conflict, role overload, role ambiguity and role incongruity. Data analysis of the results included descriptive and inferential statistics. Inferential statistics involved the use of repeated measures ANOVA testing for significant difference in the causes of role strain between male nurses employed in critical care settings and a post hoc comparison of specific demographic data using multivariate analyses of variance (MANOVAs). Data from 37 male nurses in critical care settings from the northeast of the United States were used to calculate descriptive statistics standard deviation, mean of the data analysis and results of the repeated ANOVA and the post hoc secondary MANOVA analysis. The descriptive data showed that all participants worked full-time. There was an even split from those participants who worked day shift (46%) vs. night shift (43%), most the participants indicated they had 15 years or more experience as an registered nurse (54%). Significant findings of this study include two causes of role strain in male nurses employed in critical care settings which are: role ambiguity and role overload based on ethnicity. Consistent with previous research findings, the results of this study suggest that male registered nurses employed in critical care settings do experience role strain. The two main causes of role strain in male nurses are role ambiguity and role overload. Copyright © 2017. Published by Elsevier Ltd.
2016-12-22
included assessments and instruments, descriptive statistics were calculated. Independent-samples t-tests were conducted using participant survey scores...integrity tests within a multimodal system. Both conditions included the Military Acute Concussion Evaluation (MACE) and an Ease-of-Use survey . Mean scores...for the Ease-of-Use survey and mean test administration times for each measure were compared. Administrative feedback was also considered for
Quality of reporting statistics in two Indian pharmacology journals
Jaykaran; Yadav, Preeti
2011-01-01
Objective: To evaluate the reporting of the statistical methods in articles published in two Indian pharmacology journals. Materials and Methods: All original articles published since 2002 were downloaded from the journals’ (Indian Journal of Pharmacology (IJP) and Indian Journal of Physiology and Pharmacology (IJPP)) website. These articles were evaluated on the basis of appropriateness of descriptive statistics and inferential statistics. Descriptive statistics was evaluated on the basis of reporting of method of description and central tendencies. Inferential statistics was evaluated on the basis of fulfilling of assumption of statistical methods and appropriateness of statistical tests. Values are described as frequencies, percentage, and 95% confidence interval (CI) around the percentages. Results: Inappropriate descriptive statistics was observed in 150 (78.1%, 95% CI 71.7–83.3%) articles. Most common reason for this inappropriate descriptive statistics was use of mean ± SEM at the place of “mean (SD)” or “mean ± SD.” Most common statistical method used was one-way ANOVA (58.4%). Information regarding checking of assumption of statistical test was mentioned in only two articles. Inappropriate statistical test was observed in 61 (31.7%, 95% CI 25.6–38.6%) articles. Most common reason for inappropriate statistical test was the use of two group test for three or more groups. Conclusion: Articles published in two Indian pharmacology journals are not devoid of statistical errors. PMID:21772766
Lognormal-like statistics of a stochastic squeeze process
NASA Astrophysics Data System (ADS)
Shapira, Dekel; Cohen, Doron
2017-10-01
We analyze the full statistics of a stochastic squeeze process. The model's two parameters are the bare stretching rate w and the angular diffusion coefficient D . We carry out an exact analysis to determine the drift and the diffusion coefficient of log(r ) , where r is the radial coordinate. The results go beyond the heuristic lognormal description that is implied by the central limit theorem. Contrary to the common "quantum Zeno" approximation, the radial diffusion is not simply Dr=(1 /8 ) w2/D but has a nonmonotonic dependence on w /D . Furthermore, the calculation of the radial moments is dominated by the far non-Gaussian tails of the log(r ) distribution.
A Divergence Statistics Extension to VTK for Performance Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pebay, Philippe Pierre; Bennett, Janine Camille
This report follows the series of previous documents ([PT08, BPRT09b, PT09, BPT09, PT10, PB13], where we presented the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k -means, order and auto-correlative statistics engines which we developed within the Visualization Tool Kit ( VTK ) as a scalable, parallel and versatile statistics package. We now report on a new engine which we developed for the calculation of divergence statistics, a concept which we hereafter explain and whose main goal is to quantify the discrepancy, in a stasticial manner akin to measuring a distance, between an observed empirical distribution and a theoretical,more » "ideal" one. The ease of use of the new diverence statistics engine is illustrated by the means of C++ code snippets. Although this new engine does not yet have a parallel implementation, it has already been applied to HPC performance analysis, of which we provide an example.« less
Thompson, Cheryl Bagley
2009-01-01
This 13th article of the Basics of Research series is first in a short series on statistical analysis. These articles will discuss creating your statistical analysis plan, levels of measurement, descriptive statistics, probability theory, inferential statistics, and general considerations for interpretation of the results of a statistical analysis.
Barber, Julie A; Thompson, Simon G
1998-01-01
Objective To review critically the statistical methods used for health economic evaluations in randomised controlled trials where an estimate of cost is available for each patient in the study. Design Survey of published randomised trials including an economic evaluation with cost values suitable for statistical analysis; 45 such trials published in 1995 were identified from Medline. Main outcome measures The use of statistical methods for cost data was assessed in terms of the descriptive statistics reported, use of statistical inference, and whether the reported conclusions were justified. Results Although all 45 trials reviewed apparently had cost data for each patient, only 9 (20%) reported adequate measures of variability for these data and only 25 (56%) gave results of statistical tests or a measure of precision for the comparison of costs between the randomised groups. Only 16 (36%) of the articles gave conclusions which were justified on the basis of results presented in the paper. No paper reported sample size calculations for costs. Conclusions The analysis and interpretation of cost data from published trials reveal a lack of statistical awareness. Strong and potentially misleading conclusions about the relative costs of alternative therapies have often been reported in the absence of supporting statistical evidence. Improvements in the analysis and reporting of health economic assessments are urgently required. Health economic guidelines need to be revised to incorporate more detailed statistical advice. Key messagesHealth economic evaluations required for important healthcare policy decisions are often carried out in randomised controlled trialsA review of such published economic evaluations assessed whether statistical methods for cost outcomes have been appropriately used and interpretedFew publications presented adequate descriptive information for costs or performed appropriate statistical analysesIn at least two thirds of the papers, the main conclusions regarding costs were not justifiedThe analysis and reporting of health economic assessments within randomised controlled trials urgently need improving PMID:9794854
Mean-field approximation for spacing distribution functions in classical systems
NASA Astrophysics Data System (ADS)
González, Diego Luis; Pimpinelli, Alberto; Einstein, T. L.
2012-01-01
We propose a mean-field method to calculate approximately the spacing distribution functions p(n)(s) in one-dimensional classical many-particle systems. We compare our method with two other commonly used methods, the independent interval approximation and the extended Wigner surmise. In our mean-field approach, p(n)(s) is calculated from a set of Langevin equations, which are decoupled by using a mean-field approximation. We find that in spite of its simplicity, the mean-field approximation provides good results in several systems. We offer many examples illustrating that the three previously mentioned methods give a reasonable description of the statistical behavior of the system. The physical interpretation of each method is also discussed.
Nick, Todd G
2007-01-01
Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.
Campos-Filho, N; Franco, E L
1989-02-01
A frequent procedure in matched case-control studies is to report results from the multivariate unmatched analyses if they do not differ substantially from the ones obtained after conditioning on the matching variables. Although conceptually simple, this rule requires that an extensive series of logistic regression models be evaluated by both the conditional and unconditional maximum likelihood methods. Most computer programs for logistic regression employ only one maximum likelihood method, which requires that the analyses be performed in separate steps. This paper describes a Pascal microcomputer (IBM PC) program that performs multiple logistic regression by both maximum likelihood estimation methods, which obviates the need for switching between programs to obtain relative risk estimates from both matched and unmatched analyses. The program calculates most standard statistics and allows factoring of categorical or continuous variables by two distinct methods of contrast. A built-in, descriptive statistics option allows the user to inspect the distribution of cases and controls across categories of any given variable.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Petukhov, B. V., E-mail: petukhov@ns.crys.ras.r
2010-01-15
A model has been proposed for describing the influence of impurities adsorbed by dislocation cores on the mobility of dislocation kinks in materials with a high crystalline relief (Peierls barriers). The delay time spectrum of kinks at statistical fluctuations of the impurity density has been calculated for a sufficiently high energy of interaction between impurities and dislocations when the migration potential is not reduced to a random Gaussian potential. It has been shown that fluctuations in the impurity distribution substantially change the character of the migration of dislocation kinks due to the slow decrease in the probability of long delaymore » times. The dependences of the position of the boundary of the dynamic phase transition to a sublinear drift of kinks x {proportional_to} t{sup {delta}} ({delta} {sigma} 1) and the characteristics of the anomalous mobility on the physical parameters (stress, impurity concentration, experimental temperature, etc.) have been calculated.« less
Extended optical model for fission
Sin, M.; Capote, R.; Herman, M. W.; ...
2016-03-07
A comprehensive formalism to calculate fission cross sections based on the extension of the optical model for fission is presented. It can be used for description of nuclear reactions on actinides featuring multi-humped fission barriers with partial absorption in the wells and direct transmission through discrete and continuum fission channels. The formalism describes the gross fluctuations observed in the fission probability due to vibrational resonances, and can be easily implemented in existing statistical reaction model codes. The extended optical model for fission is applied for neutron induced fission cross-section calculations on 234,235,238U and 239Pu targets. A triple-humped fission barrier ismore » used for 234,235U(n,f), while a double-humped fission barrier is used for 238U(n,f) and 239Pu(n,f) reactions as predicted by theoretical barrier calculations. The impact of partial damping of class-II/III states, and of direct transmission through discrete and continuum fission channels, is shown to be critical for a proper description of the measured fission cross sections for 234,235,238U(n,f) reactions. The 239Pu(n,f) reaction can be calculated in the complete damping approximation. Calculated cross sections for 235,238U(n,f) and 239Pu(n,f) reactions agree within 3% with the corresponding cross sections derived within the Neutron Standards least-squares fit of available experimental data. Lastly, the extended optical model for fission can be used for both theoretical fission studies and nuclear data evaluation.« less
Fisher statistics for analysis of diffusion tensor directional information.
Hutchinson, Elizabeth B; Rutecki, Paul A; Alexander, Andrew L; Sutula, Thomas P
2012-04-30
A statistical approach is presented for the quantitative analysis of diffusion tensor imaging (DTI) directional information using Fisher statistics, which were originally developed for the analysis of vectors in the field of paleomagnetism. In this framework, descriptive and inferential statistics have been formulated based on the Fisher probability density function, a spherical analogue of the normal distribution. The Fisher approach was evaluated for investigation of rat brain DTI maps to characterize tissue orientation in the corpus callosum, fornix, and hilus of the dorsal hippocampal dentate gyrus, and to compare directional properties in these regions following status epilepticus (SE) or traumatic brain injury (TBI) with values in healthy brains. Direction vectors were determined for each region of interest (ROI) for each brain sample and Fisher statistics were applied to calculate the mean direction vector and variance parameters in the corpus callosum, fornix, and dentate gyrus of normal rats and rats that experienced TBI or SE. Hypothesis testing was performed by calculation of Watson's F-statistic and associated p-value giving the likelihood that grouped observations were from the same directional distribution. In the fornix and midline corpus callosum, no directional differences were detected between groups, however in the hilus, significant (p<0.0005) differences were found that robustly confirmed observations that were suggested by visual inspection of directionally encoded color DTI maps. The Fisher approach is a potentially useful analysis tool that may extend the current capabilities of DTI investigation by providing a means of statistical comparison of tissue structural orientation. Copyright © 2012 Elsevier B.V. All rights reserved.
Anharmonic effects in the quantum cluster equilibrium method
NASA Astrophysics Data System (ADS)
von Domaros, Michael; Perlt, Eva
2017-03-01
The well-established quantum cluster equilibrium (QCE) model provides a statistical thermodynamic framework to apply high-level ab initio calculations of finite cluster structures to macroscopic liquid phases using the partition function. So far, the harmonic approximation has been applied throughout the calculations. In this article, we apply an important correction in the evaluation of the one-particle partition function and account for anharmonicity. Therefore, we implemented an analytical approximation to the Morse partition function and the derivatives of its logarithm with respect to temperature, which are required for the evaluation of thermodynamic quantities. This anharmonic QCE approach has been applied to liquid hydrogen chloride and cluster distributions, and the molar volume, the volumetric thermal expansion coefficient, and the isobaric heat capacity have been calculated. An improved description for all properties is observed if anharmonic effects are considered.
Vendrell, Oriol; Brill, Michael; Gatti, Fabien; Lauvergnat, David; Meyer, Hans-Dieter
2009-06-21
Quantum dynamical calculations are reported for the zero point energy, several low-lying vibrational states, and the infrared spectrum of the H(5)O(2)(+) cation. The calculations are performed by the multiconfiguration time-dependent Hartree (MCTDH) method. A new vector parametrization based on a mixed Jacobi-valence description of the system is presented. With this parametrization the potential energy surface coupling is reduced with respect to a full Jacobi description, providing a better convergence of the n-mode representation of the potential. However, new coupling terms appear in the kinetic energy operator. These terms are derived and discussed. A mode-combination scheme based on six combined coordinates is used, and the representation of the 15-dimensional potential in terms of a six-combined mode cluster expansion including up to some 7-dimensional grids is discussed. A statistical analysis of the accuracy of the n-mode representation of the potential at all orders is performed. Benchmark, fully converged results are reported for the zero point energy, which lie within the statistical uncertainty of the reference diffusion Monte Carlo result for this system. Some low-lying vibrationally excited eigenstates are computed by block improved relaxation, illustrating the applicability of the approach to large systems. Benchmark calculations of the linear infrared spectrum are provided, and convergence with increasing size of the time-dependent basis and as a function of the order of the n-mode representation is studied. The calculations presented here make use of recent developments in the parallel version of the MCTDH code, which are briefly discussed. We also show that the infrared spectrum can be computed, to a very good approximation, within D(2d) symmetry, instead of the G(16) symmetry used before, in which the complete rotation of one water molecule with respect to the other is allowed, thus simplifying the dynamical problem.
Medication calculation skills of graduating nursing students in Finland.
Grandell-Niemi, H; Hupli, M; Leino-Kilpi, H
2001-01-01
The aim of this study was to describe the basic mathematical proficiency and the medication calculation skills of graduating nursing students in Finland. A further concern was with how students experienced the teaching of medication calculation. We wanted to find out whether these experiences were associated with various background factors and the students' medication calculation skills. In spring 1997 the population of graduating nursing students in Finland numbered around 1280; the figure for the whole year was 2640. A convenience sample of 204 students completed a questionnaire specially developed for this study. The instrument included structured questions, statements and a medication calculation test. The response rate was 88%. Data analysis was based on descriptive statistics. The students found it hard to learn mathematics and medication calculation skills. Those who evaluated their mathematical and medication calculation skills as sufficient successfully solved the problems included in the questionnaire. It was felt that the introductory course on medication calculation was uninteresting and poorly organised. Overall the students' mathematical skills were inadequate. One-fifth of the students failed to pass the medication calculation test. A positive correlation was shown between the student's grade in mathematics (Sixth Form College) and her skills in medication calculation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Purtov, P.A.; Salikhov, K.M.
1987-09-01
Semiclassical HFI description is applicable to calculating the integral CIDNP effect in weak fields. The HFI has been calculated for radicals with sufficiently numerous magnetically equivalent nuclei (n greater than or equal to 5) in satisfactory agreement with CIDNP calculations based on quantum-mechanical description of radical-pair spin dynamics.
Learning moment-based fast local binary descriptor
NASA Astrophysics Data System (ADS)
Bellarbi, Abdelkader; Zenati, Nadia; Otmane, Samir; Belghit, Hayet
2017-03-01
Recently, binary descriptors have attracted significant attention due to their speed and low memory consumption; however, using intensity differences to calculate the binary descriptive vector is not efficient enough. We propose an approach to binary description called POLAR_MOBIL, in which we perform binary tests between geometrical and statistical information using moments in the patch instead of the classical intensity binary test. In addition, we introduce a learning technique used to select an optimized set of binary tests with low correlation and high variance. This approach offers high distinctiveness against affine transformations and appearance changes. An extensive evaluation on well-known benchmark datasets reveals the robustness and the effectiveness of the proposed descriptor, as well as its good performance in terms of low computation complexity when compared with state-of-the-art real-time local descriptors.
What is too much variation? The null hypothesis in small-area analysis.
Diehr, P; Cain, K; Connell, F; Volinn, E
1990-01-01
A small-area analysis (SAA) in health services research often calculates surgery rates for several small areas, compares the largest rate to the smallest, notes that the difference is large, and attempts to explain this discrepancy as a function of service availability, physician practice styles, or other factors. SAAs are often difficult to interpret because there is little theoretical basis for determining how much variation would be expected under the null hypothesis that all of the small areas have similar underlying surgery rates and that the observed variation is due to chance. We developed a computer program to simulate the distribution of several commonly used descriptive statistics under the null hypothesis, and used it to examine the variability in rates among the counties of the state of Washington. The expected variability when the null hypothesis is true is surprisingly large, and becomes worse for procedures with low incidence, for smaller populations, when there is variability among the populations of the counties, and when readmissions are possible. The characteristics of four descriptive statistics were studied and compared. None was uniformly good, but the chi-square statistic had better performance than the others. When we reanalyzed five journal articles that presented sufficient data, the results were usually statistically significant. Since SAA research today is tending to deal with low-incidence events, smaller populations, and measures where readmissions are possible, more research is needed on the distribution of small-area statistics under the null hypothesis. New standards are proposed for the presentation of SAA results. PMID:2312306
What is too much variation? The null hypothesis in small-area analysis.
Diehr, P; Cain, K; Connell, F; Volinn, E
1990-02-01
A small-area analysis (SAA) in health services research often calculates surgery rates for several small areas, compares the largest rate to the smallest, notes that the difference is large, and attempts to explain this discrepancy as a function of service availability, physician practice styles, or other factors. SAAs are often difficult to interpret because there is little theoretical basis for determining how much variation would be expected under the null hypothesis that all of the small areas have similar underlying surgery rates and that the observed variation is due to chance. We developed a computer program to simulate the distribution of several commonly used descriptive statistics under the null hypothesis, and used it to examine the variability in rates among the counties of the state of Washington. The expected variability when the null hypothesis is true is surprisingly large, and becomes worse for procedures with low incidence, for smaller populations, when there is variability among the populations of the counties, and when readmissions are possible. The characteristics of four descriptive statistics were studied and compared. None was uniformly good, but the chi-square statistic had better performance than the others. When we reanalyzed five journal articles that presented sufficient data, the results were usually statistically significant. Since SAA research today is tending to deal with low-incidence events, smaller populations, and measures where readmissions are possible, more research is needed on the distribution of small-area statistics under the null hypothesis. New standards are proposed for the presentation of SAA results.
Sneck, Sami; Saarnio, Reetta; Isola, Arja; Boigu, Risto
2016-01-01
Medication administration is an important task of registered nurses. According to previous studies, nurses lack theoretical knowledge and drug calculation skills and knowledge-based mistakes do occur in clinical practice. Finnish health care organizations started to develop a systematic verification processes for medication competence at the end of the last decade. No studies have yet been made of nurses' theoretical knowledge and drug calculation skills according to these online exams. The aim of this study was to describe the medication competence of Finnish nurses according to theoretical and drug calculation exams. A descriptive correlation design was adopted. Participants and settings All nurses who participated in the online exam in three Finnish hospitals between 1.1.2009 and 31.05.2014 were selected to the study (n=2479). Quantitative methods like Pearson's chi-squared tests, analysis of variance (ANOVA) with post hoc Tukey tests and Pearson's correlation coefficient were used to test the existence of relationships between dependent and independent variables. The majority of nurses mastered the theoretical knowledge needed in medication administration, but 5% of the nurses struggled with passing the drug calculation exam. Theoretical knowledge and drug calculation skills were better in acute care units than in the other units and younger nurses achieved better results in both exams than their older colleagues. The differences found in this study were statistically significant, but not high. Nevertheless, even the tiniest deficiency in theoretical knowledge and drug calculation skills should be focused on. It is important to identify the nurses who struggle in the exams and to plan targeted educational interventions for supporting them. The next step is to study if verification of medication competence has an effect on patient safety. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Peters, C.; Kampe, F. (Principal Investigator)
1980-01-01
The mathematical description and implementation of the statistical estimation procedure known as the Houston integrated spatial/spectral estimator (HISSE) is discussed. HISSE is based on a normal mixture model and is designed to take advantage of spectral and spatial information of LANDSAT data pixels, utilizing the initial classification and clustering information provided by the AMOEBA algorithm. The HISSE calculates parametric estimates of class proportions which reduce the error inherent in estimates derived from typical classify and count procedures common to nonparametric clustering algorithms. It also singles out spatial groupings of pixels which are most suitable for labeling classes. These calculations are designed to aid the analyst/interpreter in labeling patches with a crop class label. Finally, HISSE's initial performance on an actual LANDSAT agricultural ground truth data set is reported.
Mean-field approximation for spacing distribution functions in classical systems.
González, Diego Luis; Pimpinelli, Alberto; Einstein, T L
2012-01-01
We propose a mean-field method to calculate approximately the spacing distribution functions p((n))(s) in one-dimensional classical many-particle systems. We compare our method with two other commonly used methods, the independent interval approximation and the extended Wigner surmise. In our mean-field approach, p((n))(s) is calculated from a set of Langevin equations, which are decoupled by using a mean-field approximation. We find that in spite of its simplicity, the mean-field approximation provides good results in several systems. We offer many examples illustrating that the three previously mentioned methods give a reasonable description of the statistical behavior of the system. The physical interpretation of each method is also discussed. © 2012 American Physical Society
Spriestersbach, Albert; Röhrig, Bernd; du Prel, Jean-Baptist; Gerhold-Ay, Aslihan; Blettner, Maria
2009-09-01
Descriptive statistics are an essential part of biometric analysis and a prerequisite for the understanding of further statistical evaluations, including the drawing of inferences. When data are well presented, it is usually obvious whether the author has collected and evaluated them correctly and in keeping with accepted practice in the field. Statistical variables in medicine may be of either the metric (continuous, quantitative) or categorical (nominal, ordinal) type. Easily understandable examples are given. Basic techniques for the statistical description of collected data are presented and illustrated with examples. The goal of a scientific study must always be clearly defined. The definition of the target value or clinical endpoint determines the level of measurement of the variables in question. Nearly all variables, whatever their level of measurement, can be usefully presented graphically and numerically. The level of measurement determines what types of diagrams and statistical values are appropriate. There are also different ways of presenting combinations of two independent variables graphically and numerically. The description of collected data is indispensable. If the data are of good quality, valid and important conclusions can already be drawn when they are properly described. Furthermore, data description provides a basis for inferential statistics.
Mathematical ability of first year undergraduate paramedic students-A before and after study.
Eastwood, Kathryn; Boyle, Malcolm; Kim, Visal; Stam, Nathan; Williams, Brett
2015-11-01
An ability to accurately perform drug calculations unassisted is an essential skill for all health professionals, with various occupational-specific stressors exacerbating mathematical deficiencies. The objective of this study was to determine the unaided mathematic ability of first year undergraduate paramedic students before and after mathematical and drug calculation tutorials. Students were administered a questionnaire containing demographic, drug calculation and arithmetic questions during week one of the semester before the tutorials. During the semester students participated in three 2-hour tutorials which included both mathematical and drug calculation questions without assistance of computational devices. At the end of semester was a summative drug calculation examination of which five key questions were compared to similar questions from the first questionnaire. Descriptive statistics describe the demographic data with a paired t-test comparing the questionnaire and exam results. Drug calculation and mathematical ability was markedly improved following the tutorials, mean score of correct answers before 1.74 (SD 1.4) and after 4.14 (SD 0.93), p<0001. When comparing the correct results for the same question type, there were statistically significant differences in four of five different drug calculations: volume of drug drawn up 10 v 57 p<0.0001, infusion rate 29 v 31 p=0.717, drip rate 16 v 54 p<0.0001, volume from a syringe 30 v 59 p<0.0001, and drug dose 42 v 62 p<0.0001. Total errors reduced from 188 to 45. First year undergraduate paramedic students initially demonstrated a poor ability to complete mathematical and drug calculations without the assistance of computational devices. This improved significantly following appropriate education and practice. Further research is required to determine the retention of this ability over time. Copyright © 2015 Elsevier Ltd. All rights reserved.
Statistics of some atmospheric turbulence records relevant to aircraft response calculations
NASA Technical Reports Server (NTRS)
Mark, W. D.; Fischer, R. W.
1981-01-01
Methods for characterizing atmospheric turbulence are described. The methods illustrated include maximum likelihood estimation of the integral scale and intensity of records obeying the von Karman transverse power spectral form, constrained least-squares estimation of the parameters of a parametric representation of autocorrelation functions, estimation of the power spectra density of the instantaneous variance of a record with temporally fluctuating variance, and estimation of the probability density functions of various turbulence components. Descriptions of the computer programs used in the computations are given, and a full listing of these programs is included.
Overholser, Brian R; Sowinski, Kevin M
2007-12-01
Biostatistics is the application of statistics to biologic data. The field of statistics can be broken down into 2 fundamental parts: descriptive and inferential. Descriptive statistics are commonly used to categorize, display, and summarize data. Inferential statistics can be used to make predictions based on a sample obtained from a population or some large body of information. It is these inferences that are used to test specific research hypotheses. This 2-part review will outline important features of descriptive and inferential statistics as they apply to commonly conducted research studies in the biomedical literature. Part 1 in this issue will discuss fundamental topics of statistics and data analysis. Additionally, some of the most commonly used statistical tests found in the biomedical literature will be reviewed in Part 2 in the February 2008 issue.
Tuuli, Methodius G; Odibo, Anthony O
2011-08-01
The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, Andrew; Haass, Michael; Rintoul, Mark Daniel
GazeAppraise advances the state of the art of gaze pattern analysis using methods that simultaneously analyze spatial and temporal characteristics of gaze patterns. GazeAppraise enables novel research in visual perception and cognition; for example, using shape features as distinguishing elements to assess individual differences in visual search strategy. Given a set of point-to-point gaze sequences, hereafter referred to as scanpaths, the method constructs multiple descriptive features for each scanpath. Once the scanpath features have been calculated, they are used to form a multidimensional vector representing each scanpath and cluster analysis is performed on the set of vectors from all scanpaths.more » An additional benefit of this method is the identification of causal or correlated characteristics of the stimuli, subjects, and visual task through statistical analysis of descriptive metadata distributions within and across clusters.« less
Final excitation energy of fission fragments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmidt, Karl-Heinz; Jurado, Beatriz
We study how the excitation energy of the fully accelerated fission fragments is built up. It is stressed that only the intrinsic excitation energy available before scission can be exchanged between the fission fragments to achieve thermal equilibrium. This is in contradiction with most models used to calculate prompt neutron emission, where it is assumed that the total excitation energy of the final fragments is shared between the fragments by the condition of equal temperatures. We also study the intrinsic excitation-energy partition in statistical equilibrium for different level-density descriptions as a function of the total intrinsic excitation energy of themore » fissioning system. Excitation energies are found to be strongly enhanced in the heavy fragment, if the level density follows a constant-temperature behavior at low energies, e.g., in the composed Gilbert-Cameron description.« less
Comparative Research of Navy Voluntary Education at Operational Commands
2017-03-01
return on investment, ROI, logistic regression, multivariate analysis, descriptive statistics, Markov, time-series, linear programming 15. NUMBER...21 B. DESCRIPTIVE STATISTICS TABLES ...............................................25 C. PRIVACY CONSIDERATIONS...THIS PAGE INTENTIONALLY LEFT BLANK xi LIST OF TABLES Table 1. Variables and Descriptions . Adapted from NETC (2016). .......................21
Linear models for calculating digestibile energy for sheep diets.
Fonnesbeck, P V; Christiansen, M L; Harris, L E
1981-05-01
Equations for estimating the digestible energy (DE) content of sheep diets were generated from the chemical contents and a factorial description of diets fed to lambs in digestion trials. The diet factors were two forages (alfalfa and grass hay), harvested at three stages of maturity (late vegetative, early bloom and full bloom), fed in two ingredient combinations (all hay or a 50:50 hay and corn grain mixture) and prepared by two forage texture processes (coarsely chopped or finely chopped and pelleted). The 2 x 3 x 2 x 2 factorial arrangement produced 24 diet treatments. These were replicated twice, for a total of 48 lamb digestion trials. In model 1 regression equations, DE was calculated directly from chemical composition of the diet. In model 2, regression equations predicted the percentage of digested nutrient from the chemical contents of the diet and then DE of the diet was calculated as the sum of the gross energy of the digested organic components. Expanded forms of model 1 and model 2 were also developed that included diet factors as qualitative indicator variables to adjust the regression constant and regression coefficients for the diet description. The expanded forms of the equations accounted for significantly more variation in DE than did the simple models and more accurately estimated DE of the diet. Information provided by the diet description proved as useful as chemical analyses for the prediction of digestibility of nutrients. The statistics indicate that, with model 1, neutral detergent fiber and plant cell wall analyses provided as much information for the estimation of DE as did model 2 with the combined information from crude protein, available carbohydrate, total lipid, cellulose and hemicellulose. Regression equations are presented for estimating DE with the most currently analyzed organic components, including linear and curvilinear variables and diet factors that significantly reduce the standard error of the estimate. To estimate De of a diet, the user utilizes the equation that uses the chemical analysis information and diet description most effectively.
Analysis of Professional and Pre-Accession Characteristics and Junior Naval Officer Performance
2018-03-01
REVIEW .............................................5 A. NAVY PERFORMANCE EVALUATION SYSTEM ............................5 B. PROFESSIONAL...17 A. DATA DESCRIPTION ...........................................................................17 B. SUMMARY...STATISTICS ......................................................................24 C. DESCRIPTIVE STATISTICS
The statistical average of optical properties for alumina particle cluster in aircraft plume
NASA Astrophysics Data System (ADS)
Li, Jingying; Bai, Lu; Wu, Zhensen; Guo, Lixin
2018-04-01
We establish a model for lognormal distribution of monomer radius and number of alumina particle clusters in plume. According to the Multi-Sphere T Matrix (MSTM) theory, we provide a method for finding the statistical average of optical properties for alumina particle clusters in plume, analyze the effect of different distributions and different detection wavelengths on the statistical average of optical properties for alumina particle cluster, and compare the statistical average optical properties under the alumina particle cluster model established in this study and those under three simplified alumina particle models. The calculation results show that the monomer number of alumina particle cluster and its size distribution have a considerable effect on its statistical average optical properties. The statistical average of optical properties for alumina particle cluster at common detection wavelengths exhibit obvious differences, whose differences have a great effect on modeling IR and UV radiation properties of plume. Compared with the three simplified models, the alumina particle cluster model herein features both higher extinction and scattering efficiencies. Therefore, we may find that an accurate description of the scattering properties of alumina particles in aircraft plume is of great significance in the study of plume radiation properties.
A program code generator for multiphysics biological simulation using markup languages.
Amano, Akira; Kawabata, Masanari; Yamashita, Yoshiharu; Rusty Punzalan, Florencio; Shimayoshi, Takao; Kuwabara, Hiroaki; Kunieda, Yoshitoshi
2012-01-01
To cope with the complexity of the biological function simulation models, model representation with description language is becoming popular. However, simulation software itself becomes complex in these environment, thus, it is difficult to modify the simulation conditions, target computation resources or calculation methods. In the complex biological function simulation software, there are 1) model equations, 2) boundary conditions and 3) calculation schemes. Use of description model file is useful for first point and partly second point, however, third point is difficult to handle for various calculation schemes which is required for simulation models constructed from two or more elementary models. We introduce a simulation software generation system which use description language based description of coupling calculation scheme together with cell model description file. By using this software, we can easily generate biological simulation code with variety of coupling calculation schemes. To show the efficiency of our system, example of coupling calculation scheme with three elementary models are shown.
NASA Astrophysics Data System (ADS)
Adlmann, Franz A.; Herbel, Jörg; Korolkovas, Airidas; Bliersbach, Andreas; Toperverg, Boris; Van Herck, Walter; Pálsson, Gunnar K.; Kitchen, Brian; Wolff, Max
2018-04-01
Grazing incidence neutron scattering experiments offer surface sensitivity by reflecting from an interface at momentum transfers close to total external reflection. Under these conditions the penetration depth is strongly non-linear and may change by many orders of magnitude. This fact imposes severe challenges for depth resolved experiments, since the brilliance of neutron beams is relatively low in comparison to e.g. synchrotron radiation. In this article we use probability density functions to calculate the contribution of scattering at different distances from an interface to the intensities registered on the detector. Our method has the particular advantage that the depth sensitivity is directly extracted from the scattering pattern itself. Hence for perfectly known samples exact resolution functions can be calculated and visa versa. We show that any tails in the resolution function, e.g. Gaussian shaped, hinders depth resolved experiments. More importantly we provide means for a descriptive statistical analysis of detector images with respect to the scattering contributions and show that even for perfect resolution near surface scattering is hardly accessible.
Descriptive Statistical Techniques for Librarians. 2nd Edition.
ERIC Educational Resources Information Center
Hafner, Arthur W.
A thorough understanding of the uses and applications of statistical techniques is integral in gaining support for library funding or new initiatives. This resource is designed to help practitioners develop and manipulate descriptive statistical information in evaluating library services, tracking and controlling limited resources, and analyzing…
Joaquim, Ana; Custódio, Sandra; Savva-Bordalo, Joana; Chacim, Sérgio; Carvalhais, Inês; Lombo, Liliana; Lopes, Heitor; Araújo, António; Gomes, Rui
2018-03-01
Burnout is a professional syndrome associated with stress caused by overwork. Our aim was to calculate the prevalence of burnout and stress on medical residents of Oncology, Haematology and Radiotherapy in Portugal, as well as to determine predictors of burnout and stress. An anonymous questionnaire was applied (n = 118). Statistical analysis consisted of a descriptive and inferential analysis. The prevalence of burnout and stress was calculated to be 45.2 and 50%, respectively. The dimensions that generated higher levels of stress were 'dealing with patients' and 'overwork'. Burnout was directly related with stress dimension 'overwork'. The prevalence of burnout in Portuguese oncological residents is high as in other European countries and in the U.S. Therefore, interventional strategies can be designed.
Theoretical study of the kinetics of reactions of the monohalogenated methanes with atomic chlorine.
Brudnik, Katarzyna; Twarda, Maria; Sarzyński, Dariusz; Jodkowski, Jerzy T
2013-04-01
Ab initio calculations at the G2 level were used in a theoretical description of the kinetics and mechanism of the hydrogen abstraction reactions from fluoro-, chloro- and bromomethane by chlorine atoms. The profiles of the potential energy surfaces show that mechanism of the reactions under investigation is complex and consists of two - in the case of CH3F+Cl - and of three elementary steps for CH3Cl+Cl and CH3Br+Cl. The heights of the energy barrier related to the H-abstraction are of 8-10 kJ mol(-1), the lowest value corresponds to CH3Cl+Cl and the highest one to CH3F+Cl. The rate constants were calculated using the theoretical method based on the RRKM theory and the simplified version of the statistical adiabatic channel model. The kinetic equations derived in this study[Formula: see text]and[Formula: see text]allow a description of the kinetics of the reactions under investigation in the temperature range of 200-3000 K. The kinetics of reactions of the entirely deuterated reactants were also included in the kinetic analysis. Results of ab initio calculations show that D-abstraction process is related with the energy barrier of 5 kJ mol(-1) higher than the H-abstraction from the corresponding non-deuterated reactant molecule. The derived analytical equations for the reactions, CD3X+Cl, CH2X+HCl and CD2X+DCl (X = F, Cl and Br) are a substantial supplement of the kinetic data necessary for the description and modeling of the processes of importance in the atmospheric chemistry.
Teif, Vladimir B
2007-01-01
The transfer matrix methodology is proposed as a systematic tool for the statistical-mechanical description of DNA-protein-drug binding involved in gene regulation. We show that a genetic system of several cis-regulatory modules is calculable using this method, considering explicitly the site-overlapping, competitive, cooperative binding of regulatory proteins, their multilayer assembly and DNA looping. In the methodological section, the matrix models are solved for the basic types of short- and long-range interactions between DNA-bound proteins, drugs and nucleosomes. We apply the matrix method to gene regulation at the O(R) operator of phage lambda. The transfer matrix formalism allowed the description of the lambda-switch at a single-nucleotide resolution, taking into account the effects of a range of inter-protein distances. Our calculations confirm previously established roles of the contact CI-Cro-RNAP interactions. Concerning long-range interactions, we show that while the DNA loop between the O(R) and O(L) operators is important at the lysogenic CI concentrations, the interference between the adjacent promoters P(R) and P(RM) becomes more important at small CI concentrations. A large change in the expression pattern may arise in this regime due to anticooperative interactions between DNA-bound RNA polymerases. The applicability of the matrix method to more complex systems is discussed.
An Overview of Interrater Agreement on Likert Scales for Researchers and Practitioners
O'Neill, Thomas A.
2017-01-01
Applications of interrater agreement (IRA) statistics for Likert scales are plentiful in research and practice. IRA may be implicated in job analysis, performance appraisal, panel interviews, and any other approach to gathering systematic observations. Any rating system involving subject-matter experts can also benefit from IRA as a measure of consensus. Further, IRA is fundamental to aggregation in multilevel research, which is becoming increasingly common in order to address nesting. Although, several technical descriptions of a few specific IRA statistics exist, this paper aims to provide a tractable orientation to common IRA indices to support application. The introductory overview is written with the intent of facilitating contrasts among IRA statistics by critically reviewing equations, interpretations, strengths, and weaknesses. Statistics considered include rwg, rwg*, r′wg, rwg(p), average deviation (AD), awg, standard deviation (Swg), and the coefficient of variation (CVwg). Equations support quick calculation and contrasting of different agreement indices. The article also includes a “quick reference” table and three figures in order to help readers identify how IRA statistics differ and how interpretations of IRA will depend strongly on the statistic employed. A brief consideration of recommended practices involving statistical and practical cutoff standards is presented, and conclusions are offered in light of the current literature. PMID:28553257
Gates, Allison; Gates, Michelle; Duarte, Gonçalo; Cary, Maria; Becker, Monika; Prediger, Barbara; Vandermeer, Ben; Fernandes, Ricardo M; Pieper, Dawid; Hartling, Lisa
2018-06-13
Systematic reviews (SRs) of randomised controlled trials (RCTs) can provide the best evidence to inform decision-making, but their methodological and reporting quality varies. Tools exist to guide the critical appraisal of quality and risk of bias in SRs, but evaluations of their measurement properties are limited. We will investigate the interrater reliability (IRR), usability, and applicability of A MeaSurement Tool to Assess systematic Reviews (AMSTAR), AMSTAR 2, and Risk Of Bias In Systematic reviews (ROBIS) for SRs in the fields of biomedicine and public health. An international team of researchers at three collaborating centres will undertake the study. We will use a random sample of 30 SRs of RCTs investigating therapeutic interventions indexed in MEDLINE in February 2014. Two reviewers at each centre will appraise the quality and risk of bias in each SR using AMSTAR, AMSTAR 2, and ROBIS. We will record the time to complete each assessment and for the two reviewers to reach consensus for each SR. We will extract the descriptive characteristics of each SR, the included studies, participants, interventions, and comparators. We will also extract the direction and strength of the results and conclusions for the primary outcome. We will summarise the descriptive characteristics of the SRs using means and standard deviations, or frequencies and proportions. To test for interrater reliability between reviewers and between the consensus agreements of reviewer pairs, we will use Gwet's AC 1 statistic. For comparability to previous evaluations, we will also calculate weighted Cohen's kappa and Fleiss' kappa statistics. To estimate usability, we will calculate the mean time to complete the appraisal and to reach consensus for each tool. To inform applications of the tools, we will test for statistical associations between quality scores and risk of bias judgments, and the results and conclusions of the SRs. Appraising the methodological and reporting quality of SRs is necessary to determine the trustworthiness of their conclusions. Which tool may be most reliably applied and how the appraisals should be used is uncertain; the usability of newly developed tools is unknown. This investigation of common (AMSTAR) and newly developed (AMSTAR 2, ROBIS) tools will provide empiric data to inform their application, interpretation, and refinement.
Single-gender mathematics and science classes and the effects on urban middle school boys and girls
NASA Astrophysics Data System (ADS)
Sudler, Dawn M.
This study compared the differences in the Criterion-Referenced Competency Test (CRCT) mathematics and science achievement scores of boys and girls in Grade 7 at two urban middle schools. The data allowed the researcher to determine to what degree boys and girls in Grade 7 differ in their mathematics and science achievements within a single-gender environment versus a coeducational learning environment. The study compared any differences between boys and girls in Grade 7 within a single-gender environment in the subjects of mathematics and science, as measured by the CRCT assessments. The study also compared differences between boys and girls in Grade 7 within a coeducational environment in the subjects of mathematics and science, as measured by the CRCT assessments. Two middle schools were used within the study. One middle school was identified as a single-gender school (Middle School A); the other was identified as a coeducational school (Middle School B). This quantitative study applied the use of a descriptive research design. In addition, CRCT scores for the subjects of mathematics and science were taken during the spring of 2008 from both middle schools. Data were measured using descriptive statistics and independent t test calculations. The frequency statistics proceeded to compare each sample performance levels. The data were described in means, standard deviations, standard error means, frequency, and percentages. This method provided an excellent description of a sample scored on the spring 2008 CRCT mathematics and science assessments.
Writing to Learn Statistics in an Advanced Placement Statistics Course
ERIC Educational Resources Information Center
Northrup, Christian Glenn
2012-01-01
This study investigated the use of writing in a statistics classroom to learn if writing provided a rich description of problem-solving processes of students as they solved problems. Through analysis of 329 written samples provided by students, it was determined that writing provided a rich description of problem-solving processes and enabled…
The Greyhound Strike: Using a Labor Dispute to Teach Descriptive Statistics.
ERIC Educational Resources Information Center
Shatz, Mark A.
1985-01-01
A simulation exercise of a labor-management dispute is used to teach psychology students some of the basics of descriptive statistics. Using comparable data sets generated by the instructor, students work in small groups to develop a statistical presentation that supports their particular position in the dispute. (Author/RM)
High Accuracy Transistor Compact Model Calibrations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hembree, Charles E.; Mar, Alan; Robertson, Perry J.
2015-09-01
Typically, transistors are modeled by the application of calibrated nominal and range models. These models consists of differing parameter values that describe the location and the upper and lower limits of a distribution of some transistor characteristic such as current capacity. Correspond- ingly, when using this approach, high degrees of accuracy of the transistor models are not expected since the set of models is a surrogate for a statistical description of the devices. The use of these types of models describes expected performances considering the extremes of process or transistor deviations. In contrast, circuits that have very stringent accuracy requirementsmore » require modeling techniques with higher accuracy. Since these accurate models have low error in transistor descriptions, these models can be used to describe part to part variations as well as an accurate description of a single circuit instance. Thus, models that meet these stipulations also enable the calculation of quantifi- cation of margins with respect to a functional threshold and uncertainties in these margins. Given this need, new model high accuracy calibration techniques for bipolar junction transis- tors have been developed and are described in this report.« less
NASA Astrophysics Data System (ADS)
Niedermeier, Dennis; Ervens, Barbara; Clauss, Tina; Voigtländer, Jens; Wex, Heike; Hartmann, Susan; Stratmann, Frank
2014-01-01
In a recent study, the Soccer ball model (SBM) was introduced for modeling and/or parameterizing heterogeneous ice nucleation processes. The model applies classical nucleation theory. It allows for a consistent description of both apparently singular and stochastic ice nucleation behavior, by distributing contact angles over the nucleation sites of a particle population assuming a Gaussian probability density function. The original SBM utilizes the Monte Carlo technique, which hampers its usage in atmospheric models, as fairly time-consuming calculations must be performed to obtain statistically significant results. Thus, we have developed a simplified and computationally more efficient version of the SBM. We successfully used the new SBM to parameterize experimental nucleation data of, e.g., bacterial ice nucleation. Both SBMs give identical results; however, the new model is computationally less expensive as confirmed by cloud parcel simulations. Therefore, it is a suitable tool for describing heterogeneous ice nucleation processes in atmospheric models.
Rear-End Crashes: Problem Size Assessment And Statistical Description
DOT National Transportation Integrated Search
1993-05-01
KEYWORDS : RESEARCH AND DEVELOPMENT OR R&D, ADVANCED VEHICLE CONTROL & SAFETY SYSTEMS OR AVCSS, INTELLIGENT VEHICLE INITIATIVE OR IVI : THIS DOCUMENT PRESENTS PROBLEM SIZE ASSESSMENTS AND STATISTICAL CRASH DESCRIPTION FOR REAR-END CRASHES, INC...
Version 2.0 Visual Sample Plan (VSP): UXO Module Code Description and Verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, Richard O.; Wilson, John E.; O'Brien, Robert F.
2003-05-06
The Pacific Northwest National Laboratory (PNNL) is developing statistical methods for determining the amount of geophysical surveys conducted along transects (swaths) that are needed to achieve specified levels of confidence of finding target areas (TAs) of anomalous readings and possibly unexploded ordnance (UXO) at closed, transferring and transferred (CTT) Department of Defense (DoD) ranges and other sites. The statistical methods developed by PNNL have been coded into the UXO module of the Visual Sample Plan (VSP) software code that is being developed by PNNL with support from the DoD, the U.S. Department of Energy (DOE, and the U.S. Environmental Protectionmore » Agency (EPA). (The VSP software and VSP Users Guide (Hassig et al, 2002) may be downloaded from http://dqo.pnl.gov/vsp.) This report describes and documents the statistical methods developed and the calculations and verification testing that have been conducted to verify that VSPs implementation of these methods is correct and accurate.« less
Statistics in the pharmacy literature.
Lee, Charlene M; Soin, Herpreet K; Einarson, Thomas R
2004-09-01
Research in statistical methods is essential for maintenance of high quality of the published literature. To update previous reports of the types and frequencies of statistical terms and procedures in research studies of selected professional pharmacy journals. We obtained all research articles published in 2001 in 6 journals: American Journal of Health-System Pharmacy, The Annals of Pharmacotherapy, Canadian Journal of Hospital Pharmacy, Formulary, Hospital Pharmacy, and Journal of the American Pharmaceutical Association. Two independent reviewers identified and recorded descriptive and inferential statistical terms/procedures found in the methods, results, and discussion sections of each article. Results were determined by tallying the total number of times, as well as the percentage, that each statistical term or procedure appeared in the articles. One hundred forty-four articles were included. Ninety-eight percent employed descriptive statistics; of these, 28% used only descriptive statistics. The most common descriptive statistical terms were percentage (90%), mean (74%), standard deviation (58%), and range (46%). Sixty-nine percent of the articles used inferential statistics, the most frequent being chi(2) (33%), Student's t-test (26%), Pearson's correlation coefficient r (18%), ANOVA (14%), and logistic regression (11%). Statistical terms and procedures were found in nearly all of the research articles published in pharmacy journals. Thus, pharmacy education should aim to provide current and future pharmacists with an understanding of the common statistical terms and procedures identified to facilitate the appropriate appraisal and consequential utilization of the information available in research articles.
Cavallario, Julie M; Van Lunen, Bonnie L
2015-07-01
The examination of the appropriate professional degree for preparation as an athletic trainer is of interest to the profession. Descriptive information concerning universal outcomes is needed to understand the effect of a degree change. To obtain and compare descriptive information related to professional athletic training programs and a potential degree change and to determine if any of these factors contribute to success on existing universal outcome measures. Cross-sectional study. Web-based survey. We contacted 364 program directors; 178 (48.9%; 163 undergraduate, 15 postbaccalaureate) responded. The survey consisted of 46 questions: 45 questions that dealt with 5 themes (institutional demographics [n = 13], program admissions [n = 6], program outcomes [n = 10], program design [n = 9], faculty and staff [n = 7]) and 1 optional question. Descriptive statistics for all programs were calculated. We compared undergraduate and postbaccalaureate programs by examining universal outcome variables. Descriptive statistics demonstrated that 33 programs could not support postbaccalaureate degrees, and a substantial loss of faculty could occur if the degree requirement changed (553 graduate assistants, 642 potentially underqualified instructors). Postbaccalaureate professional programs had higher 2011-2012 first-time Board of Certification (BOC) passing rates (U = 464.5, P = .001), 3-year aggregate first-time BOC passing rates (U = 451.5, P = .001), and employment rates for 2011-2012 graduates employed within athletic training (U = 614.0, P = .01). Linear multiple-regression models demonstrated that program and institution type contributed to the variance of the first-time BOC passing rates and the 3-year aggregate first-time BOC passing rates (P < .05). Students in postbaccalaureate athletic training programs performed better in universal outcome measures. Our data supported the concerns that this transition could result in the loss of some programs and an additional immediate strain on current staff due to potential staffing changes and the loss of graduate assistant positions.
Principal component regression analysis with SPSS.
Liu, R X; Kuang, J; Gong, Q; Hou, X L
2003-06-01
The paper introduces all indices of multicollinearity diagnoses, the basic principle of principal component regression and determination of 'best' equation method. The paper uses an example to describe how to do principal component regression analysis with SPSS 10.0: including all calculating processes of the principal component regression and all operations of linear regression, factor analysis, descriptives, compute variable and bivariate correlations procedures in SPSS 10.0. The principal component regression analysis can be used to overcome disturbance of the multicollinearity. The simplified, speeded up and accurate statistical effect is reached through the principal component regression analysis with SPSS.
The method of expected number of deaths, 1786-1886-1986.
Keiding, N
1987-04-01
"The method of expected number of deaths is an integral part of standardization of vital rates, which is one of the oldest statistical techniques. The expected number of deaths was calculated in 18th century actuarial mathematics...but the method seems to have been forgotten, and was reinvented in connection with 19th century studies of geographical and occupational variations of mortality.... It is noted that standardization of rates is intimately connected to the study of relative mortality, and a short description of very recent developments in the methodology of that area is included." (SUMMARY IN FRE) excerpt
ERIC Educational Resources Information Center
Perrett, Jamis J.
2012-01-01
This article demonstrates how textbooks differ in their description of the term "experimental unit". Advanced Placement Statistics teachers and students are often limited in their statistical knowledge by the information presented in their classroom textbook. Definitions and descriptions differ among textbooks as well as among different…
2016-02-02
23 Descriptive Statistics for Enlisted Service Applicants and Accessions...33 Summary Statistics for Applicants and Accessions for Enlisted Service ..................................... 36 Applicants and...utilization among Soldiers screened using TAPAS. Section 2 of this report includes the descriptive statistics AMSARA compiles and publishes
Hayat, Matthew J.; Powell, Amanda; Johnson, Tessa; Cadwell, Betsy L.
2017-01-01
Statistical literacy and knowledge is needed to read and understand the public health literature. The purpose of this study was to quantify basic and advanced statistical methods used in public health research. We randomly sampled 216 published articles from seven top tier general public health journals. Studies were reviewed by two readers and a standardized data collection form completed for each article. Data were analyzed with descriptive statistics and frequency distributions. Results were summarized for statistical methods used in the literature, including descriptive and inferential statistics, modeling, advanced statistical techniques, and statistical software used. Approximately 81.9% of articles reported an observational study design and 93.1% of articles were substantively focused. Descriptive statistics in table or graphical form were reported in more than 95% of the articles, and statistical inference reported in more than 76% of the studies reviewed. These results reveal the types of statistical methods currently used in the public health literature. Although this study did not obtain information on what should be taught, information on statistical methods being used is useful for curriculum development in graduate health sciences education, as well as making informed decisions about continuing education for public health professionals. PMID:28591190
Hayat, Matthew J; Powell, Amanda; Johnson, Tessa; Cadwell, Betsy L
2017-01-01
Statistical literacy and knowledge is needed to read and understand the public health literature. The purpose of this study was to quantify basic and advanced statistical methods used in public health research. We randomly sampled 216 published articles from seven top tier general public health journals. Studies were reviewed by two readers and a standardized data collection form completed for each article. Data were analyzed with descriptive statistics and frequency distributions. Results were summarized for statistical methods used in the literature, including descriptive and inferential statistics, modeling, advanced statistical techniques, and statistical software used. Approximately 81.9% of articles reported an observational study design and 93.1% of articles were substantively focused. Descriptive statistics in table or graphical form were reported in more than 95% of the articles, and statistical inference reported in more than 76% of the studies reviewed. These results reveal the types of statistical methods currently used in the public health literature. Although this study did not obtain information on what should be taught, information on statistical methods being used is useful for curriculum development in graduate health sciences education, as well as making informed decisions about continuing education for public health professionals.
van Meer, R; Gritsenko, O V; Baerends, E J
2014-10-14
In recent years, several benchmark studies on the performance of large sets of functionals in time-dependent density functional theory (TDDFT) calculations of excitation energies have been performed. The tested functionals do not approximate exact Kohn-Sham orbitals and orbital energies closely. We highlight the advantages of (close to) exact Kohn-Sham orbitals and orbital energies for a simple description, very often as just a single orbital-to-orbital transition, of molecular excitations. Benchmark calculations are performed for the statistical average of orbital potentials (SAOP) functional for the potential [J. Chem. Phys. 2000, 112, 1344; 2001, 114, 652], which approximates the true Kohn-Sham potential much better than LDA, GGA, mGGA, and hybrid potentials do. An accurate Kohn-Sham potential does not only perform satisfactorily for calculated vertical excitation energies of both valence and Rydberg transitions but also exhibits appealing properties of the KS orbitals including occupied orbital energies close to ionization energies, virtual-occupied orbital energy gaps very close to excitation energies, realistic shapes of virtual orbitals, leading to straightforward interpretation of most excitations as single orbital transitions. We stress that such advantages are completely lost in time-dependent Hartree-Fock and partly in hybrid approaches. Many excitations and excitation energies calculated with local density, generalized gradient, and hybrid functionals are spurious. There is, with an accurate KS, or even the LDA or GGA potentials, nothing problematic about the "band gap" in molecules: the HOMO-LUMO gap is close to the first excitation energy (the optical gap).
C-statistic fitting routines: User's manual and reference guide
NASA Technical Reports Server (NTRS)
Nousek, John A.; Farwana, Vida
1991-01-01
The computer program is discussed which can read several input files and provide a best set of values for the functions provided by the user, using either C-statistic or the chi(exp 2) statistic method. The program consists of one main routine and several functions and subroutines. Detail descriptions of each function and subroutine is presented. A brief description of the C-statistic and the reason for its application is also presented.
Sample size considerations for clinical research studies in nuclear cardiology.
Chiuzan, Cody; West, Erin A; Duong, Jimmy; Cheung, Ken Y K; Einstein, Andrew J
2015-12-01
Sample size calculation is an important element of research design that investigators need to consider in the planning stage of the study. Funding agencies and research review panels request a power analysis, for example, to determine the minimum number of subjects needed for an experiment to be informative. Calculating the right sample size is crucial to gaining accurate information and ensures that research resources are used efficiently and ethically. The simple question "How many subjects do I need?" does not always have a simple answer. Before calculating the sample size requirements, a researcher must address several aspects, such as purpose of the research (descriptive or comparative), type of samples (one or more groups), and data being collected (continuous or categorical). In this article, we describe some of the most frequent methods for calculating the sample size with examples from nuclear cardiology research, including for t tests, analysis of variance (ANOVA), non-parametric tests, correlation, Chi-squared tests, and survival analysis. For the ease of implementation, several examples are also illustrated via user-friendly free statistical software.
Mobile Applications for Type 2 Diabetes Risk Estimation: a Systematic Review.
Fijacko, Nino; Brzan, Petra Povalej; Stiglic, Gregor
2015-10-01
Screening for chronical diseases like type 2 diabetes can be done using different methods and various risk tests. This study present a review of type 2 diabetes risk estimation mobile applications focusing on their functionality and availability of information on the underlying risk calculators. Only 9 out of 31 reviewed mobile applications, featured in three major mobile application stores, disclosed the name of risk calculator used for assessing the risk of type 2 diabetes. Even more concerning, none of the reviewed applications mentioned that they are collecting the data from users to improve the performance of their risk estimation calculators or offer users the descriptive statistics of the results from users that already used the application. For that purpose the questionnaires used for calculation of risk should be upgraded by including the information on the most recent blood sugar level measurements from users. Although mobile applications represent a great future potential for health applications, developers still do not put enough emphasis on informing the user of the underlying methods used to estimate the risk for a specific clinical condition.
Study of fatigue crack propagation in Ti-1Al-1Mn based on the calculation of cold work evolution
NASA Astrophysics Data System (ADS)
Plekhov, O. A.; Kostina, A. A.
2017-05-01
The work proposes a numerical method for lifetime assessment for metallic materials based on consideration of energy balance at crack tip. This method is based on the evaluation of the stored energy value per loading cycle. To calculate the stored and dissipated parts of deformation energy an elasto-plastic phenomenological model of energy balance in metals under the deformation and failure processes was proposed. The key point of the model is strain-type internal variable describing the stored energy process. This parameter is introduced based of the statistical description of defect evolution in metals as a second-order tensor and has a meaning of an additional strain due to the initiation and growth of the defects. The fatigue crack rate was calculated in a framework of a stationary crack approach (several loading cycles for every crack length was considered to estimate the energy balance at crack tip). The application of the proposed algorithm is illustrated by the calculation of the lifetime of the Ti-1Al-1Mn compact tension specimen under cyclic loading.
Multispectral scanner system parameter study and analysis software system description, volume 2
NASA Technical Reports Server (NTRS)
Landgrebe, D. A. (Principal Investigator); Mobasseri, B. G.; Wiersma, D. J.; Wiswell, E. R.; Mcgillem, C. D.; Anuta, P. E.
1978-01-01
The author has identified the following significant results. The integration of the available methods provided the analyst with the unified scanner analysis package (USAP), the flexibility and versatility of which was superior to many previous integrated techniques. The USAP consisted of three main subsystems; (1) a spatial path, (2) a spectral path, and (3) a set of analytic classification accuracy estimators which evaluated the system performance. The spatial path consisted of satellite and/or aircraft data, data correlation analyzer, scanner IFOV, and random noise model. The output of the spatial path was fed into the analytic classification and accuracy predictor. The spectral path consisted of laboratory and/or field spectral data, EXOSYS data retrieval, optimum spectral function calculation, data transformation, and statistics calculation. The output of the spectral path was fended into the stratified posterior performance estimator.
Validating Future Force Performance Measures (Army Class): Concluding Analyses
2016-06-01
32 Table 3.10. Descriptive Statistics and Intercorrelations for LV Final Predictor Factor Scores...55 Table 4.7. Descriptive Statistics for Analysis Criteria...Soldier attrition and performance: Dependability (Non- Delinquency ), Adjustment, Physical Conditioning, Leadership, Work Orientation, and Agreeableness
Energy Weighted Angular Correlations Between Hadrons Produced in Electron-Positron Annihilation.
NASA Astrophysics Data System (ADS)
Strharsky, Roger Joseph
Electron-positron annihilation at large center of mass energy produces many hadronic particles. Experimentalists then measure the energies of these particles in calorimeters. This study investigated correlations between the angular locations of one or two such calorimeters and the angular orientation of the electron beam in the laboratory frame of reference. The calculation of these correlations includes weighting by the fraction of the total center of mass energy which the calorimeter measures. Starting with the assumption that the reaction proceeeds through the intermediate production of a single quark/anti-quark pair, a simple statistical model was developed to provide a phenomenological description of the distribution of final state hadrons. The model distributions were then used to calculate the one- and two-calorimeter correlation functions. Results of these calculations were compared with available data and several predictions were made for those quantities which had not yet been measured. Failure of the model to reproduce all of the data was discussed in terms of quantum chromodynamics, a fundamental theory which includes quark interactions.
Predicting Subsequent Myopia in Initially Pilot-Qualified USAFA Cadets.
1985-12-27
Refraction Measurement 14 Accesion For . 4.0 RESULTS NTIS CRA&I 15 4.1 Descriptive Statistics DTIC TAB 0 15i ~ ~Unannoutwced [ 4.2 Predictive Statistics ...mentioned), and three were missing a status. The data of the subject who was commissionable were dropped from the statistical analyses. Of the 91...relatively equal numbers of participants from all classes will become obvious ’’" - within the results. J 4.1 Descriptive Statistics In the original plan
Evidence-based orthodontics. Current statistical trends in published articles in one journal.
Law, Scott V; Chudasama, Dipak N; Rinchuse, Donald J
2010-09-01
To ascertain the number, type, and overall usage of statistics in American Journal of Orthodontics and Dentofacial (AJODO) articles for 2008. These data were then compared to data from three previous years: 1975, 1985, and 2003. The frequency and distribution of statistics used in the AJODO original articles for 2008 were dichotomized into those using statistics and those not using statistics. Statistical procedures were then broadly divided into descriptive statistics (mean, standard deviation, range, percentage) and inferential statistics (t-test, analysis of variance). Descriptive statistics were used to make comparisons. In 1975, 1985, 2003, and 2008, AJODO published 72, 87, 134, and 141 original articles, respectively. The percentage of original articles using statistics was 43.1% in 1975, 75.9% in 1985, 94.0% in 2003, and 92.9% in 2008; original articles using statistics stayed relatively the same from 2003 to 2008, with only a small 1.1% decrease. The percentage of articles using inferential statistical analyses was 23.7% in 1975, 74.2% in 1985, 92.9% in 2003, and 84.4% in 2008. Comparing AJODO publications in 2003 and 2008, there was an 8.5% increase in the use of descriptive articles (from 7.1% to 15.6%), and there was an 8.5% decrease in articles using inferential statistics (from 92.9% to 84.4%).
Blum, Thomas; Chowdhury, Saumitra; Hayakawa, Masashi; Izubuchi, Taku
2015-01-09
The most compelling possibility for a new law of nature beyond the four fundamental forces comprising the standard model of high-energy physics is the discrepancy between measurements and calculations of the muon anomalous magnetic moment. Until now a key part of the calculation, the hadronic light-by-light contribution, has only been accessible from models of QCD, the quantum description of the strong force, whose accuracy at the required level may be questioned. A first principles calculation with systematically improvable errors is needed, along with the upcoming experiments, to decisively settle the matter. For the first time, the form factor that yields the light-by-light scattering contribution to the muon anomalous magnetic moment is computed in such a framework, lattice QCD+QED and QED. A nonperturbative treatment of QED is used and checked against perturbation theory. The hadronic contribution is calculated for unphysical quark and muon masses, and only the diagram with a single quark loop is computed for which statistically significant signals are obtained. Initial results are promising, and the prospect for a complete calculation with physical masses and controlled errors is discussed.
Precalculus teachers' perspectives on using graphing calculators: an example from one curriculum
NASA Astrophysics Data System (ADS)
Karadeniz, Ilyas; Thompson, Denisse R.
2018-01-01
Graphing calculators are hand-held technological tools currently used in mathematics classrooms. Teachers' perspectives on using graphing calculators are important in terms of exploring what teachers think about using such technology in advanced mathematics courses, particularly precalculus courses. A descriptive intrinsic case study was conducted to analyse the perspectives of 11 teachers using graphing calculators with potential Computer Algebra System (CAS) capability while teaching Functions, Statistics, and Trigonometry, a precalculus course for 11th-grade students developed by the University of Chicago School Mathematics Project. Data were collected from multiple sources as part of a curriculum evaluation study conducted during the 2007-2008 school year. Although all teachers were using the same curriculum that integrated CAS into the instructional materials, teachers had mixed views about the technology. Graphing calculator features were used much more than CAS features, with many teachers concerned about the use of CAS because of pressures from external assessments. In addition, several teachers found it overwhelming to learn a new technology at the same time they were learning a new curriculum. The results have implications for curriculum developers and others working with teachers to update curriculum and the use of advanced technologies simultaneously.
do Nascimento, Paulo Roberto; Westphal, Marcia Faria; Moreira, Rafael da Silveira; Baltar, Valéria Troncoso; Moysés, Simone Tetu; Zioni, Fabiola; Minowa, Evelin
2014-01-01
In order to improve the quality of life and health of the population in recent years there have been several local social agendas, like Agenda 21 and Healthy Cities. To identify how social agendas are impacting on the living conditions and health in municipalities of the five regions of Brazil. Through an ecological longitudinal study, the social agendas' effects on the Social Determinants of Health were measured in 105 municipalities, using indicators related to the eight dimensions of the Millennium Development Goals (MDGs). Indicators were also calculated for other 175 non-exposed municipalities. Descriptive statistics were calculated for each group of municipalities at three different moments: in the year of the agenda implementation, then 3 and 6 years later. The models were adjusted by the method of GEE to assess the effects of the agendas, time and their interaction. Nonparametric analysis of variance was used for the ordinal data with repeated measures. Impacts of the agendas were detected for reduction of hunger and increase of universal access to education: 'percentage of children under one year with protein/caloric undernourishment' (interaction effect: p = 0.02) and 'Age-grade distortion in the 8th grade of fundamental education' (interaction effect: p < 0.001). The comparative discussion between model results and descriptive statistics recommends, at further research, extending the period of investigation, using compound indexes, improving the methodology for the apprehension of the impacts of the diffuse social policies for development, as well as using 'mixed methodologies', integrating quantitative and qualitative tools.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fujiwara, K., E-mail: ku.fujiwara@screen.co.jp; Department of Mechanical Engineering, Graduate School of Engineering, Osaka University, 2-1 Yamadaoka, Suita, Osaka 565-0871; Shibahara, M., E-mail: siba@mech.eng.osaka-u.ac.jp
A classical molecular dynamics simulation was conducted for a system composed of fluid molecules between two planar solid surfaces, and whose interactions are described by the 12-6 Lennard-Jones form. This paper presents a general description of the pressure components and interfacial tension at a fluid-solid interface obtained by the perturbative method on the basis of statistical thermodynamics, proposes a method to consider the pressure components tangential to an interface which are affected by interactions with solid atoms, and applies this method to the calculation system. The description of the perturbative method is extended to subsystems, and the local pressure componentsmore » and interfacial tension at a liquid-solid interface are obtained and examined in one- and two-dimensions. The results are compared with those obtained by two alternative methods: (a) an evaluation of the intermolecular force acting on a plane, and (b) the conventional method based on the virial expression. The accuracy of the numerical results is examined through the comparison of the results obtained by each method. The calculated local pressure components and interfacial tension of the fluid at a liquid-solid interface agreed well with the results of the two alternative methods at each local position in one dimension. In two dimensions, the results showed a characteristic profile of the tangential pressure component which depended on the direction tangential to the liquid-solid interface, which agreed with that obtained by the evaluation of the intermolecular force acting on a plane in the present study. Such good agreement suggests that the perturbative method on the basis of statistical thermodynamics used in this study is valid to obtain the local pressure components and interfacial tension at a liquid-solid interface.« less
Descriptive profile of people with diabetes who use the Puerto Rico Quitline.
Cabrera-Serrano, Alex; Ramos-Colón, Miriam V; Rivera-Alvarado, Abraham; Cases-Rosario, Antonio; Ramos, Jessica Irizarry
2012-01-01
To provide a descriptive profile of the people with diabetes (PWD) who received the services of the Puerto Rico Quitline (PRQ) during 2008, compared to non-diabetic people (NDP) to establish a significant statistical difference. Using a cross-sectional study methodology, the Quitline database was analyzed. Ninety-four percent of the 1,137 people who received the services of the PRQ during 2008 and completed all the interviews were included in the analysis. Frequency distributions and means calculation were performed to describe the PWD. Chi-square tests, odds ratio, t test and 95% confidence intervals were calculated to identify statistically significant differences between the PWD and NDP. Nearly 11 percent (10.9%) of the people who received the services of the PRQ during 2008 and completed all the interviews reported a diabetes diagnosis. Health conditions were reported by 95.7% of PWD vs. 62.3% of NDP (P < .01). People with diabetes were more likely to have hypertension (P < .01), circulatory problems (P < .01), and respiratory infections (P = .02) than NDP. They also reported a higher mean number of smoking years than the NDP (P < .01), but the PWD were less likely to use menthol cigarettes Sthan NDP (P =.01). A physician's recommendation is the only reason for trying to quit smoking with a statically significant difference between the PWD and the NDP (P = .02). The mean number of alcoholic beverages consumed per day for the PWD was 8 and for the NDP it was 5 (P < .01). This study provides important evidence that can help increase the chances of success in the smoking cessation process in the PWD who access the services of the Quitline program.
Job Satisfaction DEOCS 4.1 Construct Validity Summary
2017-08-01
focuses more specifically on satisfaction with the job. Included is a review of the 4.0 description and items, followed by the proposed modifications to...the factor. The DEOCS 4.0 description provided for job satisfaction is “the perception of personal fulfillment in a specific vocation, and sense of...piloting items on the DEOCS; (4) examining the descriptive statistics, exploratory factor analysis results, and aggregation statistics; and (5
Mirmohammadi, Seyyed Jalil; Hafezi, Rahmatollah; Mehrparvar, Amir Houshang; Gerdfaramarzi, Raziyeh Soltani; Mostaghaci, Mehrdad; Nodoushan, Reza Jafari; Rezaeian, Bibiseyedeh
2013-01-01
Anthropometric data can be used to identify the physical dimensions of equipment, furniture, clothing and workstations. The use of poorly designed furniture that fails to fulfil the users' anthropometric dimensions, has a negative impact on human health. In this study, we measured some anthropometric dimensions of Iranian children from different ethnicities. A total of 12,731 Iranian primary school children aged 7-11 years were included in the study and their static anthropometric dimensions were measured. Descriptive statistics such as mean, standard deviation and key percentiles were calculated. All dimensions were compared among different ethnicities and different genders. This study showed significant differences in a set of 22 anthropometric dimensions with regard to gender, age and ethnicity. Turk boys and Arab girls were larger than their contemporaries in different ages. According to the results of this study, difference between genders and among different ethnicities should be taken into account by designers and manufacturers of school furniture. In this study, we measured 22 static anthropometric dimensions of 12,731 Iranian primary school children aged 7-11 years from different ethnicities. Descriptive statistics such as mean, standard deviation and key percentiles were measured for each dimension. This study showed significant differences in a set of 22 anthropometric dimensions in different genders, ages and ethnicities.
Došler, Anita Jug; Skubic, Metka; Mivšek, Ana Polona
2014-09-01
Mobbing, defined as sustained harassment among workers, in particular towards subordinates, merits investigation. This study aims to investigate Slovenian midwifery students' (2nd and 3rd year students of midwifery at the Faculty for Health Studies Ljubljana; the single educational institution for midwives in Slovenia) perception of mobbing, since management of acceptable behavioural interrelationships in midwifery profession forms already during the study, through professional socialization. Descriptive and causal-nonexperimental method with questionnaire was used. Basic descriptive statistics and measures for calculating statistical significance were carried out with SPSS 20.0 software version. All necessary ethical measures were taken into the consideration during the study to protect participants. The re- sults revealed that several participants experienced mobbing during the study (82.3%); 58.8% of them during their practical training and 23.5% from midwifery teachers. Students are often anxious and nervous in face of clinical settings (60.8%) or before faculty commitments (exams, presentations etc.) (41.2%). A lot of them (40.4%) estimate that mobbing affected their health. They did not show effective strategies to solve relationship problems. According to the findings, everyone involved in midwifery education, but above all students, should be provided with more knowledge and skills on successful management of conflict situations.
NASA Technical Reports Server (NTRS)
Davis, B. J.; Feiveson, A. H.
1975-01-01
Results are presented of CITARS data processing in raw form. Tables of descriptive statistics are given along with descriptions and results of inferential analyses. The inferential results are organized by questions which CITARS was designed to answer.
Prison Radicalization: The New Extremist Training Grounds?
2007-09-01
distributing and collecting survey data , and the data analysis. The analytical methodology includes descriptive and inferential statistical methods, in... statistical analysis of the responses to identify significant correlations and relationships. B. SURVEY DATA COLLECTION To effectively access a...Q18, Q19, Q20, and Q21. Due to the exploratory nature of this small survey, data analyses were confined mostly to descriptive statistics and
A two-component rain model for the prediction of attenuation and diversity improvement
NASA Technical Reports Server (NTRS)
Crane, R. K.
1982-01-01
A new model was developed to predict attenuation statistics for a single Earth-satellite or terrestrial propagation path. The model was extended to provide predictions of the joint occurrences of specified or higher attenuation values on two closely spaced Earth-satellite paths. The joint statistics provide the information required to obtain diversity gain or diversity advantage estimates. The new model is meteorologically based. It was tested against available Earth-satellite beacon observations and terrestrial path measurements. The model employs the rain climate region descriptions of the Global rain model. The rms deviation between the predicted and observed attenuation values for the terrestrial path data was 35 percent, a result consistent with the expectations of the Global model when the rain rate distribution for the path is not used in the calculation. Within the United States the rms deviation between measurement and prediction was 36 percent but worldwide it was 79 percent.
Simulation of magnetoelastic response of iron nanowire loop
NASA Astrophysics Data System (ADS)
Huang, Junping; Peng, Xianghe; Wang, Zhongchang; Hu, Xianzhi
2018-03-01
We analyzed the magnetoelastic responses of one-dimensional iron nanowire loop systems with quantum statistical mechanics, treating the particles in the systems as identical bosons with an arbitrary integer spin. Under the assumptions adopted, we demonstrated that the Hamiltonian of the system can be separated into two parts, corresponding to two Ising subsystems, describing the particle spin and the particle displacement, respectively. Because the energy of the particle motion at atomic scale is quantized, there should be more the strict constraint on the particle displacement Ising subsystem. Making use of the existing results for Ising system, the partition function of the system was derived into two parts, corresponding respectively to the two Ising subsystems. Then the Gibbs distribution was obtained by statistical mechanics, and the description for the magnetoelastic response was derived. The magnetoelastic responses were predicted with the developed approach, and the comparison with the results calculated with VASP demonstrates the validity of the developed approach.
Schenk, Emily R; Nau, Frederic; Fernandez-Lima, Francisco
2015-06-01
The ability to correlate experimental ion mobility data with candidate structures from theoretical modeling provides a powerful analytical and structural tool for the characterization of biomolecules. In the present paper, a theoretical workflow is described to generate and assign candidate structures for experimental trapped ion mobility and H/D exchange (HDX-TIMS-MS) data following molecular dynamics simulations and statistical filtering. The applicability of the theoretical predictor is illustrated for a peptide and protein example with multiple conformations and kinetic intermediates. The described methodology yields a low computational cost and a simple workflow by incorporating statistical filtering and molecular dynamics simulations. The workflow can be adapted to different IMS scenarios and CCS calculators for a more accurate description of the IMS experimental conditions. For the case of the HDX-TIMS-MS experiments, molecular dynamics in the "TIMS box" accounts for a better sampling of the molecular intermediates and local energy minima.
Low, Diana H P; Motakis, Efthymios
2013-10-01
Binding free energy calculations obtained through molecular dynamics simulations reflect intermolecular interaction states through a series of independent snapshots. Typically, the free energies of multiple simulated series (each with slightly different starting conditions) need to be estimated. Previous approaches carry out this task by moving averages at certain decorrelation times, assuming that the system comes from a single conformation description of binding events. Here, we discuss a more general approach that uses statistical modeling, wavelets denoising and hierarchical clustering to estimate the significance of multiple statistically distinct subpopulations, reflecting potential macrostates of the system. We present the deltaGseg R package that performs macrostate estimation from multiple replicated series and allows molecular biologists/chemists to gain physical insight into the molecular details that are not easily accessible by experimental techniques. deltaGseg is a Bioconductor R package available at http://bioconductor.org/packages/release/bioc/html/deltaGseg.html.
α -induced reactions on 115In: Cross section measurements and statistical model analysis
NASA Astrophysics Data System (ADS)
Kiss, G. G.; Szücs, T.; Mohr, P.; Török, Zs.; Huszánk, R.; Gyürky, Gy.; Fülöp, Zs.
2018-05-01
Background: α -nucleus optical potentials are basic ingredients of statistical model calculations used in nucleosynthesis simulations. While the nucleon+nucleus optical potential is fairly well known, for the α +nucleus optical potential several different parameter sets exist and large deviations, reaching sometimes even an order of magnitude, are found between the cross section predictions calculated using different parameter sets. Purpose: A measurement of the radiative α -capture and the α -induced reaction cross sections on the nucleus 115In at low energies allows a stringent test of statistical model predictions. Since experimental data are scarce in this mass region, this measurement can be an important input to test the global applicability of α +nucleus optical model potentials and further ingredients of the statistical model. Methods: The reaction cross sections were measured by means of the activation method. The produced activities were determined by off-line detection of the γ rays and characteristic x rays emitted during the electron capture decay of the produced Sb isotopes. The 115In(α ,γ )119Sb and 115In(α ,n )Sb118m reaction cross sections were measured between Ec .m .=8.83 and 15.58 MeV, and the 115In(α ,n )Sb118g reaction was studied between Ec .m .=11.10 and 15.58 MeV. The theoretical analysis was performed within the statistical model. Results: The simultaneous measurement of the (α ,γ ) and (α ,n ) cross sections allowed us to determine a best-fit combination of all parameters for the statistical model. The α +nucleus optical potential is identified as the most important input for the statistical model. The best fit is obtained for the new Atomki-V1 potential, and good reproduction of the experimental data is also achieved for the first version of the Demetriou potentials and the simple McFadden-Satchler potential. The nucleon optical potential, the γ -ray strength function, and the level density parametrization are also constrained by the data although there is no unique best-fit combination. Conclusions: The best-fit calculations allow us to extrapolate the low-energy (α ,γ ) cross section of 115In to the astrophysical Gamow window with reasonable uncertainties. However, still further improvements of the α -nucleus potential are required for a global description of elastic (α ,α ) scattering and α -induced reactions in a wide range of masses and energies.
2016-11-15
participants who were followed for the development of back pain for an average of 3.9 years. Methods. Descriptive statistics and longitudinal...health, military personnel, occupational health, outcome assessment, statistics, survey methodology . Level of Evidence: 3 Spine 2016;41:1754–1763ack...based on the National Health and Nutrition Examination Survey.21 Statistical Analysis Descriptive and univariate analyses compared character- istics
1985-09-01
4 C/SCSC Terms and Definitions ...... ..... 5 Cost Performance Report Analysis (CPA) Progrra" m 6 Description of CPRA Terms and Formulas...hypotheses are: 1 2 C2: al’ 02 ’ The test statistic is then calculated as: F* (( SSEI + (nI - 2)) / (SSE 2 + (n 2 - 2))] The critical F value is: F(c, nl...353.90767 SIGNIF F = .0000 44 ,1 42 •.4 m . - .TABLE B.4 General Linear Test for EAC1 and EAC5 MEAN STD DEV CASES ECAC 827534.056 1202737.882 1630 EACS
Nuclear Deformation at Finite Temperature
NASA Astrophysics Data System (ADS)
Alhassid, Y.; Gilbreth, C. N.; Bertsch, G. F.
2014-12-01
Deformation, a key concept in our understanding of heavy nuclei, is based on a mean-field description that breaks the rotational invariance of the nuclear many-body Hamiltonian. We present a method to analyze nuclear deformations at finite temperature in a framework that preserves rotational invariance. The auxiliary-field Monte Carlo method is used to generate a statistical ensemble and calculate the probability distribution associated with the quadrupole operator. Applying the technique to nuclei in the rare-earth region, we identify model-independent signatures of deformation and find that deformation effects persist to temperatures higher than the spherical-to-deformed shape phase-transition temperature of mean-field theory.
NASA Astrophysics Data System (ADS)
González, Diego Luis; Pimpinelli, Alberto; Einstein, T. L.
2011-07-01
We study the configurational structure of the point-island model for epitaxial growth in one dimension. In particular, we calculate the island gap and capture zone distributions. Our model is based on an approximate description of nucleation inside the gaps. Nucleation is described by the joint probability density pnXY(x,y), which represents the probability density to have nucleation at position x within a gap of size y. Our proposed functional form for pnXY(x,y) describes excellently the statistical behavior of the system. We compare our analytical model with extensive numerical simulations. Our model retains the most relevant physical properties of the system.
SPY: a new scission-point model based on microscopic inputs to predict fission fragment properties
NASA Astrophysics Data System (ADS)
Panebianco, Stefano; Dubray, Nöel; Goriely, Stéphane; Hilaire, Stéphane; Lemaître, Jean-François; Sida, Jean-Luc
2014-04-01
Despite the difficulty in describing the whole fission dynamics, the main fragment characteristics can be determined in a static approach based on a so-called scission-point model. Within this framework, a new Scission-Point model for the calculations of fission fragment Yields (SPY) has been developed. This model, initially based on the approach developed by Wilkins in the late seventies, consists in performing a static energy balance at scission, where the two fragments are supposed to be completely separated so that their macroscopic properties (mass and charge) can be considered as fixed. Given the knowledge of the system state density, averaged quantities such as mass and charge yields, mean kinetic and excitation energy can then be extracted in the framework of a microcanonical statistical description. The main advantage of the SPY model is the introduction of one of the most up-to-date microscopic descriptions of the nucleus for the individual energy of each fragment and, in the future, for their state density. These quantities are obtained in the framework of HFB calculations using the Gogny nucleon-nucleon interaction, ensuring an overall coherence of the model. Starting from a description of the SPY model and its main features, a comparison between the SPY predictions and experimental data will be discussed for some specific cases, from light nuclei around mercury to major actinides. Moreover, extensive predictions over the whole chart of nuclides will be discussed, with particular attention to their implication in stellar nucleosynthesis. Finally, future developments, mainly concerning the introduction of microscopic state densities, will be briefly discussed.
Tanihara, Shinichi
2015-01-01
Uncoded diagnoses in health insurance claims (HICs) may introduce bias into Japanese health statistics dependent on computerized HICs. This study's aim was to identify the causes and characteristics of uncoded diagnoses. Uncoded diagnoses from computerized HICs (outpatient, inpatient, and the diagnosis procedure-combination per-diem payment system [DPC/PDPS]) submitted to the National Health Insurance Organization of Kumamoto Prefecture in May 2010 were analyzed. The text documentation accompanying the uncoded diagnoses was used to classify diagnoses in accordance with the International Classification of Diseases-10 (ICD-10). The text documentation was also classified into four categories using the standard descriptions of diagnoses defined in the master files of the computerized HIC system: 1) standard descriptions of diagnoses, 2) standard descriptions with a modifier, 3) non-standard descriptions of diagnoses, and 4) unclassifiable text documentation. Using these classifications, the proportions of uncoded diagnoses by ICD-10 disease category were calculated. Of the uncoded diagnoses analyzed (n = 363 753), non-standard descriptions of diagnoses for outpatient, inpatient, and DPC/PDPS HICs comprised 12.1%, 14.6%, and 1.0% of uncoded diagnoses, respectively. The proportion of uncoded diagnoses with standard descriptions with a modifier for Diseases of the eye and adnexa was significantly higher than the overall proportion of uncoded diagnoses among every HIC type. The pattern of uncoded diagnoses differed by HIC type and disease category. Evaluating the proportion of uncoded diagnoses in all medical facilities and developing effective coding methods for diagnoses with modifiers, prefixes, and suffixes should reduce number of uncoded diagnoses in computerized HICs and improve the quality of HIC databases.
NASA Astrophysics Data System (ADS)
Noguere, Gilles; Archier, Pascal; Bouland, Olivier; Capote, Roberto; Jean, Cyrille De Saint; Kopecky, Stefan; Schillebeeckx, Peter; Sirakov, Ivan; Tamagno, Pierre
2017-09-01
A consistent description of the neutron cross sections from thermal energy up to the MeV region is challenging. One of the first steps consists in optimizing the optical model parameters using average resonance parameters, such as the neutron strength functions. They can be derived from a statistical analysis of the resolved resonance parameters, or calculated with the generalized form of the SPRT method by using scattering matrix elements provided by optical model calculations. One of the difficulties is to establish the contributions of the direct and compound nucleus reactions. This problem was solved by using a slightly modified average R-Matrix formula with an equivalent hard sphere radius deduced from the phase shift originating from the potential. The performances of the proposed formalism are illustrated with results obtained for the 238U+n nuclear systems.
Popov, I; Valašková, J; Štefaničková, J; Krásnik, V
2017-01-01
A substantial part of the population suffers from some kind of refractive errors. It is envisaged that their prevalence may change with the development of society. The aim of this study is to determine the prevalence of refractive errors using calculations based on the Gullstrand schematic eye model. We used the Gullstrand schematic eye model to calculate refraction retrospectively. Refraction was presented as the need for glasses correction at a vertex distance of 12 mm. The necessary data was obtained using the optical biometer Lenstar LS900. Data which could not be obtained due to the limitations of the device was substituted by theoretical data from the Gullstrand schematic eye model. Only analyses from the right eyes were presented. The data was interpreted using descriptive statistics, Pearson correlation and t-test. The statistical tests were conducted at a level of significance of 5%. Our sample included 1663 patients (665 male, 998 female) within the age range of 19 to 96 years. Average age was 70.8 ± 9.53 years. Average refraction of the eye was 2.73 ± 2.13D (males 2.49 ± 2.34, females 2.90 ± 2.76). The mean absolute error from emmetropia was 3.01 ± 1.58 (males 2.83 ± 2.95, females 3.25 ± 3.35). 89.06% of the sample was hyperopic, 6.61% was myopic and 4.33% emmetropic. We did not find any correlation between refraction and age. Females were more hyperopic than males. We did not find any statistically significant hypermetopic shift of refraction with age. According to our estimation, the calculations of refractive errors using the Gullstrand schematic eye model showed a significant hypermetropic shift of more than +2D. Our results could be used in future for comparing the prevalence of refractive errors using same methods we used.Key words: refractive errors, refraction, Gullstrand schematic eye model, population, emmetropia.
The Sport Students’ Ability of Literacy and Statistical Reasoning
NASA Astrophysics Data System (ADS)
Hidayah, N.
2017-03-01
The ability of literacy and statistical reasoning is very important for the students of sport education college due to the materials of statistical learning can be taken from their many activities such as sport competition, the result of test and measurement, predicting achievement based on training, finding connection among variables, and others. This research tries to describe the sport education college students’ ability of literacy and statistical reasoning related to the identification of data type, probability, table interpretation, description and explanation by using bar or pie graphic, explanation of variability, interpretation, the calculation and explanation of mean, median, and mode through an instrument. This instrument is tested to 50 college students majoring in sport resulting only 26% of all students have the ability above 30% while others still below 30%. Observing from all subjects; 56% of students have the ability of identification data classification, 49% of students have the ability to read, display and interpret table through graphic, 27% students have the ability in probability, 33% students have the ability to describe variability, and 16.32% students have the ability to read, count and describe mean, median and mode. The result of this research shows that the sport students’ ability of literacy and statistical reasoning has not been adequate and students’ statistical study has not reached comprehending concept, literary ability trining and statistical rasoning, so it is critical to increase the sport students’ ability of literacy and statistical reasoning
Regression modeling of ground-water flow
Cooley, R.L.; Naff, R.L.
1985-01-01
Nonlinear multiple regression methods are developed to model and analyze groundwater flow systems. Complete descriptions of regression methodology as applied to groundwater flow models allow scientists and engineers engaged in flow modeling to apply the methods to a wide range of problems. Organization of the text proceeds from an introduction that discusses the general topic of groundwater flow modeling, to a review of basic statistics necessary to properly apply regression techniques, and then to the main topic: exposition and use of linear and nonlinear regression to model groundwater flow. Statistical procedures are given to analyze and use the regression models. A number of exercises and answers are included to exercise the student on nearly all the methods that are presented for modeling and statistical analysis. Three computer programs implement the more complex methods. These three are a general two-dimensional, steady-state regression model for flow in an anisotropic, heterogeneous porous medium, a program to calculate a measure of model nonlinearity with respect to the regression parameters, and a program to analyze model errors in computed dependent variables such as hydraulic head. (USGS)
Statistics in three biomedical journals.
Pilcík, T
2003-01-01
In this paper we analyze the use of statistics and associated problems, in three Czech biological journals in the year 2000. We investigated 23 articles Folia Biologica, 60 articles in Folia Microbiologica, and 88 articles in Physiological Research. The highest frequency of publications with statistical content have used descriptive statistics and t-test. The most usual mistake concerns the absence of reference about the used statistical software and insufficient description of the data. We have compared our results with the results of similar studies in some other medical journals. The use of important statistical methods is comparable with those used in most medical journals, the proportion of articles, in which the applied method is described insufficiently is moderately low.
Unlawful Discrimination DEOCS 4.1 Construct Validity Summary
2017-08-01
Included is a review of the 4.0 description and items, followed by the proposed modifications to the factor. The current DEOCS (4.0) contains multiple...Officer (E7 – E9) 586 10.8% Junior Officer (O1 – O3) 474 9% Senior Officer (O4 and above) 391 6.1% Descriptive Statistics and Reliability This section...displays descriptive statistics for the items on the Unlawful Discrimination scale. All items had a range from 1 to 7 (strongly disagree to strongly
Langenderfer, Joseph E; Rullkoetter, Paul J; Mell, Amy G; Laz, Peter J
2009-04-01
An accurate assessment of shoulder kinematics is useful for understanding healthy normal and pathological mechanics. Small variability in identifying and locating anatomical landmarks (ALs) has potential to affect reported shoulder kinematics. The objectives of this study were to quantify the effect of landmark location variability on scapular and humeral kinematic descriptions for multiple subjects using probabilistic analysis methods, and to evaluate the consistency in results across multiple subjects. Data from 11 healthy subjects performing humeral elevation in the scapular plane were used to calculate Euler angles describing humeral and scapular kinematics. Probabilistic analyses were performed for each subject to simulate uncertainty in the locations of 13 upper-extremity ALs. For standard deviations of 4 mm in landmark location, the analysis predicted Euler angle envelopes between the 1 and 99 percentile bounds of up to 16.6 degrees . While absolute kinematics varied with the subject, the average 1-99% kinematic ranges for the motion were consistent across subjects and sensitivity factors showed no statistically significant differences between subjects. The description of humeral kinematics was most sensitive to the location of landmarks on the thorax, while landmarks on the scapula had the greatest effect on the description of scapular elevation. The findings of this study can provide a better understanding of kinematic variability, which can aid in making accurate clinical diagnoses and refining kinematic measurement techniques.
Self-Esteem and Academic Achievement of High School Students
ERIC Educational Resources Information Center
Moradi Sheykhjan, Tohid; Jabari, Kamran; Rajeswari, K.
2014-01-01
The primary purpose of this study was to determine the influence of self-esteem on academic achievement among high school students in Miandoab City of Iran. The methodology of the research is descriptive and correlation that descriptive and inferential statistics were used to analyze the data. Statistical Society includes male and female high…
Authorization of Animal Experiments Is Based on Confidence Rather than Evidence of Scientific Rigor
Nathues, Christina; Würbel, Hanno
2016-01-01
Accumulating evidence indicates high risk of bias in preclinical animal research, questioning the scientific validity and reproducibility of published research findings. Systematic reviews found low rates of reporting of measures against risks of bias in the published literature (e.g., randomization, blinding, sample size calculation) and a correlation between low reporting rates and inflated treatment effects. That most animal research undergoes peer review or ethical review would offer the possibility to detect risks of bias at an earlier stage, before the research has been conducted. For example, in Switzerland, animal experiments are licensed based on a detailed description of the study protocol and a harm–benefit analysis. We therefore screened applications for animal experiments submitted to Swiss authorities (n = 1,277) for the rates at which the use of seven basic measures against bias (allocation concealment, blinding, randomization, sample size calculation, inclusion/exclusion criteria, primary outcome variable, and statistical analysis plan) were described and compared them with the reporting rates of the same measures in a representative sub-sample of publications (n = 50) resulting from studies described in these applications. Measures against bias were described at very low rates, ranging on average from 2.4% for statistical analysis plan to 19% for primary outcome variable in applications for animal experiments, and from 0.0% for sample size calculation to 34% for statistical analysis plan in publications from these experiments. Calculating an internal validity score (IVS) based on the proportion of the seven measures against bias, we found a weak positive correlation between the IVS of applications and that of publications (Spearman’s rho = 0.34, p = 0.014), indicating that the rates of description of these measures in applications partly predict their rates of reporting in publications. These results indicate that the authorities licensing animal experiments are lacking important information about experimental conduct that determines the scientific validity of the findings, which may be critical for the weight attributed to the benefit of the research in the harm–benefit analysis. Similar to manuscripts getting accepted for publication despite poor reporting of measures against bias, applications for animal experiments may often be approved based on implicit confidence rather than explicit evidence of scientific rigor. Our findings shed serious doubt on the current authorization procedure for animal experiments, as well as the peer-review process for scientific publications, which in the long run may undermine the credibility of research. Developing existing authorization procedures that are already in place in many countries towards a preregistration system for animal research is one promising way to reform the system. This would not only benefit the scientific validity of findings from animal experiments but also help to avoid unnecessary harm to animals for inconclusive research. PMID:27911892
Authorization of Animal Experiments Is Based on Confidence Rather than Evidence of Scientific Rigor.
Vogt, Lucile; Reichlin, Thomas S; Nathues, Christina; Würbel, Hanno
2016-12-01
Accumulating evidence indicates high risk of bias in preclinical animal research, questioning the scientific validity and reproducibility of published research findings. Systematic reviews found low rates of reporting of measures against risks of bias in the published literature (e.g., randomization, blinding, sample size calculation) and a correlation between low reporting rates and inflated treatment effects. That most animal research undergoes peer review or ethical review would offer the possibility to detect risks of bias at an earlier stage, before the research has been conducted. For example, in Switzerland, animal experiments are licensed based on a detailed description of the study protocol and a harm-benefit analysis. We therefore screened applications for animal experiments submitted to Swiss authorities (n = 1,277) for the rates at which the use of seven basic measures against bias (allocation concealment, blinding, randomization, sample size calculation, inclusion/exclusion criteria, primary outcome variable, and statistical analysis plan) were described and compared them with the reporting rates of the same measures in a representative sub-sample of publications (n = 50) resulting from studies described in these applications. Measures against bias were described at very low rates, ranging on average from 2.4% for statistical analysis plan to 19% for primary outcome variable in applications for animal experiments, and from 0.0% for sample size calculation to 34% for statistical analysis plan in publications from these experiments. Calculating an internal validity score (IVS) based on the proportion of the seven measures against bias, we found a weak positive correlation between the IVS of applications and that of publications (Spearman's rho = 0.34, p = 0.014), indicating that the rates of description of these measures in applications partly predict their rates of reporting in publications. These results indicate that the authorities licensing animal experiments are lacking important information about experimental conduct that determines the scientific validity of the findings, which may be critical for the weight attributed to the benefit of the research in the harm-benefit analysis. Similar to manuscripts getting accepted for publication despite poor reporting of measures against bias, applications for animal experiments may often be approved based on implicit confidence rather than explicit evidence of scientific rigor. Our findings shed serious doubt on the current authorization procedure for animal experiments, as well as the peer-review process for scientific publications, which in the long run may undermine the credibility of research. Developing existing authorization procedures that are already in place in many countries towards a preregistration system for animal research is one promising way to reform the system. This would not only benefit the scientific validity of findings from animal experiments but also help to avoid unnecessary harm to animals for inconclusive research.
Monolithic ceramic analysis using the SCARE program
NASA Technical Reports Server (NTRS)
Manderscheid, Jane M.
1988-01-01
The Structural Ceramics Analysis and Reliability Evaluation (SCARE) computer program calculates the fast fracture reliability of monolithic ceramic components. The code is a post-processor to the MSC/NASTRAN general purpose finite element program. The SCARE program automatically accepts the MSC/NASTRAN output necessary to compute reliability. This includes element stresses, temperatures, volumes, and areas. The SCARE program computes two-parameter Weibull strength distributions from input fracture data for both volume and surface flaws. The distributions can then be used to calculate the reliability of geometrically complex components subjected to multiaxial stress states. Several fracture criteria and flaw types are available for selection by the user, including out-of-plane crack extension theories. The theoretical basis for the reliability calculations was proposed by Batdorf. These models combine linear elastic fracture mechanics (LEFM) with Weibull statistics to provide a mechanistic failure criterion. Other fracture theories included in SCARE are the normal stress averaging technique and the principle of independent action. The objective of this presentation is to summarize these theories, including their limitations and advantages, and to provide a general description of the SCARE program, along with example problems.
Statistical Mechanical Model for Adsorption Coupled with SAFT-VR Mie Equation of State.
Franco, Luís F M; Economou, Ioannis G; Castier, Marcelo
2017-10-24
We extend the SAFT-VR Mie equation of state to calculate adsorption isotherms by considering explicitly the residual energy due to the confinement effect. Assuming a square-well potential for the fluid-solid interactions, the structure imposed by the fluid-solid interface is calculated using two different approaches: an empirical expression proposed by Travalloni et al. ( Chem. Eng. Sci. 65 , 3088 - 3099 , 2010 ), and a new theoretical expression derived by applying the mean value theorem. Adopting the SAFT-VR Mie ( Lafitte et al. J. Chem. Phys. , 139 , 154504 , 2013 ) equation of state to describe the fluid-fluid interactions, and solving the phase equilibrium criteria, we calculate adsorption isotherms for light hydrocarbons adsorbed in a carbon molecular sieve and for carbon dioxide, nitrogen, and water adsorbed in a zeolite. Good results are obtained from the model using either approach. Nonetheless, the theoretical expression seems to correlate better the experimental data than the empirical one, possibly implying that a more reliable way to describe the structure ensures a better description of the thermodynamic behavior.
Tips and Tricks for Successful Application of Statistical Methods to Biological Data.
Schlenker, Evelyn
2016-01-01
This chapter discusses experimental design and use of statistics to describe characteristics of data (descriptive statistics) and inferential statistics that test the hypothesis posed by the investigator. Inferential statistics, based on probability distributions, depend upon the type and distribution of the data. For data that are continuous, randomly and independently selected, as well as normally distributed more powerful parametric tests such as Student's t test and analysis of variance (ANOVA) can be used. For non-normally distributed or skewed data, transformation of the data (using logarithms) may normalize the data allowing use of parametric tests. Alternatively, with skewed data nonparametric tests can be utilized, some of which rely on data that are ranked prior to statistical analysis. Experimental designs and analyses need to balance between committing type 1 errors (false positives) and type 2 errors (false negatives). For a variety of clinical studies that determine risk or benefit, relative risk ratios (random clinical trials and cohort studies) or odds ratios (case-control studies) are utilized. Although both use 2 × 2 tables, their premise and calculations differ. Finally, special statistical methods are applied to microarray and proteomics data, since the large number of genes or proteins evaluated increase the likelihood of false discoveries. Additional studies in separate samples are used to verify microarray and proteomic data. Examples in this chapter and references are available to help continued investigation of experimental designs and appropriate data analysis.
NASA Astrophysics Data System (ADS)
Jiang, Ying; Chen, Jeff Z. Y.
2013-10-01
This paper concerns establishing a theoretical basis and numerical scheme for studying the phase behavior of AB diblock copolymers made of wormlike chains. The general idea of a self-consistent field theory is the combination of the mean-field approach together with a statistical weight that describes the configurational properties of a polymer chain. In recent years, this approach has been extensively used for structural prediction of block copolymers, based on the Gaussian-model description of a polymer chain. The wormlike-chain model has played an important role in the description of polymer systems, covering the semiflexible-to-rod crossover of the polymer properties and the highly stretching regime, which the Gaussian-chain model has difficulties to describe. Although the idea of developing a self-consistent field theory for wormlike chains could be traced back to early development in polymer physics, the solution of such a theory has been limited due to technical difficulties. In particular, a challenge has been to develop a numerical algorithm enabling the calculation of the phase diagram containing three-dimensional structures for wormlike AB diblock copolymers. This paper describes a computational algorithm that combines a number of numerical tricks, which can be used for such a calculation. A phase diagram covering major parameter areas was constructed for the wormlike-chain system and reported by us, where the ratio between the total length and the persistence length of a constituent polymer is suggested as another tuning parameter for the microphase-separated structures; all detailed technical issues are carefully addressed in the current paper.
Histogram based analysis of lung perfusion of children after congenital diaphragmatic hernia repair.
Kassner, Nora; Weis, Meike; Zahn, Katrin; Schaible, Thomas; Schoenberg, Stefan O; Schad, Lothar R; Zöllner, Frank G
2018-05-01
To investigate a histogram based approach to characterize the distribution of perfusion in the whole left and right lung by descriptive statistics and to show how histograms could be used to visually explore perfusion defects in two year old children after Congenital Diaphragmatic Hernia (CDH) repair. 28 children (age of 24.2±1.7months; all left sided hernia; 9 after extracorporeal membrane oxygenation therapy) underwent quantitative DCE-MRI of the lung. Segmentations of left and right lung were manually drawn to mask the calculated pulmonary blood flow maps and then to derive histograms for each lung side. Individual and group wise analysis of histograms of left and right lung was performed. Ipsilateral and contralateral lung show significant difference in shape and descriptive statistics derived from the histogram (Wilcoxon signed-rank test, p<0.05) on group wise and individual level. Subgroup analysis (patients with vs without ECMO therapy) showed no significant differences using histogram derived parameters. Histogram analysis can be a valuable tool to characterize and visualize whole lung perfusion of children after CDH repair. It allows for several possibilities to analyze the data, either describing the perfusion differences between the right and left lung but also to explore and visualize localized perfusion patterns in the 3D lung volume. Subgroup analysis will be possible given sufficient sample sizes. Copyright © 2017 Elsevier Inc. All rights reserved.
Statistical variability and confidence intervals for planar dose QA pass rates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bailey, Daniel W.; Nelms, Benjamin E.; Attwood, Kristopher
Purpose: The most common metric for comparing measured to calculated dose, such as for pretreatment quality assurance of intensity-modulated photon fields, is a pass rate (%) generated using percent difference (%Diff), distance-to-agreement (DTA), or some combination of the two (e.g., gamma evaluation). For many dosimeters, the grid of analyzed points corresponds to an array with a low areal density of point detectors. In these cases, the pass rates for any given comparison criteria are not absolute but exhibit statistical variability that is a function, in part, on the detector sampling geometry. In this work, the authors analyze the statistics ofmore » various methods commonly used to calculate pass rates and propose methods for establishing confidence intervals for pass rates obtained with low-density arrays. Methods: Dose planes were acquired for 25 prostate and 79 head and neck intensity-modulated fields via diode array and electronic portal imaging device (EPID), and matching calculated dose planes were created via a commercial treatment planning system. Pass rates for each dose plane pair (both centered to the beam central axis) were calculated with several common comparison methods: %Diff/DTA composite analysis and gamma evaluation, using absolute dose comparison with both local and global normalization. Specialized software was designed to selectively sample the measured EPID response (very high data density) down to discrete points to simulate low-density measurements. The software was used to realign the simulated detector grid at many simulated positions with respect to the beam central axis, thereby altering the low-density sampled grid. Simulations were repeated with 100 positional iterations using a 1 detector/cm{sup 2} uniform grid, a 2 detector/cm{sup 2} uniform grid, and similar random detector grids. For each simulation, %/DTA composite pass rates were calculated with various %Diff/DTA criteria and for both local and global %Diff normalization techniques. Results: For the prostate and head/neck cases studied, the pass rates obtained with gamma analysis of high density dose planes were 2%-5% higher than respective %/DTA composite analysis on average (ranging as high as 11%), depending on tolerances and normalization. Meanwhile, the pass rates obtained via local normalization were 2%-12% lower than with global maximum normalization on average (ranging as high as 27%), depending on tolerances and calculation method. Repositioning of simulated low-density sampled grids leads to a distribution of possible pass rates for each measured/calculated dose plane pair. These distributions can be predicted using a binomial distribution in order to establish confidence intervals that depend largely on the sampling density and the observed pass rate (i.e., the degree of difference between measured and calculated dose). These results can be extended to apply to 3D arrays of detectors, as well. Conclusions: Dose plane QA analysis can be greatly affected by choice of calculation metric and user-defined parameters, and so all pass rates should be reported with a complete description of calculation method. Pass rates for low-density arrays are subject to statistical uncertainty (vs. the high-density pass rate), but these sampling errors can be modeled using statistical confidence intervals derived from the sampled pass rate and detector density. Thus, pass rates for low-density array measurements should be accompanied by a confidence interval indicating the uncertainty of each pass rate.« less
Statistical description and transport in stochastic magnetic fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vanden Eijnden, E.; Balescu, R.
1996-03-01
The statistical description of particle motion in a stochastic magnetic field is presented. Starting form the stochastic Liouville equation (or, hybrid kinetic equation) associated with the equations of motion of a test particle, the probability distribution function of the system is obtained for various magnetic fields and collisional processes. The influence of these two ingredients on the statistics of the particle dynamics is stressed. In all cases, transport properties of the system are discussed. {copyright} {ital 1996 American Institute of Physics.}
ERIC Educational Resources Information Center
Jabari, Kamran; Moradi Sheykhjan, Tohid
2015-01-01
Present study examined the relationship between stress among academic staff and students' satisfaction of their performances in Payame Noor University (PNU) of Miandoab City, Iran in 2014. The methodology of the research is descriptive and correlation that descriptive and inferential statistics were used to analyze the data. Statistical Society…
ERIC Educational Resources Information Center
Brattin, Barbara C.
Content analysis was performed on the top six core journals for 1990 in library and information science to determine the extent of research in the field. Articles (n=186) were examined for descriptive or inferential statistics and separately for the presence of mathematical models. Results show a marked (14%) increase in research for 1990,…
Compaction Behavior of Granular Materials
NASA Astrophysics Data System (ADS)
Endicott, Mark R.; Kenkre, V. M.; Glass, S. Jill; Hurd, Alan J.
1996-03-01
We report the results of our recent study of compaction of granular materials. A theoretical model is developed for the description of the compaction of granular materials exemplified by granulated ceramic powders. Its predictions are compared to observations of uniaxial compaction tests of ceramic granules of PMN-PT, spray dried alumina and rutile. The theoretical model employs a volume-based statistical mechanics treatment and an activation analogy. Results of a computer simulation of random packing of discs in two dimensions are also reported. The effect of type of particle size distribution and other parameters of that distribution on the calculated quantities are discussed. We examine the implications of the results of the simulation for the theoretical model.
A Documentary Analysis of Abstracts Presented in European Congresses on Adapted Physical Activity.
Sklenarikova, Jana; Kudlacek, Martin; Baloun, Ladislav; Causgrove Dunn, Janice
2016-07-01
The purpose of the study was to identify trends in research abstracts published in the books of abstracts of the European Congress of Adapted Physical Activity from 2004 to 2012. A documentary analysis of the contents of 459 abstracts was completed. Data were coded based on subcategories used in a previous study by Zhang, deLisle, and Chen (2006) and by Porretta and Sherrill (2005): number of authors, data source, sample size, type of disability, data analyses, type of study, and focus of study. Descriptive statistics calculated for each subcategory revealed an overall picture of the state and trends of scientific inquiry in adapted physical activity research in Europe.
Parton distributions in the LHC era
NASA Astrophysics Data System (ADS)
Del Debbio, Luigi
2018-03-01
Analyses of LHC (and other!) experiments require robust and statistically accurate determinations of the structure of the proton, encoded in the parton distribution functions (PDFs). The standard description of hadronic processes relies on factorization theorems, which allow a separation of process-dependent short-distance physics from the universal long-distance structure of the proton. Traditionally the PDFs are obtained from fits to experimental data. However, understanding the long-distance properties of hadrons is a nonperturbative problem, and lattice QCD can play a role in providing useful results from first principles. In this talk we compare the different approaches used to determine PDFs, and try to assess the impact of existing, and future, lattice calculations.
Study of optimum methods of optical communication
NASA Technical Reports Server (NTRS)
Harger, R. O.
1972-01-01
Optimum methods of optical communication accounting for the effects of the turbulent atmosphere and quantum mechanics, both by the semi-classical method and the full-fledged quantum theoretical model are described. A concerted effort to apply the techniques of communication theory to the novel problems of optical communication by a careful study of realistic models and their statistical descriptions, the finding of appropriate optimum structures and the calculation of their performance and, insofar as possible, comparing them to conventional and other suboptimal systems are discussed. In this unified way the bounds on performance and the structure of optimum communication systems for transmission of information, imaging, tracking, and estimation can be determined for optical channels.
Environmental flow allocation and statistics calculator
Konrad, Christopher P.
2011-01-01
The Environmental Flow Allocation and Statistics Calculator (EFASC) is a computer program that calculates hydrologic statistics based on a time series of daily streamflow values. EFASC will calculate statistics for daily streamflow in an input file or will generate synthetic daily flow series from an input file based on rules for allocating and protecting streamflow and then calculate statistics for the synthetic time series. The program reads dates and daily streamflow values from input files. The program writes statistics out to a series of worksheets and text files. Multiple sites can be processed in series as one run. EFASC is written in MicrosoftRegistered Visual BasicCopyright for Applications and implemented as a macro in MicrosoftOffice Excel 2007Registered. EFASC is intended as a research tool for users familiar with computer programming. The code for EFASC is provided so that it can be modified for specific applications. All users should review how output statistics are calculated and recognize that the algorithms may not comply with conventions used to calculate streamflow statistics published by the U.S. Geological Survey.
NASA Astrophysics Data System (ADS)
González, Diego Luis; Pimpinelli, Alberto; Einstein, T. L.
2017-07-01
We study the effect of hindered aggregation on the island formation process in a one- (1D) and two-dimensional (2D) point-island model for epitaxial growth with arbitrary critical nucleus size i . In our model, the attachment of monomers to preexisting islands is hindered by an additional attachment barrier, characterized by length la. For la=0 the islands behave as perfect sinks while for la→∞ they behave as reflecting boundaries. For intermediate values of la, the system exhibits a crossover between two different kinds of processes, diffusion-limited aggregation and attachment-limited aggregation. We calculate the growth exponents of the density of islands and monomers for the low coverage and aggregation regimes. The capture-zone (CZ) distributions are also calculated for different values of i and la. In order to obtain a good spatial description of the nucleation process, we propose a fragmentation model, which is based on an approximate description of nucleation inside of the gaps for 1D and the CZs for 2D. In both cases, the nucleation is described by using two different physically rooted probabilities, which are related with the microscopic parameters of the model (i and la). We test our analytical model with extensive numerical simulations and previously established results. The proposed model describes excellently the statistical behavior of the system for arbitrary values of la and i =1 , 2, and 3.
Non-resonant multipactor--A statistical model
NASA Astrophysics Data System (ADS)
Rasch, J.; Johansson, J. F.
2012-12-01
High power microwave systems operating in vacuum or near vacuum run the risk of multipactor breakdown. In order to avoid multipactor, it is necessary to make theoretical predictions of critical parameter combinations. These treatments are generally based on the assumption of electrons moving in resonance with the electric field while traversing the gap between critical surfaces. Through comparison with experiments, it has been found that only for small system dimensions will the resonant approach give correct predictions. Apparently, the resonance is destroyed due to the statistical spread in electron emission velocity, and for a more valid description it is necessary to resort to rather complicated statistical treatments of the electron population, and extensive simulations. However, in the limit where resonance is completely destroyed it is possible to use a much simpler treatment, here called non-resonant theory. In this paper, we develop the formalism for this theory, use it to calculate universal curves for the existence of multipactor, and compare with previous results. Two important effects that leads to an increase in the multipactor threshold in comparison with the resonant prediction are identified. These are the statistical spread of impact speed, which leads to a lower average electron impact speed, and the impact of electrons in phase regions where the secondary electrons are immediately reabsorbed, leading to an effective removal of electrons from the discharge.
75 FR 4323 - Additional Quantitative Fit-testing Protocols for the Respiratory Protection Standard
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-27
... respirators (500 and 1000 for protocols 1 and 2, respectively). However, OSHA could not evaluate the results... the values of these descriptive statistics for revised PortaCount[supreg] QNFT protocols 1 (at RFFs of 100 and 500) and 2 (at RFFs of 200 and 1000). Table 2--Descriptive Statistics for RFFs of 100 and 200...
Teaching Statistics Online Using "Excel"
ERIC Educational Resources Information Center
Jerome, Lawrence
2011-01-01
As anyone who has taught or taken a statistics course knows, statistical calculations can be tedious and error-prone, with the details of a calculation sometimes distracting students from understanding the larger concepts. Traditional statistics courses typically use scientific calculators, which can relieve some of the tedium and errors but…
Exploring Marine Corps Officer Quality: An Analysis of Promotion to Lieutenant Colonel
2017-03-01
44 G. DESCRIPTIVE STATISTICS ................................................................44 1. Dependent...Variable Summary Statistics ...................................44 2. Performance...87 4. Further Research .........................................................................88 APPENDIX A. SUMMARY STATISTICS OF FITREP AND
Statistics of the geomagnetic secular variation for the past 5Ma
NASA Technical Reports Server (NTRS)
Constable, C. G.; Parker, R. L.
1986-01-01
A new statistical model is proposed for the geomagnetic secular variation over the past 5Ma. Unlike previous models, the model makes use of statistical characteristics of the present day geomagnetic field. The spatial power spectrum of the non-dipole field is consistent with a white source near the core-mantle boundary with Gaussian distribution. After a suitable scaling, the spherical harmonic coefficients may be regarded as statistical samples from a single giant Gaussian process; this is the model of the non-dipole field. The model can be combined with an arbitrary statistical description of the dipole and probability density functions and cumulative distribution functions can be computed for declination and inclination that would be observed at any site on Earth's surface. Global paleomagnetic data spanning the past 5Ma are used to constrain the statistics of the dipole part of the field. A simple model is found to be consistent with the available data. An advantage of specifying the model in terms of the spherical harmonic coefficients is that it is a complete statistical description of the geomagnetic field, enabling us to test specific properties for a general description. Both intensity and directional data distributions may be tested to see if they satisfy the expected model distributions.
Statistics of the geomagnetic secular variation for the past 5 m.y
NASA Technical Reports Server (NTRS)
Constable, C. G.; Parker, R. L.
1988-01-01
A new statistical model is proposed for the geomagnetic secular variation over the past 5Ma. Unlike previous models, the model makes use of statistical characteristics of the present day geomagnetic field. The spatial power spectrum of the non-dipole field is consistent with a white source near the core-mantle boundary with Gaussian distribution. After a suitable scaling, the spherical harmonic coefficients may be regarded as statistical samples from a single giant Gaussian process; this is the model of the non-dipole field. The model can be combined with an arbitrary statistical description of the dipole and probability density functions and cumulative distribution functions can be computed for declination and inclination that would be observed at any site on Earth's surface. Global paleomagnetic data spanning the past 5Ma are used to constrain the statistics of the dipole part of the field. A simple model is found to be consistent with the available data. An advantage of specifying the model in terms of the spherical harmonic coefficients is that it is a complete statistical description of the geomagnetic field, enabling us to test specific properties for a general description. Both intensity and directional data distributions may be tested to see if they satisfy the expected model distributions.
Hodge, Natalia; Evans, Carla A; Simmons, Kirt E; Fadavi, Shahrbanoo; Viana, Grace
2015-01-01
The purpose of this study was to assess the occlusal characteristics of individuals with growth hormone deficiency (GHD), idiopathic short stature (ISS), and Russell-Silver syndrome (RSS), and compare them to the means of a normal population. Data about the stage of dentition, diastema, maxillary transverse deficiency, overjet, overbite, molar classification, and maxillary and mandibular crowding were obtained from orthodontic screening notes and standardized clinical exams of children with growth disorders seen at screening events. The prevalence of these occlusal characteristics was calculated and compared to the pooled mean of a normal population as determined by the National Health and Nutrition Examination Survey studies. Twenty RSS subjects and 16 subjects with GHD or ISS were studied. The RSS cohort presented statistically significant greater mean overbite as well as mandibular and maxillary crowding compared to the general population. Descriptive statistics were performed for the GHD and ISS group. Occlusal abnormalities are prevalent in children with growth disorders.
The effect of neck dissection on quality of life after chemoradiation.
Donatelli-Lassig, Amy Anne; Duffy, Sonia A; Fowler, Karen E; Ronis, David L; Chepeha, Douglas B; Terrell, Jeffrey E
2008-10-01
To determine differences in quality of life (QOL) between patients with head and neck cancer who receive chemoradiation versus chemoradiation and neck dissection. A prospective cohort study was conducted at two tertiary otolaryngology clinics and a Veterans Administration hospital. 103 oropharyngeal patients with Stage IV squamous cell carcinoma treated via chemoradiation +/- neck dissection. self-administered health survey to collect health, demographic, and QOL information pretreatment and 1 year later. QOL via SF-36 and HNQoL. Descriptive statistics were calculated for health/clinical characteristics, demographics, and QOL scores. t tests evaluated changes in QOL over time. Sixty-five patients underwent chemoradiation and 38 patients underwent chemoradiation and neck dissection. Only the pain index of the SF-36 showed a significant difference between groups (P < 0.05) with the neck dissection group reporting greater pain. After post-treatment neck dissection, patients experience statistically significant decrement in bodily pain domain scores, but other QOL scores are similar to those of patients who underwent chemoradiation alone.
The effect of neck dissection on quality of life after chemoradiation
Lassig, Amy Anne Donatelli; Duffy, Sonia A.; Fowler, Karen E.; Ronis, David L.; Chepeha, Douglas B.; Terrell, Jeffrey E.
2010-01-01
Objective To determine differences in QOL between head and neck cancer patients receiving chemoradiation versus chemoradiation and neck dissection. Methods A prospective cohort study was conducted at 2 tertiary otolaryngology clinics and a VA. Sample: 103 oropharyngeal Stage IV SCCA patients treated via chemoradiation +/− neck dissection. Intervention: self-administered health survey collecting health, demographic, and QOL information pretreatment and 1 year later. Main outcome measures: QOL via SF-36 and HNQoL. Descriptive statistics were calculated for health / clinical characteristics, demographics, and QOL scores. T-tests evaluated changes in QOL over time. Results 65 patients received chemoradiation and 38 chemoradiation + neck dissection. Only the pain index of the SF-36 showed a significant difference between groups (p<.05) with the neck dissection group reporting greater pain. Conclusions After post-treatment neck dissection, patients experience statistically significant decrement in bodily pain domain scores, but other QOL scores are similar to those of patients undergoing chemoradiation alone. PMID:18922336
Detection and Estimation of an Optical Image by Photon-Counting Techniques. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Wang, Lily Lee
1973-01-01
Statistical description of a photoelectric detector is given. The photosensitive surface of the detector is divided into many small areas, and the moment generating function of the photo-counting statistic is derived for large time-bandwidth product. The detection of a specified optical image in the presence of the background light by using the hypothesis test is discussed. The ideal detector based on the likelihood ratio from a set of numbers of photoelectrons ejected from many small areas of the photosensitive surface is studied and compared with the threshold detector and a simple detector which is based on the likelihood ratio by counting the total number of photoelectrons from a finite area of the surface. The intensity of the image is assumed to be Gaussian distributed spatially against the uniformly distributed background light. The numerical approximation by the method of steepest descent is used, and the calculations of the reliabilities for the detectors are carried out by a digital computer.
Brix, Tobias Johannes; Bruland, Philipp; Sarfraz, Saad; Ernsting, Jan; Neuhaus, Philipp; Storck, Michael; Doods, Justin; Ständer, Sonja; Dugas, Martin
2018-01-01
A required step for presenting results of clinical studies is the declaration of participants demographic and baseline characteristics as claimed by the FDAAA 801. The common workflow to accomplish this task is to export the clinical data from the used electronic data capture system and import it into statistical software like SAS software or IBM SPSS. This software requires trained users, who have to implement the analysis individually for each item. These expenditures may become an obstacle for small studies. Objective of this work is to design, implement and evaluate an open source application, called ODM Data Analysis, for the semi-automatic analysis of clinical study data. The system requires clinical data in the CDISC Operational Data Model format. After uploading the file, its syntax and data type conformity of the collected data is validated. The completeness of the study data is determined and basic statistics, including illustrative charts for each item, are generated. Datasets from four clinical studies have been used to evaluate the application's performance and functionality. The system is implemented as an open source web application (available at https://odmanalysis.uni-muenster.de) and also provided as Docker image which enables an easy distribution and installation on local systems. Study data is only stored in the application as long as the calculations are performed which is compliant with data protection endeavors. Analysis times are below half an hour, even for larger studies with over 6000 subjects. Medical experts have ensured the usefulness of this application to grant an overview of their collected study data for monitoring purposes and to generate descriptive statistics without further user interaction. The semi-automatic analysis has its limitations and cannot replace the complex analysis of statisticians, but it can be used as a starting point for their examination and reporting.
NASA Astrophysics Data System (ADS)
Salvato, Steven Walter
The purpose of this study was to analyze questions within the chapters of a nontraditional general chemistry textbook and the four general chemistry textbooks most widely used by Texas community colleges in order to determine if the questions require higher- or lower-order thinking according to Bloom's taxonomy. The study employed quantitative methods. Bloom's taxonomy (Bloom, Engelhart, Furst, Hill, & Krathwohl, 1956) was utilized as the main instrument in the study. Additional tools were used to help classify the questions into the proper category of the taxonomy (McBeath, 1992; Metfessel, Michael, & Kirsner, 1969). The top four general chemistry textbooks used in Texas community colleges and Chemistry: A Project of the American Chemical Society (Bell et al., 2005) were analyzed during the fall semester of 2010 in order to categorize the questions within the chapters into one of the six levels of Bloom's taxonomy. Two coders were used to assess reliability. The data were analyzed using descriptive and inferential methods. The descriptive method involved calculation of the frequencies and percentages of coded questions from the books as belonging to the six categories of the taxonomy. Questions were dichotomized into higher- and lower-order thinking questions. The inferential methods involved chi-square tests of association to determine if there were statistically significant differences among the four traditional college general chemistry textbooks in the proportions of higher- and lower-order questions and if there were statistically significant differences between the nontraditional chemistry textbook and the four traditional general chemistry textbooks. Findings indicated statistically significant differences among the four textbooks frequently used in Texas community colleges in the number of higher- and lower-level questions. Statistically significant differences were also found among the four textbooks and the nontraditional textbook. After the analysis of the data, conclusions were drawn, implications for practice were delineated, and recommendations for future research were given.
A Multidisciplinary Approach for Teaching Statistics and Probability
ERIC Educational Resources Information Center
Rao, C. Radhakrishna
1971-01-01
The author presents a syllabus for an introductory (first year after high school) course in statistics and probability and some methods of teaching statistical techniques. The description comes basically from the procedures used at the Indian Statistical Institute, Calcutta. (JG)
The Performance and Retention of Female Navy Officers with a Military Spouse
2017-03-01
5 2. Female Officer Retention and Dual-Military Couples ...............7 3. Demographic Statistics ...23 III. DATA DESCRIPTION AND STATISTICS ...28 2. Independent Variables.................................................................31 C. SUMMARY STATISTICS
Back to basics: an introduction to statistics.
Halfens, R J G; Meijers, J M M
2013-05-01
In the second in the series, Professor Ruud Halfens and Dr Judith Meijers give an overview of statistics, both descriptive and inferential. They describe the first principles of statistics, including some relevant inferential tests.
Students' attitudes towards learning statistics
NASA Astrophysics Data System (ADS)
Ghulami, Hassan Rahnaward; Hamid, Mohd Rashid Ab; Zakaria, Roslinazairimah
2015-05-01
Positive attitude towards learning is vital in order to master the core content of the subject matters under study. This is unexceptional in learning statistics course especially at the university level. Therefore, this study investigates the students' attitude towards learning statistics. Six variables or constructs have been identified such as affect, cognitive competence, value, difficulty, interest, and effort. The instrument used for the study is questionnaire that was adopted and adapted from the reliable instrument of Survey of Attitudes towards Statistics(SATS©). This study is conducted to engineering undergraduate students in one of the university in the East Coast of Malaysia. The respondents consist of students who were taking the applied statistics course from different faculties. The results are analysed in terms of descriptive analysis and it contributes to the descriptive understanding of students' attitude towards the teaching and learning process of statistics.
Reynolds, Richard J; Fenster, Charles B
2008-05-01
Pollinator importance, the product of visitation rate and pollinator effectiveness, is a descriptive parameter of the ecology and evolution of plant-pollinator interactions. Naturally, sources of its variation should be investigated, but the SE of pollinator importance has never been properly reported. Here, a Monte Carlo simulation study and a result from mathematical statistics on the variance of the product of two random variables are used to estimate the mean and confidence limits of pollinator importance for three visitor species of the wildflower, Silene caroliniana. Both methods provided similar estimates of mean pollinator importance and its interval if the sample size of the visitation and effectiveness datasets were comparatively large. These approaches allowed us to determine that bumblebee importance was significantly greater than clearwing hawkmoth, which was significantly greater than beefly. The methods could be used to statistically quantify temporal and spatial variation in pollinator importance of particular visitor species. The approaches may be extended for estimating the variance of more than two random variables. However, unless the distribution function of the resulting statistic is known, the simulation approach is preferable for calculating the parameter's confidence limits.
Factors associated with mouth breathing in children with -developmental -disabilities.
de Castilho, Lia Silva; Abreu, Mauro Henrique Nogueira Guimarães; de Oliveira, Renata Batista; Souza E Silva, Maria Elisa; Resende, Vera Lúcia Silva
2016-01-01
To investigate the prevalence and factors associated with mouth breathing among patients with developmental disabilities of a dental service. We analyzed 408 dental records. Mouth breathing was reported by the patients' parents and from direct observation. Other variables were as -follows: history of asthma, bronchitis, palate shape, pacifier use, thumb -sucking, nail biting, use of medications, gastroesophageal reflux, bruxism, gender, age, and diagnosis of the patient. Statistical analysis included descriptive analysis with ratio calculation and multiple logistic regression. Variables with p < 0.25 were included in the model to estimate the adjusted OR (95% CI), calculated by the forward stepwise method. Variables with p < 0.05 were kept in the model. Being male (p = 0.016) and use of centrally acting drugs (p = 0.001) were the variables that remained in the model. Among patients with -developmental disabilities, boys and psychotropic drug users had a greater chance of being mouth breathers. © 2016 Special Care Dentistry Association and Wiley Periodicals, Inc.
Equations of state for explosive detonation products: The PANDA model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerley, G.I.
1994-05-01
This paper discusses a thermochemical model for calculating equations of state (EOS) for the detonation products of explosives. This model, which was first presented at the Eighth Detonation Symposium, is available in the PANDA code and is referred to here as ``the Panda model``. The basic features of the PANDA model are as follows. (1) Statistical-mechanical theories are used to construct EOS tables for each of the chemical species that are to be allowed in the detonation products. (2) The ideal mixing model is used to compute the thermodynamic functions for a mixture of these species, and the composition ofmore » the system is determined from assumption of chemical equilibrium. (3) For hydrocode calculations, the detonation product EOS are used in tabular form, together with a reactive burn model that allows description of shock-induced initiation and growth or failure as well as ideal detonation wave propagation. This model has been implemented in the three-dimensional Eulerian code, CTH.« less
Gee, Bryan M.; Lloyd, Kimberly; Devine, Nancy; Tyrrell, Erin; Evans, Trisha; Hill, Rebekah; Dineen, Stacee; Magalogo, Kristin
2016-01-01
Occupational therapists determine the dosage when establishing the plan of care for their pediatric clients. A content analysis was conducted using 123 pediatric occupational therapy outcomes studies from 9 scholarly international occupational therapy journals. The parameters of dosage were calculated using descriptive statistics in order to obtain a representation of dosage available within the current collage of pediatric occupational therapy outcomes studies. The results revealed that most studies reported portions of dosage parameters within the published studies. The average findings for the subcomponents related to dosage were session length (minutes) M = 58.7, duration of plan of care (weeks) M = 12.1, session frequency (per week) M = 3.4, and total hours of therapy (hours) M = 18.1. This first attempt at describing and calculating dosage related to pediatric occupational therapy practice indicates that evidence is lacking within the published literature to adequately guide OT dosage decisions. Further research related to dosage in pediatric occupational therapy practice is needed. PMID:26949547
Unbiased mean direction of paleomagnetic data and better estimate of paleolatitude
NASA Astrophysics Data System (ADS)
Hatakeyama, T.; Shibuya, H.
2010-12-01
In paleomagnetism, when we obtain only paleodirection data without paleointensities we calculate Fisher-mean directions (I, D) and Fisher-mean VGP positions as the description of the mean field. However, Kono (1997) and Hatakeyama and Kono (2001) indicated that these averaged directions does not show the unbiased estimated mean directions derived from the time-averaged field (TAF). Hatakeyama and Kono (2002) calculated the TAF and paleosecular variation (PSV) models for the past 5My with considering the biases due to the averaging of the nonlinear functions such as the summation of the unit vectors in the Fisher statistics process. Here we will show a zonal TAF model based on the Hatakeyama and Kono TAF model. Moreover, we will introduce the biased angles due to the PSV in the mean direction and a method for determining true paleolatitudes, which represents the TAF, from paleodirections. This method will helps tectonics studies, especially in the estimation of the accurate paleolatitude in the middle latitude regions.
Student's Conceptions in Statistical Graph's Interpretation
ERIC Educational Resources Information Center
Kukliansky, Ida
2016-01-01
Histograms, box plots and cumulative distribution graphs are popular graphic representations for statistical distributions. The main research question that this study focuses on is how college students deal with interpretation of these statistical graphs when translating graphical representations into analytical concepts in descriptive statistics.…
Peterson, Cynthia K; Saupe, Nadja; Buck, Florian; Pfirrmann, Christian W A; Zanetti, Marco; Hodler, Juerg
2010-12-01
The purpose of this study was to evaluate pain relief 20 to 30 minutes after diagnostic or therapeutic injections into the sternoclavicular joint and to compare patient outcomes based on the CT diagnosis. Informed consent was obtained from each patient. Ethics approval was not required. Fifty patients who had CT-guided injections of corticosteroid and local anesthetic into their sternoclavicular joints were included in the study. Preinjection and 20- to 30-minute postinjection visual analog scale data were recorded and compared with the imaging findings agreed by consensus. Kappa statistics were calculated for the reliability of imaging diagnosis. The percentage of patients improving after joint injection was calculated, and the risk ratio comparing the response of patients with osteoarthritis to those without osteoarthritis was completed. The correlation between the severity of each patient's osteoarthritis and the pain response was calculated using Spearman's correlation coefficient. Sixty-six percent of the patients reported clinically significant pain reduction at between 20 and 30 minutes after injection. The proportion of patients with osteoarthritis who had a clinically significant response was 67% compared with 64% for patients who did not have osteoarthritis. This difference was not statistically or clinically significant. There was no correlation between the severity of osteoarthritis and the amount of pain reduction (r = 0.03). The reliability of imaging diagnosis was substantial. Two thirds of patients having sternoclavicular joint injections of corticosteroids and local anesthetics report clinically significant improvement regardless of the abnormalities detected on their CT images.
Faheem, Muhammad; Heyden, Andreas
2014-08-12
We report the development of a quantum mechanics/molecular mechanics free energy perturbation (QM/MM-FEP) method for modeling chemical reactions at metal-water interfaces. This novel solvation scheme combines planewave density function theory (DFT), periodic electrostatic embedded cluster method (PEECM) calculations using Gaussian-type orbitals, and classical molecular dynamics (MD) simulations to obtain a free energy description of a complex metal-water system. We derive a potential of mean force (PMF) of the reaction system within the QM/MM framework. A fixed-size, finite ensemble of MM conformations is used to permit precise evaluation of the PMF of QM coordinates and its gradient defined within this ensemble. Local conformations of adsorbed reaction moieties are optimized using sequential MD-sampling and QM-optimization steps. An approximate reaction coordinate is constructed using a number of interpolated states and the free energy difference between adjacent states is calculated using the QM/MM-FEP method. By avoiding on-the-fly QM calculations and by circumventing the challenges associated with statistical averaging during MD sampling, a computational speedup of multiple orders of magnitude is realized. The method is systematically validated against the results of ab initio QM calculations and demonstrated for C-C cleavage in double-dehydrogenated ethylene glycol on a Pt (111) model surface.
NASA Astrophysics Data System (ADS)
Menthe, R. W.; McColgan, C. J.; Ladden, R. M.
1991-05-01
The Unified AeroAcoustic Program (UAAP) code calculates the airloads on a single rotation prop-fan, or propeller, and couples these airloads with an acoustic radiation theory, to provide estimates of near-field or far-field noise levels. The steady airloads can also be used to calculate the nonuniform velocity components in the propeller wake. The airloads are calculated using a three dimensional compressible panel method which considers the effects of thin, cambered, multiple blades which may be highly swept. These airloads may be either steady or unsteady. The acoustic model uses the blade thickness distribution and the steady or unsteady aerodynamic loads to calculate the acoustic radiation. The users manual for the UAAP code is divided into five sections: general code description; input description; output description; system description; and error codes. The user must have access to IMSL10 libraries (MATH and SFUN) for numerous calls made for Bessel functions and matrix inversion. For plotted output users must modify the dummy calls to plotting routines included in the code to system-specific calls appropriate to the user's installation.
NASA Technical Reports Server (NTRS)
Menthe, R. W.; Mccolgan, C. J.; Ladden, R. M.
1991-01-01
The Unified AeroAcoustic Program (UAAP) code calculates the airloads on a single rotation prop-fan, or propeller, and couples these airloads with an acoustic radiation theory, to provide estimates of near-field or far-field noise levels. The steady airloads can also be used to calculate the nonuniform velocity components in the propeller wake. The airloads are calculated using a three dimensional compressible panel method which considers the effects of thin, cambered, multiple blades which may be highly swept. These airloads may be either steady or unsteady. The acoustic model uses the blade thickness distribution and the steady or unsteady aerodynamic loads to calculate the acoustic radiation. The users manual for the UAAP code is divided into five sections: general code description; input description; output description; system description; and error codes. The user must have access to IMSL10 libraries (MATH and SFUN) for numerous calls made for Bessel functions and matrix inversion. For plotted output users must modify the dummy calls to plotting routines included in the code to system-specific calls appropriate to the user's installation.
Microscopic description of production cross sections including deexcitation effects
NASA Astrophysics Data System (ADS)
Sekizawa, Kazuyuki
2017-07-01
Background: At the forefront of the nuclear science, production of new neutron-rich isotopes is continuously pursued at accelerator laboratories all over the world. To explore the currently unknown territories in the nuclear chart far away from the stability, reliable theoretical predictions are inevitable. Purpose: To provide a reliable prediction of production cross sections taking into account secondary deexcitation processes, both particle evaporation and fission, a new method called TDHF+GEMINI is proposed, which combines the microscopic time-dependent Hartree-Fock (TDHF) theory with a sophisticated statistical compound-nucleus deexcitation model, GEMINI++. Methods: Low-energy heavy ion reactions are described based on three-dimensional Skyrme-TDHF calculations. Using the particle-number projection method, production probabilities, total angular momenta, and excitation energies of primary reaction products are extracted from the TDHF wave function after collision. Production cross sections for secondary reaction products are evaluated employing GEMINI++. Results are compared with available experimental data and widely used grazing calculations. Results: The method is applied to describe cross sections for multinucleon transfer processes in 40Ca+124Sn (Ec .m .≃128.54 MeV ), 48Ca+124Sn (Ec .m .≃125.44 MeV ), 40Ca+208Pb (Ec .m .≃208.84 MeV ), 58Ni+208Pb (Ec .m .≃256.79 MeV ), 64Ni+238U (Ec .m .≃307.35 MeV ), and 136Xe+198Pt (Ec .m .≃644.98 MeV ) reactions at energies close to the Coulomb barrier. It is shown that the inclusion of secondary deexcitation processes, which are dominated by neutron evaporation in the present systems, substantially improves agreement with the experimental data. The magnitude of the evaporation effects is very similar to the one observed in grazing calculations. TDHF+GEMINI provides better description of the absolute value of the cross sections for channels involving transfer of more than one proton, compared to the grazing results. However, there remain discrepancies between the measurements and the calculated cross sections, indicating a limit of the theoretical framework that works with a single mean-field potential. Possible causes of the discrepancies are discussed. Conclusions: To perfectly reproduce experimental cross sections for multinucleon transfer processes, one should go beyond the standard self-consistent mean-field description. Nevertheless, the proposed method will provide valuable information to optimize production mechanisms of new neutron-rich nuclei through its microscopic, nonempirical predictions.
The assessment of Urban Storm Inundation
NASA Astrophysics Data System (ADS)
Setyandito, Oki; Wijayanti, Yureana; Alwan, Muhammad; Chayati, Cholilul; Meilani
2017-12-01
A Sustainable and integrated plan in order to solve urban storm inundation problem, is an urgent issue in Indonesia. A reliable and complete datasets of urban storm inundation area in Indonesia should become its basis to give clear description of inundation area for formulating the best solution. In this study, Statistics Indonesia data in thirty three provinces were assessed during 2000 until 2012 providing data series of urban flood area, flood frequency and land cover changes. Drainage system condition in big cities should be well understood to ensure its infrastructure condition and performance. If inundation occurred, it can be concluded that there is drainage system problem. Inundation data is also important for drainage system design process in the future. The study result is provided estimation of urban storm inundation area based on calculation of Statistics Indonesia data. Moreover, this study is preceded by analyzing and reviewing the capacity of existing drainage channel, using case study of Mataram, West Nusa Tenggara. Rainfall data was obtained from three rainfall stations surround Mataram City. The storm water quantity was calculated using three different approaches as follows: 1) Rational Method; 2) Summation of existing inundation and surface run off discharge; 3) Discharge calculation from existing channel dimensions. After that, the result of these approaches was compared. The storm water quantity gap was concluded as quantity of inundation. The result shows that 36% of drainage channel in Brenyok Kanan River sub system could not accommodate the storm water runoff in this area, which causing inundation. The redesign of drainage channel using design discharge from Rational Method approach should be performed. Within area with the lowest level topography, a construction of detention or storage pond is essential to prevent inundation in this area. Furthermore, the benefits and drawbacks of the statistics database are discussed. Recommendations include utilizing more refined urban land use typologies that can better represent physical alteration of hydrological pathways
Lynch, Kimberly; Kendall, Mat; Shanks, Katherine; Haque, Ahmed; Jones, Emily; Wanis, Maggie G; Furukawa, Michael; Mostashari, Farzad
2014-02-01
Assess the Regional Extension Center (REC) program's progress toward its goal of supporting over 100,000 providers in small, rural, and underserved practices to achieve meaningful use (MU) of an electronic health record (EHR). Data collected January 2010 through June 2013 via monitoring and evaluation of the 4-year REC program. Descriptive study of 62 REC programs. Primary data collected from RECs were merged with nine other datasets, and descriptive statistics of progress by practice setting and penetration of targeted providers were calculated. RECs recruited almost 134,000 primary care providers (PCPs), or 44 percent of the nation's PCPs; 86 percent of these were using an EHR with advanced functionality and almost half (48 percent) have demonstrated MU. Eighty-three percent of Federally Qualified Health Centers and 78 percent of the nation's Critical Access Hospitals were participating with an REC. RECs have made substantial progress in assisting PCPs with adoption and MU of EHRs. This infrastructure supports small practices, community health centers, and rural and public hospitals to use technology for care delivery transformation and improvement. © Health Research and Educational Trust.
Stochastic effects in a thermochemical system with Newtonian heat exchange.
Nowakowski, B; Lemarchand, A
2001-12-01
We develop a mesoscopic description of stochastic effects in the Newtonian heat exchange between a diluted gas system and a thermostat. We explicitly study the homogeneous Semenov model involving a thermochemical reaction and neglecting consumption of reactants. The master equation includes a transition rate for the thermal transfer process, which is derived on the basis of the statistics for inelastic collisions between gas particles and walls of the thermostat. The main assumption is that the perturbation of the Maxwellian particle velocity distribution can be neglected. The transition function for the thermal process admits a continuous spectrum of temperature changes, and consequently, the master equation has a complicated integro-differential form. We perform Monte Carlo simulations based on this equation to study the stochastic effects in the Semenov system in the explosive regime. The dispersion of ignition times is calculated as a function of system size. For sufficiently small systems, the probability distribution of temperature displays transient bimodality during the ignition period. The results of the stochastic description are successfully compared with those of direct simulations of microscopic particle dynamics.
Expósito-Ruiz, Manuela; Sánchez-López, Juan; Ruiz-Bailén, Manuel; Rodríguez-Del Águila, María Del Mar
2017-01-01
To determine the frequency of use of Spanish pediatric emergency services, and to describe user profiles and geographic variations. Descriptive study based on data from the Spanish National Health Survey. We calculated descriptive statistics and analyzed crude and adjusted odds ratios (ORs). Thirty-five percent of the 5495 respondents had come to an emergency department in the past year, and 88.1% of them had used the services of a Spanish national health service hospital. Factors associated with higher use of emergency services were male sex of the patient, (OR, 1.202; 95% CI, 1.047-1.381), a higher educational level of parents (OR, 1.255; 95% CI, 0.983-1.603), and younger age of the child (OR, 0.909; 95% CI, 0.894-0.924). Emergency department use varied widely from one Spanish community to another. There was a positive correlation between use and the presence of a foreign-born population (ρ=0.495, P=.031). The rate of emergency department use is high in Spain. Variability between geographic areas is considerable, and some variation is explained by population characteristics.
2015-12-01
WAIVERS ..............................................................................................49 APPENDIX C. DESCRIPTIVE STATISTICS ... Statistics of Dependent Variables. .............................................23 Table 6. Summary Statistics of Academics Variables...24 Table 7. Summary Statistics of Application Variables ............................................25 Table 8
21 CFR 314.50 - Content and format of an application.
Code of Federal Regulations, 2013 CFR
2013-04-01
... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...
21 CFR 314.50 - Content and format of an application.
Code of Federal Regulations, 2012 CFR
2012-04-01
... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...
21 CFR 314.50 - Content and format of an application.
Code of Federal Regulations, 2014 CFR
2014-04-01
... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...
21 CFR 314.50 - Content and format of an application.
Code of Federal Regulations, 2011 CFR
2011-04-01
... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...
21 CFR 314.50 - Content and format of an application.
Code of Federal Regulations, 2010 CFR
2010-04-01
... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...
Micellar hexagonal phases in lyotropic liquid crystals
NASA Astrophysics Data System (ADS)
Amaral, L. Q.; Gulik, A.; Itri, R.; Mariani, P.
1992-09-01
The hexagonal cell parameter a of the system sodium dodecyl lauryl sulfate and water as a function of volume concentration cv in phase Hα shows the functional behavior expected for micelles of finite length: a~c-1/3v. The interpretation of x-ray data based on finite micelles leads to an alternative description of the hexagonal phase Hα: spherocylindrical micelles of constant radius with length that may grow along the range of the Hα phase. Results are compared with recent statistical-mechanical calculations for the isotropic I-Hα transition. The absence of diffraction in the direction perpendicular to the hexagonal plane is ascribed to polydispersity of micellar length, which also is a necessary condition for the occurrence of direct I-Hα transitions.
Gazica, Michele W; Spector, Paul E
2016-01-01
Safety climate, violence prevention climate, and civility climate were independently developed and linked to domain-specific workplace hazards, although all three were designed to promote the physical and psychological safety of workers. To test domain specificity between conceptually related workplace climates and relevant workplace hazards. Data were collected from 368 persons employed in various industries and descriptive statistics were calculated for all study variables. Correlational and relative weights analyses were used to test for domain specificity. The three climate domains were similarly predictive of most workplace hazards, regardless of domain specificity. This study suggests that the three climate domains share a common higher order construct that may predict relevant workplace hazards better than any of the scales alone.
NASA Astrophysics Data System (ADS)
Payraudeau, S.; Tournoud, M. G.; Cernesson, F.
Distributed modelling in hydrology assess catchment subdivision to take into account physic characteristics. In this paper, we test the effect of land use aggregation scheme on catchment hydrological response. Evolution of intra-subcatchment land use is studied using statistic and entropy methods. The SCS-CN method is used to calculate effective rainfall which is here assimilated to hydrological response. Our purpose is to determine the existence of a critical threshold-area appropriate for the application of hydrological modelling. Land use aggregation effects on effective rainfall is assessed on small mediterranean catchment. The results show that land use aggregation and land use classification type have significant effects on hydrological modelling and in particular on effective rainfall modelling.
NASA Astrophysics Data System (ADS)
Li, Qun; Chen, Qian; Chong, Jing
2017-12-01
In InAlN/GaN heterostructures, alloy clustering-induced InAlN conduction band fluctuations interact with electrons penetrating into the barrier layers and thus affect the electron transport. Based on the statistical description of InAlN compositional distribution, a theoretical model of the conduction band fluctuation scattering (CBFS) is presented. The model calculations show that the CBFS-limited mobility decreases with increasing two-dimensional electron gas sheet density and is inversely proportional to the squared standard deviation of In distribution. The AlN interfacial layer can effectively suppress the CBFS via decreasing the penetration probability. This model is directed towards understanding the transport properties in heterostructure materials with columnar clusters.
One Yard Below: Education Statistics from a Different Angle.
ERIC Educational Resources Information Center
Education Intelligence Agency, Carmichael, CA.
This report offers a different perspective on education statistics by highlighting rarely used "stand-alone" statistics on public education, inputs, outputs, and descriptions, and it uses interactive statistics that combine two or more statistics in an unusual way. It is a report that presents much evidence, but few conclusions. It is not intended…
A Bibliography of Statistical Applications in Geography, Technical Paper No. 9.
ERIC Educational Resources Information Center
Greer-Wootten, Bryn; And Others
Included in this bibliography are resource materials available to both college instructors and students on statistical applications in geographic research. Two stages of statistical development are treated in the bibliography. They are 1) descriptive statistics, in which the sample is the focus of interest, and 2) analytical statistics, in which…
Ene-Obong, Henrietta Nkechi; Onuoha, Nne Ola; Eme, Paul Eze
2017-11-01
This study examined gender roles, family relationships, food security, and nutritional status of households in Ohafia: a matrilineal society in Nigeria. A cross-sectional descriptive study was conducted. Multistage sampling technique was used to select 287 households from three villages: Akanu, Amangwu, and Elu. Qualitative and quantitative data collection methods were adopted, namely, focus group discussions and questionnaires. Anthropometric measurements (height and weight for mothers and children and Mid-Upper Arm Circumference for young children) were taken using standard techniques. The body mass index of women was calculated. All nutritional indices were compared with reference standards. Food insecurity was assessed using the Household Hunger Scale and Dietary Diversity Score, then analysed using the Statistical Product for Service Solution version 21. Data analysis used descriptive statistics. Most (91.2%) of the respondents were female. The matrilineal system known as ikwu nne or iri ala a nne (inheritance through mothers' lineage) is still in place but is changing. One important benefit of the system is the access to land by women. Whereas women participated actively in agriculture, food preparation, and care of family, the men were moving to off-farm activities. High prevalence of household food insecurity (66%) and signs of malnutrition including moderate to severe stunting (48.4%) and wasting (31.7%) in children, household hunger (34.5%), and overweight (27.5%) and obesity (19.2%) among mothers were observed. These communities urgently need gender sensitive food and nutrition interventions. © 2018 John Wiley & Sons Ltd.
Putz, Mihai V.
2009-01-01
The density matrix theory, the ancestor of density functional theory, provides the immediate framework for Path Integral (PI) development, allowing the canonical density be extended for the many-electronic systems through the density functional closure relationship. Yet, the use of path integral formalism for electronic density prescription presents several advantages: assures the inner quantum mechanical description of the system by parameterized paths; averages the quantum fluctuations; behaves as the propagator for time-space evolution of quantum information; resembles Schrödinger equation; allows quantum statistical description of the system through partition function computing. In this framework, four levels of path integral formalism were presented: the Feynman quantum mechanical, the semiclassical, the Feynman-Kleinert effective classical, and the Fokker-Planck non-equilibrium ones. In each case the density matrix or/and the canonical density were rigorously defined and presented. The practical specializations for quantum free and harmonic motions, for statistical high and low temperature limits, the smearing justification for the Bohr’s quantum stability postulate with the paradigmatic Hydrogen atomic excursion, along the quantum chemical calculation of semiclassical electronegativity and hardness, of chemical action and Mulliken electronegativity, as well as by the Markovian generalizations of Becke-Edgecombe electronic focalization functions – all advocate for the reliability of assuming PI formalism of quantum mechanics as a versatile one, suited for analytically and/or computationally modeling of a variety of fundamental physical and chemical reactivity concepts characterizing the (density driving) many-electronic systems. PMID:20087467
Putz, Mihai V
2009-11-10
The density matrix theory, the ancestor of density functional theory, provides the immediate framework for Path Integral (PI) development, allowing the canonical density be extended for the many-electronic systems through the density functional closure relationship. Yet, the use of path integral formalism for electronic density prescription presents several advantages: assures the inner quantum mechanical description of the system by parameterized paths; averages the quantum fluctuations; behaves as the propagator for time-space evolution of quantum information; resembles Schrödinger equation; allows quantum statistical description of the system through partition function computing. In this framework, four levels of path integral formalism were presented: the Feynman quantum mechanical, the semiclassical, the Feynman-Kleinert effective classical, and the Fokker-Planck non-equilibrium ones. In each case the density matrix or/and the canonical density were rigorously defined and presented. The practical specializations for quantum free and harmonic motions, for statistical high and low temperature limits, the smearing justification for the Bohr's quantum stability postulate with the paradigmatic Hydrogen atomic excursion, along the quantum chemical calculation of semiclassical electronegativity and hardness, of chemical action and Mulliken electronegativity, as well as by the Markovian generalizations of Becke-Edgecombe electronic focalization functions - all advocate for the reliability of assuming PI formalism of quantum mechanics as a versatile one, suited for analytically and/or computationally modeling of a variety of fundamental physical and chemical reactivity concepts characterizing the (density driving) many-electronic systems.
Urban pavement surface temperature. Comparison of numerical and statistical approach
NASA Astrophysics Data System (ADS)
Marchetti, Mario; Khalifa, Abderrahmen; Bues, Michel; Bouilloud, Ludovic; Martin, Eric; Chancibaut, Katia
2015-04-01
The forecast of pavement surface temperature is very specific in the context of urban winter maintenance. to manage snow plowing and salting of roads. Such forecast mainly relies on numerical models based on a description of the energy balance between the atmosphere, the buildings and the pavement, with a canyon configuration. Nevertheless, there is a specific need in the physical description and the numerical implementation of the traffic in the energy flux balance. This traffic was originally considered as a constant. Many changes were performed in a numerical model to describe as accurately as possible the traffic effects on this urban energy balance, such as tires friction, pavement-air exchange coefficient, and infrared flux neat balance. Some experiments based on infrared thermography and radiometry were then conducted to quantify the effect fo traffic on urban pavement surface. Based on meteorological data, corresponding pavement temperature forecast were calculated and were compared with fiels measurements. Results indicated a good agreement between the forecast from the numerical model based on this energy balance approach. A complementary forecast approach based on principal component analysis (PCA) and partial least-square regression (PLS) was also developed, with data from thermal mapping usng infrared radiometry. The forecast of pavement surface temperature with air temperature was obtained in the specific case of urban configurtation, and considering traffic into measurements used for the statistical analysis. A comparison between results from the numerical model based on energy balance, and PCA/PLS was then conducted, indicating the advantages and limits of each approach.
Research Education in Undergraduate Occupational Therapy Programs.
ERIC Educational Resources Information Center
Petersen, Paul; And Others
1992-01-01
Of 63 undergraduate occupational therapy programs surveyed, the 38 responses revealed some common areas covered: elementary descriptive statistics, validity, reliability, and measurement. Areas underrepresented include statistical analysis with or without computers, research design, and advanced statistics. (SK)
Policy Safeguards and the Legitimacy of Highway Interdiction
2016-12-01
17 B. BIAS WITHIN LAW ENFORCEMENT ..............................................19 C. STATISTICAL DATA GATHERING...32 3. Controlling Discretion .................................................................36 4. Statistical Data Collection for Traffic Stops...49 A. DESCRIPTION OF STATISTICAL DATA COLLECTED ...............50 B. DATA ORGANIZATION AND ANALYSIS
Fish: A New Computer Program for Friendly Introductory Statistics Help
ERIC Educational Resources Information Center
Brooks, Gordon P.; Raffle, Holly
2005-01-01
All introductory statistics students must master certain basic descriptive statistics, including means, standard deviations and correlations. Students must also gain insight into such complex concepts as the central limit theorem and standard error. This article introduces and describes the Friendly Introductory Statistics Help (FISH) computer…
Neyeloff, Jeruza L; Fuchs, Sandra C; Moreira, Leila B
2012-01-20
Meta-analyses are necessary to synthesize data obtained from primary research, and in many situations reviews of observational studies are the only available alternative. General purpose statistical packages can meta-analyze data, but usually require external macros or coding. Commercial specialist software is available, but may be expensive and focused in a particular type of primary data. Most available softwares have limitations in dealing with descriptive data, and the graphical display of summary statistics such as incidence and prevalence is unsatisfactory. Analyses can be conducted using Microsoft Excel, but there was no previous guide available. We constructed a step-by-step guide to perform a meta-analysis in a Microsoft Excel spreadsheet, using either fixed-effect or random-effects models. We have also developed a second spreadsheet capable of producing customized forest plots. It is possible to conduct a meta-analysis using only Microsoft Excel. More important, to our knowledge this is the first description of a method for producing a statistically adequate but graphically appealing forest plot summarizing descriptive data, using widely available software.
2012-01-01
Background Meta-analyses are necessary to synthesize data obtained from primary research, and in many situations reviews of observational studies are the only available alternative. General purpose statistical packages can meta-analyze data, but usually require external macros or coding. Commercial specialist software is available, but may be expensive and focused in a particular type of primary data. Most available softwares have limitations in dealing with descriptive data, and the graphical display of summary statistics such as incidence and prevalence is unsatisfactory. Analyses can be conducted using Microsoft Excel, but there was no previous guide available. Findings We constructed a step-by-step guide to perform a meta-analysis in a Microsoft Excel spreadsheet, using either fixed-effect or random-effects models. We have also developed a second spreadsheet capable of producing customized forest plots. Conclusions It is possible to conduct a meta-analysis using only Microsoft Excel. More important, to our knowledge this is the first description of a method for producing a statistically adequate but graphically appealing forest plot summarizing descriptive data, using widely available software. PMID:22264277
Cold fission description with constant and varying mass asymmetries
NASA Astrophysics Data System (ADS)
Duarte, S. B.; Rodríguez, O.; Tavares, O. A. P.; Gonçalves, M.; García, F.; Guzmán, F.
1998-05-01
Different descriptions for varying the mass asymmetry in the fragmentation process are used to calculate the cold fission barrier penetrability. The relevance of the appropriate choice for both the description of the prescission phase and inertia coefficient to unify alpha decay, cluster radioactivity, and spontaneous cold fission processes in the same theoretical framework is explicitly shown. We calculate the half-life of all possible partition modes of nuclei of A>200 following the most recent Mass Table by Audi and Wapstra. It is shown that if one uses the description in which the mass asymmetry is maintained constant during the fragmentation process, the experimental half-life values and mass yield of 234U cold fission are satisfactorily reproduced.
ERIC Educational Resources Information Center
Sharief, Mostafa; Naderi, Mahin; Hiedari, Maryam Shoja; Roodbari, Omolbanin; Jalilvand, Mohammad Reza
2012-01-01
The aim of current study is to determine the strengths and weaknesses of descriptive evaluation from the viewpoint of principals, teachers and experts of Chaharmahal and Bakhtiari province. A descriptive survey was performed. Statistical population includes 208 principals, 303 teachers, and 100 executive experts of descriptive evaluation scheme in…
NASA Astrophysics Data System (ADS)
Schilling, Osvaldo F.
2016-11-01
The alternating Fe-Mn layered structures of the compounds FeMnAsxP1-x display properties which have been demonstrated experimentally as very promising as far as commercial applications of the magnetocaloric effect are concerned. However, the theoretical literature on this and other families of magnetocaloric compounds still adopts simple molecular-field models in the description of important statistical mechanical properties like the entropy variation that accompanies applied isothermal magnetic field cycling, as well as the temperature variation following adiabatic magnetic field cycles. In the present paper, a random phase approximation Green function theoretical treatment is applied to such structures. The advantages of such approach are well known since the details of the crystal structure are easily incorporated in the model, as well as a precise description of correlations between neighbor spins can be obtained. We focus on a simple one-exchange parameter Heisenberg model, and the observed first-order phase transitions are reproduced by the introduction of a biquadratic term in the Hamiltonian whose origin is related both to the magnetoelastic coupling with the phonon spectrum in these compounds as well as with the values of spins in the Fe and Mn ions. The calculations are compared with experimental magnetocaloric data for the FeMnAsxP1-x compounds. In particular, the magnetic field dependence for the entropy variation at the transition temperature predicted from the Landau theory of continuous phase transitions is reproduced even in the case of discontinuous transitions.
González, Diego Luis; Pimpinelli, Alberto; Einstein, T L
2017-07-01
We study the effect of hindered aggregation on the island formation process in a one- (1D) and two-dimensional (2D) point-island model for epitaxial growth with arbitrary critical nucleus size i. In our model, the attachment of monomers to preexisting islands is hindered by an additional attachment barrier, characterized by length l_{a}. For l_{a}=0 the islands behave as perfect sinks while for l_{a}→∞ they behave as reflecting boundaries. For intermediate values of l_{a}, the system exhibits a crossover between two different kinds of processes, diffusion-limited aggregation and attachment-limited aggregation. We calculate the growth exponents of the density of islands and monomers for the low coverage and aggregation regimes. The capture-zone (CZ) distributions are also calculated for different values of i and l_{a}. In order to obtain a good spatial description of the nucleation process, we propose a fragmentation model, which is based on an approximate description of nucleation inside of the gaps for 1D and the CZs for 2D. In both cases, the nucleation is described by using two different physically rooted probabilities, which are related with the microscopic parameters of the model (i and l_{a}). We test our analytical model with extensive numerical simulations and previously established results. The proposed model describes excellently the statistical behavior of the system for arbitrary values of l_{a} and i=1, 2, and 3.
Student rating as an effective tool for teacher evaluation.
Aslam, Muhammad Nadeem
2013-01-01
To determine the effectiveness of students' rating as a teacher evaluation tool. Concurrent mixed method. King Edward Medical University, Lahore, from January to June 2010. Anonymous 5-point Likert scale survey questionnaire was conducted involving a single class consisting of 310 students and 12 students were selected for structured interview based on non-probability purposive sampling. Informed consent was procured. They were required to rate 6 teachers and were supposed to discuss teachers' performance in detail. Quantitative data collected through survey was analyzed using SPSS 15 and qualitative data was analyzed with the help of content analysis by identifying themes and patterns from thick descriptions. This student feedback would show the effectiveness in terms of its feasibility and as an indicator of teaching attributes. Descriptive statistics of quantitative data obtained from survey was used to calculate mean and standard deviation for all teachers' individually. This showed the average direction of the student ratings. Percentages of the responses calculated of teacher A were 85.96%, teacher B 65.53, teacher C 65.20%, teacher D 69.62%, teacher E 65.32% and teacher F 64.24% in terms of overall effectiveness of their teaching. Structured interviews generated qualitative data which validated the students' views about strengths and weaknesses of teachers, and helped to determine the effectiveness of their rating and feedback. This simple rating system clearly showed its importance and hence can be used in institutions as a regular evaluating method of teaching faculty.
Statistical Tutorial | Center for Cancer Research
Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. ST is designed as a follow up to Statistical Analysis of Research Data (SARD) held in April 2018. The tutorial will apply the general principles of statistical analysis of research data including descriptive statistics, z- and t-tests of means and mean
Kottner, Jan; Halfens, Ruud
2010-05-01
Institutionally acquired pressure ulcers are used as outcome indicators to assess the quality of pressure ulcer prevention programs. Determining whether quality improvement projects that aim to decrease the proportions of institutionally acquired pressure ulcers lead to real changes in clinical practice depends on the measurement method and statistical analysis used. To examine whether nosocomial pressure ulcer prevalence rates in hospitals in the Netherlands changed, a secondary data analysis using different statistical approaches was conducted of annual (1998-2008) nationwide nursing-sensitive health problem prevalence studies in the Netherlands. Institutions that participated regularly in all survey years were identified. Risk-adjusted nosocomial pressure ulcers prevalence rates, grade 2 to 4 (European Pressure Ulcer Advisory Panel system) were calculated per year and hospital. Descriptive statistics, chi-square trend tests, and P charts based on statistical process control (SPC) were applied and compared. Six of the 905 healthcare institutions participated in every survey year and 11,444 patients in these six hospitals were identified as being at risk for pressure ulcers. Prevalence rates per year ranged from 0.05 to 0.22. Chi-square trend tests revealed statistically significant downward trends in four hospitals but based on SPC methods, prevalence rates of five hospitals varied by chance only. Results of chi-square trend tests and SPC methods were not comparable, making it impossible to decide which approach is more appropriate. P charts provide more valuable information than single P values and are more helpful for monitoring institutional performance. Empirical evidence about the decrease of nosocomial pressure ulcer prevalence rates in the Netherlands is contradictory and limited.
Non-resonant multipactor-A statistical model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rasch, J.; Johansson, J. F.
2012-12-15
High power microwave systems operating in vacuum or near vacuum run the risk of multipactor breakdown. In order to avoid multipactor, it is necessary to make theoretical predictions of critical parameter combinations. These treatments are generally based on the assumption of electrons moving in resonance with the electric field while traversing the gap between critical surfaces. Through comparison with experiments, it has been found that only for small system dimensions will the resonant approach give correct predictions. Apparently, the resonance is destroyed due to the statistical spread in electron emission velocity, and for a more valid description it is necessarymore » to resort to rather complicated statistical treatments of the electron population, and extensive simulations. However, in the limit where resonance is completely destroyed it is possible to use a much simpler treatment, here called non-resonant theory. In this paper, we develop the formalism for this theory, use it to calculate universal curves for the existence of multipactor, and compare with previous results. Two important effects that leads to an increase in the multipactor threshold in comparison with the resonant prediction are identified. These are the statistical spread of impact speed, which leads to a lower average electron impact speed, and the impact of electrons in phase regions where the secondary electrons are immediately reabsorbed, leading to an effective removal of electrons from the discharge.« less
Maćków, Anna; Małachowska-Sobieska, Monika; Demczuk-Włodarczyk, Ewa; Sidorowska, Marta; Szklarska, Alicja; Lipowicz, Anna
2014-01-01
The aim of the study was to present the influence of neurophysiological hippotherapy on the transference of the centre of gravity (COG) among children with cerebral palsy (CP). The study involved 19 children aged 4-13 years suffering from CP who demonstrated an asymmetric (A/P) model of compensation. Body balance was studied with the Cosmogamma Balance Platform. An examination on this platform was performed before and after a session of neurophysiological hippotherapy. In order to compare the correlations and differences between the examinations, the results were analysed using Student's T-test for dependent samples at p ≤ 0.05 as the level of statistical significance and descriptive statistics were calculated. The mean value of the body's centre of gravity in the frontal plane (COG X) was 18.33 (mm) during the first examination, changing by 21.84 (mm) after neurophysiological hippotherapy towards deloading of the antigravity lower limb (p ≤ 0.0001). The other stabilographic parameters increased; however, only the change in average speed of antero - posterior COG oscillation was statistically significant (p = 0.0354). 1. One session of neurophysiological hippotherapy induced statistically significant changes in the position of the centre of gravity in the body in the frontal plane and the average speed of COG oscillation in the sagittal plane among CP children demonstrating an asymmetric model of compensation (A/P).
Quantum Monte Carlo studies of solvated systems
NASA Astrophysics Data System (ADS)
Schwarz, Kathleen; Letchworth Weaver, Kendra; Arias, T. A.; Hennig, Richard G.
2011-03-01
Solvation qualitatively alters the energetics of diverse processes from protein folding to reactions on catalytic surfaces. An explicit description of the solvent in quantum-mechanical calculations requires both a large number of electrons and exploration of a large number of configurations in the phase space of the solvent. These problems can be circumvented by including the effects of solvent through a rigorous classical density-functional description of the liquid environment, thereby yielding free energies and thermodynamic averages directly, while eliminating the need for explicit consideration of the solvent electrons. We have implemented and tested this approach within the CASINO Quantum Monte Carlo code. Our method is suitable for calculations in any basis within CASINO, including b-spline and plane wave trial wavefunctions, and is equally applicable to molecules, surfaces, and crystals. For our preliminary test calculations, we use a simplified description of the solvent in terms of an isodensity continuum dielectric solvation approach, though the method is fully compatible with more reliable descriptions of the solvent we shall employ in the future.
Sexual Assault Prevention and Response Climate DEOCS 4.1 Construct Validity Summary
2017-08-01
DEOCS, (7) examining variance and descriptive statistics (8) examining the relationship among items/areas to reduce multicollinearity, and (9...selecting items that demonstrate the strongest scale properties. Included is a review of the 4.0 description and items, followed by the proposed...Tables 1 – 7 for the description of each measure and corresponding items. Table 1. DEOCS 4.0 Perceptions of Safety Measure Description
Development of polytoxicomania in function of defence from psychoticism.
Nenadović, Milutin M; Sapić, Rosa
2011-01-01
Polytoxicomanic proportions in subpopulations of youth have been growing steadily in recent decades, and this trend is pan-continental. Psychoticism is a psychological construct that assumes special basic dimensions of personality disintegration and cognitive functions. Psychoticism may, in general, be the basis of pathological functioning of youth and influence the patterns of thought, feelings and actions that cause dysfunction. The aim of this study was to determine the distribution of basic dimensions of psychoticism for commitment of youth to abuse psychoactive substances (PAS) in order to reduce disturbing intrapsychic experiences or manifestation of psychotic symptoms. For the purpose of this study, two groups of respondents were formed, balanced by age, gender and family structure of origin (at least one parent alive). The study applied a DELTA-9 instrument for assessment of cognitive disintegration in function of establishing psychoticism and its operationalization. The obtained results were statistically analyzed. From the parameters of descriptive statistics, the arithmetic mean was calculated with measures of dispersion. A cross-tabular analysis of variables tested was performed, as well as statistical significance with Pearson's chi2-test, and analysis of variance. Age structure and gender are approximately represented in the group of polytoximaniacs and the control group. Testing did not confirm the statistically significant difference (p > 0.5). Statistical methodology established that they significantly differed in most variables of psychoticism, polytoxicomaniacs compared with a control group of respondents. Testing confirmed a high statistical significance of differences of variables of psychoticism in the group of respondents for p < 0.001 to p < 0.01. A statistically significant representation of the dimension of psychoticism in the polytoxicomaniac group was established. The presence of factors concerning common executive dysfunction was emphasized.
Radiation from quantum weakly dynamical horizons in loop quantum gravity.
Pranzetti, Daniele
2012-07-06
We provide a statistical mechanical analysis of quantum horizons near equilibrium in the grand canonical ensemble. By matching the description of the nonequilibrium phase in terms of weakly dynamical horizons with a local statistical framework, we implement loop quantum gravity dynamics near the boundary. The resulting radiation process provides a quantum gravity description of the horizon evaporation. For large black holes, the spectrum we derive presents a discrete structure which could be potentially observable.
Lamont, Scott; Brunero, Scott
2018-05-19
Workplace violence prevalence has attracted significant attention within the international nursing literature. Little attention to non-mental health settings and a lack of evaluation rigor have been identified within review literature. To examine the effects of a workplace violence training program in relation to risk assessment and management practices, de-escalation skills, breakaway techniques, and confidence levels, within an acute hospital setting. A quasi-experimental study of nurses using pretest-posttest measurements of educational objectives and confidence levels, with two week follow-up. A 440 bed metropolitan tertiary referral hospital in Sydney, Australia. Nurses working in specialties identified as a 'high risk' for violence. A pre-post-test design was used with participants attending a one day workshop. The workshop evaluation comprised the use of two validated questionnaires: the Continuing Professional Development Reaction questionnaire, and the Confidence in Coping with Patient Aggression Instrument. Descriptive and inferential statistics were calculated. The paired t-test was used to assess the statistical significance of changes in the clinical behaviour intention and confidence scores from pre- to post-intervention. Cohen's d effect sizes were calculated to determine the extent of the significant results. Seventy-eight participants completed both pre- and post-workshop evaluation questionnaires. Statistically significant increases in behaviour intention scores were found in fourteen of the fifteen constructs relating to the three broad workshop objectives, and confidence ratings, with medium to large effect sizes observed in some constructs. A significant increase in overall confidence in coping with patient aggression was also found post-test with large effect size. Positive results were observed from the workplace violence training. Training needs to be complimented by a multi-faceted organisational approach which includes governance, quality and review processes. Copyright © 2018 Elsevier Ltd. All rights reserved.
Manterola, Carlos; Busquets, Juli; Pascual, Marta; Grande, Luis
2006-02-01
The aim of this study was to determine the methodological quality of articles on therapeutic procedures published in Cirugía Española and to study its association with the publication year, center, and subject-matter. A bibliometric study that included all articles on therapeutic procedures published in Cirugía Española between 2001 and 2004 was performed. All kinds of clinical designs were considered, excluding editorials, review articles, letters to editor, and experimental studies. The variables analyzed were: year of publication, center, design, and methodological quality. Methodological quality was determined by a valid and reliable scale. Descriptive statistics (calculation of means, standard deviation and medians) and analytical statistics (Pearson's chi2, nonparametric, ANOVA and Bonferroni tests) were used. A total of 244 articles were studied (197 case series [81%], 28 cohort studies [12%], 17 clinical trials [7%], 1 cross sectional study and 1 case-control study [0.8%]). The studies were performed mainly in Catalonia and Murcia (22% and 16%, respectively). The most frequent subject areas were soft tissue and hepatobiliopancreatic surgery (23% and 19%, respectively). The mean and median of the methodological quality score calculated for the entire series was 10.2 +/- 3.9 points and 9.5 points, respectively. Methodological quality significantly increased by publication year (p < 0.001). An association between methodological quality and subject area was observed but no association was detected with the center performing the study. The methodological quality of articles on therapeutic procedures published in Cirugía Española between 2001 and 2004 is low. However, a statistically significant trend toward improvement was observed.
Mandell, Jacob C; Rhodes, Jeffrey A; Shah, Nehal; Gaviola, Glenn C; Gomoll, Andreas H; Smith, Stacy E
2017-11-01
Accurate assessment of knee articular cartilage is clinically important. Although 3.0 Tesla (T) MRI is reported to offer improved diagnostic performance, literature regarding the clinical impact of MRI field strength is lacking. The purpose of this study is to compare the diagnostic performance of clinical MRI reports for assessment of cartilage at 1.5 and 3.0 T in comparison to arthroscopy. This IRB-approved retrospective study consisted of 300 consecutive knees in 297 patients who had routine clinical MRI and arthroscopy. Descriptions of cartilage from MRI reports of 165 knees at 1.5 T and 135 at 3.0 T were compared with arthroscopy. The sensitivity, specificity, percent of articular surfaces graded concordantly, and percent of articular surfaces graded within one grade of the arthroscopic grading were calculated for each articular surface at 1.5 and 3.0 T. Agreement between MRI and arthroscopy was calculated with the weighted-kappa statistic. Significance testing was performed utilizing the z-test after bootstrapping to obtain the standard error. The sensitivity, specificity, percent of articular surfaces graded concordantly, and percent of articular surfaces graded within one grade were 61.4%, 82.7%, 62.2%, and 77.5% at 1.5 T and 61.8%, 80.6%, 59.5%, and 75.6% at 3.0 T, respectively. The weighted kappa statistic was 0.56 at 1.5 T and 0.55 at 3.0 T. There was no statistically significant difference in any of these parameters between 1.5 and 3.0 T. Factors potentially contributing to the lack of diagnostic advantage of 3.0 T MRI are discussed.
Taylor, Cliff D.; Causey, J. Douglas; Denning, Paul; Hammarstrom, Jane M.; Hayes, Timothy S.; Horton, John D.; Kirschbaum, Michael J.; Parks, Heather L.; Wilson, Anna B.; Wintzer, Niki E.; Zientek, Michael L.
2013-01-01
Chapter 1 of this report summarizes a descriptive model of sediment-hosted stratabound copper deposits. General characteristics and subtypes of sediment-hosted stratabound copper deposits are described based upon worldwide examples. Chapter 2 provides a global database of 170 sediment-hosted copper deposits, along with a statistical evaluation of grade and tonnage data for stratabound deposits, a comparison of stratabound deposits in the CACB with those found elsewhere, a discussion of the distinctive characteristics of the subtypes of sediment-hosted copper deposits that occur within the CACB, and guidelines for using grade and tonnage distributions for assessment of undiscovered resources in sediment-hosted stratabound deposits in the CACB. Chapter 3 presents a new descriptive model of sediment-hosted structurally controlled replacement and vein (SCRV) copper deposits with descriptions of individual deposits of this type in the CACB and elsewhere. Appendix A describes a relational database of tonnage, grade, and other information for more than 100 sediment-hosted copper deposits in the CACB. These data are used to calculate the pre-mining mineral endowment for individual deposits in the CACB and serve as the basis for the grade and tonnage models presented in chapter 2. Appendix B describes three spatial databases (Esri shapefiles) for (1) point locations of more than 500 sediment-hosted copper deposits and prospects, (2) projected surface extent of 86 selected copper ore bodies, and (3) areal extent of 77 open pits, all within the CACB.
ERIC Educational Resources Information Center
Chan, Shiau Wei; Ismail, Zaleha
2014-01-01
The focus of assessment in statistics has gradually shifted from traditional assessment towards alternative assessment where more attention has been paid to the core statistical concepts such as center, variability, and distribution. In spite of this, there are comparatively few assessments that combine the significant three types of statistical…
Skinner, Carl G; Patel, Manish M; Thomas, Jerry D; Miller, Michael A
2011-01-01
Statistical methods are pervasive in medical research and general medical literature. Understanding general statistical concepts will enhance our ability to critically appraise the current literature and ultimately improve the delivery of patient care. This article intends to provide an overview of the common statistical methods relevant to medicine.
34 CFR 668.49 - Institutional fire safety policies and fire statistics.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 34 Education 3 2010-07-01 2010-07-01 false Institutional fire safety policies and fire statistics... fire statistics. (a) Additional definitions that apply to this section. Cause of fire: The factor or... statistics described in paragraph (c) of this section. (2) A description of each on-campus student housing...
Factors Affecting Hemodialysis Adequacy in Cohort of Iranian Patient with End Stage Renal Disease.
Shahdadi, Hosein; Balouchi, Abbas; Sepehri, Zahra; Rafiemanesh, Hosein; Magbri, Awad; Keikhaie, Fereshteh; Shahakzehi, Ahmad; Sarjou, Azizullah Arbabi
2016-08-01
There are many factors that can affect dialysis adequacy; such as the type of vascular access, filter type, device used, and the dose, and rout of erythropoietin stimulation agents (ESA) used. The aim of this study was investigating factors affecting Hemodialysis adequacy in cohort of Iranian patient with end stage renal disease (ESRD). This is a cross-sectional study conducted on 133 Hemodialysis patients referred to two dialysis units in Sistan-Baluchistan province in the cities of Zabol and Iranshahr, Iran. We have looked at, (the effects of the type of vascular access, the filter type, the device used, and the dose, route of delivery, and the type of ESA used) on Hemodialysis adequacy. Dialysis adequacy was calculated using kt/v formula, two-part information questionnaire including demographic data which also including access type, filter type, device used for hemodialysis (HD), type of Eprex injection, route of administration, blood groups and hemoglobin response to ESA were utilized. The data was analyzed using the SPSS v16 statistical software. Descriptive statistical methods, Mann-Whitney statistical test, and multiple regressions were used when applicable. The range of calculated dialysis adequacy is 0.28 to 2.39 (units of adequacy of dialysis). 76.7% of patients are being dialyzed via AVF and 23.3% of patients used central venous catheters (CVC). There was no statistical significant difference between dialysis adequacy, vascular access type, device used for HD (Fresenius and B. Braun), and the filter used for HD (p> 0.05). However, a significant difference was observed between the adequacy of dialysis and Eprex injection and patients' time of dialysis (p <0.05). Subcutaneous ESA (Eprex) injection and dialysis shift (being dialyzed in the morning) can have positive impact on dialysis adequacy. Patients should be educated on the facts that the type of device used for HD and the vascular access used has no significant effects on dialysis adequacy.
Plastic Surgery Statistics in the US: Evidence and Implications.
Heidekrueger, Paul I; Juran, Sabrina; Patel, Anup; Tanna, Neil; Broer, P Niclas
2016-04-01
The American Society of Plastic Surgeons publishes yearly procedural statistics, collected through questionnaires and online via tracking operations and outcomes for plastic surgeons (TOPS). The statistics, disaggregated by U.S. region, leave two important factors unaccounted for: (1) the underlying base population and (2) the number of surgeons performing the procedures. The presented analysis puts the regional distribution of surgeries into perspective and contributes to fulfilling the TOPS legislation objectives. ASPS statistics from 2005 to 2013 were analyzed by geographic region in the U.S. Using population estimates from the 2010 U.S. Census Bureau, procedures were calculated per 100,000 population. Then, based on the ASPS member roster, the rate of surgeries per surgeon by region was calculated and the interaction of these two variables was related to each other. In 2013, 1668,420 esthetic surgeries were performed in the U.S., resulting in the following ASPS ranking: 1st Mountain/Pacific (Region 5; 502,094 procedures, 30 % share), 2nd New England/Middle Atlantic (Region 1; 319,515, 19 %), 3rd South Atlantic (Region 3; 310,441, 19 %), 4th East/West South Central (Region 4; 274,282, 16 %), and 5th East/West North Central (Region 2; 262,088, 16 %). However, considering underlying populations, distribution and ranking appear to be different, displaying a smaller variance in surgical demand. Further, the number of surgeons and rate of procedures show great regional variation. Demand for plastic surgery is influenced by patients' geographic background and varies among U.S. regions. While ASPS data provide important information, additional insight regarding the demand for surgical procedures can be gained by taking certain demographic factors into consideration. This journal requires that the authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266.
Gravitational self-interactions of a degenerate quantum scalar field
NASA Astrophysics Data System (ADS)
Chakrabarty, Sankha S.; Enomoto, Seishi; Han, Yaqi; Sikivie, Pierre; Todarello, Elisa M.
2018-02-01
We develop a formalism to help calculate in quantum field theory the departures from the description of a system by classical field equations. We apply the formalism to a homogeneous condensate with attractive contact interactions and to a homogeneous self-gravitating condensate in critical expansion. In their classical descriptions, such condensates persist forever. We show that in their quantum description, parametric resonance causes quanta to jump in pairs out of the condensate into all modes with wave vector less than some critical value. We calculate, in each case, the time scale over which the homogeneous condensate is depleted and after which a classical description is invalid. We argue that the duration of classicality of inhomogeneous condensates is shorter than that of homogeneous condensates.
NASA Astrophysics Data System (ADS)
Ruggles, Adam J.
2015-11-01
This paper presents improved statistical insight regarding the self-similar scalar mixing process of atmospheric hydrogen jets and the downstream region of under-expanded hydrogen jets. Quantitative planar laser Rayleigh scattering imaging is used to probe both jets. The self-similarity of statistical moments up to the sixth order (beyond the literature established second order) is documented in both cases. This is achieved using a novel self-similar normalization method that facilitated a degree of statistical convergence that is typically limited to continuous, point-based measurements. This demonstrates that image-based measurements of a limited number of samples can be used for self-similar scalar mixing studies. Both jets exhibit the same radial trends of these moments demonstrating that advanced atmospheric self-similarity can be applied in the analysis of under-expanded jets. Self-similar histograms away from the centerline are shown to be the combination of two distributions. The first is attributed to turbulent mixing. The second, a symmetric Poisson-type distribution centered on zero mass fraction, progressively becomes the dominant and eventually sole distribution at the edge of the jet. This distribution is attributed to shot noise-affected pure air measurements, rather than a diffusive superlayer at the jet boundary. This conclusion is reached after a rigorous measurement uncertainty analysis and inspection of pure air data collected with each hydrogen data set. A threshold based upon the measurement noise analysis is used to separate the turbulent and pure air data, and thusly estimate intermittency. Beta-distributions (four parameters) are used to accurately represent the turbulent distribution moments. This combination of measured intermittency and four-parameter beta-distributions constitutes a new, simple approach to model scalar mixing. Comparisons between global moments from the data and moments calculated using the proposed model show excellent agreement. This was attributed to the high quality of the measurements which reduced the width of the correctly identified, noise-affected pure air distribution, with respect to the turbulent mixing distribution. The ignitability of the atmospheric jet is determined using the flammability factor calculated from both kernel density estimated (KDE) PDFs and PDFs generated using the newly proposed model. Agreement between contours from both approaches is excellent. Ignitability of the under-expanded jet is also calculated using KDE PDFs. Contours are compared with those calculated by applying the atmospheric model to the under-expanded jet. Once again, agreement is excellent. This work demonstrates that self-similar scalar mixing statistics and ignitability of atmospheric jets can be accurately described by the proposed model. This description can be applied with confidence to under-expanded jets, which are more realistic of leak and fuel injection scenarios.
FAST COGNITIVE AND TASK ORIENTED, ITERATIVE DATA DISPLAY (FACTOID)
2017-06-01
approaches. As a result, the following assumptions guided our efforts in developing modeling and descriptive metrics for evaluation purposes...Application Evaluation . Our analytic workflow for evaluation is to first provide descriptive statistics about applications across metrics (performance...distributions for evaluation purposes because the goal of evaluation is accurate description , not inference (e.g., prediction). Outliers depicted
NASA Astrophysics Data System (ADS)
Sellaoui, Lotfi; Lima, Éder Cláudio; Dotto, Guilherme Luiz; Dias, Silvio L. P.; Ben Lamine, Abdelmottaleb
Two equilibrium models based on statistical physics, i.e., monolayer model with single energy and multilayer model with saturation, were developed and employed to access the steric and energetic aspects in the adsorption of reactive violet 5 dye (RV-5) on cocoa shell activated carbon (AC) and commercial activated carbon (CAC), at different temperatures (from 298 to 323 K). The results showed that the multilayer model with saturation was able to represent the adsorption system. This model assumes that the adsorption occurs by a formation of certain number of layers. The n values ranged from 1.10 to 2.98, indicating that the adsorbate molecules interacted in an inclined position on the adsorbent surface and aggregate in solution. The study of the total number of the formed layers (1 + L2) showed that the steric hindrance is the dominant factor. The description of the adsorbate-adsorbent interactions by calculation of the adsorption energy indicated that the process occurred by physisorption in nature, since the values were lower than 40 kJ mol-1.
NASA Technical Reports Server (NTRS)
Meinert, D. L.; Malone, D. L.; Voss, A. W. (Principal Investigator); Scarpace, F. L.
1980-01-01
LANDSAT MSS data from four different dates were extracted from computer tapes using a semiautomated digital data handling and analysis system. Reservoirs were extracted from the surrounding land matrix by using a Band 7 density level slice of 3; and descriptive statistics to include mean, variance, and ratio between bands for each of the four bands were calculated. Significant correlations ( 0.80) were identified between the MSS statistics and many trophic indicators from ground truth water quality data collected at 35 reservoirs in the greater Tennessee Valley region. Regression models were developed which gave significant estimates of each reservoir's trophic state as defined by its trophic state index and explained in all four LANDSAT frames at least 85 percent of the variability in the data. To illustrate the spatial variations within reservoirs as well as the relative variations between reservoirs, a table look up elliptical classification was used in conjunction with each reservoir's trophic state index to classify each reservoir on a pixel by pixel basis and produce color coded thematic representations.
Swimming path statistics of an active Brownian particle with time-dependent self-propulsion
NASA Astrophysics Data System (ADS)
Babel, S.; ten Hagen, B.; Löwen, H.
2014-02-01
Typically, in the description of active Brownian particles, a constant effective propulsion force is assumed, which is then subjected to fluctuations in orientation and translation, leading to a persistent random walk with an enlarged long-time diffusion coefficient. Here, we generalize previous results for the swimming path statistics to a time-dependent, and thus in many situations more realistic, propulsion which is a prescribed input. We analytically calculate both the noise-free and the noise-averaged trajectories for time-periodic propulsion under the action of an additional torque. In the deterministic case, such an oscillatory microswimmer moves on closed paths that can be much more complicated than the commonly observed straight lines and circles. When exposed to random fluctuations, the mean trajectories turn out to be self-similar curves which bear the characteristics of their noise-free counterparts. Furthermore, we consider a propulsion force which scales in time t as ∝tα (with α = 0,1,2, …) and analyze the resulting superdiffusive behavior. Our predictions are verifiable for diffusiophoretic artificial microswimmers with prescribed propulsion protocols.
Serafica, Reimund; Angosta, Alona D
2016-09-01
The purpose of this research study was to examine whether level of acculturation is a predictor of body mass index, waist circumference, and waist-hip ratio in Filipino Americans with hypertension in the United States. The Filipino Americans (N = 108) were recruited from a primary care clinic in the United States. Two instruments were used to collect and operationalize the variables, specifically: (1) Socioeconomic/Demographic Questionnaire and (2) A Short Acculturation Scale for Filipino Americans. Descriptive statistics and partial least squares were used to calculate the results. The partial least square path model identified acculturation as a predictor of body mass index, wait circumference, and waist-hip ratio among Filipino Americans. The positive path coefficient (β = 0.384) was statistically significant (t = 5.92, P < .001). Health care providers need to stress the importance of the degree of acculturation when developing culturally appropriate lifestyle and health promotion interventions among immigrant patients with hypertension. Copyright © 2016 American Society of Hypertension. Published by Elsevier Inc. All rights reserved.
Mathematical and Statistical Software Index. Final Report.
ERIC Educational Resources Information Center
Black, Doris E., Comp.
Brief descriptions are provided of general-purpose mathematical and statistical software, including 27 "stand-alone" programs, three subroutine systems, and two nationally recognized statistical packages, which are available in the Air Force Human Resources Laboratory (AFHRL) software library. This index was created to enable researchers…
Education Statistics Quarterly, Spring 2001.
ERIC Educational Resources Information Center
Education Statistics Quarterly, 2001
2001-01-01
The "Education Statistics Quarterly" gives a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications, data products and funding opportunities developed over a 3-month period. Each issue…
What can 35 years and over 700,000 measurements tell us about noise exposure in the mining industry?
Roberts, Benjamin; Sun, Kan; Neitzel, Richard L
2017-01-01
To analyse over 700,000 cross-sectional measurements from the Mine Safety and Health Administration (MHSA) and develop statistical models to predict noise exposure for a worker. Descriptive statistics were used to summarise the data. Two linear regression models were used to predict noise exposure based on MSHA-permissible exposure limit (PEL) and action level (AL), respectively. Twofold cross validation was used to compare the exposure estimates from the models to actual measurement. The mean difference and t-statistic was calculated for each job title to determine whether the model predictions were significantly different from the actual data. Measurements were acquired from MSHA through a Freedom of Information Act request. From 1979 to 2014, noise exposure has decreased. Measurements taken before the implementation of MSHA's revised noise regulation in 2000 were on average 4.5 dBA higher than after the law was implemented. Both models produced exposure predictions that were less than 1 dBA different than the holdout data. Overall noise levels in mines have been decreasing. However, this decrease has not been uniform across all mining sectors. The exposure predictions from the model will be useful to help predict hearing loss in workers in the mining industry.
Better prognostic marker in ICU - APACHE II, SOFA or SAP II!
Naqvi, Iftikhar Haider; Mahmood, Khalid; Ziaullaha, Syed; Kashif, Syed Mohammad; Sharif, Asim
2016-01-01
This study was designed to determine the comparative efficacy of different scoring system in assessing the prognosis of critically ill patients. This was a retrospective study conducted in medical intensive care unit (MICU) and high dependency unit (HDU) Medical Unit III, Civil Hospital, from April 2012 to August 2012. All patients over age 16 years old who have fulfilled the criteria for MICU admission were included. Predictive mortality of APACHE II, SAP II and SOFA were calculated. Calibration and discrimination were used for validity of each scoring model. A total of 96 patients with equal gender distribution were enrolled. The average APACHE II score in non-survivors (27.97+8.53) was higher than survivors (15.82+8.79) with statistically significant p value (<0.001). The average SOFA score in non-survivors (9.68+4.88) was higher than survivors (5.63+3.63) with statistically significant p value (<0.001). SAP II average score in non-survivors (53.71+19.05) was higher than survivors (30.18+16.24) with statistically significant p value (<0.001). All three tested scoring models (APACHE II, SAP II and SOFA) would be accurate enough for a general description of our ICU patients. APACHE II has showed better calibration and discrimination power than SAP II and SOFA.
Spector, Paul E.
2016-01-01
Background Safety climate, violence prevention climate, and civility climate were independently developed and linked to domain-specific workplace hazards, although all three were designed to promote the physical and psychological safety of workers. Purpose To test domain specificity between conceptually related workplace climates and relevant workplace hazards. Methods Data were collected from 368 persons employed in various industries and descriptive statistics were calculated for all study variables. Correlational and relative weights analyses were used to test for domain specificity. Results The three climate domains were similarly predictive of most workplace hazards, regardless of domain specificity. Discussion This study suggests that the three climate domains share a common higher order construct that may predict relevant workplace hazards better than any of the scales alone. PMID:27110930
Forecast of future aviation fuels: The model
NASA Technical Reports Server (NTRS)
Ayati, M. B.; Liu, C. Y.; English, J. M.
1981-01-01
A conceptual models of the commercial air transportation industry is developed which can be used to predict trends in economics, demand, and consumption. The methodology is based on digraph theory, which considers the interaction of variables and propagation of changes. Air transportation economics are treated by examination of major variables, their relationships, historic trends, and calculation of regression coefficients. A description of the modeling technique and a compilation of historic airline industry statistics used to determine interaction coefficients are included. Results of model validations show negligible difference between actual and projected values over the twenty-eight year period of 1959 to 1976. A limited application of the method presents forecasts of air tranportation industry demand, growth, revenue, costs, and fuel consumption to 2020 for two scenarios of future economic growth and energy consumption.
NASA Astrophysics Data System (ADS)
Obuchowski, Nancy A.; Bullen, Jennifer A.
2018-04-01
Receiver operating characteristic (ROC) analysis is a tool used to describe the discrimination accuracy of a diagnostic test or prediction model. While sensitivity and specificity are the basic metrics of accuracy, they have many limitations when characterizing test accuracy, particularly when comparing the accuracies of competing tests. In this article we review the basic study design features of ROC studies, illustrate sample size calculations, present statistical methods for measuring and comparing accuracy, and highlight commonly used ROC software. We include descriptions of multi-reader ROC study design and analysis, address frequently seen problems of verification and location bias, discuss clustered data, and provide strategies for testing endpoints in ROC studies. The methods are illustrated with a study of transmission ultrasound for diagnosing breast lesions.
GAMBIT: the global and modular beyond-the-standard-model inference tool
NASA Astrophysics Data System (ADS)
Athron, Peter; Balazs, Csaba; Bringmann, Torsten; Buckley, Andy; Chrząszcz, Marcin; Conrad, Jan; Cornell, Jonathan M.; Dal, Lars A.; Dickinson, Hugh; Edsjö, Joakim; Farmer, Ben; Gonzalo, Tomás E.; Jackson, Paul; Krislock, Abram; Kvellestad, Anders; Lundberg, Johan; McKay, James; Mahmoudi, Farvah; Martinez, Gregory D.; Putze, Antje; Raklev, Are; Ripken, Joachim; Rogan, Christopher; Saavedra, Aldo; Savage, Christopher; Scott, Pat; Seo, Seon-Hee; Serra, Nicola; Weniger, Christoph; White, Martin; Wild, Sebastian
2017-11-01
We describe the open-source global fitting package GAMBIT: the Global And Modular Beyond-the-Standard-Model Inference Tool. GAMBIT combines extensive calculations of observables and likelihoods in particle and astroparticle physics with a hierarchical model database, advanced tools for automatically building analyses of essentially any model, a flexible and powerful system for interfacing to external codes, a suite of different statistical methods and parameter scanning algorithms, and a host of other utilities designed to make scans faster, safer and more easily-extendible than in the past. Here we give a detailed description of the framework, its design and motivation, and the current models and other specific components presently implemented in GAMBIT. Accompanying papers deal with individual modules and present first GAMBIT results. GAMBIT can be downloaded from gambit.hepforge.org.
Accurate simulations of helium pick-up experiments using a rejection-free Monte Carlo method
NASA Astrophysics Data System (ADS)
Dutra, Matthew; Hinde, Robert
2018-04-01
In this paper, we present Monte Carlo simulations of helium droplet pick-up experiments with the intention of developing a robust and accurate theoretical approach for interpreting experimental helium droplet calorimetry data. Our approach is capable of capturing the evaporative behavior of helium droplets following dopant acquisition, allowing for a more realistic description of the pick-up process. Furthermore, we circumvent the traditional assumption of bulk helium behavior by utilizing density functional calculations of the size-dependent helium droplet chemical potential. The results of this new Monte Carlo technique are compared to commonly used Poisson pick-up statistics for simulations that reflect a broad range of experimental parameters. We conclude by offering an assessment of both of these theoretical approaches in the context of our observed results.
Radar derived spatial statistics of summer rain. Volume 1: Experiment description
NASA Technical Reports Server (NTRS)
Katz, I.; Arnold, A.; Goldhirsh, J.; Konrad, T. G.; Vann, W. L.; Dobson, E. B.; Rowland, J. R.
1975-01-01
An experiment was performed at Wallops Island, Virginia, to obtain a statistical description of summer rainstorms. Its purpose was to obtain information needed for design of earth and space communications systems in which precipitation in the earth's atmosphere scatters or attenuates the radio signal. Rainstorms were monitored with the high resolution SPANDAR radar and the 3-dimensional structures of the storms were recorded on digital tape. The equipment, the experiment, and tabulated data obtained during the experiment are described.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Statistics. 1065.602 Section 1065.602... PROCEDURES Calculations and Data Requirements § 1065.602 Statistics. (a) Overview. This section contains equations and example calculations for statistics that are specified in this part. In this section we use...
Cole, William G.; Michael, Patricia; Blois, Marsden S.
1987-01-01
A computer program was created to use information about the statistical distribution of words in journal abstracts to make probabilistic judgments about the level of description (e.g. molecular, cell, organ) of medical text. Statistical analysis of 7,409 journal abstracts taken from three medical journals representing distinct levels of description revealed that many medical words seem to be highly specific to one or another level of description. For example, the word adrenoreceptors occurred only in the American Journal of Physiology, never in Journal of Biological Chemistry or in Journal of American Medical Association. Such highly specific words occured so frequently that the automatic classification program was able to classify correctly 45 out of 45 test abstracts, with 100% confidence. These findings are interpreted in terms of both a theory of the structure of medical knowledge and the pragmatics of automatic classification.
78 FR 34101 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-06
... and basic descriptive statistics on the quantity and type of consumer-reported patient safety events... conduct correlations, cross tabulations of responses and other statistical analysis. Estimated Annual...
NASA Technical Reports Server (NTRS)
Barth, Timothy J.
2016-01-01
This chapter discusses the ongoing development of combined uncertainty and error bound estimates for computational fluid dynamics (CFD) calculations subject to imposed random parameters and random fields. An objective of this work is the construction of computable error bound formulas for output uncertainty statistics that guide CFD practitioners in systematically determining how accurately CFD realizations should be approximated and how accurately uncertainty statistics should be approximated for output quantities of interest. Formal error bounds formulas for moment statistics that properly account for the presence of numerical errors in CFD calculations and numerical quadrature errors in the calculation of moment statistics have been previously presented in [8]. In this past work, hierarchical node-nested dense and sparse tensor product quadratures are used to calculate moment statistics integrals. In the present work, a framework has been developed that exploits the hierarchical structure of these quadratures in order to simplify the calculation of an estimate of the quadrature error needed in error bound formulas. When signed estimates of realization error are available, this signed error may also be used to estimate output quantity of interest probability densities as a means to assess the impact of realization error on these density estimates. Numerical results are presented for CFD problems with uncertainty to demonstrate the capabilities of this framework.
Research design and statistical methods in Pakistan Journal of Medical Sciences (PJMS).
Akhtar, Sohail; Shah, Syed Wadood Ali; Rafiq, M; Khan, Ajmal
2016-01-01
This article compares the study design and statistical methods used in 2005, 2010 and 2015 of Pakistan Journal of Medical Sciences (PJMS). Only original articles of PJMS were considered for the analysis. The articles were carefully reviewed for statistical methods and designs, and then recorded accordingly. The frequency of each statistical method and research design was estimated and compared with previous years. A total of 429 articles were evaluated (n=74 in 2005, n=179 in 2010, n=176 in 2015) in which 171 (40%) were cross-sectional and 116 (27%) were prospective study designs. A verity of statistical methods were found in the analysis. The most frequent methods include: descriptive statistics (n=315, 73.4%), chi-square/Fisher's exact tests (n=205, 47.8%) and student t-test (n=186, 43.4%). There was a significant increase in the use of statistical methods over time period: t-test, chi-square/Fisher's exact test, logistic regression, epidemiological statistics, and non-parametric tests. This study shows that a diverse variety of statistical methods have been used in the research articles of PJMS and frequency improved from 2005 to 2015. However, descriptive statistics was the most frequent method of statistical analysis in the published articles while cross-sectional study design was common study design.
Applied statistics in ecology: common pitfalls and simple solutions
E. Ashley Steel; Maureen C. Kennedy; Patrick G. Cunningham; John S. Stanovick
2013-01-01
The most common statistical pitfalls in ecological research are those associated with data exploration, the logic of sampling and design, and the interpretation of statistical results. Although one can find published errors in calculations, the majority of statistical pitfalls result from incorrect logic or interpretation despite correct numerical calculations. There...
Quantitative metrics for assessment of chemical image quality and spatial resolution
Kertesz, Vilmos; Cahill, John F.; Van Berkel, Gary J.
2016-02-28
Rationale: Currently objective/quantitative descriptions of the quality and spatial resolution of mass spectrometry derived chemical images are not standardized. Development of these standardized metrics is required to objectively describe chemical imaging capabilities of existing and/or new mass spectrometry imaging technologies. Such metrics would allow unbiased judgment of intra-laboratory advancement and/or inter-laboratory comparison for these technologies if used together with standardized surfaces. Methods: We developed two image metrics, viz., chemical image contrast (ChemIC) based on signal-to-noise related statistical measures on chemical image pixels and corrected resolving power factor (cRPF) constructed from statistical analysis of mass-to-charge chronograms across features of interest inmore » an image. These metrics, quantifying chemical image quality and spatial resolution, respectively, were used to evaluate chemical images of a model photoresist patterned surface collected using a laser ablation/liquid vortex capture mass spectrometry imaging system under different instrument operational parameters. Results: The calculated ChemIC and cRPF metrics determined in an unbiased fashion the relative ranking of chemical image quality obtained with the laser ablation/liquid vortex capture mass spectrometry imaging system. These rankings were used to show that both chemical image contrast and spatial resolution deteriorated with increasing surface scan speed, increased lane spacing and decreasing size of surface features. Conclusions: ChemIC and cRPF, respectively, were developed and successfully applied for the objective description of chemical image quality and spatial resolution of chemical images collected from model surfaces using a laser ablation/liquid vortex capture mass spectrometry imaging system.« less
Quantitative metrics for assessment of chemical image quality and spatial resolution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kertesz, Vilmos; Cahill, John F.; Van Berkel, Gary J.
Rationale: Currently objective/quantitative descriptions of the quality and spatial resolution of mass spectrometry derived chemical images are not standardized. Development of these standardized metrics is required to objectively describe chemical imaging capabilities of existing and/or new mass spectrometry imaging technologies. Such metrics would allow unbiased judgment of intra-laboratory advancement and/or inter-laboratory comparison for these technologies if used together with standardized surfaces. Methods: We developed two image metrics, viz., chemical image contrast (ChemIC) based on signal-to-noise related statistical measures on chemical image pixels and corrected resolving power factor (cRPF) constructed from statistical analysis of mass-to-charge chronograms across features of interest inmore » an image. These metrics, quantifying chemical image quality and spatial resolution, respectively, were used to evaluate chemical images of a model photoresist patterned surface collected using a laser ablation/liquid vortex capture mass spectrometry imaging system under different instrument operational parameters. Results: The calculated ChemIC and cRPF metrics determined in an unbiased fashion the relative ranking of chemical image quality obtained with the laser ablation/liquid vortex capture mass spectrometry imaging system. These rankings were used to show that both chemical image contrast and spatial resolution deteriorated with increasing surface scan speed, increased lane spacing and decreasing size of surface features. Conclusions: ChemIC and cRPF, respectively, were developed and successfully applied for the objective description of chemical image quality and spatial resolution of chemical images collected from model surfaces using a laser ablation/liquid vortex capture mass spectrometry imaging system.« less
End-of-life care practices of critical care nurses: A national cross-sectional survey.
Ranse, Kristen; Yates, Patsy; Coyer, Fiona
2016-05-01
The critical care context presents important opportunities for nurses to deliver skilled, comprehensive care to patients at the end of life and their families. Limited research has identified the actual end-of-life care practices of critical care nurses. To identify the end-of-life care practices of critical care nurses. A national cross-sectional online survey. The survey was distributed to members of an Australian critical care nursing association and 392 critical care nurses (response rate 25%) completed the survey. Exploratory factor analysis using principal axis factoring with oblique rotation was undertaken on survey responses to identify the domains of end-of-life care practice. Descriptive statistics were calculated for individual survey items. Exploratory factor analysis identified six domains of end-of-life care practice: information sharing, environmental modification, emotional support, patient and family centred decision-making, symptom management and spiritual support. Descriptive statistics identified a high level of engagement in information sharing and environmental modification practices and less frequent engagement in items from the emotional support and symptom management practice areas. The findings of this study identified domains of end-of-life care practice, and critical care nurse engagement in these practices. The findings highlight future training and practice development opportunities, including the need for experiential learning targeting the emotional support practice domain. Further research is needed to enhance knowledge of symptom management practices during the provision of end-of-life care to inform and improve practice in this area. Copyright © 2015 Australian College of Critical Care Nurses Ltd. Published by Elsevier Ltd. All rights reserved.
Rethinking wave-kinetic theory applied to zonal flows
NASA Astrophysics Data System (ADS)
Parker, Jeffrey
2017-10-01
Over the past two decades, a number of studies have employed a wave-kinetic theory to describe fluctuations interacting with zonal flows. Recent work has uncovered a defect in this wave-kinetic formulation: the system is dominated by the growth of (arbitrarily) small-scale zonal structures. Theoretical calculations of linear growth rates suggest, and nonlinear simulations confirm, that this system leads to the concentration of zonal flow energy in the smallest resolved scales, irrespective of the numerical resolution. This behavior results from the assumption that zonal flows are extremely long wavelength, leading to the neglect of key terms responsible for conservation of enstrophy. A corrected theory, CE2-GO, is presented; it is free of these errors yet preserves the intuitive phase-space mathematical structure. CE2-GO properly conserves enstrophy as well as energy, and yields accurate growth rates of zonal flow. Numerical simulations are shown to be well-behaved and not dependent on box size. The steady-state limit simplifies into an exact wave-kinetic form which offers the promise of deeper insight into the behavior of wavepackets. The CE2-GO theory takes its place in a hierarchy of models as the geometrical-optics reduction of the more complete cumulant-expansion statistical theory CE2. The new theory represents the minimal statistical description, enabling an intuitive phase-space formulation and an accurate description of turbulence-zonal flow dynamics. This work was supported by an NSF Graduate Research Fellowship, a US DOE Fusion Energy Sciences Fellowship, and US DOE Contract Nos. DE-AC52-07NA27344 and DE-AC02-09CH11466.
Numerical Investigations of Moisture Distribution in a Selected Anisotropic Soil Medium
NASA Astrophysics Data System (ADS)
Iwanek, M.
2018-01-01
The moisture of soil profile changes both in time and space and depends on many factors. Changes of the quantity of water in soil can be determined on the basis of in situ measurements, but numerical methods are increasingly used for this purpose. The quality of the results obtained using pertinent software packages depends on appropriate description and parameterization of soil medium. Thus, the issue of providing for the soil anisotropy phenomenon gains a big importance. Although anisotropy can be taken into account in many numerical models, isotopic soil is often assumed in the research process. However, this assumption can be a reason for incorrect results in the simulations of water changes in soil medium. In this article, results of numerical simulations of moisture distribution in the selected soil profile were presented. The calculations were conducted assuming isotropic and anisotropic conditions. Empirical verification of the results obtained in the numerical investigations indicated statistical essential discrepancies for the both analyzed conditions. However, better fitting measured and calculated moisture values was obtained for the case of providing for anisotropy in the simulation model.
Regression analysis for solving diagnosis problem of children's health
NASA Astrophysics Data System (ADS)
Cherkashina, Yu A.; Gerget, O. M.
2016-04-01
The paper includes results of scientific researches. These researches are devoted to the application of statistical techniques, namely, regression analysis, to assess the health status of children in the neonatal period based on medical data (hemostatic parameters, parameters of blood tests, the gestational age, vascular-endothelial growth factor) measured at 3-5 days of children's life. In this paper a detailed description of the studied medical data is given. A binary logistic regression procedure is discussed in the paper. Basic results of the research are presented. A classification table of predicted values and factual observed values is shown, the overall percentage of correct recognition is determined. Regression equation coefficients are calculated, the general regression equation is written based on them. Based on the results of logistic regression, ROC analysis was performed, sensitivity and specificity of the model are calculated and ROC curves are constructed. These mathematical techniques allow carrying out diagnostics of health of children providing a high quality of recognition. The results make a significant contribution to the development of evidence-based medicine and have a high practical importance in the professional activity of the author.
The Molybdenum titanium Phase Diagram Evaluated from Ab initio Calculations
2016-10-07
thermodynamic properties of this binary system are not well known and two conflicting descriptions of the β-phase stability have been presented in the...computational thermodynamics CALPHAD approach [13] and the Thermo-Calc software [14]. These studies led to two conflicting descriptions of the stability of...energy calculations, with an energy cutoff separating core and valence states of -6 Ry. 2.2. Thermodynamic modeling The formation enthalpy of a
XPOSE: the Exxon Nuclear revised LEOPARD
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skogen, F.B.
1975-04-01
Main differences between XPOSE and LEOPARD codes used to generate fast and thermal neutron spectra and cross sections are presented. Models used for fast and thermal spectrum calculations as well as the depletion calculations considering U-238 chain, U-235 chain, xenon and samarium, fission products and boron-10 are described. A detailed description of the input required to run XPOSE and a description of the output are included. (FS)
2012 aerospace medical certification statistical handbook.
DOT National Transportation Integrated Search
2013-12-01
The annual Aerospace Medical Certification Statistical Handbook reports descriptive : characteristics of all active U.S. civil aviation airmen and the aviation medical examiners (AMEs) that : perform the required medical examinations. The 2012 annual...
Teaching Statistics Using Classic Psychology Research: An Activities-Based Approach
ERIC Educational Resources Information Center
Holmes, Karen Y.; Dodd, Brett A.
2012-01-01
In this article, we discuss a collection of active learning activities derived from classic psychology studies that illustrate the appropriate use of descriptive and inferential statistics. (Contains 2 tables.)
NASA Astrophysics Data System (ADS)
Zhu, Jun
Ru and Pt are candidate additional component for improving the high temperature properties of Ni-base superalloys. A thermodynamic description of the Ni-Al-Cr-Ru-Pt system, serving as an essential knowledge base for better alloy design and processing control, was developed in the present study by means of thermodynamic modeling coupled with experimental investigations of phase equilibria. To deal with the order/disorder transition occurring in the Ni-base superalloys, a physical sound model, Cluster/Site Approximation (CSA) was used to describe the fcc phases. The CSA offers computational advantages, without loss of accuracy, over the Cluster Variation Method (CVM) in the calculation of multicomponent phase diagrams. It has been successfully applied to fcc phases in calculating technologically important Ni-Al-Cr phase diagrams. Our effort in this study focused on the two key ternary systems: Ni-Al-Ru and Ni-Al-Pt. The CSA calculated Ni-Al-Ru ternary phase diagrams are in good agreement with the experimental results in the literature and from the current study. A thermodynamic description of quaternary Ni-Al-Cr-Ru was obtained based on the descriptions of the lower order systems and the calculated results agree with experimental data available in literature and in the current study. The Ni-Al-Pt system was thermodynamically modeled based on the limited experimental data available in the literature and obtained from the current study. With the help of the preliminary description, a number of alloy compositions were selected for further investigation. The information obtained was used to improve the current modeling. A thermodynamic description of the Ni-Al-Cr-Pt quaternary was then obtained via extrapolation from its constituent lower order systems. The thermodynamic description for Ni-base superalloy containing Al, Cr, Ru and Pt was obtained via extrapolation. It is believed to be reliable and useful to guide the alloy design and further experimental investigation.
Alar-columellar and lateral nostril changes following tongue-in-groove rhinoplasty.
Shah, Ajul; Pfaff, Miles; Kinsman, Gianna; Steinbacher, Derek M
2015-04-01
Repositioning the medial crura cephalically onto the caudal septum (tongue-in-groove; TIG) allows alteration of the columella, ala, and nasal tip to address alar-columellar disproportion as seen from the lateral view. To date, quantitative analysis of nostril dimension, alar-columellar relationship, and nasal tip changes following the TIG rhinoplasty technique have not been described. The present study aims to evaluate post-operative lateral morphometric changes following TIG. Pre- and post-operative lateral views of a series of consecutive patients who underwent TIG rhinoplasty were produced from 3D images at multiple time points (≤2 weeks, 4-10 weeks, and >10 weeks post-operatively) for analysis. The 3D images were converted to 2D and set to scale. Exposed lateral nostril area, alar-columellar disproportion (divided into superior and inferior heights), nasolabial angle, nostril height, and nostril length were calculated and statistically analyzed using a pairwise t test. A P ≤ 0.05 was considered statistically significant. Ninety-four lateral views were analyzed from 20 patients (16 females; median age: 31.8). One patient had a history of current tobacco cigarette use. Lateral nostril area decreased at all time points post-operatively, in a statistically significant fashion. Alar-columellar disproportion was reduced following TIG at all time points. The nasolabial angle significantly increased post-operatively at ≤2 weeks, 4-10 weeks, and >10, all in a statistically significant fashion. Nostril height and nostril length decreased at all post-operative time points. Morphometric analysis reveals reduction in alar-columellar disproportion and lateral nostril shows following TIG rhinoplasty. Tip rotation, as a function of nasolabial angle, also increased. These results provide quantitative substantiation for qualitative descriptions attributed to the TIG technique. Future studies will focus on area and volumetric measurements, and assessment of long-term stability. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266.
USDA-ARS?s Scientific Manuscript database
The introduction to the second edition of the Compendium of Apple and Pear Diseases contains a general description of genus and species of commercial importance, some general information about growth and fruiting habits as well as recent production statistics. A general description of major scion c...
General Description of Fission Observables: GEF Model Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmidt, K.-H.; Jurado, B., E-mail: jurado@cenbg.in2p3.fr; Amouroux, C.
2016-01-15
The GEF (“GEneral description of Fission observables”) model code is documented. It describes the observables for spontaneous fission, neutron-induced fission and, more generally, for fission of a compound nucleus from any other entrance channel, with given excitation energy and angular momentum. The GEF model is applicable for a wide range of isotopes from Z = 80 to Z = 112 and beyond, up to excitation energies of about 100 MeV. The results of the GEF model are compared with fission barriers, fission probabilities, fission-fragment mass- and nuclide distributions, isomeric ratios, total kinetic energies, and prompt-neutron and prompt-gamma yields and energymore » spectra from neutron-induced and spontaneous fission. Derived properties of delayed neutrons and decay heat are also considered. The GEF model is based on a general approach to nuclear fission that explains a great part of the complex appearance of fission observables on the basis of fundamental laws of physics and general properties of microscopic systems and mathematical objects. The topographic theorem is used to estimate the fission-barrier heights from theoretical macroscopic saddle-point and ground-state masses and experimental ground-state masses. Motivated by the theoretically predicted early localisation of nucleonic wave functions in a necked-in shape, the properties of the relevant fragment shells are extracted. These are used to determine the depths and the widths of the fission valleys corresponding to the different fission channels and to describe the fission-fragment distributions and deformations at scission by a statistical approach. A modified composite nuclear-level-density formula is proposed. It respects some features in the superfluid regime that are in accordance with new experimental findings and with theoretical expectations. These are a constant-temperature behaviour that is consistent with a considerably increased heat capacity and an increased pairing condensation energy that is consistent with the collective enhancement of the level density. The exchange of excitation energy and nucleons between the nascent fragments on the way from saddle to scission is estimated according to statistical mechanics. As a result, excitation energy and unpaired nucleons are predominantly transferred to the heavy fragment in the superfluid regime. This description reproduces some rather peculiar observed features of the prompt-neutron multiplicities and of the even-odd effect in fission-fragment Z distributions. For completeness, some conventional descriptions are used for calculating pre-equilibrium emission, fission probabilities and statistical emission of neutrons and gamma radiation from the excited fragments. Preference is given to simple models that can also be applied to exotic nuclei compared to more sophisticated models that need precise empirical input of nuclear properties, e.g. spectroscopic information. The approach reveals a high degree of regularity and provides a considerable insight into the physics of the fission process. Fission observables can be calculated with a precision that complies with the needs for applications in nuclear technology without specific adjustments to measured data of individual systems. The GEF executable runs out of the box with no need for entering any empirical data. This unique feature is of valuable importance, because the number of systems and energies of potential significance for fundamental and applied science will never be possible to be measured. The relevance of the approach for examining the consistency of experimental results and for evaluating nuclear data is demonstrated.« less
Laharz_py: GIS tools for automated mapping of lahar inundation hazard zones
Schilling, Steve P.
2014-01-01
Laharz_py is written in the Python programming language as a suite of tools for use in ArcMap Geographic Information System (GIS). Primarily, Laharz_py is a computational model that uses statistical descriptions of areas inundated by past mass-flow events to forecast areas likely to be inundated by hypothetical future events. The forecasts use physically motivated and statistically calibrated power-law equations that each has a form A = cV2/3, relating mass-flow volume (V) to planimetric or cross-sectional areas (A) inundated by an average flow as it descends a given drainage. Calibration of the equations utilizes logarithmic transformation and linear regression to determine the best-fit values of c. The software uses values of V, an algorithm for idenitifying mass-flow source locations, and digital elevation models of topography to portray forecast hazard zones for lahars, debris flows, or rock avalanches on maps. Laharz_py offers two methods to construct areas of potential inundation for lahars: (1) Selection of a range of plausible V values results in a set of nested hazard zones showing areas likely to be inundated by a range of hypothetical flows; and (2) The user selects a single volume and a confidence interval for the prediction. In either case, Laharz_py calculates the mean expected A and B value from each user-selected value of V. However, for the second case, a single value of V yields two additional results representing the upper and lower values of the confidence interval of prediction. Calculation of these two bounding predictions require the statistically calibrated prediction equations, a user-specified level of confidence, and t-distribution statistics to calculate the standard error of regression, standard error of the mean, and standard error of prediction. The portrayal of results from these two methods on maps compares the range of inundation areas due to prediction uncertainties with uncertainties in selection of V values. The Open-File Report document contains an explanation of how to install and use the software. The Laharz_py software includes an example data set for Mount Rainier, Washington. The second part of the documentation describes how to use all of the Laharz_py tools in an example dataset at Mount Rainier, Washington.
[Smoking impact on mortality in Spain in 2012].
Gutiérrez-Abejón, Eduardo; Rejas-Gutiérrez, Javier; Criado-Espegel, Paloma; Campo-Ortega, Eva P; Breñas-Villalón, María T; Martín-Sobrino, Nieves
2015-12-21
Smoking is an important public health problem, and is one of the main avoidable causes of morbidity and early mortality. The aim was to estimate the mortality attributable to smoking and its impact on premature mortality in Spain in the year 2012. Descriptive, cross-sectional study, carried out on the Spanish population aged ≥ 18 years in 2012. The prevalence of smoking by age and sex was obtained from the National Health Survey 2011-2012, and the number of deaths by age, sex and cause was obtained from the vital statistics of the National Institute of Statistics. The proportion of deaths attributable to smoking was calculated according to sex and age group, from the etiological fraction of the population. Likewise, loss of potential years of life lost (PYLL) and the mean potential years of life lost (MPYLL) were also calculated. In 2012, smoking caused 60,456 deaths which accounted for 15.23% of all deaths. Trachea-bronchial-lung cancer in men and other cardiopathies in women mostly contributed to this mortality. The PYLL were 184,426, and the MPYLL were 3.25 years in men and 2.42 years in women. In 2012, every day, 125 men and 40 women die from smoking-related conditions. The smoking prevalence has diminished in comparison with previous years and the number and percentage of deaths attributable to the smoking have increased in the last 20 years. Copyright © 2015 Elsevier España, S.L.U. All rights reserved.
Estimation of true height: a study in population-specific methods among young South African adults.
Lahner, Christen Renée; Kassier, Susanna Maria; Veldman, Frederick Johannes
2017-02-01
To investigate the accuracy of arm-associated height estimation methods in the calculation of true height compared with stretch stature in a sample of young South African adults. A cross-sectional descriptive design was employed. Pietermaritzburg, Westville and Durban, KwaZulu-Natal, South Africa, 2015. Convenience sample (N 900) aged 18-24 years, which included an equal number of participants from both genders (150 per gender) stratified across race (Caucasian, Black African and Indian). Continuous variables that were investigated included: (i) stretch stature; (ii) total armspan; (iii) half-armspan; (iv) half-armspan ×2; (v) demi-span; (vi) demi-span gender-specific equation; (vii) WHO equation; and (viii) WHO-adjusted equations; as well as categorization according to gender and race. Statistical analysis was conducted using IBM SPSS Statistics Version 21.0. Significant correlations were identified between gender and height estimation measurements, with males being anatomically larger than females (P<0·001). Significant differences were documented when study participants were stratified according to race and gender (P<0·001). Anatomical similarities were noted between Indians and Black Africans, whereas Caucasians were anatomically different from the other race groups. Arm-associated height estimation methods were able to estimate true height; however, each method was specific to each gender and race group. Height can be calculated by using arm-associated measurements. Although universal equations for estimating true height exist, for the enhancement of accuracy, the use of equations that are race-, gender- and population-specific should be considered.
2011 aerospace medical certification statistical handbook.
DOT National Transportation Integrated Search
2013-01-01
The annual Aerospace Medical Certification Statistical Handbook reports descriptive characteristics of all active U.S. civil aviation airmen and the aviation medical examiners (AMEs) that perform the required medical examinations. The 2011 annual han...
DOT National Transportation Integrated Search
2007-02-01
This annual edition of Large Truck Crash Facts contains descriptive statistics about fatal, injury, and property damage only crashes involving large trucks in 2005. Selected crash statistics on passenger vehicles are also presented for comparison pur...
Sequi, Marco; Campi, Rita; Clavenna, Antonio; Bonati, Maurizio
2013-03-01
To evaluate the quality of data reporting and statistical methods performed in drug utilization studies in the pediatric population. Drug utilization studies evaluating all drug prescriptions to children and adolescents published between January 1994 and December 2011 were retrieved and analyzed. For each study, information on measures of exposure/consumption, the covariates considered, descriptive and inferential analyses, statistical tests, and methods of data reporting was extracted. An overall quality score was created for each study using a 12-item checklist that took into account the presence of outcome measures, covariates of measures, descriptive measures, statistical tests, and graphical representation. A total of 22 studies were reviewed and analyzed. Of these, 20 studies reported at least one descriptive measure. The mean was the most commonly used measure (18 studies), but only five of these also reported the standard deviation. Statistical analyses were performed in 12 studies, with the chi-square test being the most commonly performed test. Graphs were presented in 14 papers. Sixteen papers reported the number of drug prescriptions and/or packages, and ten reported the prevalence of the drug prescription. The mean quality score was 8 (median 9). Only seven of the 22 studies received a score of ≥10, while four studies received a score of <6. Our findings document that only a few of the studies reviewed applied statistical methods and reported data in a satisfactory manner. We therefore conclude that the methodology of drug utilization studies needs to be improved.
2015-03-26
to my reader, Lieutenant Colonel Robert Overstreet, for helping solidify my research, coaching me through the statistical analysis, and positive...61 Descriptive Statistics .............................................................................................................. 61...common-method bias requires careful assessment of potential sources of bias and implementing procedural and statistical control methods. Podsakoff
Using Facebook Data to Turn Introductory Statistics Students into Consultants
ERIC Educational Resources Information Center
Childers, Adam F.
2017-01-01
Facebook provides businesses and organizations with copious data that describe how users are interacting with their page. This data affords an excellent opportunity to turn introductory statistics students into consultants to analyze the Facebook data using descriptive and inferential statistics. This paper details a semester-long project that…
ALISE Library and Information Science Education Statistical Report, 1999.
ERIC Educational Resources Information Center
Daniel, Evelyn H., Ed.; Saye, Jerry D., Ed.
This volume is the twentieth annual statistical report on library and information science (LIS) education published by the Association for Library and Information Science Education (ALISE). Its purpose is to compile, analyze, interpret, and report statistical (and other descriptive) information about library/information science programs offered by…
User embracement with risk classification in an emergency care unit: an evaluative study.
Hermida, Patrícia Madalena Vieira; Nascimento, Eliane Regina Pereira do; Echevarría-Guanilo, Maria Elena; Brüggemann, Odaléa Maria; Malfussi, Luciana Bihain Hagemann de
2018-01-01
Objective Describing the evaluation of the Structure, Process and Outcome of User Embracement with Risk Classification of an Emergency Care Unit from the perspective of physicians and nurses. Method An evaluative, descriptive, quantitative study developed in Santa Catarina. Data were collected using a validated and adapted instrument consisting of 21 items distributed in the dimensions of Structure (facilities), Process (activities and relationships in providing care) and Outcome (care effects). In the analysis, descriptive statistics and the Mean Ranking and Mean Score calculations were applied. Results The sample consisted of 37 participants. From the 21 evaluated items, 11 (52.4%) had a Mean Ranking between 3 and 4, and none of them reached the maximum ranking (5 points). "Prioritization of severe cases" and "Primary care according to the severity of the case" reached a higher Mean Ranking (4.5), while "Flowchart discussion" had the lowest Ranking (2.1). The dimensions of Structure, Process and Outcome reached mean scores of 23.9, 21.9 and 25.5, respectively, indicating a Precarious evaluation (17.5 to 26.1 points). Conclusion User Embracement with Risk Classification is precarious, especially regarding the Process which obtained a lower satisfaction level from the participants.
Perceived deprivation in active duty military nurse anesthetists.
Pearson, Julie A; Fallacaro, Michael D; Pellegrini, Joseph E
2009-02-01
There is a shortage of military Certified Registered Nurse Anesthetists (CRNAs). Relative deprivation is a perception of unfairness due to discrepancies between what one has and what one could or should have that is dependent on feelings (subjective data) and facts (objective data). Feelings of relative deprivation could contribute to the military CRNA shortage. The purposes of this study were to measure relative deprivation in active-duty military CRNAs and explore variables that correlate with relative deprivation. The descriptive, correlational study was conducted using a self-administered survey sent to 435 active-duty Army, Navy, and Air Force CRNAs. Surveys were distributed to subjects by mail and could be answered by mail or by secured website. Data were analyzed using descriptive and inferential statistics. Analysis of the data revealed a calculated response rate of 57.7%. There was no significant correlation (P < .05) between years as a CRNA, military pay, promotion opportunity, or scope of practice/autonomy and relative deprivation. Correlations of the psychological factors "wanting" and "deserving" with relative deprivation were significant (P < .001). Further research is indicated to identify definitive factors that can be modified to improve feelings of deprivation as they relate to retention and recruitment of military CRNAs.
McPherson, Amy C; Leo, Jennifer; Church, Paige; Lyons, Julia; Chen, Lorry; Swift, Judy
2014-01-01
Childhood obesity is a global health concern, but children with spina bifida in particular have unique interacting risk factors for increased weight. To identify and explore current clinical practices around weight assessment and management in pediatric spina bifida clinics. An online, self-report survey of healthcare professionals (HCPs) was conducted in all pediatric spina bifida clinics across Canada (15 clinics). Summary and descriptive statistics were calculated and descriptive thematic analysis was performed on free text responses. 52 responses across all 15 clinics indicated that weight and height were assessed and recorded most of the time using a wide variety of methods, although some HCPs questioned their suitability for children with spina bifida. Weight and height information was not routinely communicated to patients and their families and HCPS identified considerable barriers to discussing weight-related information in consultations. Despite weight and height reportedly being measured regularly, HCPs expressed concern over the lack of appropriate assessment and classification tools. Communication across multi-disciplinary team members is required to ensure that children with weight-related issues do not inadvertently get overlooked. Specific skill training around weight-related issues and optimizing consultation time should be explored further for HCPs working with this population.
Occupational accidents among mototaxi drivers.
Amorim, Camila Rego; de Araújo, Edna Maria; de Araújo, Tânia Maria; de Oliveira, Nelson Fernandes
2012-03-01
The use of motorcycles as a means of work has contributed to the increase in traffic accidents, in particular, mototaxi accidents. The aim of this study was to estimate and characterize the incidence of occupational accidents among the mototaxis registered in Feira de Santana, BA. This is a cross-sectional study with descriptive and census data. Of the 300 professionals registered at the Municipal Transportation Service, 267 professionals were interviewed through a structured questionnaire. Then, a descriptive analysis was conducted and the incidence of accidents was estimated based on the variables studied. Relative risks were calculated and statistical significance was determined using the chi-square test and Fisher's exact test, considering p < 0.05. Logistic regression was used in order to perform simultaneous adjustment of variables. Occupational accidents were observed in 10.5% of mototaxis. There were mainly minor injuries (48.7%), 27% of them requiring leaves of absence from work. There was an association between the days of work per week, fatigue in lower limbs and musculoskeletal complaints, and accidents. Knowledge of the working conditions and accidents involved in this activity can be of great importance for the adoption of traffic education policies, and to help prevent accidents by improving the working conditions and lives of these professionals.
Spatial Distribution of Soil Fauna In Long Term No Tillage
NASA Astrophysics Data System (ADS)
Corbo, J. Z. F.; Vieira, S. R.; Siqueira, G. M.
2012-04-01
The soil is a complex system constituted by living beings, organic and mineral particles, whose components define their physical, chemical and biological properties. Soil fauna plays an important role in soil and may reflect and interfere in its functionality. These organisms' populations may be influenced by management practices, fertilization, liming and porosity, among others. Such changes may reduce the composition and distribution of soil fauna community. Thus, this study aimed to determine the spatial variability of soil fauna in consolidated no-tillage system. The experimental area is located at Instituto Agronômico in Campinas (São Paulo, Brazil). The sampling was conducted in a Rhodic Eutrudox, under no tillage system and 302 points distributed in a 3.2 hectare area in a regular grid of 10.00 m x 10.00 m were sampled. The soil fauna was sampled with "Pitfall Traps" method and traps remained in the area for seven days. Data were analyzed using descriptive statistics to determine the main statistical moments (mean variance, coefficient of variation, standard deviation, skewness and kurtosis). Geostatistical tools were used to determine the spatial variability of the attributes using the experimental semivariogram. For the biodiversity analysis, Shannon and Pielou indexes and richness were calculated for each sample. Geostatistics has proven to be a great tool for mapping the spatial variability of groups from the soil epigeal fauna. The family Formicidae proved to be the most abundant and dominant in the study area. The parameters of descriptive statistics showed that all attributes studied showed lognormal frequency distribution for groups from the epigeal soil fauna. The exponential model was the most suited for the obtained data, for both groups of epigeal soil fauna (Acari, Araneae, Coleoptera, Formicidae and Coleoptera larva), and the other biodiversity indexes. The sampling scheme (10.00 m x 10.00 m) was not sufficient to detect the spatial variability for all groups of soil epigeal fauna found in this study.
Xu, Yiling; Oh, Heesoo; Lagravère, Manuel O
2017-09-01
The purpose of this study was to locate traditionally-used landmarks in two-dimensional (2D) images and newly-suggested ones in three-dimensional (3D) images (cone-beam computer tomographies [CBCTs]) and determine possible relationships between them to categorize patients with Class II-1 malocclusion. CBCTs from 30 patients diagnosed with Class II-1 malocclusion were obtained from the University of Alberta Graduate Orthodontic Program database. The reconstructed images were downloaded and visualized using the software platform AVIZO ® . Forty-two landmarks were chosen and the coordinates were then obtained and analyzed using linear and angular measurements. Ten images were analyzed three times to determine the reliability and measurement error of each landmark using Intra-Class Correlation coefficient (ICC). Descriptive statistics were done using the SPSS statistical package to determine any relationships. ICC values were excellent for all landmarks in all axes, with the highest measurement error of 2mm in the y-axis for the Gonion Left landmark. Linear and angular measurements were calculated using the coordinates of each landmark. Descriptive statistics showed that the linear and angular measurements used in the 2D images did not correlate well with the 3D images. The lowest standard deviation obtained was 0.6709 for S-GoR/N-Me, with a mean of 0.8016. The highest standard deviation was 20.20704 for ANS-InfraL, with a mean of 41.006. The traditional landmarks used for 2D malocclusion analysis show good reliability when transferred to 3D images. However, they did not reveal specific skeletal or dental patterns when trying to analyze 3D images for malocclusion. Thus, another technique should be considered when classifying 3D CBCT images for Class II-1malocclusion. Copyright © 2017 CEO. Published by Elsevier Masson SAS. All rights reserved.
Readability of patient information pamphlets in urogynecology.
Reagan, Krista M L; O'Sullivan, David M; Harvey, David P; Lasala, Christine A
2015-01-01
The purpose of this study was to determine the reading level of frequently used patient information pamphlets and documents in the field of urogynecology. Urogynecology pamphlets were identified from a variety of sources. Readability was determined using 4 different accepted formulas: the Flesch-Kincaid Grade Level, the simple measure of gobbledygook Index, the Coleman-Liau Index, and the Gunning Fog index. The scores were calculated using an online calculator (http://www.readability-score.com). Descriptive statistics were used for analysis. The average of the 4 scores was calculated for each pamphlet. Subsequently, Z-scores were used to standardize the averages between the reading scales. Of the 40 documents reviewed, only a single pamphlet met the National Institutes of Health-recommended reading level. This document was developed by the American Urological Association and was specifically designated as a "Low-Literacy Brochure." The remainder of the patient education pamphlets, from both industry-sponsored and academic-sponsored sources, consistently rated above the recommended reading level for maximum comprehension. The majority of patient education pamphlets, from both industry-sponsored and academic-sponsored sources, are above the reading level recommended by the National Institutes of Health for maximum patient comprehension. Future work should be done to improve the educational resources available to patients by simplifying the verbiage in these documents.
Statistical Tutorial | Center for Cancer Research
Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. ST is designed as a follow up to Statistical Analysis of Research Data (SARD) held in April 2018. The tutorial will apply the general principles of statistical analysis of research data including descriptive statistics, z- and t-tests of means and mean differences, simple and multiple linear regression, ANOVA tests, and Chi-Squared distribution.
50 CFR Figure 1 to Part 679 - Bering Sea and Aleutian Islands Statistical and Reporting Areas
Code of Federal Regulations, 2011 CFR
2011-10-01
... Statistical and Reporting Areas 1 Figure 1 to Part 679 Wildlife and Fisheries FISHERY CONSERVATION AND... Islands Statistical and Reporting Areas ER15NO99.000 b. Coordinates Code Description 300 Russian waters... statistical area is the part of a reporting area contained in the EEZ. [64 FR 61983, Nov. 15, 1999; 65 FR...
50 CFR Figure 1 to Part 679 - Bering Sea and Aleutian Islands Statistical and Reporting Areas
Code of Federal Regulations, 2010 CFR
2010-10-01
... Statistical and Reporting Areas 1 Figure 1 to Part 679 Wildlife and Fisheries FISHERY CONSERVATION AND... Islands Statistical and Reporting Areas ER15NO99.000 b. Coordinates Code Description 300 Russian waters... statistical area is the part of a reporting area contained in the EEZ. [64 FR 61983, Nov. 15, 1999; 65 FR...
40 CFR Appendix IV to Part 265 - Tests for Significance
Code of Federal Regulations, 2010 CFR
2010-07-01
... introductory statistics texts. ... student's t-test involves calculation of the value of a t-statistic for each comparison of the mean... parameter with its initial background concentration or value. The calculated value of the t-statistic must...
Gorgich, Enam Alhagh Charkhat; Barfroshan, Sanam; Ghoreishi, Gholamreza; Yaghoobi, Maryam
2016-01-01
Introduction and Aim: Medication errors as a serious problem in world and one of the most common medical errors that threaten patient safety and may lead to even death of them. The purpose of this study was to investigate the causes of medication errors and strategies to prevention of them from nurses and nursing student viewpoint. Materials & Methods: This cross-sectional descriptive study was conducted on 327 nursing staff of khatam-al-anbia hospital and 62 intern nursing students in nursing and midwifery school of Zahedan, Iran, enrolled through the availability sampling in 2015. The data were collected by the valid and reliable questionnaire. To analyze the data, descriptive statistics, T-test and ANOVA were applied by use of SPSS16 software. Findings: The results showed that the most common causes of medications errors in nursing were tiredness due increased workload (97.8%), and in nursing students were drug calculation, (77.4%). The most important way for prevention in nurses and nursing student opinion, was reducing the work pressure by increasing the personnel, proportional to the number and condition of patients and also creating a unit as medication calculation. Also there was a significant relationship between the type of ward and the mean of medication errors in two groups. Conclusion: Based on the results it is recommended that nurse-managers resolve the human resources problem, provide workshops and in-service education about preparing medications, side-effects of drugs and pharmacological knowledge. Using electronic medications cards is a measure which reduces medications errors. PMID:27045413
The Performance of Preparatory School Candidates at the United States Naval Academy
2001-09-01
79 1. Differences in Characteristics .....................................................79 2. Differences in...Coefficients ......................................42 Table 3.3 Applicant/Midshipman Background Characteristics ...45 Table 3.4 Descriptive Characteristics for Midshipmen by Accession Source .................46 Table 3.5 Descriptive Statistics for
2012-12-01
states.2 The research activities in this area are focused on the mathematical description of the dynamic behaviour within the hierarchy of organisations...Aviation, Infantry, Medical, Catering, Armoured , Artillery, Transport, Signals, and Ordnance). The corps structure essentially maps to that of the training...calculation of the casualty levels for personnel from a hypothetical force composed of infantry and armour . Table 2: Calculated Monthly Casualty Rates for
NASA Technical Reports Server (NTRS)
Middleton, W. D.; Lundry, J. L.
1975-01-01
An integrated system of computer programs has been developed for the design and analysis of supersonic configurations. The system uses linearized theory methods for the calculation of surface pressures and supersonic area rule concepts in combination with linearized theory for calculation of aerodynamic force coefficients. Interactive graphics are optional at the user's request. This part presents a general description of the system and describes the theoretical methods used.
Statistical process control: separating signal from noise in emergency department operations.
Pimentel, Laura; Barrueto, Fermin
2015-05-01
Statistical process control (SPC) is a visually appealing and statistically rigorous methodology very suitable to the analysis of emergency department (ED) operations. We demonstrate that the control chart is the primary tool of SPC; it is constructed by plotting data measuring the key quality indicators of operational processes in rationally ordered subgroups such as units of time. Control limits are calculated using formulas reflecting the variation in the data points from one another and from the mean. SPC allows managers to determine whether operational processes are controlled and predictable. We review why the moving range chart is most appropriate for use in the complex ED milieu, how to apply SPC to ED operations, and how to determine when performance improvement is needed. SPC is an excellent tool for operational analysis and quality improvement for these reasons: 1) control charts make large data sets intuitively coherent by integrating statistical and visual descriptions; 2) SPC provides analysis of process stability and capability rather than simple comparison with a benchmark; 3) SPC allows distinction between special cause variation (signal), indicating an unstable process requiring action, and common cause variation (noise), reflecting a stable process; and 4) SPC keeps the focus of quality improvement on process rather than individual performance. Because data have no meaning apart from their context, and every process generates information that can be used to improve it, we contend that SPC should be seriously considered for driving quality improvement in emergency medicine. Copyright © 2015 Elsevier Inc. All rights reserved.
Trends in motor vehicle traffic collision statistics, 1988-1997
DOT National Transportation Integrated Search
2001-02-01
This report presents descriptive statistics about Canadian traffic collisions during the ten-year period : from 1988 to 1997, focusing specifically on casualty collisions. Casualty collisions are defined as all : reportable motor vehicle crashes resu...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-23
... in Tables A and B. Table D--Borrower Closing Costs and Seller Concessions Descriptive Statistics by... accuracy of the statistical data illustrating the correlation between higher seller concessions and an...
42 CFR 402.7 - Notice of proposed determination.
Code of Federal Regulations, 2010 CFR
2010-10-01
... and a brief description of the statistical sampling technique CMS or OIG used. (3) The reason why the... is relying upon statistical sampling to project the number and types of claims or requests for...
Organizational Commitment DEOCS 4.1 Construct Validity Summary
2017-08-01
commitment construct that targets more specifically on the workgroup frame of reference. Included is a review of the 4.0 description and items...followed by the proposed modifications to the factor. The DEOCS 4.0 description provided for organizational commitment is “members’ dedication to the...5) examining variance and descriptive statistics, and (6) selecting items that demonstrate the strongest scale properties. Table 1. DEOCS 4.0
ERIC Educational Resources Information Center
McClain, Robert L.; Wright, John C.
2014-01-01
A description of shot noise and the role it plays in absorption and emission measurements using photodiode and photomultiplier tube detection systems is presented. This description includes derivations of useful forms of the shot noise equation based on Poisson counting statistics. This approach can deepen student understanding of a fundamental…
A retrospective, descriptive study of shoulder outcomes in outpatient physical therapy.
Millar, A Lynn; Lasheway, Philip A; Eaton, Wendy; Christensen, Frances
2006-06-01
A retrospective, descriptive study of clients with shoulder dysfunction referred to physical therapy. To (1) describe the clinical and functional outcomes of clients with shoulder dysfunction following outpatient physical therapy, and (2) to compare the outcomes by type of shoulder dysfunction. Although individuals with shoulder dysfunction are commonly referred to physical therapy few large descriptive studies regarding outcomes following physical therapy are available. Data for 878 clients (468 female, 410 male) were retrieved and analyzed. This database was developed between 1997 and 2000 and included 4 outpatient facilities from 1 healthcare system in the southwest corner of Michigan. Clients were classified by type of shoulder dysfunction, and standardized tests were performed upon admittance and discharge to physical therapy. Descriptive and inferential statistics were calculated for all data. Of all clients, 55.1% had shoulder impingement, while 18.3% had postoperative repair, 8.9% had a frozen shoulder, 7.6% had a rotator cuff tear, 3.0% had shoulder instability, 2.1% were post fracture, and the remaining 4.9% had miscellaneous diagnoses. The average (+/-SD) age of the patients was 53.6 +/- 16.4 years, with an average (+/-SD) number of treatment sessions of 13.7 +/- 11.0. All groups showed significant changes following physical therapy intervention. Clients with diverse types of shoulder dysfunction demonstrated improvement in both clinical and functional measures at the conclusion of physical therapy, although it is not possible to determine whether these changes were due to the interventions or due to time. The type of shoulder dysfunction appears to affect the prognosis, thus expected outcomes should be based upon initial diagnosis and specific measures.
Properties of Deflagration Fronts and Models for Type IA Supernovae
NASA Astrophysics Data System (ADS)
Domínguez, I.; Höflich, P.
2000-01-01
Detailed models of the explosion of a white dwarf that include self-consistent calculations of the light curve and spectra provide a link between observational quantities and the underlying explosion model. These calculations assume spherical geometry and are based on parameterized descriptions of the burning front. Recently, the first multidimensional calculations for nuclear burning fronts have been performed. Although a fully consistent treatment of the burning fronts is beyond the current state of the art, these calculations provide a new and better understanding of the physics. Several new descriptions for flame propagation have been proposed by Khokhlov et al. and Niemeyer et al. Using various descriptions for the propagation of a nuclear deflagration front, we have studied the influence on the results of previous analyses of Type Ia supernovae, namely, the nucleosynthesis and structure of the expanding envelope. Our calculations are based on a set of delayed detonation models with parameters that give a good account of the optical and infrared light curves and of the spectral evolution. In this scenario, the burning front first propagates in a deflagration mode and subsequently turns into a detonation. The explosions and light curves are calculated using a one-dimensional Lagrangian radiation-hydro code including a detailed nuclear network. We find that the results of the explosion are rather insensitive to details of the description of the deflagration front, even if its speed and the time from the transition to detonation differ almost by a factor of 2. For a given white dwarf (WD) and a fixed transition density, the total production of elements changes by less than 10%, and the distribution in the velocity space changes by less than 7%. Qualitatively, this insensitivity of the final outcome of the explosion to the details of the flame propagation during the (slow) deflagration phase can be understood as follows: for plausible variations in the speed of the turbulent deflagration, the duration of this phase is several times longer than the sound crossing time in the initial WD. Therefore, the energy produced during the early nuclear burning can be redistributed over the entire WD, causing a slow preexpansion. In this intermediate state, the WD is still bound but its binding energy is reduced by the amount of nuclear energy. The expansion ratio depends mainly on the total amount of burning during the deflagration phase. Consequently, the conditions are very similar under which nuclear burning takes place during the subsequent detonation phase. In our example, the density and temperature at the burning front changes by less than 3%, and the expansion velocity changes by less than 10%. The burning conditions are very close to previous calculations which used a constant deflagration velocity. Based on a comparison with observations, those required low deflagration speeds (~2%-3% of the speed of sound). Exceptions to the similarity are the innermost layers of ~0.03-0.05 Msolar. Still, nuclear burning is in nuclear statistical equilibrium, but the rate of electron capture is larger for the new descriptions of the flame propagation. Consequently, the production of very neutron-rich isotopes is increased. In our example, close to the center Ye is about 0.44, compared to 0.46 in the model with constant deflagration speed. This increases the 48Ca production by more than a factor of 100 to 3.E-6 Msolar. Conclusions from previous analyses of light curves and spectra on the properties of the WD and the explosions will not change, and even with the new descriptions, the delayed detonation scenario is consistent with the observations. Namely, the central density results with respect to the chemical structure of the progenitor and the transition density from deflagration to detonation do not change. The reason for this similarity is the fact that the total amount of burning during the long deflagration phase determines the restructuring of the WD prior to the detonation. Therefore, we do not expect that the precise, microphysical prescription for the speed of a subsonic burning front has a significant effect on the outcome. However, at the current level of uncertainties for the burning front, the relation between properties of the burning front and of the initial white dwarf cannot be obtained from a comparison between observation and theoretical predictions by one-dimensional models. Multidimensional calculations are needed (1) to get inside the relations between model parameters such as central density and properties of the deflagration front and its relation to the transition density between deflagration and detonation and (2) to make use of information on asphericity that is provided by polarization measurements. These questions are essential to test, estimate, and predict some of the evolutionary effects of SNe Ia and their use as cosmological yardsticks.
Statistical representation of multiphase flow
NASA Astrophysics Data System (ADS)
Subramaniam
2000-11-01
The relationship between two common statistical representations of multiphase flow, namely, the single--point Eulerian statistical representation of two--phase flow (D. A. Drew, Ann. Rev. Fluid Mech. (15), 1983), and the Lagrangian statistical representation of a spray using the dropet distribution function (F. A. Williams, Phys. Fluids 1 (6), 1958) is established for spherical dispersed--phase elements. This relationship is based on recent work which relates the droplet distribution function to single--droplet pdfs starting from a Liouville description of a spray (Subramaniam, Phys. Fluids 10 (12), 2000). The Eulerian representation, which is based on a random--field model of the flow, is shown to contain different statistical information from the Lagrangian representation, which is based on a point--process model. The two descriptions are shown to be simply related for spherical, monodisperse elements in statistically homogeneous two--phase flow, whereas such a simple relationship is precluded by the inclusion of polydispersity and statistical inhomogeneity. The common origin of these two representations is traced to a more fundamental statistical representation of a multiphase flow, whose concepts derive from a theory for dense sprays recently proposed by Edwards (Atomization and Sprays 10 (3--5), 2000). The issue of what constitutes a minimally complete statistical representation of a multiphase flow is resolved.
Liu, Xuan L; Gheno, Thomas; Lindahl, Bonnie B; Lindwall, Greta; Gleeson, Brian; Liu, Zi-Kui
2015-01-01
The phase relations and thermodynamic properties of the condensed Al-Co-Cr ternary alloy system are investigated using first-principles calculations based on density functional theory (DFT) and phase-equilibria experiments that led to X-ray diffraction (XRD) and electron probe micro-analysis (EPMA) measurements. A thermodynamic description is developed by means of the calculations of phase diagrams (CALPHAD) method using experimental and computational data from the present work and the literature. Emphasis is placed on modeling the bcc-A2, B2, fcc-γ, and tetragonal-σ phases in the temperature range of 1173 to 1623 K. Liquid, bcc-A2 and fcc-γ phases are modeled using substitutional solution descriptions. First-principles special quasirandom structures (SQS) calculations predict a large bcc-A2 (disordered)/B2 (ordered) miscibility gap, in agreement with experiments. A partitioning model is then used for the A2/B2 phase to effectively describe the order-disorder transitions. The critically assessed thermodynamic description describes all phase equilibria data well. A2/B2 transitions are also shown to agree well with previous experimental findings.
Long-term strategy for the statistical design of a forest health monitoring system
Hans T. Schreuder; Raymond L. Czaplewski
1993-01-01
A conceptual framework is given for a broad-scale survey of forest health that accomplishes three objectives: generate descriptive statistics; detect changes in such statistics; and simplify analytical inferences that identify, and possibly establish cause-effect relationships. Our paper discusses the development of sampling schemes to satisfy these three objectives,...
Design-based Sample and Probability Law-Assumed Sample: Their Role in Scientific Investigation.
ERIC Educational Resources Information Center
Ojeda, Mario Miguel; Sahai, Hardeo
2002-01-01
Discusses some key statistical concepts in probabilistic and non-probabilistic sampling to provide an overview for understanding the inference process. Suggests a statistical model constituting the basis of statistical inference and provides a brief review of the finite population descriptive inference and a quota sampling inferential theory.…
Computers as an Instrument for Data Analysis. Technical Report No. 11.
ERIC Educational Resources Information Center
Muller, Mervin E.
A review of statistical data analysis involving computers as a multi-dimensional problem provides the perspective for consideration of the use of computers in statistical analysis and the problems associated with large data files. An overall description of STATJOB, a particular system for doing statistical data analysis on a digital computer,…
A Descriptive Study of Individual and Cross-Cultural Differences in Statistics Anxiety
ERIC Educational Resources Information Center
Baloglu, Mustafa; Deniz, M. Engin; Kesici, Sahin
2011-01-01
The present study investigated individual and cross-cultural differences in statistics anxiety among 223 Turkish and 237 American college students. A 2 x 2 between-subjects factorial multivariate analysis of covariance (MANCOVA) was performed on the six dependent variables which are the six subscales of the Statistical Anxiety Rating Scale.…
Children in the UK: Signposts to Statistics.
ERIC Educational Resources Information Center
Grey, Eleanor
This guide indicates statistical sources in the United Kingdom dealing with children and young people. Regular and occasional sources are listed in a three-column format including the name of the source, a brief description, and the geographic area to which statistics refer. Information is classified under 25 topic headings: abortions; accidents;…
An analysis of the relationship of flight hours and naval rotary wing aviation mishaps
2017-03-01
evidence to support indicators used for sequestration, high flight hours, night flight, and overwater flight had statistically significant effects on...estimates found enough evidence to support indicators used for sequestration, high flight hours, night flight, and overwater flight had statistically ...38 C. DESCRIPTIVE STATISTICS ................................................................38 D
Practicing Statistics by Creating Exercises for Fellow Students
ERIC Educational Resources Information Center
Bebermeier, Sarah; Reiss, Katharina
2016-01-01
This article outlines the execution of a workshop in which students were encouraged to actively review the course contents on descriptive statistics by creating exercises for their fellow students. In a first-year statistics course in psychology, 39 out of 155 students participated in the workshop. In a subsequent evaluation, the workshop was…
Research design and statistical methods in Pakistan Journal of Medical Sciences (PJMS)
Akhtar, Sohail; Shah, Syed Wadood Ali; Rafiq, M.; Khan, Ajmal
2016-01-01
Objective: This article compares the study design and statistical methods used in 2005, 2010 and 2015 of Pakistan Journal of Medical Sciences (PJMS). Methods: Only original articles of PJMS were considered for the analysis. The articles were carefully reviewed for statistical methods and designs, and then recorded accordingly. The frequency of each statistical method and research design was estimated and compared with previous years. Results: A total of 429 articles were evaluated (n=74 in 2005, n=179 in 2010, n=176 in 2015) in which 171 (40%) were cross-sectional and 116 (27%) were prospective study designs. A verity of statistical methods were found in the analysis. The most frequent methods include: descriptive statistics (n=315, 73.4%), chi-square/Fisher’s exact tests (n=205, 47.8%) and student t-test (n=186, 43.4%). There was a significant increase in the use of statistical methods over time period: t-test, chi-square/Fisher’s exact test, logistic regression, epidemiological statistics, and non-parametric tests. Conclusion: This study shows that a diverse variety of statistical methods have been used in the research articles of PJMS and frequency improved from 2005 to 2015. However, descriptive statistics was the most frequent method of statistical analysis in the published articles while cross-sectional study design was common study design. PMID:27022365
Algorithm for computing descriptive statistics for very large data sets and the exa-scale era
NASA Astrophysics Data System (ADS)
Beekman, Izaak
2017-11-01
An algorithm for Single-point, Parallel, Online, Converging Statistics (SPOCS) is presented. It is suited for in situ analysis that traditionally would be relegated to post-processing, and can be used to monitor the statistical convergence and estimate the error/residual in the quantity-useful for uncertainty quantification too. Today, data may be generated at an overwhelming rate by numerical simulations and proliferating sensing apparatuses in experiments and engineering applications. Monitoring descriptive statistics in real time lets costly computations and experiments be gracefully aborted if an error has occurred, and monitoring the level of statistical convergence allows them to be run for the shortest amount of time required to obtain good results. This algorithm extends work by Pébay (Sandia Report SAND2008-6212). Pébay's algorithms are recast into a converging delta formulation, with provably favorable properties. The mean, variance, covariances and arbitrary higher order statistical moments are computed in one pass. The algorithm is tested using Sillero, Jiménez, & Moser's (2013, 2014) publicly available UPM high Reynolds number turbulent boundary layer data set, demonstrating numerical robustness, efficiency and other favorable properties.
Large truck and bus crash facts, 2010.
DOT National Transportation Integrated Search
2012-09-01
This annual edition of Large Truck and Bus Crash Facts contains descriptive statistics about fatal, injury, and : property damage only crashes involving large trucks and buses in 2010. Selected crash statistics on passenger : vehicles are also presen...
Large truck and bus crash facts, 2007.
DOT National Transportation Integrated Search
2009-03-01
This annual edition of Large Truck and Bus Crash Facts contains descriptive statistics about fatal, injury, and : property damage only crashes involving large trucks and buses in 2007. Selected crash statistics on passenger : vehicles are also presen...
Large truck and bus crash facts, 2008.
DOT National Transportation Integrated Search
2010-03-01
This annual edition of Large Truck and Bus Crash Facts contains descriptive statistics about fatal, injury, and : property damage only crashes involving large trucks and buses in 2008. Selected crash statistics on passenger : vehicles are also presen...
Large truck and bus crash facts, 2011.
DOT National Transportation Integrated Search
2013-10-01
This annual edition of Large Truck and Bus Crash Facts contains descriptive statistics about fatal, injury, and : property damage only crashes involving large trucks and buses in 2011. Selected crash statistics on passenger : vehicles are also presen...
Large truck and bus crash facts, 2013.
DOT National Transportation Integrated Search
2015-04-01
This annual edition of Large Truck and Bus Crash Facts contains descriptive statistics about fatal, injury, and property damage only crashes involving large trucks and buses in 2013. Selected crash statistics on passenger vehicles are also presented ...
Large truck and bus crash facts, 2009.
DOT National Transportation Integrated Search
2011-10-01
This annual edition of Large Truck and Bus Crash Facts contains descriptive statistics about fatal, injury, and : property damage only crashes involving large trucks and buses in 2009. Selected crash statistics on passenger : vehicles are also presen...
Large truck and bus crash facts, 2012.
DOT National Transportation Integrated Search
2014-06-01
This annual edition of Large Truck and Bus Crash Facts contains descriptive statistics about fatal, injury, and property damage only crashes involving large trucks and buses in 2012. Selected crash statistics on passenger vehicles are also presented ...
Realistic finite temperature simulations of magnetic systems using quantum statistics
NASA Astrophysics Data System (ADS)
Bergqvist, Lars; Bergman, Anders
2018-01-01
We have performed realistic atomistic simulations at finite temperatures using Monte Carlo and atomistic spin dynamics simulations incorporating quantum (Bose-Einstein) statistics. The description is much improved at low temperatures compared to classical (Boltzmann) statistics normally used in these kind of simulations, while at higher temperatures the classical statistics are recovered. This corrected low-temperature description is reflected in both magnetization and the magnetic specific heat, the latter allowing for improved modeling of the magnetic contribution to free energies. A central property in the method is the magnon density of states at finite temperatures, and we have compared several different implementations for obtaining it. The method has no restrictions regarding chemical and magnetic order of the considered materials. This is demonstrated by applying the method to elemental ferromagnetic systems, including Fe and Ni, as well as Fe-Co random alloys and the ferrimagnetic system GdFe3.
Projected shell model description of N = 114 superdeformed isotone nuclei
NASA Astrophysics Data System (ADS)
Guo, R. S.; Chen, L. M.; Chou, C. H.
2006-03-01
A systematic description of the yrast superdeformed (SD) bands in N = 114, Z = 80-84 isotone nuclei using the projected shell model is presented. The calculated γ-ray energies, moment of inertia and M1 transitions are compared with the data for which spin is assigned. Excellent agreement with the available data for all isotones is obtained. The calculated electromagnetic properties provide a microscopic understanding of those measured nuclei. Some predictions in superdeformed nuclei are also discussed.
Low, Sheryl A; McCoy, Sarah Westcott; Beling, Janna; Adams, Janet
2011-01-01
This study investigated pediatric physical therapists' use of support walkers (SWs) for children with disabilities. An 8-page survey was mailed to 2500 randomly selected members of the Section on Pediatrics of the American Physical Therapy Association. Respondents to the survey included 513 pediatric physical therapists who were users of SWs. Descriptive statistics were calculated and themes were analyzed. Several SWs were reported as used most often to improve gait, mobility, participation at school, and interaction with peers. Use commonly included a month trial before purchase and 9 sessions of physical therapy to train a child for use in school. Reasons given for the use of SWs were improving impairments, functional limitations, and participation with peers. Pediatric physical therapists use SWs to increase postural control, mobility, and children's participation in school.
Effect of Graph Scale on Risky Choice: Evidence from Preference and Process in Decision-Making
Sun, Yan; Li, Shu; Bonini, Nicolao; Liu, Yang
2016-01-01
We investigate the effect of graph scale on risky choices. By (de)compressing the scale, we manipulate the relative physical distance between options on a given attribute in a coordinate graphical context. In Experiment 1, the risky choice changes as a function of the scale in the graph. In Experiment 2, we show that the type of graph scale also affects decision times. In Experiment 3, we examine the graph scale effect by using real money among students who have taken statistics courses. Consequently, the scale effects still appear even when we control the variations in calculation ability and increase the gravity with which participants view the consequence of their decisions. This finding is inconsistent with descriptive invariance of preference. The theoretical implications and practical applications of the findings are discussed. PMID:26771530
Spacing distribution functions for 1D point island model with irreversible attachment
NASA Astrophysics Data System (ADS)
Gonzalez, Diego; Einstein, Theodore; Pimpinelli, Alberto
2011-03-01
We study the configurational structure of the point island model for epitaxial growth in one dimension. In particular, we calculate the island gap and capture zone distributions. Our model is based on an approximate description of nucleation inside the gaps. Nucleation is described by the joint probability density p xy n (x,y), which represents the probability density to have nucleation at position x within a gap of size y. Our proposed functional form for p xy n (x,y) describes excellently the statistical behavior of the system. We compare our analytical model with extensive numerical simulations. Our model retains the most relevant physical properties of the system. This work was supported by the NSF-MRSEC at the University of Maryland, Grant No. DMR 05-20471, with ancillary support from the Center for Nanophysics and Advanced Materials (CNAM).
Vibrational Mode-Specific Reaction of Methane on a Nickel Surface
NASA Astrophysics Data System (ADS)
Beck, Rainer D.; Maroni, Plinio; Papageorgopoulos, Dimitrios C.; Dang, Tung T.; Schmid, Mathieu P.; Rizzo, Thomas R.
2003-10-01
The dissociation of methane on a nickel catalyst is a key step in steam reforming of natural gas for hydrogen production. Despite substantial effort in both experiment and theory, there is still no atomic-scale description of this important gas-surface reaction. We report quantum state-resolved studies, using pulsed laser and molecular beam techniques, of vibrationally excited methane reacting on the nickel (100) surface. For doubly deuterated methane (CD2H2), we observed that the reaction probability with two quanta of excitation in one C-H bond was greater (by as much as a factor of 5) than with one quantum in each of two C-H bonds. These results clearly exclude the possibility of statistical models correctly describing the mechanism of this process and attest to the importance of full-dimensional calculations of the reaction dynamics.
Vibrational mode-specific reaction of methane on a nickel surface.
Beck, Rainer D; Maroni, Plinio; Papageorgopoulos, Dimitrios C; Dang, Tung T; Schmid, Mathieu P; Rizzo, Thomas R
2003-10-03
The dissociation of methane on a nickel catalyst is a key step in steam reforming of natural gas for hydrogen production. Despite substantial effort in both experiment and theory, there is still no atomic-scale description of this important gas-surface reaction. We report quantum state-resolved studies, using pulsed laser and molecular beam techniques, of vibrationally excited methane reacting on the nickel (100) surface. For doubly deuterated methane (CD2H2), we observed that the reaction probability with two quanta of excitation in one C-H bond was greater (by as much as a factor of 5) than with one quantum in each of two C-H bonds. These results clearly exclude the possibility of statistical models correctly describing the mechanism of this process and attest to the importance of full-dimensional calculations of the reaction dynamics.
Numerical calculations of turbulent swirling flow
NASA Technical Reports Server (NTRS)
Kubo, I.; Gouldin, F. C.
1974-01-01
Description of a numerical technique for solving axisymmetric, incompressible, turbulent swirling flow problems. Isothermal flow calculations are presented for a coaxial flow configuration of special interest. The calculation results are discussed in regard to their implications for the design of gas turbine combustors.
47 CFR 1.363 - Introduction of statistical data.
Code of Federal Regulations, 2010 CFR
2010-10-01
... case of sample surveys, there shall be a clear description of the survey design, including the... evidence in common carrier hearing proceedings, including but not limited to sample surveys, econometric... description of the experimental design shall be set forth, including a specification of the controlled...
47 CFR 1.363 - Introduction of statistical data.
Code of Federal Regulations, 2013 CFR
2013-10-01
... case of sample surveys, there shall be a clear description of the survey design, including the... evidence in common carrier hearing proceedings, including but not limited to sample surveys, econometric... description of the experimental design shall be set forth, including a specification of the controlled...
47 CFR 1.363 - Introduction of statistical data.
Code of Federal Regulations, 2014 CFR
2014-10-01
... case of sample surveys, there shall be a clear description of the survey design, including the... evidence in common carrier hearing proceedings, including but not limited to sample surveys, econometric... description of the experimental design shall be set forth, including a specification of the controlled...
47 CFR 1.363 - Introduction of statistical data.
Code of Federal Regulations, 2012 CFR
2012-10-01
... case of sample surveys, there shall be a clear description of the survey design, including the... evidence in common carrier hearing proceedings, including but not limited to sample surveys, econometric... description of the experimental design shall be set forth, including a specification of the controlled...
47 CFR 1.363 - Introduction of statistical data.
Code of Federal Regulations, 2011 CFR
2011-10-01
... case of sample surveys, there shall be a clear description of the survey design, including the... evidence in common carrier hearing proceedings, including but not limited to sample surveys, econometric... description of the experimental design shall be set forth, including a specification of the controlled...
Han, Kyunghwa; Jung, Inkyung
2018-05-01
This review article presents an assessment of trends in statistical methods and an evaluation of their appropriateness in articles published in the Archives of Plastic Surgery (APS) from 2012 to 2017. We reviewed 388 original articles published in APS between 2012 and 2017. We categorized the articles that used statistical methods according to the type of statistical method, the number of statistical methods, and the type of statistical software used. We checked whether there were errors in the description of statistical methods and results. A total of 230 articles (59.3%) published in APS between 2012 and 2017 used one or more statistical method. Within these articles, there were 261 applications of statistical methods with continuous or ordinal outcomes, and 139 applications of statistical methods with categorical outcome. The Pearson chi-square test (17.4%) and the Mann-Whitney U test (14.4%) were the most frequently used methods. Errors in describing statistical methods and results were found in 133 of the 230 articles (57.8%). Inadequate description of P-values was the most common error (39.1%). Among the 230 articles that used statistical methods, 71.7% provided details about the statistical software programs used for the analyses. SPSS was predominantly used in the articles that presented statistical analyses. We found that the use of statistical methods in APS has increased over the last 6 years. It seems that researchers have been paying more attention to the proper use of statistics in recent years. It is expected that these positive trends will continue in APS.
NASA Astrophysics Data System (ADS)
Zemenkova, M. Y.; Shabarov, A.; Shatalov, A.; Puldas, L.
2018-05-01
The problem of the pore space description and the calculation of relative phase permeabilities (RPP) for two-phase filtration is considered. A technique for constructing a pore-network structure for constant and variable channel diameters is proposed. A description of the design model of RPP based on the capillary pressure curves is presented taking into account the variability of diameters along the length of pore channels. By the example of the calculation analysis for the core samples of the Urnenskoye and Verkhnechonskoye deposits, the possibilities of calculating RPP are shown when using the stochastic distribution of pores by diameters and medium-flow diameters.
ERIC Educational Resources Information Center
Shaw, W. M., Jr.
1993-01-01
Describes a study conducted on the cystic fibrosis (CF) database, a subset of MEDLINE, that investigated clustering structure and the effectiveness of cluster-based retrieval as a function of the exhaustivity of the uncontrolled subject descriptions. Results are compared to calculations for controlled descriptions based on Medical Subject Headings…
NASA Technical Reports Server (NTRS)
Taback, I.
1979-01-01
The discussion of vulnerability begins with a description of some of the electrical characteristics of fibers before definiting how vulnerability calculations are done. The vulnerability results secured to date are presented. The discussion touches on post exposure vulnerability. After a description of some shock hazard work now underway, the discussion leads into a description of the planned effort and some preliminary conclusions are presented.
Huang, Ying; Li, Cao; Liu, Linhai; Jia, Xianbo; Lai, Song-Jia
2016-01-01
Although various computer tools have been elaborately developed to calculate a series of statistics in molecular population genetics for both small- and large-scale DNA data, there is no efficient and easy-to-use toolkit available yet for exclusively focusing on the steps of mathematical calculation. Here, we present PopSc, a bioinformatic toolkit for calculating 45 basic statistics in molecular population genetics, which could be categorized into three classes, including (i) genetic diversity of DNA sequences, (ii) statistical tests for neutral evolution, and (iii) measures of genetic differentiation among populations. In contrast to the existing computer tools, PopSc was designed to directly accept the intermediate metadata, such as allele frequencies, rather than the raw DNA sequences or genotyping results. PopSc is first implemented as the web-based calculator with user-friendly interface, which greatly facilitates the teaching of population genetics in class and also promotes the convenient and straightforward calculation of statistics in research. Additionally, we also provide the Python library and R package of PopSc, which can be flexibly integrated into other advanced bioinformatic packages of population genetics analysis. PMID:27792763
Chen, Shi-Yi; Deng, Feilong; Huang, Ying; Li, Cao; Liu, Linhai; Jia, Xianbo; Lai, Song-Jia
2016-01-01
Although various computer tools have been elaborately developed to calculate a series of statistics in molecular population genetics for both small- and large-scale DNA data, there is no efficient and easy-to-use toolkit available yet for exclusively focusing on the steps of mathematical calculation. Here, we present PopSc, a bioinformatic toolkit for calculating 45 basic statistics in molecular population genetics, which could be categorized into three classes, including (i) genetic diversity of DNA sequences, (ii) statistical tests for neutral evolution, and (iii) measures of genetic differentiation among populations. In contrast to the existing computer tools, PopSc was designed to directly accept the intermediate metadata, such as allele frequencies, rather than the raw DNA sequences or genotyping results. PopSc is first implemented as the web-based calculator with user-friendly interface, which greatly facilitates the teaching of population genetics in class and also promotes the convenient and straightforward calculation of statistics in research. Additionally, we also provide the Python library and R package of PopSc, which can be flexibly integrated into other advanced bioinformatic packages of population genetics analysis.
A primer on multifactor productivity : description, benefits, and uses
DOT National Transportation Integrated Search
2008-04-01
This primer presents a description of multifactor : productivity (MFP) and its calculation. Productivity : is an important measure of the state of the : economy at various levels: firm, industry, sectoral, : and the macroeconomic. The method describe...
Parsimonious description for predicting high-dimensional dynamics
Hirata, Yoshito; Takeuchi, Tomoya; Horai, Shunsuke; Suzuki, Hideyuki; Aihara, Kazuyuki
2015-01-01
When we observe a system, we often cannot observe all its variables and may have some of its limited measurements. Under such a circumstance, delay coordinates, vectors made of successive measurements, are useful to reconstruct the states of the whole system. Although the method of delay coordinates is theoretically supported for high-dimensional dynamical systems, practically there is a limitation because the calculation for higher-dimensional delay coordinates becomes more expensive. Here, we propose a parsimonious description of virtually infinite-dimensional delay coordinates by evaluating their distances with exponentially decaying weights. This description enables us to predict the future values of the measurements faster because we can reuse the calculated distances, and more accurately because the description naturally reduces the bias of the classical delay coordinates toward the stable directions. We demonstrate the proposed method with toy models of the atmosphere and real datasets related to renewable energy. PMID:26510518
Lee, F K-H; Chan, C C-L; Law, C-K
2009-02-01
Contrast enhanced computed tomography (CECT) has been used for delineation of treatment target in radiotherapy. The different Hounsfield unit due to the injected contrast agent may affect radiation dose calculation. We investigated this effect on intensity modulated radiotherapy (IMRT) of nasopharyngeal carcinoma (NPC). Dose distributions of 15 IMRT plans were recalculated on CECT. Dose statistics for organs at risk (OAR) and treatment targets were recorded for the plain CT-calculated and CECT-calculated plans. Statistical significance of the differences was evaluated. Correlations were also tested, among magnitude of calculated dose difference, tumor size and level of enhancement contrast. Differences in nodal mean/median dose were statistically significant, but small (approximately 0.15 Gy for a 66 Gy prescription). In the vicinity of the carotid arteries, the difference in calculated dose was also statistically significant, but only with a mean of approximately 0.2 Gy. We did not observe any significant correlation between the difference in the calculated dose and the tumor size or level of enhancement. The results implied that the calculated dose difference was clinically insignificant and may be acceptable for IMRT planning.
ERIC Educational Resources Information Center
Karadag, Engin
2010-01-01
To assess research methods and analysis of statistical techniques employed by educational researchers, this study surveyed unpublished doctoral dissertation from 2003 to 2007. Frequently used research methods consisted of experimental research; a survey; a correlational study; and a case study. Descriptive statistics, t-test, ANOVA, factor…
2017-03-01
53 ix LIST OF TABLES Table 1. Descriptive Statistics for Control Variables by... Statistics for Control Variables by Gender (Random Subsample with Complete Survey) ............................................................30 Table...empirical analysis. Chapter IV describes the summary statistics and results. Finally, Chapter V offers concluding thoughts, study limitations, and
ERIC Educational Resources Information Center
Bailey, Thomas; Jenkins, Davis; Leinbach, Timothy
2005-01-01
This report summarizes the latest available national statistics on access and attainment by low income and minority community college students. The data come from the National Center for Education Statistics' (NCES) Integrated Postsecondary Education Data System (IPEDS) annual surveys of all postsecondary educational institutions and the NCES…
A First Assignment to Create Student Buy-In in an Introductory Business Statistics Course
ERIC Educational Resources Information Center
Newfeld, Daria
2016-01-01
This paper presents a sample assignment to be administered after the first two weeks of an introductory business focused statistics course in order to promote student buy-in. This assignment integrates graphical displays of data, descriptive statistics and cross-tabulation analysis through the lens of a marketing analysis study. A marketing sample…
A meta-analysis of research on science teacher education practices associated with inquiry strategy
NASA Astrophysics Data System (ADS)
Sweitzer, Gary L.; Anderson, Ronald D.
A meta-analysis was conducted of studies of teacher education having as measured outcomes one or more variables associated with inquiry teaching. Inquiry addresses those teacher behaviors that facilitate student acquisition of concepts and processes through strategies such as problem solving, uses of evidence, logical and analytical reasoning, clarification of values, and decision making. Studies which contained sufficient data for the calculation of an effect size were coded for 114 variables. These variables were divided into the following six major categories: study information and design characteristics, teacher and teacher trainee characteristics, student characteristics, treatment description, outcome description, and effect size calculation. A total of 68 studies resulting in 177 effect size calculations were coded. Mean effect sizes broken across selected variables were calculated.
NASA Technical Reports Server (NTRS)
Miller, R. D.; Anderson, L. R.
1979-01-01
The LOADS program L218, a digital computer program that calculates dynamic load coefficient matrices utilizing the force summation method, is described. The load equations are derived for a flight vehicle in straight and level flight and excited by gusts and/or control motions. In addition, sensor equations are calculated for use with an active control system. The load coefficient matrices are calculated for the following types of loads: translational and rotational accelerations, velocities, and displacements; panel aerodynamic forces; net panel forces; shears and moments. Program usage and a brief description of the analysis used are presented. A description of the design and structure of the program to aid those who will maintain and/or modify the program in the future is included.
What can 35 years and over 700,000 measurements tell us about noise exposure in the mining industry?
Roberts, Benjamin; Sun, Kan; Neitzel, Richard L.
2017-01-01
Objective To analyze over 700,000 cross-sectional measurements from the Mine Safety and Health Administration (MHSA) and develop statistical models to predict noise exposure for a worker. Design Descriptive statistics were used to summarize the data. Two linear regression models were used to predict noise exposure based on MSHA permissible exposure limit (PEL) and action level (AL) respectively. Two-fold cross validation was used to compare the exposure estimates from the models to actual measurements in the hold out data. The mean difference and t-statistic was calculated for each job title to determine if the model exposure predictions were significantly different from the actual data. Study Sample Measurements were acquired from MSHA through a Freedom of Information Act request. Results From 1979 to 2014 the average noise measurement has decreased. Measurements taken before the implementation of MSHA’s revised noise regulation in 2000 were on average 4.5 dBA higher than after the law came in to effect. Both models produced mean exposure predictions that were less than 1 dBA different compared to the holdout data. Conclusion Overall noise levels in mines have been decreasing. However, this decrease has not been uniform across all mining sectors. The exposure predictions from the model will be useful to help predict hearing loss in workers from the mining industry. PMID:27871188
Chi-Square Statistics, Tests of Hypothesis and Technology.
ERIC Educational Resources Information Center
Rochowicz, John A.
The use of technology such as computers and programmable calculators enables students to find p-values and conduct tests of hypotheses in many different ways. Comprehension and interpretation of a research problem become the focus for statistical analysis. This paper describes how to calculate chisquare statistics and p-values for statistical…
Sherman, David M.
1986-01-01
A molecular orbital description, based on spin-unrestricted X??-scattered wave calculations, is given for the electronic structures of mixed valence iron oxides and silicates. The cluster calculations show that electron hopping and optical intervalence charge-transger result from weak FeFe bonding across shared edges of FeO6 coordination polyhedra. In agreement with Zener's double exchange model, FeFe bonding is found to stabilize ferromagnetic coupling between Fe2+ and Fe3+ cations. ?? 1986.
The evaluation of reproductive health PhD program in Iran: The input indicators analysis.
AbdiShahshahani, Mahshid; Ehsanpour, Soheila; Yamani, Nikoo; Kohan, Shahnaz
2014-11-01
Appropriate quality achievement of a PhD program requires frequent assessment and discovering the shortcomings in the program. Inputs, which are important elements of the curriculum, are frequently missed in evaluations. The purpose of this study was to evaluate the input indicators of reproductive health PhD program in Iran based on the Context, Input, Process, and Product (CIPP) evaluation model. This is a descriptive and evaluative study based on the CIPP evaluation model. It was conducted in 2013 in four Iranian schools of nursing and midwifery of medical sciences universities. Statistical population consisted of four groups: heads of departments (n = 5), faculty members (n = 18), graduates (n = 12), and PhD students of reproductive health (n = 54). Data collection tools were five separate questionnaires including 37 indicators that were developed by the researcher. Content and face validity were evaluated based on the experts' indications. The Cronbach's alpha coefficient was calculated in order to obtain the reliability of the questionnaires. Collected data were analyzed by SPSS software. Data were analyzed by descriptive statistics (mean, frequency, percentage, and standard deviation), and one-way analysis of variance (ANOVA) and least significant difference (LSD) post hoc tests to compare means between groups. The results of the study indicated that the highest percentage of the heads of departments (80%), graduates (66.7%), and students (68.5%) evaluated the status of input indicators of reproductive health PhD program as relatively appropriate, while most of the faculties (66.7%) evaluated that as appropriate. It is suggested to explore the reasons for relatively appropriate evaluation of input indicators by further academic researches and improve the reproductive health PhD program accordingly.
2010-01-01
Background To assess the reliability of the measurements obtained with the PalmScan™, when compared with another standardized A-mode ultrasound device, and assess the consistency and correlation between the two methods. Methods Transversal, descriptive, and comparative study. We recorded the axial length (AL), anterior chamber depth (ACD) and lens thickness (LT) obtained with two A-mode ultrasounds (PalmScan™ A2000 and Eye Cubed™) using an immersion technique. We compared the measurements with a two-sample t-test. Agreement between the two devices was assessed with Bland-Altman plots and 95% limits of agreement. Results 70 eyes of 70 patients were enrolled in this study. The measurements with the Eye Cubed™ of AL and ACD were shorter than the measurements taken by the PalmScan™. The differences were not statistically significant regarding AL (p < 0.4) but significant regarding ACD (p < 0.001). The highest agreement between the two devices was obtained during LT measurement. The PalmScan™ measurements were shorter, but not statistically significantly (p < 0.2). Conclusions The values of AL and LT, obtained with both devices are not identical, but within the limits of agreement. The agreement is not affected by the magnitude of the ocular dimensions (but only between range of 20 mm to 27 mm of AL and 3.5 mm to 5.7 mm of LT). A correction of about 0.5 D could be considered if an intraocular lens is being calculated. However due to the large variability of the results, the authors recommend discretion in using this conversion factor, and to adjust the power of the intraocular lenses based upon the personal experience of the surgeon. PMID:20334670
Velez-Montoya, Raul; Shusterman, Eugene Mark; López-Miranda, Miriam Jessica; Mayorquin-Ruiz, Mariana; Salcedo-Villanueva, Guillermo; Quiroz-Mercado, Hugo; Morales-Cantón, Virgilio
2010-03-24
To assess the reliability of the measurements obtained with the PalmScan, when compared with another standardized A-mode ultrasound device, and assess the consistency and correlation between the two methods. Transversal, descriptive, and comparative study. We recorded the axial length (AL), anterior chamber depth (ACD) and lens thickness (LT) obtained with two A-mode ultrasounds (PalmScan A2000 and Eye Cubed) using an immersion technique. We compared the measurements with a two-sample t-test. Agreement between the two devices was assessed with Bland-Altman plots and 95% limits of agreement. 70 eyes of 70 patients were enrolled in this study. The measurements with the Eye Cubed of AL and ACD were shorter than the measurements taken by the PalmScan. The differences were not statistically significant regarding AL (p < 0.4) but significant regarding ACD (p < 0.001). The highest agreement between the two devices was obtained during LT measurement. The PalmScan measurements were shorter, but not statistically significantly (p < 0.2). The values of AL and LT, obtained with both devices are not identical, but within the limits of agreement. The agreement is not affected by the magnitude of the ocular dimensions (but only between range of 20 mm to 27 mm of AL and 3.5 mm to 5.7 mm of LT). A correction of about 0.5 D could be considered if an intraocular lens is being calculated. However due to the large variability of the results, the authors recommend discretion in using this conversion factor, and to adjust the power of the intraocular lenses based upon the personal experience of the surgeon.
Fossil diatoms and neogene paleolimnology
Platt, Bradbury J.
1988-01-01
Diatoms have played an important role in the development of Neogene continental biostratigraphy and paleolimnology since the mid-19th Century. The history of progress in Quaternary diatom biostratigraphy has developed as a result of improved coring techniques that enable sampling sediments beneath existing lakes coupled with improved chronological control (including radiometric dating and varve enumeration), improved statistical treatment of fossil diatom assemblages (from qualitative description to influx calculations of diatom numbers or volumes), and improved ecological information about analogous living diatom associations. The last factor, diatom ecology, is the most critical in many ways, but progresses slowly. Fortunately, statistical comparison of modern diatom assemblages and insightful studies of the nutrient requirements of some common freshwater species are enabling diatom paleolimnologists to make more detailed interpretations of the Quaternary record than had been possible earlier, and progress in the field of diatom biology and ecology will continue to refine paleolimnological studies. The greater age and geologic setting of Tertiary diatomaceous deposits has prompted their study in the contexts of geologic history, biochronology and evolution. The distribution of diatoms of marine affinities in continental deposits has given geologists insights about tectonism and sea-level change, and the distribution of distinctive (extinct?) diatoms has found utilization both in making stratigraphic correlations between outcrops of diatomaceous deposits and in various types of biochronological studies that involve dating deposits in different areas. A continental diatom biochronologic scheme will rely upon evolution, such as the appearance of new genera within a family, in combination with regional environmental changes that are responsible for the wide distribution of distinctive diatom species. The increased use of the scanning electron microscope for the detailed descriptions of fossil diatoms will provide the basis for making more accurate correlations and identifications, and the micromorphological detail for speculations about evolutionary relationships. ?? 1988.
Watson, Estelle D; Oddie, Brydie; Constantinou, Demitri
2015-10-07
There is compelling evidence for the benefits of regular exercise during pregnancy, and medical practitioners (MPs) can play an important role in changing antenatal health behaviours. The purpose of this study was to assess the knowledge, attitudes and beliefs of South African MPs towards exercise during pregnancy. A convenience sample of ninety-six MPs working in the private health care sector, including General Practitioners (n = 58), Obstetricians/Gynaecologists (n = 33) and other Specialists (n = 5), participated in this cross sectional, descriptive survey study. A 33-item questionnaire was distributed manually at medical practices and via email to an on-line survey tool. Descriptive statistics and frequency tables were calculated for all questions. Chi-squared and Fisher's exact statistical tests were used to determine the differences in response by age, speciality and years of practice (p < 0.05). The majority of practitioners (98%) believe that exercise during pregnancy is beneficial, and were knowledgeable on most of the expected benefits. Seventy-eight percent believed that providing exercise advice is an important part of prenatal care, however only 19% provided informational pamphlets and few (24%) referred to exercise specialists. A large majority (83%) were unaware of the recommended exercise guidelines. Although age and years of practice played no role in this awareness, practitioners who focussed on obstetrics and gynaecology were more likely to be aware of the current guidelines, than those in general practice (p < 0.001). Although the MPs were largely positive towards exercise during pregnancy, their advice did not always align with the current guidelines. Therefore, better dissemination of available research is warranted, to bridge the gap between clinical knowledge and current recommendations for physical activity promotion.
Sordi, Marina de; Mourão, Lucia Figueiredo; Silva, Ariovaldo Armando da; Flosi, Luciana Claudia Leite
2009-01-01
Patients with dysphagia have impairments in many aspects, and an interdisciplinary approach is fundamental to define diagnosis and treatment. A joint approach in the clinical and videoendoscopy evaluation is paramount. To study the correlation between the clinical assessment (ACD) and the videoendoscopic (VED) assessment of swallowing by classifying the degree of severity and the qualitative/descriptive analyses of the procedures. cross-sectional, descriptive and comparative. held from March to December of 2006, at the Otolaryngology/Dysphagia ward of a hospital in the country side of São Paulo. 30 dysphagic patients with different disorders were assessed by ACD and VED. The data was classified by means of severity scales and qualitative/ descriptive analysis. the correlation between severity ACD and VED scales pointed to a statistically significant low agreement (KAPA = 0.4) (p=0,006). The correlation between the qualitative/descriptive analysis pointed to an excellent and statistically significant agreement (KAPA=0.962) (p<0.001) concerning the entire sample. the low agreement between the severity scales point to a need to perform both procedures, reinforcing VED as a doable procedure. The descriptive qualitative analysis pointed to an excellent agreement, and such data reinforces our need to understand swallowing as a process.
R is an open source language and environment for statistical computing and graphics that can also be used for both spatial analysis (i.e. geoprocessing and mapping of different types of spatial data) and spatial data analysis (i.e. the application of statistical descriptions and ...
WASP (Write a Scientific Paper) using Excel - 2: Pivot tables.
Grech, Victor
2018-02-01
Data analysis at the descriptive stage and the eventual presentation of results requires the tabulation and summarisation of data. This exercise should always precede inferential statistics. Pivot tables and pivot charts are one of Excel's most powerful and underutilised features, with tabulation functions that immensely facilitate descriptive statistics. Pivot tables permit users to dynamically summarise and cross-tabulate data, create tables in several dimensions, offer a range of summary statistics and can be modified interactively with instant outputs. Large and detailed datasets are thereby easily manipulated making pivot tables arguably the best way to explore, summarise and present data from many different angles. This second paper in the WASP series in Early Human Development provides pointers for pivot table manipulation in Excel™. Copyright © 2018 Elsevier B.V. All rights reserved.
Effect size and statistical power in the rodent fear conditioning literature - A systematic review.
Carneiro, Clarissa F D; Moulin, Thiago C; Macleod, Malcolm R; Amaral, Olavo B
2018-01-01
Proposals to increase research reproducibility frequently call for focusing on effect sizes instead of p values, as well as for increasing the statistical power of experiments. However, it is unclear to what extent these two concepts are indeed taken into account in basic biomedical science. To study this in a real-case scenario, we performed a systematic review of effect sizes and statistical power in studies on learning of rodent fear conditioning, a widely used behavioral task to evaluate memory. Our search criteria yielded 410 experiments comparing control and treated groups in 122 articles. Interventions had a mean effect size of 29.5%, and amnesia caused by memory-impairing interventions was nearly always partial. Mean statistical power to detect the average effect size observed in well-powered experiments with significant differences (37.2%) was 65%, and was lower among studies with non-significant results. Only one article reported a sample size calculation, and our estimated sample size to achieve 80% power considering typical effect sizes and variances (15 animals per group) was reached in only 12.2% of experiments. Actual effect sizes correlated with effect size inferences made by readers on the basis of textual descriptions of results only when findings were non-significant, and neither effect size nor power correlated with study quality indicators, number of citations or impact factor of the publishing journal. In summary, effect sizes and statistical power have a wide distribution in the rodent fear conditioning literature, but do not seem to have a large influence on how results are described or cited. Failure to take these concepts into consideration might limit attempts to improve reproducibility in this field of science.
Effect size and statistical power in the rodent fear conditioning literature – A systematic review
Macleod, Malcolm R.
2018-01-01
Proposals to increase research reproducibility frequently call for focusing on effect sizes instead of p values, as well as for increasing the statistical power of experiments. However, it is unclear to what extent these two concepts are indeed taken into account in basic biomedical science. To study this in a real-case scenario, we performed a systematic review of effect sizes and statistical power in studies on learning of rodent fear conditioning, a widely used behavioral task to evaluate memory. Our search criteria yielded 410 experiments comparing control and treated groups in 122 articles. Interventions had a mean effect size of 29.5%, and amnesia caused by memory-impairing interventions was nearly always partial. Mean statistical power to detect the average effect size observed in well-powered experiments with significant differences (37.2%) was 65%, and was lower among studies with non-significant results. Only one article reported a sample size calculation, and our estimated sample size to achieve 80% power considering typical effect sizes and variances (15 animals per group) was reached in only 12.2% of experiments. Actual effect sizes correlated with effect size inferences made by readers on the basis of textual descriptions of results only when findings were non-significant, and neither effect size nor power correlated with study quality indicators, number of citations or impact factor of the publishing journal. In summary, effect sizes and statistical power have a wide distribution in the rodent fear conditioning literature, but do not seem to have a large influence on how results are described or cited. Failure to take these concepts into consideration might limit attempts to improve reproducibility in this field of science. PMID:29698451
Ries, Kernell G.
1999-01-01
A network of 148 low-flow partial-record stations was operated on streams in Massachusetts during the summers of 1989 through 1996. Streamflow measurements (including historical measurements), measured basin characteristics, and estimated streamflow statistics are provided in the report for each low-flow partial-record station. Also included for each station are location information, streamflow-gaging stations for which flows were correlated to those at the low-flowpartial-record station, years of operation, and remarks indicating human influences of stream-flowsat the station. Three or four streamflow measurements were made each year for three years during times of low flow to obtain nine or ten measurements for each station. Measured flows at the low-flow partial-record stations were correlated with same-day mean flows at a nearby gaging station to estimate streamflow statistics for the low-flow partial-record stations. The estimated streamflow statistics include the 99-, 98-, 97-, 95-, 93-, 90-, 85-, 80-, 75-, 70-, 65-, 60-, 55-, and 50-percent duration flows; the 7-day, 10- and 2-year low flows; and the August median flow. Characteristics of the drainage basins for the stations that theoretically relate to the response of the station to climatic variations were measured from digital map data by use of an automated geographic information system procedure. Basin characteristics measured include drainage area; total stream length; mean basin slope; area of surficial stratified drift; area of wetlands; area of water bodies; and mean, maximum, and minimum basin elevation.Station descriptions and calculated streamflow statistics are also included in the report for the 50 continuous gaging stations used in correlations with the low-flow partial-record stations.
Nursing teams: behind the charts.
Bae, Sung-Heui; Farasat, Alireza; Nikolaev, Alex; Seo, Jin Young; Foltz-Ramos, Kelly; Fabry, Donna; Castner, Jessica
2017-07-01
To examine the nature and characteristics of both received and provided mutual support in a social network within an acute care hospital unit. Current evidence regarding the social network in the health care workforce reveals the nature of social ties. Most studies of social network-related support that measured the characteristics of social support used self-reported perception from workers receiving support. There is a gap in studies that focus on back-up behaviour. The evaluation included a social network analysis of a nursing unit employing 54 staff members. A 12 item electronic survey was administered. Descriptive statistics were calculated using the Statistical Package for the Social Sciences. Social network analyses were carried out using ucinet, r 3.2.3 and gephi. Based on the study findings, as providers of mutual support the nursing staff claimed to give their peers more help than these peers gave them credit for. Those who worked overtime provided more mutual support. Mutual support is a key teamwork characteristic, essential to quality and safety in hospital nursing teams that can be evaluated using social network analysis. Because of a discrepancy regarding receiving and providing help, examining both receiver and provider networks is a superior approach to understanding mutual support. © 2017 John Wiley & Sons Ltd.
Jeddi, Fatemeh Rangraz; Farzandipoor, Mehrdad; Arabfard, Masoud; Hosseini, Azam Haj Mohammad
2014-04-01
The purpose of this study was investigating situation and presenting a conceptual model for clinical governance information system by using UML in two sample hospitals. However, use of information is one of the fundamental components of clinical governance; but unfortunately, it does not pay much attention to information management. A cross sectional study was conducted in October 2012- May 2013. Data were gathered through questionnaires and interviews in two sample hospitals. Face and content validity of the questionnaire has been confirmed by experts. Data were collected from a pilot hospital and reforms were carried out and Final questionnaire was prepared. Data were analyzed by descriptive statistics and SPSS 16 software. With the scenario derived from questionnaires, UML diagrams are presented by using Rational Rose 7 software. The results showed that 32.14 percent Indicators of the hospitals were calculated. Database was not designed and 100 percent of the hospital's clinical governance was required to create a database. Clinical governance unit of hospitals to perform its mission, do not have access to all the needed indicators. Defining of Processes and drawing of models and creating of database are essential for designing of information systems.
Vaidya, Prutha; Mahale, Swapna; Badade, Pallavi; Warang, Ayushya; Kale, Sunila; Kalekar, Lavanya
2017-01-01
Widespread interest in epidermal ridges developed only in the last several decades; however, it is still at infancy in the world of dentistry. The word "dermatoglyphics" comes from two Greek words (derma: Skin and glyphe: Carve) and refers to the epidermal skin ridge formations which appear on the fingers, palms of the hands, and soles of the feet. This study aims to assess the relationship between finger prints and chronic periodontitis. Two hundred patients were equally divided into chronic periodontitis and periodontally healthy group. The fingerprint patterns of the participants were recorded with a rolling impression technique using duplicating ink on executive bond paper. The descriptive analysis of the data was presented as percentage frequency. The percentage frequencies of each pattern on each individual finger were calculated, and statistical tests were applied. Unpaired t-test was used for intergroup comparisons (P < 0.05). There were statistically more whorls and less arches in both right and left hands in patients with chronic periodontitis. Dermatoglyphics can lead to early diagnosis, treatment, and better prevention of many genetic disorders of the oral cavity and other diseases whose etiology may be influenced directly or indirectly by genetic inheritance.
Schwalbe, Craig S; Gearing, Robin E; Mackenzie, Michael J; Brewer, Kathryne B; Ibrahim, Rawan W
2013-01-01
This study reports the prevalence of emotional and behavioral problems among youths placed in juvenile correctional facilities in Jordan and describes the effect of length of stay on mental health outcomes. The Youth Self Report (YSR) was administered to 187 adolescent males (mean age=16.4, SD=1.0) in all five juvenile detention facilities in Jordan in 2011. Descriptive statistics were calculated to estimate the prevalence of emotional and behavioral problems. Logistic regression models were estimated to evaluate the impact of placement length on mental health. Statistical models were weighted by the youth propensity to be 'long-stay' youths (>23 weeks) based on preplacement case characteristics. The prevalence of clinically significant emotional and behavioral problems was 84%. 46% had YSR scores above the clinical cutpoint in both the internalizing and externalizing subscales. 24% of youths reported suicidal ideation. The high prevalence of emotional and behavioral disorders was stable across placement for most YSR subscales. The prevalence of emotional and behavioral disorders among detained and incarcerated youth in Jordan mirrors the literature worldwide. These findings suggest that serious mental health problems for many youths persist throughout placement. Copyright © 2013 Elsevier Ltd. All rights reserved.
Effects of Platform Design on the Customer Experience in an Online Solar PV Marketplace
DOE Office of Scientific and Technical Information (OSTI.GOV)
OShaughnessy, Eric J.; Margolis, Robert M.; Leibowicz, Benjamin
We analyze a unique dataset of residential solar PV quotes offered in an online marketplace to understand how platform design changes affect customer outcomes. Three of the four design changes are associated with statistically significant and robust reductions in offer prices, though none of the policies were designed explicitly to reduce prices. The results suggest that even small changes in how prospective solar PV customers interact with installers can affect customer outcomes such as prices. Specifically, the four changes we evaluate are: 1) a customer map that shows potential new EnergySage registrants the locations of nearby customers; 2) a quotemore » cap that precludes more than seven installers from bidding on any one customer; 3) a price guidance feature that informs installers about competitive prices in the customer's market before they submit quotes; and 4) no pre-quote messaging to prohibit installers from contacting customers prior to offering quotes. We calculate descriptive statistics to investigate whether each design change accomplished its specific objectives. Then, we econometrically evaluate the impacts of the design changes on PV quote prices and purchase prices using a regression discontinuity approach.« less
Mohd Suki, Norazah; Chwee Lian, Jennifer Chiam; Suki, Norbayah Mohd
2009-01-01
In today's highly competitive health care environment, many private health care settings are now looking into customer service indicators to learn customers' perceptions and determine whether they are meeting customers' expectations in order to ensure that their customers are satisfied with the services. This research paper aims to investigate whether the human elements were more important than the nonhuman elements in private health care settings. We used the internationally renowned SERVQUAL five-dimension model plus three additional dimensions-courtesy, communication, and understanding of customers of the human element-when evaluating health care services. A total of 191 respondents from three private health care settings in the Klang Valley region of Malaysia were investigated. Descriptive statistics were calculated by the Statistical Package for Social Sciences (SPSS) computer program, version 15. Interestingly, the results suggested that customers nowadays have very high expectations especially when it comes to the treatment they are receiving. Overall, the research indicated that the human elements were more important than the nonhuman element in private health care settings. Hospital management should look further to improve on areas that have been highlighted. Implications for management practice and directions for future research are discussed.
Jeddi, Fatemeh Rangraz; Farzandipoor, Mehrdad; Arabfard, Masoud; Hosseini, Azam Haj Mohammad
2016-04-01
The purpose of this study was investigating situation and presenting a conceptual model for clinical governance information system by using UML in two sample hospitals. However, use of information is one of the fundamental components of clinical governance; but unfortunately, it does not pay much attention to information management. A cross sectional study was conducted in October 2012- May 2013. Data were gathered through questionnaires and interviews in two sample hospitals. Face and content validity of the questionnaire has been confirmed by experts. Data were collected from a pilot hospital and reforms were carried out and Final questionnaire was prepared. Data were analyzed by descriptive statistics and SPSS 16 software. With the scenario derived from questionnaires, UML diagrams are presented by using Rational Rose 7 software. The results showed that 32.14 percent Indicators of the hospitals were calculated. Database was not designed and 100 percent of the hospital's clinical governance was required to create a database. Clinical governance unit of hospitals to perform its mission, do not have access to all the needed indicators. Defining of Processes and drawing of models and creating of database are essential for designing of information systems.
Determinants of feedback retention in soccer players.
Januário, Nuno; Rosado, António; Mesquita, Isabel; Gallego, José; Aguilar-Parra, José M
2016-06-01
This study analyzed soccer players' retention of coaches' feedback during training sessions. We intended to determine if the retention of information was influenced by the athletes' personal characteristic (age, gender and the sports level), the quantity of information included in coach's feedback (the number of ideas and redundancy), athletes' perception of the relevance of the feedback information and athletes' motivation as well as the attention level. The study that was conducted over the course of 18 sessions of soccer practice, involved 12 coaches (8 males, 4 females) and 342 athletes (246 males, 96 females), aged between 10 and 18 years old. All coach and athlete interventions were transposed to a written protocol and submitted to content analysis. Descriptive statistics and multiple linear regression were calculated. The results showed that a substantial part of the information was not retained by the athletes; in 65.5% of cases, athletes experienced difficulty in completely reproducing the ideas of the coaches and, on average, the value of feedback retention was 57.0%. Six variables with a statistically significant value were found: gender, the athletes' sports level, redundancy, the number of transmitted ideas, athletes' perception of the relevance of the feedback information and the athletes' motivation level.
Physical fitness profile of professional Italian firefighters: differences among age groups.
Perroni, Fabrizio; Cignitti, Lamberto; Cortis, Cristina; Capranica, Laura
2014-05-01
Firefighters perform many tasks which require a high level of fitness and their personal safety may be compromised by the physiological aging process. The aim of the study was to evaluate strength (bench-press), power (countermovement jump), sprint (20 m) and endurance (with and without Self Contained Breathing Apparatus - S.C.B.A.) of 161 Italian firefighters recruits in relation to age groups (<25 yr; 26-30 yr; 31-35 yr; 36-40 yr; 41-42 yr). Descriptive statistics and an ANOVA were calculated to provide the physical fitness profile for each parameter and to assess differences (p < 0.05) among age groups. Anthropometric values showed an age-effect for height and BMI, while performances values showed statistical differences for strength, power, sprint tests and endurance test with S.C.B.A. Wearing the S.C.B.A., 14% of all recruits failed to complete the endurance test. We propose that the firefighters should participate in an assessment of work capacity and specific fitness programs aimed to maintain an optimal fitness level for all ages. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Predictors of Errors of Novice Java Programmers
ERIC Educational Resources Information Center
Bringula, Rex P.; Manabat, Geecee Maybelline A.; Tolentino, Miguel Angelo A.; Torres, Edmon L.
2012-01-01
This descriptive study determined which of the sources of errors would predict the errors committed by novice Java programmers. Descriptive statistics revealed that the respondents perceived that they committed the identified eighteen errors infrequently. Thought error was perceived to be the main source of error during the laboratory programming…
The Status of Child Nutrition Programs in Colorado.
ERIC Educational Resources Information Center
McMillan, Daniel C.; Vigil, Herminia J.
This report provides descriptive and statistical data on the status of child nutrition programs in Colorado. The report contains descriptions of the National School Lunch Program, school breakfast programs, the Special Milk Program, the Summer Food Service Program, the Nutrition Education and Training Program, state dietary guidelines, Colorado…
NASA Technical Reports Server (NTRS)
Purves, L.; Strang, R. F.; Dube, M. P.; Alea, P.; Ferragut, N.; Hershfeld, D.
1983-01-01
The software and procedures of a system of programs used to generate a report of the statistical correlation between NASTRAN modal analysis results and physical tests results from modal surveys are described. Topics discussed include: a mathematical description of statistical correlation, a user's guide for generating a statistical correlation report, a programmer's guide describing the organization and functions of individual programs leading to a statistical correlation report, and a set of examples including complete listings of programs, and input and output data.
2016-06-01
theories of the mammalian visual system, and exploiting descriptive text that may accompany a still image for improved inference. The focus of the Brown...test, computer vision, semantic description , street scenes, belief propagation, generative models, nonlinear filtering, sufficient statistics 16...visual system, and exploiting descriptive text that may accompany a still image for improved inference. The focus of the Brown team was on single images
Physics-based statistical model and simulation method of RF propagation in urban environments
Pao, Hsueh-Yuan; Dvorak, Steven L.
2010-09-14
A physics-based statistical model and simulation/modeling method and system of electromagnetic wave propagation (wireless communication) in urban environments. In particular, the model is a computationally efficient close-formed parametric model of RF propagation in an urban environment which is extracted from a physics-based statistical wireless channel simulation method and system. The simulation divides the complex urban environment into a network of interconnected urban canyon waveguides which can be analyzed individually; calculates spectral coefficients of modal fields in the waveguides excited by the propagation using a database of statistical impedance boundary conditions which incorporates the complexity of building walls in the propagation model; determines statistical parameters of the calculated modal fields; and determines a parametric propagation model based on the statistical parameters of the calculated modal fields from which predictions of communications capability may be made.
Statistical Analysis of Research Data | Center for Cancer Research
Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. The Statistical Analysis of Research Data (SARD) course will be held on April 5-6, 2018 from 9 a.m.-5 p.m. at the National Institutes of Health's Natcher Conference Center, Balcony C on the Bethesda Campus. SARD is designed to provide an overview on the general principles of statistical analysis of research data. The first day will feature univariate data analysis, including descriptive statistics, probability distributions, one- and two-sample inferential statistics.
[Application of statistics on chronic-diseases-relating observational research papers].
Hong, Zhi-heng; Wang, Ping; Cao, Wei-hua
2012-09-01
To study the application of statistics on Chronic-diseases-relating observational research papers which were recently published in the Chinese Medical Association Magazines, with influential index above 0.5. Using a self-developed criterion, two investigators individually participated in assessing the application of statistics on Chinese Medical Association Magazines, with influential index above 0.5. Different opinions reached an agreement through discussion. A total number of 352 papers from 6 magazines, including the Chinese Journal of Epidemiology, Chinese Journal of Oncology, Chinese Journal of Preventive Medicine, Chinese Journal of Cardiology, Chinese Journal of Internal Medicine and Chinese Journal of Endocrinology and Metabolism, were reviewed. The rate of clear statement on the following contents as: research objectives, t target audience, sample issues, objective inclusion criteria and variable definitions were 99.43%, 98.57%, 95.43%, 92.86% and 96.87%. The correct rates of description on quantitative and qualitative data were 90.94% and 91.46%, respectively. The rates on correctly expressing the results, on statistical inference methods related to quantitative, qualitative data and modeling were 100%, 95.32% and 87.19%, respectively. 89.49% of the conclusions could directly response to the research objectives. However, 69.60% of the papers did not mention the exact names of the study design, statistically, that the papers were using. 11.14% of the papers were in lack of further statement on the exclusion criteria. Percentage of the papers that could clearly explain the sample size estimation only taking up as 5.16%. Only 24.21% of the papers clearly described the variable value assignment. Regarding the introduction on statistical conduction and on database methods, the rate was only 24.15%. 18.75% of the papers did not express the statistical inference methods sufficiently. A quarter of the papers did not use 'standardization' appropriately. As for the aspect of statistical inference, the rate of description on statistical testing prerequisite was only 24.12% while 9.94% papers did not even employ the statistical inferential method that should be used. The main deficiencies on the application of Statistics used in papers related to Chronic-diseases-related observational research were as follows: lack of sample-size determination, variable value assignment description not sufficient, methods on statistics were not introduced clearly or properly, lack of consideration for pre-requisition regarding the use of statistical inferences.
Geochemistry of sediments in the Northern and Central Adriatic Sea
NASA Astrophysics Data System (ADS)
De Lazzari, A.; Rampazzo, G.; Pavoni, B.
2004-03-01
Major, minor and trace elements, loss of ignition, specific surface area, quantities of calcite and dolomite, qualitative mineralogical composition, grain-size distribution and organic micropollutants (PAH, PCB, DDT) were determined on surficial marine sediments sampled during the 1990 ASCOP (Adriatic Scientific Cooperative Program) cruise. Mineralogical composition and carbonate content of the samples were found to be comparable with data previously reported in the literature, whereas geochemical composition and distribution of major, minor and trace elements for samples in international waters and in the central basin have never been reported before. The large amount of information contained in the variables of different origin has been processed by means of a comprehensive approach which establishes the relations among the components through the mathematical-statistical calculation of principal components (factors). These account for the major part of data variance loosing only marginal parts of information and are independent from the units of measure. The sample descriptors concerning natural components and contamination load are discussed by means of a statistical model based on an R-mode Factor analysis calculating four significant factors which explain 86.8% of the total variance, and represent important relationships between grain size, mineralogy, geochemistry and organic micropollutants. A description and an interpretation of factor composition is discussed on the basis of pollution inputs, basin geology and hydrodynamics. The areal distribution of the factors showed that it is the fine grain-size fraction, with oxides and hydroxides of colloidal origin, which are the main means of transport and thus the principal link between chemical, physical and granulometric elements in the Adriatic.
Solving large scale structure in ten easy steps with COLA
NASA Astrophysics Data System (ADS)
Tassev, Svetlin; Zaldarriaga, Matias; Eisenstein, Daniel J.
2013-06-01
We present the COmoving Lagrangian Acceleration (COLA) method: an N-body method for solving for Large Scale Structure (LSS) in a frame that is comoving with observers following trajectories calculated in Lagrangian Perturbation Theory (LPT). Unlike standard N-body methods, the COLA method can straightforwardly trade accuracy at small-scales in order to gain computational speed without sacrificing accuracy at large scales. This is especially useful for cheaply generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing, as those catalogs are essential for performing detailed error analysis for ongoing and future surveys of LSS. As an illustration, we ran a COLA-based N-body code on a box of size 100 Mpc/h with particles of mass ≈ 5 × 109Msolar/h. Running the code with only 10 timesteps was sufficient to obtain an accurate description of halo statistics down to halo masses of at least 1011Msolar/h. This is only at a modest speed penalty when compared to mocks obtained with LPT. A standard detailed N-body run is orders of magnitude slower than our COLA-based code. The speed-up we obtain with COLA is due to the fact that we calculate the large-scale dynamics exactly using LPT, while letting the N-body code solve for the small scales, without requiring it to capture exactly the internal dynamics of halos. Achieving a similar level of accuracy in halo statistics without the COLA method requires at least 3 times more timesteps than when COLA is employed.
[Active Substance Index (AKS) percentile distribution in pediatric ages].
Henriquez-Pérez, Gladys; Rached-Paoli, Ingrid; Azuaje-Sánchez, Arelis
2009-12-01
The aim of this study was to discern the percentile distribution of the Active Substance Index (AKS) in boys and girls aged 4 to 9 years in order to obtain reference values for this indicator. This index was calculated in 3634 healthy and well-nourished children with normal stature from a poor urban community at Centro de Atención Nutricional Infantil Antímano (CANIA), within the period between January 1999 and December 2007. Children with prematurity backgrounds, pubertal growth spurts, or with chronic pathologies, whether defined or under study, were excluded. The Dugdale & Griffiths two-skinfold equation for boys and girls shorter than 150 cm and 140 cm, respectively was used to obtain the fat body mass required to estimate the AKS index. The variables were measured by standardized anthropometrics technicians, with quality control every 4 months as recommended by international standards. Descriptive statistics of the AKS index and variables used for their calculation were obtained, as well as index percentiles 3, 10, 25, 50, 75, 90, and 97. Tests applied included Kolmogorov-Smirnoff, Anova one-way, Chi Square, Tukey and bivariated correlations (p < 0.05). The AKS index behavior exhibited higher values in the boys, decreasing with age in both sexes, ranging from 1.28 to 1.04 in the boys and from 1.17 to 0.94 in the girls. Statistically significant differences were found for each age and sex. These results provide the AKS index percentile distribution values needed for nutritional assessments in pediatric ages. These values should be validated and their effectiveness should be studied.
Effect of esthetic core shades on the final color of IPS Empress all-ceramic crowns.
Azer, Shereen S; Ayash, Ghada M; Johnston, William M; Khalil, Moustafa F; Rosenstiel, Stephen F
2006-12-01
Clinically relevant assessment of all-ceramic crowns supported by esthetic composite resin foundations has not been evaluated with regard to color reproducibility. This in vitro study quantitatively evaluated the influence of different shades of composite resin foundations and resin cement on the final color of a leucite-reinforced all-ceramic material. A total of 128 disks were fabricated; 64 (20 x 1 mm) were made of all-ceramic material (IPS Empress) and 64 (20 x 4 mm) of 4 different shades composite resin (Tetric Ceram). The ceramic and composite resin disks were luted using 2 shades (A3 and Transparent) of resin cement (Variolink II). Color was measured using a colorimeter configured with a diffuse illumination/0-degree viewing geometry, and Commission Internationale de l'Eclairage (CIE) L( *)a( *)b( *) values were directly calculated. Descriptive statistical analysis was performed, and color differences (DeltaE) for the average L( *), a( *) and b( *) color parameters were calculated. Repeated measures analysis of variance (ANOVA) was used to compare mean values and SDs between the different color combinations (alpha=.05). The CIE L( *)a( *)b( *) color coordinate values showed no significant differences for variation in color parameters due to the effect of the different composite resin shades (P=.24) or cement shades (P=.12). The mean color difference (DeltaE) value between the groups was 0.8. Within the limitations of this study, the use of different shades for composite resin cores and resin cements presented no statistically significant effect on the final color of IPS Empress all-ceramic material.
Evaluated teletherapy source library
Cox, Lawrence J.; Schach Von Wittenau, Alexis E.
2000-01-01
The Evaluated Teletherapy Source Library (ETSL) is a system of hardware and software that provides for maintenance of a library of useful phase space descriptions (PSDs) of teletherapy sources used in radiation therapy for cancer treatment. The PSDs are designed to be used by PEREGRINE, the all-particle Monte Carlo dose calculation system. ETSL also stores other relevant information such as monitor unit factors (MUFs) for use with the PSDs, results of PEREGRINE calculations using the PSDs, clinical calibration measurements, and geometry descriptions sufficient for calculational purposes. Not all of this information is directly needed by PEREGRINE. It also is capable of acting as a repository for the Monte Carlo simulation history files from which the generic PSDs are derived.
Environmental statistics with S-Plus
DOE Office of Scientific and Technical Information (OSTI.GOV)
Millard, S.P.; Neerchal, N.K.
1999-12-01
The combination of easy-to-use software with easy access to a description of the statistical methods (definitions, concepts, etc.) makes this book an excellent resource. One of the major features of this book is the inclusion of general information on environmental statistical methods and examples of how to implement these methods using the statistical software package S-Plus and the add-in modules Environmental-Stats for S-Plus, S+SpatialStats, and S-Plus for ArcView.
Cheyney, Melissa; Bovbjerg, Marit; Everson, Courtney; Gordon, Wendy; Hannibal, Darcy; Vedam, Saraswathi
2014-01-01
In 2004, the Midwives Alliance of North America's (MANA's) Division of Research developed a Web-based data collection system to gather information on the practices and outcomes associated with midwife-led births in the United States. This system, called the MANA Statistics Project (MANA Stats), grew out of a widely acknowledged need for more reliable data on outcomes by intended place of birth. This article describes the history and development of the MANA Stats birth registry and provides an analysis of the 2.0 dataset's content, strengths, and limitations. Data collection and review procedures for the MANA Stats 2.0 dataset are described, along with methods for the assessment of data accuracy. We calculated descriptive statistics for client demographics and contributing midwife credentials, and assessed the quality of data by calculating point estimates, 95% confidence intervals, and kappa statistics for key outcomes on pre- and postreview samples of records. The MANA Stats 2.0 dataset (2004-2009) contains 24,848 courses of care, 20,893 of which are for women who planned a home or birth center birth at the onset of labor. The majority of these records were planned home births (81%). Births were attended primarily by certified professional midwives (73%), and clients were largely white (92%), married (87%), and college-educated (49%). Data quality analyses of 9932 records revealed no differences between pre- and postreviewed samples for 7 key benchmarking variables (kappa, 0.98-1.00). The MANA Stats 2.0 data were accurately entered by participants; any errors in this dataset are likely random and not systematic. The primary limitation of the 2.0 dataset is that the sample was captured through voluntary participation; thus, it may not accurately reflect population-based outcomes. The dataset's primary strength is that it will allow for the examination of research questions on normal physiologic birth and midwife-led birth outcomes by intended place of birth. © 2014 by the American College of Nurse-Midwives.
ERIC Educational Resources Information Center
Barber, Betsy; Ball, Rhonda
This project description is designed to show how graphing calculators and calculator-based laboratories (CBLs) can be used to explore topics in physics and health sciences. The activities address such topics as respiration, heart rate, and the circulatory system. Teaching notes and calculator instructions are included as are blackline masters. (MM)
Statistical Package User’s Guide.
1980-08-01
261 C. STACH Nonparametric Descriptive Statistics ... ......... ... 265 D. CHIRA Coefficient of Concordance...135 I.- -a - - W 7- Test Data: This program was tested using data from John Neter and William Wasserman, Applied Linear Statistical Models: Regression...length of data file e. new fileý name (not same as raw data file) 5. Printout as optioned for only. Comments: Ranked data are used for program CHIRA
Anger and depression levels of mothers with premature infants in the neonatal intensive care unit.
Kardaşözdemir, Funda; AKGüN Şahin, Zümrüt
2016-02-04
The aim of this study was to examine anger and depression levels of mothers who had a premature infant in the NICU, and all factors affecting the situation. This descriptive study was performed in the level I and II units of NICU at three state hospitals in Turkey. The data was collected with a demographic questionnaire, "Beck Depression Inventory" and "Anger Expression Scale". Descriptive statistics, parametric and nonparametric statistical tests and Pearson correlation were used in the data analysis. Mothers whose infants are under care in NICU have moderate depression. It has also been determined that mothers' educational level, income level and gender of infants were statistically significant (p <0.05). A positive relationship between depression and trait anger scores was found to be statistically significant. A negative relationship existed between depression and anger-control scores for the mothers, which was statistically significant (p <0.05). Due to the results of research, recommended that mothers who are at risk of depression and anger in the NICU evaluated by nurses and these nurses to develop their consulting roles.
McMullan, Miriam; Jones, Ray; Lea, Susan
2010-04-01
This paper is a report of a correlational study of the relations of age, status, experience and drug calculation ability to numerical ability of nursing students and Registered Nurses. Competent numerical and drug calculation skills are essential for nurses as mistakes can put patients' lives at risk. A cross-sectional study was carried out in 2006 in one United Kingdom university. Validated numerical and drug calculation tests were given to 229 second year nursing students and 44 Registered Nurses attending a non-medical prescribing programme. The numeracy test was failed by 55% of students and 45% of Registered Nurses, while 92% of students and 89% of nurses failed the drug calculation test. Independent of status or experience, older participants (> or = 35 years) were statistically significantly more able to perform numerical calculations. There was no statistically significant difference between nursing students and Registered Nurses in their overall drug calculation ability, but nurses were statistically significantly more able than students to perform basic numerical calculations and calculations for solids, oral liquids and injections. Both nursing students and Registered Nurses were statistically significantly more able to perform calculations for solids, liquid oral and injections than calculations for drug percentages, drip and infusion rates. To prevent deskilling, Registered Nurses should continue to practise and refresh all the different types of drug calculations as often as possible with regular (self)-testing of their ability. Time should be set aside in curricula for nursing students to learn how to perform basic numerical and drug calculations. This learning should be reinforced through regular practice and assessment.
Tonkin, Matthew J.; Tiedeman, Claire; Ely, D. Matthew; Hill, Mary C.
2007-01-01
The OPR-PPR program calculates the Observation-Prediction (OPR) and Parameter-Prediction (PPR) statistics that can be used to evaluate the relative importance of various kinds of data to simulated predictions. The data considered fall into three categories: (1) existing observations, (2) potential observations, and (3) potential information about parameters. The first two are addressed by the OPR statistic; the third is addressed by the PPR statistic. The statistics are based on linear theory and measure the leverage of the data, which depends on the location, the type, and possibly the time of the data being considered. For example, in a ground-water system the type of data might be a head measurement at a particular location and time. As a measure of leverage, the statistics do not take into account the value of the measurement. As linear measures, the OPR and PPR statistics require minimal computational effort once sensitivities have been calculated. Sensitivities need to be calculated for only one set of parameter values; commonly these are the values estimated through model calibration. OPR-PPR can calculate the OPR and PPR statistics for any mathematical model that produces the necessary OPR-PPR input files. In this report, OPR-PPR capabilities are presented in the context of using the ground-water model MODFLOW-2000 and the universal inverse program UCODE_2005. The method used to calculate the OPR and PPR statistics is based on the linear equation for prediction standard deviation. Using sensitivities and other information, OPR-PPR calculates (a) the percent increase in the prediction standard deviation that results when one or more existing observations are omitted from the calibration data set; (b) the percent decrease in the prediction standard deviation that results when one or more potential observations are added to the calibration data set; or (c) the percent decrease in the prediction standard deviation that results when potential information on one or more parameters is added.
Statistics of high-level scene context.
Greene, Michelle R
2013-01-01
CONTEXT IS CRITICAL FOR RECOGNIZING ENVIRONMENTS AND FOR SEARCHING FOR OBJECTS WITHIN THEM: contextual associations have been shown to modulate reaction time and object recognition accuracy, as well as influence the distribution of eye movements and patterns of brain activations. However, we have not yet systematically quantified the relationships between objects and their scene environments. Here I seek to fill this gap by providing descriptive statistics of object-scene relationships. A total of 48, 167 objects were hand-labeled in 3499 scenes using the LabelMe tool (Russell et al., 2008). From these data, I computed a variety of descriptive statistics at three different levels of analysis: the ensemble statistics that describe the density and spatial distribution of unnamed "things" in the scene; the bag of words level where scenes are described by the list of objects contained within them; and the structural level where the spatial distribution and relationships between the objects are measured. The utility of each level of description for scene categorization was assessed through the use of linear classifiers, and the plausibility of each level for modeling human scene categorization is discussed. Of the three levels, ensemble statistics were found to be the most informative (per feature), and also best explained human patterns of categorization errors. Although a bag of words classifier had similar performance to human observers, it had a markedly different pattern of errors. However, certain objects are more useful than others, and ceiling classification performance could be achieved using only the 64 most informative objects. As object location tends not to vary as a function of category, structural information provided little additional information. Additionally, these data provide valuable information on natural scene redundancy that can be exploited for machine vision, and can help the visual cognition community to design experiments guided by statistics rather than intuition.
Catalog of earthquake hypocenters at Alaskan volcanoes: January 1 through December 31, 2003
Dixon, James P.; Stihler, Scott D.; Power, John A.; Tytgat, Guy; Moran, Seth C.; Sanchez, John J.; McNutt, Stephen R.; Estes, Steve; Paskievitch, John
2004-01-01
The Alaska Volcano Observatory (AVO), a cooperative program of the U.S. Geological Survey, the Geophysical Institute of the University of Alaska Fairbanks, and the Alaska Division of Geological and Geophysical Surveys, has maintained seismic monitoring networks at historically active volcanoes in Alaska since 1988. The primary objectives of this program are the near real time seismic monitoring of active, potentially hazardous, Alaskan volcanoes and the investigation of seismic processes associated with active volcanism. This catalog presents the calculated earthquake hypocenter and phase arrival data, and changes in the seismic monitoring program for the period January 1 through December 31, 2003.The AVO seismograph network was used to monitor the seismic activity at twenty-seven volcanoes within Alaska in 2003. These include Mount Wrangell, Mount Spurr, Redoubt Volcano, Iliamna Volcano, Augustine Volcano, Katmai volcanic cluster (Snowy Mountain, Mount Griggs, Mount Katmai, Novarupta, Trident Volcano, Mount Mageik, Mount Martin), Aniakchak Crater, Mount Veniaminof, Pavlof Volcano, Mount Dutton, Isanotski Peaks, Shishaldin Volcano, Fisher Caldera, Westdahl Peak, Akutan Peak, Makushin Volcano, Okmok Caldera, Great Sitkin Volcano, Kanaga Volcano, Tanaga Volcano, and Mount Gareloi. Monitoring highlights in 2003 include: continuing elevated seismicity at Mount Veniaminof in January-April (volcanic unrest began in August 2002), volcanogenic seismic swarms at Shishaldin Volcano throughout the year, and low-level tremor at Okmok Caldera throughout the year. Instrumentation and data acquisition highlights in 2003 were the installation of subnetworks on Tanaga and Gareloi Islands, the installation of broadband installations on Akutan Volcano and Okmok Caldera, and the establishment of telemetry for the Okmok Caldera subnetwork. AVO located 3911 earthquakes in 2003.This catalog includes: (1) a description of instruments deployed in the field and their locations; (2) a description of earthquake detection, recording, analysis, and data archival systems; (3) a description of velocity models used for earthquake locations; (4) a summary of earthquakes located in 2003; and (5) an accompanying UNIX tar-file with a summary of earthquake origin times, hypocenters, magnitudes, phase arrival times, and location quality statistics; daily station usage statistics; and all HYPOELLIPSE files used to determine the earthquake locations in 2003.
Bieber, Frederick R; Buckleton, John S; Budowle, Bruce; Butler, John M; Coble, Michael D
2016-08-31
The evaluation and interpretation of forensic DNA mixture evidence faces greater interpretational challenges due to increasingly complex mixture evidence. Such challenges include: casework involving low quantity or degraded evidence leading to allele and locus dropout; allele sharing of contributors leading to allele stacking; and differentiation of PCR stutter artifacts from true alleles. There is variation in statistical approaches used to evaluate the strength of the evidence when inclusion of a specific known individual(s) is determined, and the approaches used must be supportable. There are concerns that methods utilized for interpretation of complex forensic DNA mixtures may not be implemented properly in some casework. Similar questions are being raised in a number of U.S. jurisdictions, leading to some confusion about mixture interpretation for current and previous casework. Key elements necessary for the interpretation and statistical evaluation of forensic DNA mixtures are described. Given the most common method for statistical evaluation of DNA mixtures in many parts of the world, including the USA, is the Combined Probability of Inclusion/Exclusion (CPI/CPE). Exposition and elucidation of this method and a protocol for use is the focus of this article. Formulae and other supporting materials are provided. Guidance and details of a DNA mixture interpretation protocol is provided for application of the CPI/CPE method in the analysis of more complex forensic DNA mixtures. This description, in turn, should help reduce the variability of interpretation with application of this methodology and thereby improve the quality of DNA mixture interpretation throughout the forensic community.
Yagüe-Sebastián, M M; Coscollar-Escartín, I; Muñoz-Albadalejo, P; López-Canales, M C; Villaverde-Royo, M V; Gutiérrez-Moreno, F
2013-09-01
To describe the prescribing of topical non-steroidal anti-inflammatory drugs (NSAIDs) in an urban health center (Zaragoza, Spain). A cross-sectional descriptive study was conducted on subjects who belonged to an urban center and were studied during the year 2010. The sample size with a confidence level of 95%, was calculated, a total of 843 prescriptions were analyzed. The sample was single random, and 150 cases were selected. The prevalence and confidence intervals were calculated. The statistical package STATA 9.1 was used for the calculations. The most used drug was diclofenac, in 27.33% (95% CI: 20.65-34.88). NSAIDs were most used in females. In 18% of the cases the area of application was the knee, followed by the 15% in the lower back area (95% CI: 10,22-21,78). There were no adverse reactions. Frequent use is made of topical NSAIDs in a basic health area. Current recommendations support the use in the knee and in the hand, but not in the back, where its use is common. The use of topical NSAIDs decreases side effects and drug interactions, therefore their use is recommended in patients on multiple drug therapy and in the elderly. Copyright © 2012 Sociedad Española de Médicos de Atención Primaria (SEMERGEN). Publicado por Elsevier España. All rights reserved.
Lyssenko, Nathalie; Redies, Christoph; Hayn-Leichsenring, Gregor U
2016-01-01
One of the major challenges in experimental aesthetics is the uncertainty of the terminology used in experiments. In this study, we recorded terms that are spontaneously used by participants to describe abstract artworks and studied their relation to the second-order statistical image properties of the same artworks (Experiment 1). We found that the usage frequency of some structure-describing terms correlates with statistical image properties, such as PHOG Self-Similarity, Anisotropy and Complexity. Additionally, emotion-associated terms correlate with measured color values. Next, based on the most frequently used terms, we created five different rating scales (Experiment 2) and obtained ratings of participants for the abstract paintings on these scales. We found significant correlations between descriptive score ratings (e.g., between structure and subjective complexity), between evaluative and descriptive score ratings (e.g., between preference and subjective complexity/interest) and between descriptive score ratings and statistical image properties (e.g., between interest and PHOG Self-Similarity, Complexity and Anisotropy). Additionally, we determined the participants' personality traits as described in the 'Big Five Inventory' (Goldberg, 1990; Rammstedt and John, 2005) and correlated them with the ratings and preferences of individual participants. Participants with higher scores for Neuroticism showed preferences for objectively more complex images, as well as a different notion of the term complex when compared with participants with lower scores for Neuroticism. In conclusion, this study demonstrates an association between objectively measured image properties and the subjective terms that participants use to describe or evaluate abstract artworks. Moreover, our results suggest that the description of abstract artworks, their evaluation and the preference of participants for their low-level statistical properties are linked to personality traits.
Lyssenko, Nathalie; Redies, Christoph; Hayn-Leichsenring, Gregor U.
2016-01-01
One of the major challenges in experimental aesthetics is the uncertainty of the terminology used in experiments. In this study, we recorded terms that are spontaneously used by participants to describe abstract artworks and studied their relation to the second-order statistical image properties of the same artworks (Experiment 1). We found that the usage frequency of some structure-describing terms correlates with statistical image properties, such as PHOG Self-Similarity, Anisotropy and Complexity. Additionally, emotion-associated terms correlate with measured color values. Next, based on the most frequently used terms, we created five different rating scales (Experiment 2) and obtained ratings of participants for the abstract paintings on these scales. We found significant correlations between descriptive score ratings (e.g., between structure and subjective complexity), between evaluative and descriptive score ratings (e.g., between preference and subjective complexity/interest) and between descriptive score ratings and statistical image properties (e.g., between interest and PHOG Self-Similarity, Complexity and Anisotropy). Additionally, we determined the participants’ personality traits as described in the ‘Big Five Inventory’ (Goldberg, 1990; Rammstedt and John, 2005) and correlated them with the ratings and preferences of individual participants. Participants with higher scores for Neuroticism showed preferences for objectively more complex images, as well as a different notion of the term complex when compared with participants with lower scores for Neuroticism. In conclusion, this study demonstrates an association between objectively measured image properties and the subjective terms that participants use to describe or evaluate abstract artworks. Moreover, our results suggest that the description of abstract artworks, their evaluation and the preference of participants for their low-level statistical properties are linked to personality traits. PMID:27445933
A quantitative study of the clustering of polycyclic aromatic hydrocarbons at high temperatures.
Totton, Tim S; Misquitta, Alston J; Kraft, Markus
2012-03-28
The clustering of polycyclic aromatic hydrocarbon (PAH) molecules is investigated in the context of soot particle inception and growth using an isotropic potential developed from the benchmark PAHAP potential. This potential is used to estimate equilibrium constants of dimerisation for five representative PAH molecules based on a statistical mechanics model. Molecular dynamics simulations are also performed to study the clustering of homomolecular systems at a range of temperatures. The results from both sets of calculations demonstrate that at flame temperatures pyrene (C(16)H(10)) dimerisation cannot be a key step in soot particle formation and that much larger molecules (e.g. circumcoronene, C(54)H(18)) are required to form small clusters at flame temperatures. The importance of using accurate descriptions of the intermolecular interactions is demonstrated by comparing results to those calculated with a popular literature potential with an order of magnitude variation in the level of clustering observed. By using an accurate intermolecular potential we are able to show that physical binding of PAH molecules based on van der Waals interactions alone can only be a viable soot inception mechanism if concentrations of large PAH molecules are significantly higher than currently thought.
Correlating N2 and CH4 adsorption on microporous carbon using a new analytical model
Sun, Jielun; Chen, S.; Rood, M.J.; Rostam-Abadi, M.
1998-01-01
A new pore size distribution (PSD) model is developed to readily describe PSDs of microporous materials with an analytical expression. Results from this model can be used to calculate the corresponding adsorption isotherm to compare the calculated isotherm to the experimental isotherm. This aspect of the model provides another check on the validity of the model's results. The model is developed on the basis of a 3-D adsorption isotherm equation that is derived from statistical mechanical principles. Least-squares error minimization is used to solve the PSD without any preassumed distribution function. In comparison with several well-accepted analytical methods from the literature, this 3-D model offers a relatively realistic PSD description for select reference materials, including activated-carbon fibers. N2 and CH4 adsorption is correlated using the 3-D model for commercial carbons BPL and AX-21. Predicted CH4 adsorption isotherms at 296 K based on N2 adsorption at 77 K are in reasonable agreement with experimental CH4 isotherms. Use of the model is also described for characterizing PSDs of tire-derived activated carbons and coal-derived activated carbons for air-quality control applications.
NASA Astrophysics Data System (ADS)
Miller, Steven D.
1995-05-01
Standard Monte Carlo methods used in photon diffusion score absorbed photons or statistical weight deposited within voxels comprising a mesh. An alternative approach to a stochastic description is considered for rapid surface flux calculations and finite medias. Matrix elements are assigned to a spatial lattice whose function is to score vector intersections of scattered photons making transitions into either the forward or back solid angle half spaces. These complete matrix elements can be related to the directional fluxes within the lattice space. This model differentiates between ballistic, quasi-ballistic, and highly diffuse photon contributions, and effectively models the subsurface generation of a scattered light flux from a ballistic source. The connection between a path integral and diffusion is illustrated. Flux perturbations can be effectively illustrated for tissue-tumor-tissue and for 3 layer systems with strong absorption in one or more layers. For conditions where the diffusion theory has difficulties such as strong absorption, highly collimated sources, small finite volumes, and subsurface regions, the computation time of the algorithm is rapid with good accuracy and compliments other description of photon diffusion. The model has the potential to do computations relevant to photodynamic therapy (PDT) and analysis of laser beam interaction with tissues.
Audiology practice management in South Africa: What audiologists know and what they should know
Kritzinger, Alta; Soer, Maggi
2015-01-01
Background In future, the South African Department of Health aims to purchase services from accredited private service providers. Successful private audiology practices can assist to address issues of access, equity and quality of health services. It is not sufficient to be an excellent clinician, since audiology practices are businesses that must also be managed effectively. Objective The objective was to determine the existing and required levels of practice management knowledge as perceived by South African audiologists. Method An electronic descriptive survey was used to investigate audiology practice management amongst South African audiologists. A total of 147 respondents completed the survey. Results were analysed by calculating descriptive statistics. The Z-proportional test was used to identify significant differences between existing and required levels of practice management knowledge. Results Significant differences were found between existing and required levels of knowledge regarding all eight practice management tasks, particularly legal and ethical issues and marketing and accounting. There were small differences in the knowledge required for practice management tasks amongst respondents working in public and private settings. Conclusion Irrespective of their work context, respondents showed that they need significant expansion of practice management knowledge in order to be successful, to compete effectively and to make sense of a complex marketplace. PMID:26809158
Fleminger, Jessica; Goldacre, Ben
2018-01-01
Trial registries are a key source of information for clinicians and researchers. While building OpenTrials, an open database of public trial information, we identified errors and omissions in registries, including discrepancies between descriptions of the same trial in different registries. We set out to ascertain the prevalence of discrepancies in trial completion status using a cohort of trials registered on both the European Union Clinical Trials Register (EUCTR) and ClinicalTrials.gov. We used matching titles and registry IDs provided by both registries to build a cohort of dual-registered trials. Completion statuses were compared; we calculated descriptive statistics on the prevalence of discrepancies. 11,988 dual-registered trials were identified. 1,496 did not provide a comparable completion status, leaving 10,492 trials. 16.2% were discrepant on completion status. The majority of discrepancies (90.5%) were a 'completed' trial on ClinicalTrials.gov inaccurately marked as 'ongoing' on EUCTR. Overall, 33.9% of dual-registered trials described as 'ongoing' on EUCTR were listed as 'completed' on ClinicalTrials.gov. Completion status on registries is commonly inaccurate. Previous work on publication bias may underestimate non-reporting. We describe simple steps registry owners and trialists could take to improve accuracy.
Navarro-Sandoval, Cristina; Uriostegui-Espíritu, Lizbeth Carlota; Delgado-Quiñones, Edna Gabriela; Sahagún-Cuevas, Minerva Natalia
2017-01-01
According to the National Health and Nutrition Survey of 2012, more than a quarter of older adults (26.9%) have some degree of disability, requiring a primary caregiver to perform basic activities of daily living. The aim is to determine the prevalence of depression and burden on primary caregivers of elderly persons with physical dependence. A descriptive cross-sectional study with non-probability sampling that included the primary caregivers of elderly patients with physical dependence. Barthel scale was applied as a tool to measure the level of physical dependence in elderly patients; while the primary caregivers were applied to the Beck Depression and Zarit scale for assessing the level of caregiver burden. A sample of 76 primary caregivers was calculated and descriptive statistical analysis was performed. Of the 76 primary caregivers, 55.3% were without depression, 32.9% had mild depression, and 11.8% with moderate depression. According to the Zarit scale, 40.8% had no burden, 44.7% had burden light, and 14.5% intense burden. The role of primary caregiver is a stressful task which can interfere with their family health; so our role is to provide care not only to the geriatric dependent patients, but also to their caregiver.
Precalculus Teachers' Perspectives on Using Graphing Calculators: An Example from One Curriculum
ERIC Educational Resources Information Center
Karadeniz, Ilyas; Thompson, Denisse R.
2018-01-01
Graphing calculators are hand-held technological tools currently used in mathematics classrooms. Teachers' perspectives on using graphing calculators are important in terms of exploring what teachers think about using such technology in advanced mathematics courses, particularly precalculus courses. A descriptive intrinsic case study was conducted…
Filter Tuning Using the Chi-Squared Statistic
NASA Technical Reports Server (NTRS)
Lilly-Salkowski, Tyler B.
2017-01-01
This paper examines the use of the Chi-square statistic as a means of evaluating filter performance. The goal of the process is to characterize the filter performance in the metric of covariance realism. The Chi-squared statistic is the value calculated to determine the realism of a covariance based on the prediction accuracy and the covariance values at a given point in time. Once calculated, it is the distribution of this statistic that provides insight on the accuracy of the covariance. The process of tuning an Extended Kalman Filter (EKF) for Aqua and Aura support is described, including examination of the measurement errors of available observation types, and methods of dealing with potentially volatile atmospheric drag modeling. Predictive accuracy and the distribution of the Chi-squared statistic, calculated from EKF solutions, are assessed.
Statistical analysis of vehicle crashes in Mississippi based on crash data from 2010 to 2014.
DOT National Transportation Integrated Search
2017-08-15
Traffic crash data from 2010 to 2014 were collected by Mississippi Department of Transportation (MDOT) and extracted for the study. Three tasks were conducted in this study: (1) geographic distribution of crashes; (2) descriptive statistics of crash ...
Using Carbon Emissions Data to "Heat Up" Descriptive Statistics
ERIC Educational Resources Information Center
Brooks, Robert
2012-01-01
This article illustrates using carbon emissions data in an introductory statistics assignment. The carbon emissions data has desirable characteristics including: choice of measure; skewness; and outliers. These complexities allow research and public policy debate to be introduced. (Contains 4 figures and 2 tables.)
Statistical mechanics of economics I
NASA Astrophysics Data System (ADS)
Kusmartsev, F. V.
2011-02-01
We show that statistical mechanics is useful in the description of financial crisis and economics. Taking a large amount of instant snapshots of a market over an interval of time we construct their ensembles and study their statistical interference. This results in a probability description of the market and gives capital, money, income, wealth and debt distributions, which in the most cases takes the form of the Bose-Einstein distribution. In addition, statistical mechanics provides the main market equations and laws which govern the correlations between the amount of money, debt, product, prices and number of retailers. We applied the found relations to a study of the evolution of the economics in USA between the years 1996 to 2008 and observe that over that time the income of a major population is well described by the Bose-Einstein distribution which parameters are different for each year. Each financial crisis corresponds to a peak in the absolute activity coefficient. The analysis correctly indicates the past crises and predicts the future one.
A study of the United States coal resources
NASA Technical Reports Server (NTRS)
Ferm, J. C.; Muthig, P. J.
1982-01-01
Geologically significant coal resources were identified. Statistically controlled tonnage estimates for each resource type were prepared. Particular emphasis was placed on the identification and description of coals in terms of seam thickness, inclination, depth of cover, discontinuities caused by faulting and igneous intrusion, and occurrence as isolated or multiseam deposits. The national resource was organized into six major coal provinces: the Appalachian Plateau, the Interior Basins, the Gulf Coastal Plain, the Rocky Mountain Basins, the High Plains, and North Alaska. Each basin within a province was blocked into subareas of homogeneous coal thickness. Total coal tonnage for a subarea was estimated from an analysis of the cumulative coal thickness derived from borehole or surface section records and subsequently categorized in terms of seam thickness, dip, overburden, multiseam proportions, coal quality, and tonnage impacted by severe faulting and igneous intrusions. Confidence intervals were calculated for both subarea and basin tonnage estimates.
An analysis of urban collisions using an artificial intelligence model.
Mussone, L; Ferrari, A; Oneta, M
1999-11-01
Traditional studies on road accidents estimate the effect of variables (such as vehicular flows, road geometry, vehicular characteristics), and the calculation of the number of accidents. A descriptive statistical analysis of the accidents (those used in the model) over the period 1992-1995 is proposed. The paper describes an alternative method based on the use of artificial neural networks (ANN) in order to work out a model that relates to the analysis of vehicular accidents in Milan. The degree of danger of urban intersections using different scenarios is quantified by the ANN model. Methodology is the first result, which allows us to tackle the modelling of urban vehicular accidents by the innovative use of ANN. Other results deal with model outputs: intersection complexity may determine a higher accident index depending on the regulation of intersection. The highest index for running over of pedestrian occurs at non-signalised intersections at night-time.
Schulman-Green, Dena; Ercolano, Elizabeth; Lacoursiere, Sheryl; Ma, Tony; Lazenby, Mark; McCorkle, Ruth
2011-06-01
Institute of Medicine reports have identified gaps in health care professionals' knowledge of palliative and end-of-life care, recommending improved education. Our purpose was to develop and administer a Web-based survey to identify the educational needs of multidisciplinary health care professionals who provide this care in Connecticut to inform educational initiatives. We developed an 80-item survey and recruited participants through the Internet and in person. Descriptive and correlational statistics were calculated on 602 surveys. Disciplines reported greater agreement on items related to their routine tasks. Reported needs included dealing with cultural and spiritual matters and having supportive resources at work. Focus groups confirmed results that are consistent with National Consensus Project guidelines for quality palliative care and indicate the End-of-Life Nursing Education Consortium modules for education.
Ehresmann, Bernd; de Groot, Marcel J; Alex, Alexander; Clark, Timothy
2004-01-01
New molecular descriptors based on statistical descriptions of the local ionization potential, local electron affinity, and the local polarizability at the surface of the molecule are proposed. The significance of these descriptors has been tested by calculating them for the Maybridge database in addition to our set of 26 descriptors reported previously. The new descriptors show little correlation with those already in use. Furthermore, the principal components of the extended set of descriptors for the Maybridge data show that especially the descriptors based on the local electron affinity extend the variance in our set of descriptors, which we have previously shown to be relevant to physical properties. The first nine principal components are shown to be most significant. As an example of the usefulness of the new descriptors, we have set up a QSPR model for boiling points using both the old and new descriptors.
Succession Planning in State Health Agencies in the United States: A Brief Report.
Harper, Elizabeth; Leider, Jonathon P; Coronado, Fatima; Beck, Angela J
2017-11-02
Approximately 25% of the public health workforce plans to retire by 2020. Succession planning is a core capability of the governmental public health enterprise; however, limited data are available regarding these efforts in state health agencies (SHAs). We analyzed 2016 Workforce Gaps Survey data regarding succession planning in SHAs using the US Office of Personnel Management's (OPM's) succession planning model, including 6 domains and 27 activities. Descriptive statistics were calculated for all 41 responding SHAs. On average, SHAs self-reported adequately addressing 11 of 27 succession planning activities, with 93% of SHAs adequately addressing 1 or more activities and 61% adequately addressing 1 or more activities in each domain. The majority of OPM-recommended succession planning activities are not being addressed, and limited succession planning occurs across SHAs. Greater activity in the OPM-identified succession planning domains may help SHAs contend with significant turnover and better preserve institutional knowledge.
Reiber, Hansotto
2016-06-01
The physiological and biophysical knowledge base for interpretations of cerebrospinal fluid (CSF) data and reference ranges are essential for the clinical pathologist and neurochemist. With the popular description of the CSF flow dependent barrier function, the dynamics and concentration gradients of blood-derived, brain-derived and leptomeningeal proteins in CSF or the specificity-independent functions of B-lymphocytes in brain also the neurologist, psychiatrist, neurosurgeon as well as the neuropharmacologist may find essentials for diagnosis, research or development of therapies. This review may help to replace the outdated ideas like "leakage" models of the barriers, linear immunoglobulin Index Interpretations or CSF electrophoresis. Calculations, Interpretations and analytical pitfalls are described for albumin quotients, quantitation of immunoglobulin synthesis in Reibergrams, oligoclonal IgG, IgM analysis, the polyspecific ( MRZ- ) antibody reaction, the statistical treatment of CSF data and general quality assessment in the CSF laboratory. The diagnostic relevance is documented in an accompaning review.
NASA Technical Reports Server (NTRS)
Canfield, R. C.; Ricchiazzi, P. J.
1980-01-01
An approximate probabilistic radiative transfer equation and the statistical equilibrium equations are simultaneously solved for a model hydrogen atom consisting of three bound levels and ionization continuum. The transfer equation for L-alpha, L-beta, H-alpha, and the Lyman continuum is explicitly solved assuming complete redistribution. The accuracy of this approach is tested by comparing source functions and radiative loss rates to values obtained with a method that solves the exact transfer equation. Two recent model solar-flare chromospheres are used for this test. It is shown that for the test atmospheres the probabilistic method gives values of the radiative loss rate that are characteristically good to a factor of 2. The advantage of this probabilistic approach is that it retains a description of the dominant physical processes of radiative transfer in the complete redistribution case, yet it achieves a major reduction in computational requirements.
Mukherjee, Sutapa; Rodrigues, Ema; Weker, Robert; Palmer, Lyle J; Christiani, David C
2002-12-01
A repeated measures short-term prospective study was performed in boilermakers to determine occupational polycyclic aromatic hydrocarbon (PAH) exposure using the biomarker, 1-hydroxypyrene (1-OHP). Two work sites were studied; an apprentice school (metal fume exposure) and a boiler overhaul (residual oil fly ash [ROFA] and metal fume exposure). Pre- and postshift urine samples (n = 241; 41 male subjects) were analyzed for cotinine and 1-OHP. Descriptive statistics and generalized estimating equations were calculated. At the apprentice school cross-shift 1-OHP levels did not significantly differ. At the overhaul 1-OHP levels increased during the week in smokers and nonsmokers; in nonsmokers the 1-OHP level increased significantly postshift compared to preshift. In conclusion this study suggests that boilermakers exposed to occupational particulates are exposed to PAH. The urinary 1-OHP level may be a useful biomarker of PAH exposure in boilermakers exposed to ROFA, particularly in nonsmokers.
Uncertainty Analysis of the NASA Glenn 8x6 Supersonic Wind Tunnel
NASA Technical Reports Server (NTRS)
Stephens, Julia; Hubbard, Erin; Walter, Joel; McElroy, Tyler
2016-01-01
This paper presents methods and results of a detailed measurement uncertainty analysis that was performed for the 8- by 6-foot Supersonic Wind Tunnel located at the NASA Glenn Research Center. The statistical methods and engineering judgments used to estimate elemental uncertainties are described. The Monte Carlo method of propagating uncertainty was selected to determine the uncertainty of calculated variables of interest. A detailed description of the Monte Carlo method as applied for this analysis is provided. Detailed uncertainty results for the uncertainty in average free stream Mach number as well as other variables of interest are provided. All results are presented as random (variation in observed values about a true value), systematic (potential offset between observed and true value), and total (random and systematic combined) uncertainty. The largest sources contributing to uncertainty are determined and potential improvement opportunities for the facility are investigated.
Validation of extremes within the Perfect-Predictor Experiment of the COST Action VALUE
NASA Astrophysics Data System (ADS)
Hertig, Elke; Maraun, Douglas; Wibig, Joanna; Vrac, Mathieu; Soares, Pedro; Bartholy, Judith; Pongracz, Rita; Mares, Ileana; Gutierrez, Jose Manuel; Casanueva, Ana; Alzbutas, Robertas
2016-04-01
Extreme events are of widespread concern due to their damaging consequences on natural and anthropogenic systems. From science to applications the statistical attributes of rare and infrequent occurrence and low probability become connected with the socio-economic aspect of strong impact. Specific end-user needs regarding information about extreme events depend on the type of application, but as a joining element there is always the request for easily accessible climate change information with a clear description of their uncertainties and limitations. Within the Perfect-Predictor Experiment of the COST Action VALUE extreme indices modelled from a wide range of downscaling methods are compared to reference indices calculated from observational data. The experiment uses reference data from a selection of 86 weather stations representative of the different climates in Europe. Results are presented for temperature and precipitation extremes and include aspects of the marginal distribution as well as spell-length related aspects.
Neri, Elizabeth M; Stringer, Kate J; Spadaro, Antonia J; Ballman, Marie R; Grunbaum, Jo Anne
2015-03-01
This study examined the roles academic researchers can play to inform policy and environmental strategies that promote health and prevent disease. Prevention Research Centers (PRCs) engage in academic-community partnerships to conduct applied public health research. Interviews were used to collect data on the roles played by 32 PRCs to inform policy and environmental strategies that were implemented between September 2009 and September 2010. Descriptive statistics were calculated in SAS 9.2. A difference in roles played was observed depending on whether strategies were policy or environmental. Of the policy initiatives, the most common roles were education, research, and partnership. In contrast, the most prevalent roles the PRCs played in environmental approaches were research and providing health promotion resources. Academic research centers play various roles to help inform policy and environmental strategies. © 2014 Society for Public Health Education.
Nursing Activities Score: nursing work load in a burns Intensive Care Unit1
Camuci, Marcia Bernadete; Martins, Júlia Trevisan; Cardeli, Alexandrina Aparecida Maciel; Robazzi, Maria Lúcia do Carmo Cruz
2014-01-01
Objective to evaluate the nursing work load in a Burns Intensive Care Unit according to the Nursing Activities Score. Method an exploratory, descriptive cross-sectional study with a quantitative approach. The Nursing Activities Score was used for data collection between October 2011 and May 2012, totalling 1,221 measurements, obtained from 50 patients' hospital records. Data for qualitative variables was described in tables; for the quantitative variables, calculations using statistical measurements were used. Results the mean score for the Nursing Activities Score was 70.4% and the median was 70.3%, corresponding to the percentage of the time spent on direct care to the patient in 24 hours. Conclusion the Nursing Activities Score provided information which involves the process of caring for patients hospitalized in a Burns Intensive Care Unit, and indicated that there is a high work load for the nursing team of the sector studied. PMID:26107842
Dynamic Conductivity and Partial Ionization in Warm, Dense Hydrogen
NASA Astrophysics Data System (ADS)
Zaghoo, M.; Silvera, I. F.
2017-10-01
A theoretical description for optical conduction experiments in dense fluid hydrogen is presented. Different quantum statistical approaches are used to describe the mechanism of electron transport in hydrogen's high-temperature dense phase. We show that at the onset of the metallic transition, optical conduction could be described by a strong rise in the atomic polarizability, resulting from increased ionization; whereas in the highly degenerate limit, the Ziman weak-scattering model better describes the observed saturation of reflectance. In the highly degenerate region, the inclusion of partial ionization effects provides excellent agreement with experimental results. Hydrogen's fluid metallic state is revealed to be a partially ionized free-electron plasma. These results provide a crucial benchmark for ab initio calculations as well as an important guide for future experiments. Research supported by DOE Stockpile Stewardship Academic Alliance Program, Grant DE-FG52-10NA29656, and NASA Earth and Space Science Fellowship Program, Award NNX14AP17H.
Blodgett, David L.; Booth, Nathaniel L.; Kunicki, Thomas C.; Walker, Jordan I.; Viger, Roland J.
2011-01-01
Interest in sharing interdisciplinary environmental modeling results and related data is increasing among scientists. The U.S. Geological Survey Geo Data Portal project enables data sharing by assembling open-standard Web services into an integrated data retrieval and analysis Web application design methodology that streamlines time-consuming and resource-intensive data management tasks. Data-serving Web services allow Web-based processing services to access Internet-available data sources. The Web processing services developed for the project create commonly needed derivatives of data in numerous formats. Coordinate reference system manipulation and spatial statistics calculation components implemented for the Web processing services were confirmed using ArcGIS 9.3.1, a geographic information science software package. Outcomes of the Geo Data Portal project support the rapid development of user interfaces for accessing and manipulating environmental data.
NASA Astrophysics Data System (ADS)
Lemaître, J.-F.; Dubray, N.; Hilaire, S.; Panebianco, S.; Sida, J.-L.
2013-12-01
Our purpose is to determine fission fragments characteristics in a framework of a scission point model named SPY for Scission Point Yields. This approach can be considered as a theoretical laboratory to study fission mechanism since it gives access to the correlation between the fragments properties and their nuclear structure, such as shell correction, pairing, collective degrees of freedom, odd-even effects. Which ones are dominant in final state? What is the impact of compound nucleus structure? The SPY model consists in a statistical description of the fission process at the scission point where fragments are completely formed and well separated with fixed properties. The most important property of the model relies on the nuclear structure of the fragments which is derived from full quantum microscopic calculations. This approach allows computing the fission final state of extremely exotic nuclei which are inaccessible by most of the fission model available on the market.
How Good Are Statistical Models at Approximating Complex Fitness Landscapes?
du Plessis, Louis; Leventhal, Gabriel E.; Bonhoeffer, Sebastian
2016-01-01
Fitness landscapes determine the course of adaptation by constraining and shaping evolutionary trajectories. Knowledge of the structure of a fitness landscape can thus predict evolutionary outcomes. Empirical fitness landscapes, however, have so far only offered limited insight into real-world questions, as the high dimensionality of sequence spaces makes it impossible to exhaustively measure the fitness of all variants of biologically meaningful sequences. We must therefore revert to statistical descriptions of fitness landscapes that are based on a sparse sample of fitness measurements. It remains unclear, however, how much data are required for such statistical descriptions to be useful. Here, we assess the ability of regression models accounting for single and pairwise mutations to correctly approximate a complex quasi-empirical fitness landscape. We compare approximations based on various sampling regimes of an RNA landscape and find that the sampling regime strongly influences the quality of the regression. On the one hand it is generally impossible to generate sufficient samples to achieve a good approximation of the complete fitness landscape, and on the other hand systematic sampling schemes can only provide a good description of the immediate neighborhood of a sequence of interest. Nevertheless, we obtain a remarkably good and unbiased fit to the local landscape when using sequences from a population that has evolved under strong selection. Thus, current statistical methods can provide a good approximation to the landscape of naturally evolving populations. PMID:27189564
Attitude towards Pre-Marital Genetic Screening among Students of Osun State Polytechnics in Nigeria
ERIC Educational Resources Information Center
Odelola, J. O.; Adisa, O.; Akintaro, O. A.
2013-01-01
This study investigated the attitude towards pre-marital genetic screening among students of Osun State Polytechnics. Descriptive survey design was used for the study. The instrument for data collection was self developed and structured questionnaire in four-point likert scale format. Descriptive statistics of frequency count and percentages were…
Basic School Teachers' Perceptions about Curriculum Design in Ghana
ERIC Educational Resources Information Center
Abudu, Amadu Musah; Mensah, Mary Afi
2016-01-01
This study focused on teachers' perceptions about curriculum design and barriers to their participation. The sample size was 130 teachers who responded to a questionnaire. The analyses made use of descriptive statistics and descriptions. The study found that the level of teachers' participation in curriculum design is low. The results further…
Descriptive and dynamic psychiatry: a perspective on DSM-III.
Frances, A; Cooper, A M
1981-09-01
The APA Task Force on Nomenclature and Statistics attempted to make DSM-III a descriptive nosology that is atheoretical in regard to etiology. The authors believe that a sharp polarity between morphological classification and explanatory formulation is artificial and misleading, and they critically review DSM-III from a psychodynamic perspective. They compare and contrast the descriptive orientation in psychiatry with the psychodynamic orientation and conclude that the two approaches overlap, that they are complementary and necessary to each other, and that there is a descriptive data base underlying dynamic psychiatry which may be usefully included in future nomenclatures.
NASA Astrophysics Data System (ADS)
Avakyan, L. A.; Heinz, M.; Skidanenko, A. V.; Yablunovski, K. A.; Ihlemann, J.; Meinertz, J.; Patzig, C.; Dubiel, M.; Bugaev, L. A.
2018-01-01
The formation of a localized surface plasmon resonance (SPR) spectrum of randomly distributed gold nanoparticles in the surface layer of silicate float glass, generated and implanted by UV ArF-excimer laser irradiation of a thin gold layer sputter-coated on the glass surface, was studied by the T-matrix method, which enables particle agglomeration to be taken into account. The experimental technique used is promising for the production of submicron patterns of plasmonic nanoparticles (given by laser masks or gratings) without damage to the glass surface. Analysis of the applicability of the multi-spheres T-matrix (MSTM) method to the studied material was performed through calculations of SPR characteristics for differently arranged and structured gold nanoparticles (gold nanoparticles in solution, particles pairs, and core-shell silver-gold nanoparticles) for which either experimental data or results of the modeling by other methods are available. For the studied gold nanoparticles in glass, it was revealed that the theoretical description of their SPR spectrum requires consideration of the plasmon coupling between particles, which can be done effectively by MSTM calculations. The obtained statistical distributions over particle sizes and over interparticle distances demonstrated the saturation behavior with respect to the number of particles under consideration, which enabled us to determine the effective aggregate of particles, sufficient to form the SPR spectrum. The suggested technique for the fitting of an experimental SPR spectrum of gold nanoparticles in glass by varying the geometrical parameters of the particles aggregate in the recurring calculations of spectrum by MSTM method enabled us to determine statistical characteristics of the aggregate: the average distance between particles, average size, and size distribution of the particles. The fitting strategy of the SPR spectrum presented here can be applied to nanoparticles of any nature and in various substances, and, in principle, can be extended for particles with non-spherical shapes, like ellipsoids, rod-like and other T-matrix-solvable shapes.
Interactive application of quadratic expansion of chi-square statistic to nonlinear curve fitting
NASA Technical Reports Server (NTRS)
Badavi, F. F.; Everhart, Joel L.
1987-01-01
This report contains a detailed theoretical description of an all-purpose, interactive curve-fitting routine that is based on P. R. Bevington's description of the quadratic expansion of the Chi-Square statistic. The method is implemented in the associated interactive, graphics-based computer program. Taylor's expansion of Chi-Square is first introduced, and justifications for retaining only the first term are presented. From the expansion, a set of n simultaneous linear equations is derived, then solved by matrix algebra. A brief description of the code is presented along with a limited number of changes that are required to customize the program of a particular task. To evaluate the performance of the method and the goodness of nonlinear curve fitting, two typical engineering problems are examined and the graphical and tabular output of each is discussed. A complete listing of the entire package is included as an appendix.
Impact of Student Calculator Use on the 2013 NAEP Twelfth-Grade Mathematics Assessment
ERIC Educational Resources Information Center
Klecker, Beverly M.; Klecker, Richard L.
2014-01-01
This descriptive research study examined 2013 NAEP 12th-grade mathematics scores by students' use of graphing calculators in math classes and the kind of calculator students used during NAEP assessment. NAEP Data Explorer analysis included two questions from Student Factors: How often do you use these different kinds of calculators in math class?…
Steven's orbital reduction factor in ionic clusters
NASA Astrophysics Data System (ADS)
Gajek, Z.; Mulak, J.
1985-11-01
General expressions for reduction coefficients of matrix elements of angular momentum operator in ionic clusters or molecular systems have been derived. The reduction in this approach results from overlap and covalency effects and plays an important role in the reconciling of magnetic and spectroscopic experimental data. The formulated expressions make possible a phenomenological description of the effect with two independent parameters for typical equidistant clusters. Some detailed calculations also suggest the possibility of a one-parameter description. The results of these calculations for some ionic uranium compounds are presented as an example.
Self-Consistent-Field Calculation on Lithium Hydride for Undergraduates.
ERIC Educational Resources Information Center
Rioux, Frank; Harriss, Donald K.
1980-01-01
Describes a self-consistent-field-linear combination of atomic orbitals-molecular orbital calculation on the valence electrons of lithium hydride using the method of Roothaan. This description is intended for undergraduate physics students.
Quasi-Monochromatic Visual Environments and the Resting Point of Accommodation
1988-01-01
accommodation. No statistically significant differences were revealed to support the possibility of color mediated differential regression to resting...discussed with respect to the general findings of the total sample as well as the specific behavior of individual participants. The summarized statistics ...remaining ten varied considerably with respect to the averaged trends reported in the above descriptive statistics as well as with respect to precision
NASA Technical Reports Server (NTRS)
Geyser, L. C.
1978-01-01
A digital computer program, DYGABCD, was developed that generates linearized, dynamic models of simulated turbofan and turbojet engines. DYGABCD is based on an earlier computer program, DYNGEN, that is capable of calculating simulated nonlinear steady-state and transient performance of one- and two-spool turbojet engines or two- and three-spool turbofan engines. Most control design techniques require linear system descriptions. For multiple-input/multiple-output systems such as turbine engines, state space matrix descriptions of the system are often desirable. DYGABCD computes the state space matrices commonly referred to as the A, B, C, and D matrices required for a linear system description. The report discusses the analytical approach and provides a users manual, FORTRAN listings, and a sample case.
Physics-based statistical learning approach to mesoscopic model selection.
Taverniers, Søren; Haut, Terry S; Barros, Kipton; Alexander, Francis J; Lookman, Turab
2015-11-01
In materials science and many other research areas, models are frequently inferred without considering their generalization to unseen data. We apply statistical learning using cross-validation to obtain an optimally predictive coarse-grained description of a two-dimensional kinetic nearest-neighbor Ising model with Glauber dynamics (GD) based on the stochastic Ginzburg-Landau equation (sGLE). The latter is learned from GD "training" data using a log-likelihood analysis, and its predictive ability for various complexities of the model is tested on GD "test" data independent of the data used to train the model on. Using two different error metrics, we perform a detailed analysis of the error between magnetization time trajectories simulated using the learned sGLE coarse-grained description and those obtained using the GD model. We show that both for equilibrium and out-of-equilibrium GD training trajectories, the standard phenomenological description using a quartic free energy does not always yield the most predictive coarse-grained model. Moreover, increasing the amount of training data can shift the optimal model complexity to higher values. Our results are promising in that they pave the way for the use of statistical learning as a general tool for materials modeling and discovery.
Performing Inferential Statistics Prior to Data Collection
ERIC Educational Resources Information Center
Trafimow, David; MacDonald, Justin A.
2017-01-01
Typically, in education and psychology research, the investigator collects data and subsequently performs descriptive and inferential statistics. For example, a researcher might compute group means and use the null hypothesis significance testing procedure to draw conclusions about the populations from which the groups were drawn. We propose an…
Inside Rural Pennsylvania: A Statistical Profile.
ERIC Educational Resources Information Center
Center for Rural Pennsylvania, Harrisburg.
Graphs, data tables, maps, and written descriptions give a statistical overview of rural Pennsylvania. A section on rural demographics covers population changes, racial and ethnic makeup, age cohorts, and families and income. Pennsylvania's rural population, the nation's largest, has increased more than its urban population since 1950, with the…
Education Statistics Quarterly, Summer 2002.
ERIC Educational Resources Information Center
Dillow, Sally, Ed.
2002-01-01
This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications, data products, and funding opportunities developed over a 3-month period. Each issue also contains a message…
Education Statistics Quarterly, Spring 2002.
ERIC Educational Resources Information Center
Dillow, Sally, Ed.
2002-01-01
This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications, data products, and funding opportunities developed over a 3-month period. Each issue also contains a message…
Notes on numerical reliability of several statistical analysis programs
Landwehr, J.M.; Tasker, Gary D.
1999-01-01
This report presents a benchmark analysis of several statistical analysis programs currently in use in the USGS. The benchmark consists of a comparison between the values provided by a statistical analysis program for variables in the reference data set ANASTY and their known or calculated theoretical values. The ANASTY data set is an amendment of the Wilkinson NASTY data set that has been used in the statistical literature to assess the reliability (computational correctness) of calculated analytical results.
From creation and annihilation operators to statistics
NASA Astrophysics Data System (ADS)
Hoyuelos, M.
2018-01-01
A procedure to derive the partition function of non-interacting particles with exotic or intermediate statistics is presented. The partition function is directly related to the associated creation and annihilation operators that obey some specific commutation or anti-commutation relations. The cases of Gentile statistics, quons, Polychronakos statistics, and ewkons are considered. Ewkons statistics was recently derived from the assumption of free diffusion in energy space (Hoyuelos and Sisterna, 2016); an ideal gas of ewkons has negative pressure, a feature that makes them suitable for the description of dark energy.
Earth Observation System Flight Dynamics System Covariance Realism
NASA Technical Reports Server (NTRS)
Zaidi, Waqar H.; Tracewell, David
2016-01-01
This presentation applies a covariance realism technique to the National Aeronautics and Space Administration (NASA) Earth Observation System (EOS) Aqua and Aura spacecraft based on inferential statistics. The technique consists of three parts: collection calculation of definitive state estimates through orbit determination, calculation of covariance realism test statistics at each covariance propagation point, and proper assessment of those test statistics.
Parallel auto-correlative statistics with VTK.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pebay, Philippe Pierre; Bennett, Janine Camille
2013-08-01
This report summarizes existing statistical engines in VTK and presents both the serial and parallel auto-correlative statistics engines. It is a sequel to [PT08, BPRT09b, PT09, BPT09, PT10] which studied the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k-means, and order statistics engines. The ease of use of the new parallel auto-correlative statistics engine is illustrated by the means of C++ code snippets and algorithm verification is provided. This report justifies the design of the statistics engines with parallel scalability in mind, and provides scalability and speed-up analysis results for the autocorrelative statistics engine.
ERIC Educational Resources Information Center
Ramananantoandro, Ramanantsoa
1988-01-01
Presented is a description of a BASIC program to be used on an IBM microcomputer for calculating and plotting synthetic seismic-reflection traces for multilayered earth models. Discusses finding raypaths for given source-receiver offsets using the "shooting method" and calculating the corresponding travel times. (Author/CW)
ERIC Educational Resources Information Center
Watson, Silvana Maria R.; Lopes, João; Oliveira, Célia; Judge, Sharon
2018-01-01
Purpose: The purpose of this descriptive study is to investigate why some elementary children have difficulties mastering addition and subtraction calculation tasks. Design/methodology/approach: The researchers have examined error types in addition and subtraction calculation made by 697 Portuguese students in elementary grades. Each student…
Simplified Numerical Description of SPT Operations
NASA Technical Reports Server (NTRS)
Manzella, David H.
1995-01-01
A simplified numerical model of the plasma discharge within the SPT-100 stationary plasma thruster was developed to aid in understanding thruster operation. A one dimensional description was used. Non-axial velocities were neglected except for the azimuthal electron velocity. A nominal operating condition of 4.5 mg/s of xenon anode flow was considered with 4.5 Amperes of discharge current, and a peak radial magnetic field strength of 130 Gauss. For these conditions, the calculated results indicated ionization fractions of 0.99 near the thruster exit with a potential drop across the discharge of approximately 250 Volts. Peak calculated electron temperatures were found to be sensitive to the choice of total ionization cross section for ionization of atomic xenon by electron bombardment and ranged from 51 eV to 60 eV. The calculated ionization fraction, potential drop, and electron number density agree favorably with previous experiments. Calculated electron temperatures are higher than previously measured.
Computed potential energy surfaces for chemical reactions
NASA Technical Reports Server (NTRS)
Heinemann, K.; Walch, Stephen P.
1992-01-01
The work on the NH + NO system which was described in the last progress report was written up and a draft of the manuscript is included in the appendix. The appendix also contains a draft of a manuscript on an Ar + H + H surface. New work which was completed in the last six months includes the following: (1) calculations on the (1)CH2 + H2O, H2 + HCOH, and H2 + H2CO product channels in the CH3 + OH reaction; (2) calculations for the NH2 + O reaction; (3) calculations for the CH3 + O2 reaction; and (4) calculations for CH3O and the two decomposition channels--CH2OH and H + H2CO. Detailed descriptions of this work will be given in manuscripts; however, brief descriptions of the CH3 + OH and CH3 + O2 projects are given.
Hazelton, Lara; Allen, Michael; MacLeod, Tanya; LeBlanc, Constance; Boudreau, Michelle
2016-01-01
Understanding of statistical terms used to measure treatment effect is important for evidence-informed medical teaching and practice. We explored knowledge of these terms among clinical faculty who instruct and mentor a continuum of medical learners to inform medical faculty learning needs. This was a mixed methods study that used a questionnaire to measure a health professional's understanding of measures of treatment effect and a focus group to explore perspectives on learning, applying, and teaching these terms. We analyzed questionnaire data using descriptive statistics and focus group data using thematic analysis. We analyzed responses from clinical faculty who were physicians and completed all sections of the questionnaire (n = 137). Overall, approximately 55% were highly confident in their understanding of statistical terms; self-reported understanding was highest for number needed to treat (77%). Only 26% of respondents correctly responded to all comprehension questions; however, 80% correctly responded to at least one of these questions. There was a significant association among self-reported understanding and ability to correctly calculate terms. A focus group with clinical/medical faculty (n = 4) revealed themes of mentorship, support and resources, and beliefs about the value of statistical literacy. We found that half of clinical faculty members are highly confident in their understanding of relative and absolute terms. Despite the limitations of self-assessment data, our study provides some evidence that self-assessment can be reliable. Recognizing that faculty development is not mandatory for clinical faculty in many centers, and the notion that faculty may benefit from mentorship in critical appraisal topics, it may be appropriate to first engage and support influential clinical faculty rather than using a broad strategy to achieve universal statistical literacy. Second, senior leadership in medical education should support continuous learning by providing paid, protected time for faculty to incorporate evidence in their teaching.
A knowledge-based T2-statistic to perform pathway analysis for quantitative proteomic data
Chen, Yi-Hau
2017-01-01
Approaches to identify significant pathways from high-throughput quantitative data have been developed in recent years. Still, the analysis of proteomic data stays difficult because of limited sample size. This limitation also leads to the practice of using a competitive null as common approach; which fundamentally implies genes or proteins as independent units. The independent assumption ignores the associations among biomolecules with similar functions or cellular localization, as well as the interactions among them manifested as changes in expression ratios. Consequently, these methods often underestimate the associations among biomolecules and cause false positives in practice. Some studies incorporate the sample covariance matrix into the calculation to address this issue. However, sample covariance may not be a precise estimation if the sample size is very limited, which is usually the case for the data produced by mass spectrometry. In this study, we introduce a multivariate test under a self-contained null to perform pathway analysis for quantitative proteomic data. The covariance matrix used in the test statistic is constructed by the confidence scores retrieved from the STRING database or the HitPredict database. We also design an integrating procedure to retain pathways of sufficient evidence as a pathway group. The performance of the proposed T2-statistic is demonstrated using five published experimental datasets: the T-cell activation, the cAMP/PKA signaling, the myoblast differentiation, and the effect of dasatinib on the BCR-ABL pathway are proteomic datasets produced by mass spectrometry; and the protective effect of myocilin via the MAPK signaling pathway is a gene expression dataset of limited sample size. Compared with other popular statistics, the proposed T2-statistic yields more accurate descriptions in agreement with the discussion of the original publication. We implemented the T2-statistic into an R package T2GA, which is available at https://github.com/roqe/T2GA. PMID:28622336
A knowledge-based T2-statistic to perform pathway analysis for quantitative proteomic data.
Lai, En-Yu; Chen, Yi-Hau; Wu, Kun-Pin
2017-06-01
Approaches to identify significant pathways from high-throughput quantitative data have been developed in recent years. Still, the analysis of proteomic data stays difficult because of limited sample size. This limitation also leads to the practice of using a competitive null as common approach; which fundamentally implies genes or proteins as independent units. The independent assumption ignores the associations among biomolecules with similar functions or cellular localization, as well as the interactions among them manifested as changes in expression ratios. Consequently, these methods often underestimate the associations among biomolecules and cause false positives in practice. Some studies incorporate the sample covariance matrix into the calculation to address this issue. However, sample covariance may not be a precise estimation if the sample size is very limited, which is usually the case for the data produced by mass spectrometry. In this study, we introduce a multivariate test under a self-contained null to perform pathway analysis for quantitative proteomic data. The covariance matrix used in the test statistic is constructed by the confidence scores retrieved from the STRING database or the HitPredict database. We also design an integrating procedure to retain pathways of sufficient evidence as a pathway group. The performance of the proposed T2-statistic is demonstrated using five published experimental datasets: the T-cell activation, the cAMP/PKA signaling, the myoblast differentiation, and the effect of dasatinib on the BCR-ABL pathway are proteomic datasets produced by mass spectrometry; and the protective effect of myocilin via the MAPK signaling pathway is a gene expression dataset of limited sample size. Compared with other popular statistics, the proposed T2-statistic yields more accurate descriptions in agreement with the discussion of the original publication. We implemented the T2-statistic into an R package T2GA, which is available at https://github.com/roqe/T2GA.
Education Statistics Quarterly, Fall 2002.
ERIC Educational Resources Information Center
Dillow, Sally, Ed.
2003-01-01
This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications and data products released in a 3-month period. Each issue also contains a message from the NCES on a timely…
BLS Machine-Readable Data and Tabulating Routines.
ERIC Educational Resources Information Center
DiFillipo, Tony
This report describes the machine-readable data and tabulating routines that the Bureau of Labor Statistics (BLS) is prepared to distribute. An introduction discusses the LABSTAT (Labor Statistics) database and the BLS policy on release of unpublished data. Descriptions summarizing data stored in 25 files follow this format: overview, data…
Education Statistics Quarterly, Fall 2001.
ERIC Educational Resources Information Center
Dillow, Sally, Ed.
2001-01-01
The publication gives a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications, data products, and funding opportunities developed over a 3-month period. Each issue also contains a message from…
ERIC Educational Resources Information Center
Tryon, Warren W.; Lewis, Charles
2009-01-01
Tryon presented a graphic inferential confidence interval (ICI) approach to analyzing two independent and dependent means for statistical difference, equivalence, replication, indeterminacy, and trivial difference. Tryon and Lewis corrected the reduction factor used to adjust descriptive confidence intervals (DCIs) to create ICIs and introduced…
Examples of Data Analysis with SPSS-X.
ERIC Educational Resources Information Center
MacFarland, Thomas W.
Intended for classroom use only, these unpublished notes contain computer lessons on descriptive statistics using SPSS-X Release 3.0 for VAX/UNIX. Statistical measures covered include Chi-square analysis; Spearman's rank correlation coefficient; Student's t-test with two independent samples; Student's t-test with a paired sample; One-way analysis…
Statistics as Unbiased Estimators: Exploring the Teaching of Standard Deviation
ERIC Educational Resources Information Center
Wasserman, Nicholas H.; Casey, Stephanie; Champion, Joe; Huey, Maryann
2017-01-01
This manuscript presents findings from a study about the knowledge for and planned teaching of standard deviation. We investigate how understanding variance as an unbiased (inferential) estimator--not just a descriptive statistic for the variation (spread) in data--is related to teachers' instruction regarding standard deviation, particularly…
Education Statistics Quarterly. Volume 5, Issue 1.
ERIC Educational Resources Information Center
Dillow, Sally, Ed.
2003-01-01
This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications, data product, and funding opportunities developed over a 3-month period. Each issue also contains a message…
Education Statistics Quarterly, Winter 2001.
ERIC Educational Resources Information Center
Dillow, Sally, Ed.
2002-01-01
This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications and data products released in a 3-month period. Each issue also contains a message from the NCES on a timely…
76 FR 60817 - Notice of Proposed Information Collection Requests
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-30
... Statistics (NCES) is seeking a three-year clearance for a new survey data collection for the College... most recent data are available. The clearance being requested is to survey the institutions on this... and sector specific findings from the CATE using descriptive statistics. The main cost areas showing...
Basic Statistical Concepts and Methods for Earth Scientists
Olea, Ricardo A.
2008-01-01
INTRODUCTION Statistics is the science of collecting, analyzing, interpreting, modeling, and displaying masses of numerical data primarily for the characterization and understanding of incompletely known systems. Over the years, these objectives have lead to a fair amount of analytical work to achieve, substantiate, and guide descriptions and inferences.
NASA Technical Reports Server (NTRS)
Cavalleri, R. J.; Agnone, A. M.
1972-01-01
A computer program for calculating internal supersonic flow fields with chemical reactions and shock waves typical of supersonic combustion chambers with either wall or mid-stream injectors is described. The usefulness and limitations of the program are indicated. The program manual and listing are presented along with a sample calculation.
Mousavi, Seyed Mohammad Hadi; Dargahi, Hossein; Mohammadi, Sara
2016-10-01
Creating a safe of health care system requires the establishment of High Reliability Organizations (HROs), which reduces errors, and increases the level of safety in hospitals. This model focuses on improving reliability through higher process design, building a culture of accreditation, and leveraging human factors. The present study intends to determine the readiness of hospitals for the establishment of HROs model in Tehran University of Medical Sciences from the viewpoint of managers of these hospitals. This is a descriptive-analytical study carried out in 2013-2014. The research population consists of 105 senior and middle managers of 15 hospitals of Tehran University of Medical Sciences. The data collection tool was a 55-question researcher-made questionnaire, included six elements of HROs to assess the level of readiness for establishing HROS model from managers' point of view. The validity of the questionnaire was calculated through the content validity method using 10 experts in the area of hospitals' accreditation, and its reliability was calculated through test-retest method with a correlation coefficient of 0.90. The response rate was 90 percent. The Likert scale was used for the questions, and data analysis was conducted through SPSS version 21 Descriptive statistics was presented via tables and normal distributions of data and means. Analytical methods, including t-test, Mann-Whitney, Spearman, and Kruskal-Wallis, were used for presenting inferential statistics. The study showed that from the viewpoint of senior and middle managers of the hospitals considered in this study, these hospitals are indeed ready for acceptance and establishment of HROs model. A significant relationship was showed between HROs model and its elements with demographic details of managers like their age, work experience, management experience, and level of management. Although the studied hospitals, as viewed by their managers, are capable of attaining the goals of HROs, it seems there are a lot of challenges in this way. Therefore, it is suggested that a detailed audit is conducted among hospitals' current status regarding different characteristics of HROs, and workshops are held for medical and non-medical employees and managers of hospitals as an influencing factor; and a re-assessment process afterward, can help moving the hospitals from their current position towards an HROs culture.
North Carolina Migrant Education Program. 1971 Project Evaluation Reports, Vol. I.
ERIC Educational Resources Information Center
North Carolina State Dept. of Public Instruction, Raleigh.
Evaluation reports for 10 of the 23 1971 Summer Migrant Projects in North Carolina are presented in Volume I of this compilation. Each report contains the following information: (1) descriptive statistics and results of student achievement; (2) description of the project as obtained from site team reports and other available information; and (3)…
The Social Profile of Students in Basic General Education in Ecuador: A Data Analysis
ERIC Educational Resources Information Center
Buri, Olga Elizabeth Minchala; Stefos, Efstathios
2017-01-01
The objective of this study is to examine the social profile of students who are enrolled in Basic General Education in Ecuador. Both a descriptive and multidimensional statistical analysis was carried out based on the data provided by the National Survey of Employment, Unemployment and Underemployment in 2015. The descriptive analysis shows the…
Policymakers Dependence on Evidence in Education Decision Making in Oyo State Ministry of Education
ERIC Educational Resources Information Center
Babalola, Joel B.; Gbolahan, Sowunmi
2016-01-01
This study investigated policymaker dependence on evidence in education decision making in Oyo State Ministry of Education. The study was conducted under a descriptive survey design, 44 out of the 290 policymakers of the Ministry and Board of Education across the State were purposively selected for the study. Descriptive statistics of frequency…
User's manual for Axisymmetric Diffuser Duct (ADD) code. Volume 1: General ADD code description
NASA Technical Reports Server (NTRS)
Anderson, O. L.; Hankins, G. B., Jr.; Edwards, D. E.
1982-01-01
This User's Manual contains a complete description of the computer codes known as the AXISYMMETRIC DIFFUSER DUCT code or ADD code. It includes a list of references which describe the formulation of the ADD code and comparisons of calculation with experimental flows. The input/output and general use of the code is described in the first volume. The second volume contains a detailed description of the code including the global structure of the code, list of FORTRAN variables, and descriptions of the subroutines. The third volume contains a detailed description of the CODUCT code which generates coordinate systems for arbitrary axisymmetric ducts.
Analysis of statistical misconception in terms of statistical reasoning
NASA Astrophysics Data System (ADS)
Maryati, I.; Priatna, N.
2018-05-01
Reasoning skill is needed for everyone to face globalization era, because every person have to be able to manage and use information from all over the world which can be obtained easily. Statistical reasoning skill is the ability to collect, group, process, interpret, and draw conclusion of information. Developing this skill can be done through various levels of education. However, the skill is low because many people assume that statistics is just the ability to count and using formulas and so do students. Students still have negative attitude toward course which is related to research. The purpose of this research is analyzing students’ misconception in descriptive statistic course toward the statistical reasoning skill. The observation was done by analyzing the misconception test result and statistical reasoning skill test; observing the students’ misconception effect toward statistical reasoning skill. The sample of this research was 32 students of math education department who had taken descriptive statistic course. The mean value of misconception test was 49,7 and standard deviation was 10,6 whereas the mean value of statistical reasoning skill test was 51,8 and standard deviation was 8,5. If the minimal value is 65 to state the standard achievement of a course competence, students’ mean value is lower than the standard competence. The result of students’ misconception study emphasized on which sub discussion that should be considered. Based on the assessment result, it was found that students’ misconception happen on this: 1) writing mathematical sentence and symbol well, 2) understanding basic definitions, 3) determining concept that will be used in solving problem. In statistical reasoning skill, the assessment was done to measure reasoning from: 1) data, 2) representation, 3) statistic format, 4) probability, 5) sample, and 6) association.
Application of pedagogy reflective in statistical methods course and practicum statistical methods
NASA Astrophysics Data System (ADS)
Julie, Hongki
2017-08-01
Subject Elementary Statistics, Statistical Methods and Statistical Methods Practicum aimed to equip students of Mathematics Education about descriptive statistics and inferential statistics. The students' understanding about descriptive and inferential statistics were important for students on Mathematics Education Department, especially for those who took the final task associated with quantitative research. In quantitative research, students were required to be able to present and describe the quantitative data in an appropriate manner, to make conclusions from their quantitative data, and to create relationships between independent and dependent variables were defined in their research. In fact, when students made their final project associated with quantitative research, it was not been rare still met the students making mistakes in the steps of making conclusions and error in choosing the hypothetical testing process. As a result, they got incorrect conclusions. This is a very fatal mistake for those who did the quantitative research. There were some things gained from the implementation of reflective pedagogy on teaching learning process in Statistical Methods and Statistical Methods Practicum courses, namely: 1. Twenty two students passed in this course and and one student did not pass in this course. 2. The value of the most accomplished student was A that was achieved by 18 students. 3. According all students, their critical stance could be developed by them, and they could build a caring for each other through a learning process in this course. 4. All students agreed that through a learning process that they undergo in the course, they can build a caring for each other.
Designing an Error Resolution Checklist for a Shared Manned-Unmanned Environment
2010-06-01
performance during the Olympics. Thank you to Birsen Donmez, who took an active role in my statistics instruction. I appreciate your time and patience...in teaching me the finer details of “varsity statistics ”. Also, thank you for being so responsive through e-mail, even though you are now located in...105! 6.3.! Experiment recommendations and future work................................................ 105! Appendix A: Descriptive Statistics
Reframing Serial Murder Within Empirical Research.
Gurian, Elizabeth A
2017-04-01
Empirical research on serial murder is limited due to the lack of consensus on a definition, the continued use of primarily descriptive statistics, and linkage to popular culture depictions. These limitations also inhibit our understanding of these offenders and affect credibility in the field of research. Therefore, this comprehensive overview of a sample of 508 cases (738 total offenders, including partnered groups of two or more offenders) provides analyses of solo male, solo female, and partnered serial killers to elucidate statistical differences and similarities in offending and adjudication patterns among the three groups. This analysis of serial homicide offenders not only supports previous research on offending patterns present in the serial homicide literature but also reveals that empirically based analyses can enhance our understanding beyond traditional case studies and descriptive statistics. Further research based on these empirical analyses can aid in the development of more accurate classifications and definitions of serial murderers.
NASA Astrophysics Data System (ADS)
Guala, M.; Liu, M.
2017-12-01
The kinematics of sediment particles is investigated by non-intrusive imaging methods to provide a statistical description of bedload transport in conditions near the threshold of motion. In particular, we focus on the cyclic transition between motion and rest regimes to quantify the waiting time statistics inferred to be responsible for anomalous diffusion, and so far elusive. Despite obvious limitations in the spatio-temporal domain of the observations, we are able to identify the probability distributions of the particle step time and length, velocity, acceleration, waiting time, and thus distinguish which quantities exhibit well converged mean values, based on the thickness of their respective tails. The experimental results shown here for four different transport conditions highlight the importance of the waiting time distribution and represent a benchmark dataset for the stochastic modeling of bedload transport.
NASA Astrophysics Data System (ADS)
Csordás, A.; Graham, R.; Szépfalusy, P.; Vattay, G.
1994-01-01
One wall of an Artin's billiard on the Poincaré half-plane is replaced by a one-parameter (cp) family of nongeodetic walls. A brief description of the classical phase space of this system is given. In the quantum domain, the continuous and gradual transition from the Poisson-like to Gaussian-orthogonal-ensemble (GOE) level statistics due to the small perturbations breaking the symmetry responsible for the ``arithmetic chaos'' at cp=1 is studied. Another GOE-->Poisson transition due to the mixed phase space for large perturbations is also investigated. A satisfactory description of the intermediate level statistics by the Brody distribution was found in both cases. The study supports the existence of a scaling region around cp=1. A finite-size scaling relation for the Brody parameter as a function of 1-cp and the number of levels considered can be established.
A two-component rain model for the prediction of attenuation statistics
NASA Technical Reports Server (NTRS)
Crane, R. K.
1982-01-01
A two-component rain model has been developed for calculating attenuation statistics. In contrast to most other attenuation prediction models, the two-component model calculates the occurrence probability for volume cells or debris attenuation events. The model performed significantly better than the International Radio Consultative Committee model when used for predictions on earth-satellite paths. It is expected that the model will have applications in modeling the joint statistics required for space diversity system design, the statistics of interference due to rain scatter at attenuating frequencies, and the duration statistics for attenuation events.
Mayo, Charles S; Yao, John; Eisbruch, Avraham; Balter, James M; Litzenberg, Dale W; Matuszak, Martha M; Kessler, Marc L; Weyburn, Grant; Anderson, Carlos J; Owen, Dawn; Jackson, William C; Haken, Randall Ten
2017-01-01
To develop statistical dose-volume histogram (DVH)-based metrics and a visualization method to quantify the comparison of treatment plans with historical experience and among different institutions. The descriptive statistical summary (ie, median, first and third quartiles, and 95% confidence intervals) of volume-normalized DVH curve sets of past experiences was visualized through the creation of statistical DVH plots. Detailed distribution parameters were calculated and stored in JavaScript Object Notation files to facilitate management, including transfer and potential multi-institutional comparisons. In the treatment plan evaluation, structure DVH curves were scored against computed statistical DVHs and weighted experience scores (WESs). Individual, clinically used, DVH-based metrics were integrated into a generalized evaluation metric (GEM) as a priority-weighted sum of normalized incomplete gamma functions. Historical treatment plans for 351 patients with head and neck cancer, 104 with prostate cancer who were treated with conventional fractionation, and 94 with liver cancer who were treated with stereotactic body radiation therapy were analyzed to demonstrate the usage of statistical DVH, WES, and GEM in a plan evaluation. A shareable dashboard plugin was created to display statistical DVHs and integrate GEM and WES scores into a clinical plan evaluation within the treatment planning system. Benchmarking with normal tissue complication probability scores was carried out to compare the behavior of GEM and WES scores. DVH curves from historical treatment plans were characterized and presented, with difficult-to-spare structures (ie, frequently compromised organs at risk) identified. Quantitative evaluations by GEM and/or WES compared favorably with the normal tissue complication probability Lyman-Kutcher-Burman model, transforming a set of discrete threshold-priority limits into a continuous model reflecting physician objectives and historical experience. Statistical DVH offers an easy-to-read, detailed, and comprehensive way to visualize the quantitative comparison with historical experiences and among institutions. WES and GEM metrics offer a flexible means of incorporating discrete threshold-prioritizations and historic context into a set of standardized scoring metrics. Together, they provide a practical approach for incorporating big data into clinical practice for treatment plan evaluations.
Effect of early childhood protein-energy malnutrition on permanent dentition dental caries.
Reyes-Perez, Elisandra; Borrell, Luisa N; Katz, Ralph V; Gebrian, Bette J; Prophete, Samuel; Psoter, Walter J
2014-01-01
The objective of this study is to determine the effect of early childhood protein-energy malnutrition (ECPEM) on decayed, missing, filled tooth (DMFT) scores in the permanent dentition of rural Haitian adolescents aged 11-19 years (n = 1,006). We used data from a retrospective cohort that was developed from the Haitian Health Foundation database and merged records on weight-for-age covering the birth through 5-year-old period for all enrolled participants. Dental examinations and interviewer-administered structured questionnaires on demographic and socioeconomic status, and relative sugar consumption were completed in 1,058 participants aged 11-19 years. The ECPEM was defined based on weight-for-age of the subjects during their first 5 years of life that were converted to Z-scores based on the National Center for Health Statistics referent database. Descriptive statistics were calculated. DMFT was regressed on ECPEM adjusting for age, sex, current body mass index Z-score, socioeconomic status, relative sugar consumption, and number of permanent teeth present assuming a Poisson distribution. Questionable malnutrition [rate ratio (RR) = 0.72; 95 percent confidence interval (CI), 0.61-0.86] and malnutrition (RR = 0.58; 95 percent CI, 0.49-0.69) were associated with a statistically significant lower DMFT in Haitian adolescents. ECPEM status is inversely associated with DMFT in Haitian participants. Further follow-up of these same participants will be recommended to evaluate the potential caries catch-up effect. © 2013 American Association of Public Health Dentistry.
Azizollah, Arbabisarjou; Abolghasem, Farhang; Mohammad Amin, Dadgar
2015-12-14
Organizations effort is to achieve a common goal. There are many constructs needed for organizations. Organizational culture and organizational commitment are special concepts in management. The objective of the current research is to study the relationship between organizational culture and organizational commitment among the personnel of Zahedan University of Medical Sciences. This is a descriptive- correlational study. The statistical population was whole tenured staff of Zahedan University of Medical Sciences that worked for this organization in 2012-2013. Random sampling method was used and 165 samples were chosen. Two standardized questionnaires of the organizational culture (Schein, 1984) and organizational commitment (Meyer & Allen, 2002) were applied. The face and construct validity of the questionnaires were approved by the lecturers of Management and experts. Reliability of questionnaires of the organizational culture and organizational commitment were 0.89 and 0.88 respectively, by Cronbach's Alpha coefficient. All statistical calculations performed using Statistical Package for the Social Sciences version 21.0 (SPSS Inc., Chicago, IL, USA). The level of significance was set at P<0.05. The findings of the study showed that there was a significant relationship between organizational culture and organizational commitment (P value=0.027). Also, the results showed that there was a significant relation between organizational culture and affective commitment (P-value=0.009), organizational culture and continuance commitment (P-value=0.009), and organizational culture and normative commitment (P-value=0.009).
Azizollah, Arbabisarjou; Abolghasem, Farhang; Amin, Dadgar Mohammad
2016-01-01
Background and Objective: Organizations effort is to achieve a common goal. There are many constructs needed for organizations. Organizational culture and organizational commitment are special concepts in management. The objective of the current research is to study the relationship between organizational culture and organizational commitment among the personnel of Zahedan University of Medical Sciences. Materials and Methods: This is a descriptive- correlational study. The statistical population was whole tenured staff of Zahedan University of Medical Sciences that worked for this organization in 2012-2013. Random sampling method was used and 165 samples were chosen. Two standardized questionnaires of the organizational culture (Schein, 1984) and organizational commitment (Meyer & Allen, 2002) were applied. The face and construct validity of the questionnaires were approved by the lecturers of Management and experts. Reliability of questionnaires of the organizational culture and organizational commitment were 0.89 and 0.88 respectively, by Cronbach’s Alpha coefficient. All statistical calculations performed using Statistical Package for the Social Sciences version 21.0 (SPSS Inc., Chicago, IL, USA). The level of significance was set at P<0.05. Findings: The findings of the study showed that there was a significant relationship between organizational culture and organizational commitment (P value=0.027). Also, the results showed that there was a significant relation between organizational culture and affective commitment (P-value=0.009), organizational culture and continuance commitment (P-value=0.009), and organizational culture and normative commitment (P-value=0.009). PMID:26925884
29 CFR 2520.102-2 - Style and format of summary plan description.
Code of Federal Regulations, 2014 CFR
2014-07-01
... plan description shall be written in a manner calculated to be understood by the average plan... to these participants, offering them assistance. The assistance provided need not involve written... assistance provided need not involve written materials, but shall be given in the non-English language common...
29 CFR 2520.102-2 - Style and format of summary plan description.
Code of Federal Regulations, 2012 CFR
2012-07-01
... plan description shall be written in a manner calculated to be understood by the average plan... to these participants, offering them assistance. The assistance provided need not involve written... assistance provided need not involve written materials, but shall be given in the non-English language common...
29 CFR 2520.102-2 - Style and format of summary plan description.
Code of Federal Regulations, 2011 CFR
2011-07-01
... plan description shall be written in a manner calculated to be understood by the average plan... to these participants, offering them assistance. The assistance provided need not involve written... assistance provided need not involve written materials, but shall be given in the non-English language common...
29 CFR 2520.102-2 - Style and format of summary plan description.
Code of Federal Regulations, 2013 CFR
2013-07-01
... plan description shall be written in a manner calculated to be understood by the average plan... to these participants, offering them assistance. The assistance provided need not involve written... assistance provided need not involve written materials, but shall be given in the non-English language common...
NASA Technical Reports Server (NTRS)
Davidian, Kenneth J.; Dieck, Ronald H.; Chuang, Isaac
1987-01-01
A preliminary uncertainty analysis was performed for the High Area Ratio Rocket Nozzle test program which took place at the altitude test capsule of the Rocket Engine Test Facility at the NASA Lewis Research Center. Results from the study establish the uncertainty of measured and calculated parameters required for the calculation of rocket engine specific impulse. A generalized description of the uncertainty methodology used is provided. Specific equations and a detailed description of the analysis is presented. Verification of the uncertainty analysis model was performed by comparison with results from the experimental program's data reduction code. Final results include an uncertainty for specific impulse of 1.30 percent. The largest contributors to this uncertainty were calibration errors from the test capsule pressure and thrust measurement devices.
NASA Technical Reports Server (NTRS)
Davidian, Kenneth J.; Dieck, Ronald H.; Chuang, Isaac
1987-01-01
A preliminary uncertainty analysis has been performed for the High Area Ratio Rocket Nozzle test program which took place at the altitude test capsule of the Rocket Engine Test Facility at the NASA Lewis Research Center. Results from the study establish the uncertainty of measured and calculated parameters required for the calculation of rocket engine specific impulse. A generalized description of the uncertainty methodology used is provided. Specific equations and a detailed description of the analysis are presented. Verification of the uncertainty analysis model was performed by comparison with results from the experimental program's data reduction code. Final results include an uncertainty for specific impulse of 1.30 percent. The largest contributors to this uncertainty were calibration errors from the test capsule pressure and thrust measurement devices.
Statistical Properties of SEE Rate Calculation in the Limits of Large and Small Event Counts
NASA Technical Reports Server (NTRS)
Ladbury, Ray
2007-01-01
This viewgraph presentation reviews the Statistical properties of Single Event Effects (SEE) rate calculations. The goal of SEE rate calculation is to bound the SEE rate, though the question is by how much. The presentation covers: (1) Understanding errors on SEE cross sections, (2) Methodology: Maximum Likelihood and confidence Contours, (3) Tests with Simulated data and (4) Applications.
Unified Description of Inelastic Propensity Rules for Electron Transport through Nanoscale Junctions
NASA Astrophysics Data System (ADS)
Paulsson, Magnus; Frederiksen, Thomas; Ueba, Hiromu; Lorente, Nicolás; Brandbyge, Mads
2008-06-01
We present a method to analyze the results of first-principles based calculations of electronic currents including inelastic electron-phonon effects. This method allows us to determine the electronic and vibrational symmetries in play, and hence to obtain the so-called propensity rules for the studied systems. We show that only a few scattering states—namely those belonging to the most transmitting eigenchannels—need to be considered for a complete description of the electron transport. We apply the method on first-principles calculations of four different systems and obtain the propensity rules in each case.
NASA Astrophysics Data System (ADS)
Lin, Shu; Wang, Rui; Xia, Ning; Li, Yongdong; Liu, Chunliang
2018-01-01
Statistical multipactor theories are critical prediction approaches for multipactor breakdown determination. However, these approaches still require a negotiation between the calculation efficiency and accuracy. This paper presents an improved stationary statistical theory for efficient threshold analysis of two-surface multipactor. A general integral equation over the distribution function of the electron emission phase with both the single-sided and double-sided impacts considered is formulated. The modeling results indicate that the improved stationary statistical theory can not only obtain equally good accuracy of multipactor threshold calculation as the nonstationary statistical theory, but also achieve high calculation efficiency concurrently. By using this improved stationary statistical theory, the total time consumption in calculating full multipactor susceptibility zones of parallel plates can be decreased by as much as a factor of four relative to the nonstationary statistical theory. It also shows that the effect of single-sided impacts is indispensable for accurate multipactor prediction of coaxial lines and also more significant for the high order multipactor. Finally, the influence of secondary emission yield (SEY) properties on the multipactor threshold is further investigated. It is observed that the first cross energy and the energy range between the first cross and the SEY maximum both play a significant role in determining the multipactor threshold, which agrees with the numerical simulation results in the literature.